Nine Reasons Why 8GB Is Only Just Enough (For A Professional Business Software Developer)

by Jon Davis 13. February 2009 21:29

Today I installed 8GB on my home workstation/playstation. I had 8GB lying around already from a voluntary purchase for a prior workplace (I took my RAM back and put the work-provided RAM back in before I left that job) but the brand of RAM didn’t work correctly on my home PC’s motherboard. It’s all good now though, some high quality performance RAM from OCZ and my Windows 7 system self-rating on the RAM I/O jumped from 5.9 to 7.2.

At my new job I had to request a RAM upgrade from 2GB to 4GB. (Since it’s 32-bit XP I couldn’t go any higher.) I asked about when 64-bit Windows Vista or Windows 7 would be put on the table for consideration as an option for employees, I was told “there are no plans for 64-bit”.

The same thing happened with my last short-term gig. Good God, corporate IT folks everywhere are stuck in the year 2002. I can barely function at 4GB, can’t function much at all at 2GB.

By quadrupling the performance of your employee's system, you’d effectively double the productivity of your employee; it’s like getting a new employee for free.

If you are a multi-role developer and aren’t already saturating at least 4GB of RAM you are throwing away your employer’s money, and if you are IT and not providing at least 4GB RAM to developers and actively working on adding corporate support for 64-bit for employees’ workstations you are costing the company a ton of money due to productivity loss!! I don’t know how many times I’ve seen people restart their computers or sit and wait for 2 minutes for Visual Studio to come up because their machine is bogged down on a swap file. That was “typical” half a decade ago, but it’s not acceptable anymore. The same is true of hard drive space. Fast 1 Terabyte hard drives are available for less than $100 these days, there is simply no excuse. For any employee who makes more than X (say, $25,000), for Pete’s sake, throw in an extra $1000-$2000 or so more and get the employee two large (24-inch) monitors, at least 1TB hard drive(s) (ideally 4 drives in a RAID-0+1 array), 64-bit Windows Server 2008 / Windows Vista / Windows 7, a quad-core CPU, and 8 GB of some high performance (800+ GHz) RAM. It’s not that that’s another $2,000 or so to lose; it’s that just $2,000 will save you many thousands more dough. By quadrupling the performance of your employee's system, you’d effectively double the productivity of your employee; it’s like getting a new employee for free. And if you are the employee, making double of X (say, more than $50,000), and if your employer could somehow allow it (and they should, shame on them if they don’t and they won’t do it themselves), you should go out and get your own hardware upgrades. Make yourself twice as productive, and earn your pay with pride.

In a business environment, whether one is paid by the hour or salaried (already expected to work X hours a week, which is effectively loosely translated to hourly anyway), time = money. Period. This is not about developers enjoying a luxury, it’s about them saving time and employers saving money.

Note to the morons who argue “this is why developers are writing big, bloated software that suck up resources” .. Dear moron, this post is from the perspective of an actual developer’s workstation, not a mere bit-twiddling programmer—a developer, that is, who wears many hats and must not just write code but manage database details, work with project plans, document technical details, electronically collaborate with teammates, test and debug, etc., all in one sitting. Nothing in here actually recommends or even contributes to writing big, bloated software for an end user. The objective is productivity, your skills as a programmer are a separate concern. If you are producing bad, bloated code, the quality of the machine on which you wrote the code has little to nothing to contribute to that—on the contrary, a poor developer system can lead to extremely shoddy code because the time and patience required just to manage to refactor and re-test become such a huge burden. If you really want to test your code on a limited machine, you can rig VMWare / VirtualPC / VirtualBox to temporarily run with lesser RAM, etc. You shouldn’t have to punish yourself with poor productivity while you are creating the output. Such punishment is far more monetarily expensive than the cost of RAM.

I can think of a lot of reasons for 8+ GB RAM, but I’ll name a handful that matter most to me.

  1. Windows XP / Server 2003 alone takes up half a gigabyte of RAM (Vista / Server 2008 takes up double that). Scheduled tasks and other processes cause the OS to peak out at some 50+% more. Cost: 512-850MB. Subtotal @nominal: ~512MB; @peak: 850MB
  2. IIS isn’t a huge hog but it’s a big system service with a lot of responsibility. Cost: 50-150. Subtotal @nominal: ~550MB; @peak 1GB.
  3. Microsoft Office and other productivity applications should need to be used more than one at a time, as needed. For more than two decades, modern computers have supported a marvelous feature called multi-tasking. This means that if you have Outlook open, and you double-click a Microsoft Word attachment, and upon reading it you realize that you need to update your Excel spreadsheet, which in your train of thought you find yourself updating an Access database, and then you realize that these updates result in a change of product features so you need to reflect these details in your PowerPoint presentation, you should have been able to open each of these applications without missing a beat, and by the time you’re done you should be able to close all these apps in no more than one passing second per click of the [X] close button of each app. Each of these apps takes up as much as 100MB of RAM, Outlook typically even more, and Outlook is typically always open. Cost: 150-1GB. Subtotal @nominal: 700MB; @peak 2GB.
  4. Every business software developer should have his own copy of SQL Server Developer Edition. Every instance of SQL Server Developer Edition takes up a good 25MB to 150MB of RAM just for the core services, multiplied by each of the support services. Meanwhile, Visual Studio 2008 Pro and Team Edition come with SQL Server 2005 Express Edition, not 2008, so for some of us that means two installations of SQL Server Express. Both SQL Server Developer Edition and SQL Server Express Edition are ideal to have on the same machine since Express doesn’t have all the features of Developer and Developer doesn’t have the flat-file support that is available in Express. SQL Server sitting idly costs a LOT of CPU, so quad core is quite ideal. Cost: @nominal: 150MB, @peak 512MB. Subtotal @nominal: 850MB; @peak: 2.5GB. We haven’t even hit Visual Studio yet.
  5. Except in actual Database projects (not to be confused with code projects that happen to have database support), any serious developer would use SQL Server Management Studio, not Visual Studio, to access database data and to work with T-SQL tasks. This would be run alongside Visual Studio, but nonetheless as a separate application. Cost: 250MB. Subtotal @nominal: 1.1GB; @peak: 2.75GB.
  6. Visual Studio itself takes the cake. With ReSharper and other popular add-ins like PowerCommands installed, Visual Studio just started up takes up half a gig of RAM per instance. Add another 250MB for a typical medium-size solution. And if you, like me lately, work in multiple branches and find yourself having to edit several branches for different reasons, one shouldn’t have to close out of Visual Studio to open the next branch. That’s productivity thrown away. This week I was working with three branches; that’s 3 instances. Sample scenario: I’m coding away on my sandbox branch, then a bug ticket comes in and I have to edit the QA/production branch in an isolated instance of Visual Studio for a quick fix, then I get an IM from someone requesting an immediate resolution to something in the developer branch. Lucky I didn’t open a fourth instance. Eventually I can close the latter two instances down and continue with my sandbox environment. Case in point: Visual Studio costs a LOT of RAM. Cost @nominal 512MB, @peak 2.25GB. Subtotal @nominal: 1.6GB; @peak: 5GB.
  7. Your app being developed takes up RAM. This could be any amount, but don’t forget that Visual Studio instantiates independent web servers and loads up bloated binaries for debugging. If there are lots of services and support apps involved, they all stack up fast. Cost @nominal: 50MB, @peak 750MB. Subtotal @nominal: 1.65GB; @peak: 5.75GB.
  8. Internet Explorer and/or your other web browsers take up plenty of RAM. Typically 75MB for IE to be loaded, plus 10-15MB per page/tab. And if you’re anything like me, you’ll have lots and lots and LOTS of pages/tabs by the end of the day; by noon I typically end up with about four or five separate IE windows/processes, each with 5-15 tabs. (Mind you, all or at least most of them are work-related windows, such as looking up internal/corporate documents on the intranet or tracking down developer documentation such as API specs, blogs, and forum posts.) Cost @nominal: 100MB; @peak: 512MB. Subtotal @nominal: 1.75GB; @peak: 6.5GB.
  9. No software solution should go untested on as many platforms as is going to be used in production. If it’s a web site, it should be tested on IE 6, IE 7, and IE 8, as well as current version of Opera, Safari 3+, Firefox 1.5, Firefox 2, and Firefox 3+. If it’s a desktop app, it should be tested on every compatible version of the OS. If it’s a cross-platform compiled app, it should be tested on Windows, Mac, and Linux. You could have an isolated set of computers and/or QA staff to look into all these scenarios, but when it comes to company time and productivity, the developer should test first, and he should test right on his own computer. He should not have to shutdown to dual-boot. He should be using VMWare (or Virtual PC, or VirtualBox, etc). Each VMWare instance takes up the RAM and CPU of a normal system installation; I can’t comprehend why it is that some people think that a VMWare image should only take up a few GB of hard drive space and half a gig of RAM; it just doesn’t work that way. Also, in a distributed software solution with multiple servers involved, firing up multiple instances of VMWare for testing and debugging should be mandatory. Cost @nominal: 512MB; @peak: 4GB. Subtotal @nominal: 2.25GB; @peak: 10.5GB.

Total peak memory (64-bit Vista SP1 which was not accounted in #1): 11+GB!!!

Now, you could argue all day long that you can “save money” by shutting down all those “peak” processes to use less RAM rather than using so much. I’d argue all day long that you are freaking insane. The 8GB I bought for my PC cost me $130 from Dell. Buy, insert, test, save money. Don’t be stupid and wasteful. Make yourself productive.

Two More Things About Windows 7

by Jon Davis 17. January 2009 04:44

I wanted to add two more early comments on my experience with Windows 7 so far.

  1. My favorite new feature of Windows 7 is the ability to take a normal / restored window and maximize it by just dragging it to the top of the screen and letting it dock. I can restore it in the same way, just drag the title bar off the top of the screen and it becomes restored again. And if I want it to fill the right half of the screen, I can dock it to the right, and likewise for the left side. WAY too handy, it's one of those things that makes you wonder, why haven't we been doing it that way for years?
  2. My new big fat pet peeve that makes me think someone at Microsoft is a little nutty and insane is the Virtual Store security "feature" that was implemented in Windows Server 2008 has ended up in Windows 7 as well. I cannot express enough how much I ABSOLUTELY HATE THIS HORRIBLE FEATURE.
    • To fix it now you need to open "Security Configuration Management" where you'll find Local Policies -> Security Options -> "Virtualize file and registry write failures to per-user locations" and disable the thing.

On a side note, in a previous blog entry I told a "story" about how I had to use an external USB-based DVD drive to install Windows because the IDE drive wasn't detected. Well, with everything installed I was still unable to use my built-in drive. I could dual-boot to Vista and use the same drive all day, so this is clearly a driver issue. And I know I'm not the only one with the problem; as Google reveals, it's one of the big well-known let-downs of the Windows 7 beta.

Windows 7 Beta first Impressions

by Jon Davis 14. January 2009 04:47

Everyone has already made Windows 7 first impression comments, but I had to see Windows 7 for myself, as I always do wth Windows pre-releases. So here are my first experiences. I tried the earlier PDC release, downloaded from a torrent, but I got an error after booting from the DVD saying that it could not locate an installer file.

Windows could not collect information for [OSImage] since the specified image file [install.wim] does not exist.

I chalked it up to a bad torrent download and tossed the copy.

Then Microsoft released Beta 1 this month. I tried downloading this torrent again, and the download was inturrupted. I tried to restore the download process and no seeds were found after hours. I found another torrent, and after about half a day and half-downloaded I realized Microsoft had actually released this version to the open public for anyone to download, so deleted that torrent and started download again, this time straight from Microsoft.

The next day, the download having been completed while I was sleeping, I burned it to DVD-RW and gave it a run. Guess what?

Windows could not collect information for [OSImage] since the specified image file [install.wim] does not exist.

Oh, poop. So the original download wasn't any more flawed on this part than this one is, it's something else.

I tried booting the DVD in VMWare on another PC, and it worked! Aha! It's a hardware problem, perhaps a DVD driver problem. My computer is only about one and a half years old, but the DVD drive is about four years old. I Googled around a bit for more information on this ridiculous error, and the only advice I could find were two suggestions:

  1. Some commented, "You probably found an old DVD-RW from behind a sofa. Use a new DVD-R and that'll fix it right up." Hm. Doubtful. I burned another DVD-RW (same brand, roughly the same condition) and this time I checked off the "Verify" option on my burner software, and it checked out. Still got the error. It was at this point that I tried it on VMWare, and it got past this error, so no, it's not a bad disc. I suppose it could have to do with the failure of the other drive, on the other PC, to read the disc, though. In other words, the drive might have failed, not the disc.
  2. Someone said, "I was using an old USB-attached DVD drive that the BIOS enabled me to boot the disc from, but after installing an IDE-based DVD drive in the actual computer the error went away." Well that stinks, because I'm using an IDE-based DVD drive, it's never given me any problems except that it often refuses to burn discs.

So I pondered, I'm leaning towards the #2 scenario as a clue, I know Microsoft was trying to thin down the core surface area in Windows 7 and I bet this is a lack of some drivers for my drive. But I wonder if "new" is the keyword here, not the form (IDE vs USB).

I just happened to have a external USB-based DVD drive I recently purchased at Amazon. USB, but new. Could it work? I ran to the back room and grabbed it, brought it back in, stretched it across the room to the outlet, configured the BIOS to boot from USB, and booted the Windows 7 DVD. I went to install and....... yes!! It got past the error.

So here's the first first impression: While I greatly appreciate Microsoft's attempt to slim down the core dependency set of Windows and its drivers set, in this area (CD/DVD drive support) they chopped off WAY too much. Perhaps driver support isn't the issue here, but if it is, this IS a bug. There are a LOT of people who were power users 4 years ago, who invested in the latest and greatest back then, and had no Windows version but XP, and were reluctant to switch to Vista because of the corners that Windows 7 rounded out. These years-old systems are more than adequate, surely, for Windows 7 performance-wise, but the CD/DVD drivers are right there along with USB subsystem and SATA as being most needed for success. Fix this, guys, this is a BUG, not a mere risky compromise (intentional droppage of legacy hardware support). Microsoft can't afford to lose THIS hardware.

I experienced no other hardware glitches, fortunately, and even my audio hardware was working, and the Aero experience working right from post-setup first boot. There was only one other hardware-related annoyance, and that is that my two monitors were backwards.. I had to mouse far to the right to access the left monitor. Yes, this is configurable with the Control Panel, but I got annoyed watching setup and dealing with dialog boxes, etc., while everything was backwards and the setup didn't have the Control Panel available to me. It would've been nice, I suppose, if there was one optional button during setup that brought up the Monitors dialog, but at least the Monitors dialog isn't accessed through the wholly inappropriately named (in Vista's time) "Personalization" dialog, which was SO ridiculously placed since monitor setup (resolution, monitor placement monitor drivers, color depth, etc) has little to nothing to do with personalization. Might as well rename Control Panel to "Personalizations".. but they got it, I'm glad.

The new Windows 7 is all about rounding off the corners and adding the polishing touches that Windows Vista only touched on and inspired.

  1. More ever-present Aero Glass experience, with lots of smooth animations and roll-overs.
  2. Explorer.exe got a huge overhaul with Aero and usability enhancements.
    • As is very well known, the ubiquitous taskbar that has been around through Windows 95, Windows NT 4, Windows 98, Windows ME, Windows NT, Windows 2000, Windows XP, Windows Server 2003, Windows Vista, and Windows Server 2008 (did I miss one somewhere? surely I did ..) is now no more. There is no longer a taskbar. There is a bar down there, but it's more like a "smartbar"; the Quick Launch toolbar and the taskbar have sorta merged. It's all very much inspired, no doubt, by the Mac OS X's dock, which frankly disgusts me. But so far I don't have a hatred of the Windows 7 smartbar thingmajig. I do very strongly believe that someone (i.e. Stardock), if not Microsoft themselves, will be pushing a "Windows Vista taskbar" as an add-on accessory to Windows 7, for those people who preferred it, as there is now a rather obvious market for it.
    • The awesome feature in the Windows Vista desktop compositing system that enabled Direct3D and high definition video to be managed in an already D3D desktop environment, advantages of which were only slightly touched upon by Windows key + Tab and taskbar mouseover tooltip previews, both showing these windows re-displayed in distorted, small form in realtime with no performance loss, has been expanded upon in Windows 7. I'm still discovering these, but the most obvious feature is the smartbar mouseover with Internet Explorer showing each tab and letting you pick a tab as it is rendered in real-time. I hope to find a lot more such scenarios
  3. Paint, Calculator, and Wordpad have finally been rewritten with an Office 2007 feel. We no longer have to puke on the Windows 95 versions. I didn't see if Notepad was replaced with something anywhere near the simplicity yet completeness of Notepad2. But I doubt Notepad was touched, which if not is a shame. But at least there's always Notepad2. *cough*
  4. In general, the things in Windows such as in the Control Panel that got moved around a lot in Vista and that everyone complained about, such as me complaining about Monitor settings showing up under stupid Personalization, have been rearranged again. Generally, things are just better and more thought out. Vista was a trial run in this matter, Windows 7 beta is just more thought through. There are still quirky "features" but nothing I've found so far that is just blaringly wrong. I do think that the personalization bits are now too broken apart but this might just be a style issue that needs some getting used to. Microsoft seems to be leaning more than before towards the Apple/Mozilla approach of pursuing minimalist options while burying advanced features down in an obvious "Advanced" click-trail. Themes are consolidated sets now, a little more like Win95 Plus! themes in the sense of consolidation, and not so much isolated background, color, and sound options. But those options as individual settings are still there. In fact, Sounds is now (finally) a personalization configuration, as it should be.
  5. You start off with a big fish. Literally. It's a nice painting (of a fish). But come on. It's a fish! I went to choose a different background image, and, while I could very possibly be mistaken, I think the number of background images you can choose from has been slashed by half since Vista, and the new offerings in the theme picker don't look as good. Boooo!
  6. Other people ran the numbers so I didnt do any testing, but the general consensus is that Windows 7 performs closer to Windows XP's performance than Windows Vista's performance. (Read: It's very performant.)
  7. The max system rating has been nudged up from 5.9 to 7.9. My score of 5.7 on Windows Vista went up to 5.9 in Windows 7... but given the max of 7.9 my year-and-a-half old PC is no longer 0.2 from ceiling. *sob*
  8. I was impressed that the color palettes across all themes, just like IE 8 beta on Vista, are way too bright. It's ugly and uncomfortable. It's not easily configurable to make darker, either.
  9. I haven't stressed Windows 7 yet with software to see how stable it is, but one of the first apps I downloaded was Google Chrome and that puked. All of Windows froze up while I was doing something else, too, but I don't remember what it was, and that sort of thing is something I'd expect from a Beta. 

I have one other complaint. Windows Vista and Office 2007 introduced some really nice glow animations on buttons. Windows 7 pushes the Office 2007 glow animations and transition animations everywhere. The new smartbar (taskbar replacement) has a really, really cool "you just clicked me!" gradient animation that is almost magical. It's nice, but the animations are so slow they're actually rather obnoxious. For example, in the new Calculator, if you simply hover over and click on a button, yeah, blue-gray turns amber, but mouse-away and it seems to take a full three or four seconds for it to animate back to the original color. It's artistically nice, but it's just too long, and I think it will be too distracting. It might actually produce some serious usability issues, fast-moving users are going to be forced to slow down because their "feedback loop" they're getting on the screen is going to all be just a big blur. I really don't like that. It's already making me a little nauseous. Weird huh.

I think Vista's close/maximize/minimize effects the animation timings just right in this matter. Office 2007 ribbon buttons were just over the edge in my taste (too slow), and I could be wrong but Windows 7 in various places feels like it tripled the Office 2007 animation timings (very, very slow).

Be the first to rate this post

  • Currently 0/5 Stars.
  • 1
  • 2
  • 3
  • 4
  • 5

Tags: ,

Computers and Internet | General Technology | Operating Systems

How My Microsoft Loyalty Is Degrading

by Jon Davis 7. June 2008 16:59

I've sat in this seat and often pronounced my discontent with Microsoft or a Microsoft technology, while still proclaiming myself to be a Microsoft enthusiast. Co-workers have often called me a Microsoft or Windows bigot. People would even give me written job recommedations pronouncing me as "one who particularly knows and understands Microsoft technologies".

But lately over the last year or two I've been suffering from malcontent, and I've lost that Microsoft spirit. I'm trying to figure out why. What went wrong? What happened?

Maybe it was Microsoft's selection of Ray Ozzie as the new Chief Software Architect. Groove (which was Ozzie's legacy) was a curious beast, but surely not a multi-billion-dollar revenue product, at best it was a network-based software experiment. Groove's migration to Microsoft under the Office umbrella would have been a lot more exciting if only it was quickly adopted into the MSDN vision and immediately given expansive and rich MSDN treatment, which it was not. Instead, it was gradually rolled in, and legacy SDK support just sort of tagged along or else "fell off" in the transition. Groove was brought in as an afterthought, not as a premier new Microsoft offering. Groove could have become the new Outlook, a rich, do-it-all software platform that brought consolidation of the team workflows and data across teams and diperate working groups, but instead it became just a simple little "IM client on steroids and then some" and I quickly abandoned it as soon as I discovered that key features such as directory sharing weren't supported on 64-bit Windows. So to bring Ozzie in and have him sit in that chair, and then have that kind of treatment of Ozzie's own Groove--Groove being only an example but an important, symbolic one--really makes me think that Microsoft doesn't know what on earth it's doing!! Even I could have sat in that chair and had a better, broader sense of software operations and retainment of vision, not that I'm jealous or would have pursued that chair. The day I heard Ozzie was selected, I immediately moaned, "Oh no, Microsoft is stuck on the network / Internet bandwagon, and has forgotten their roots, the core software platforms business!!" The whole fuzzy mesh thing that Microsoft is about to push is a really obvious example of where Microsoft is going as a result of bringing in Ozzie, and I hardly find a network mesh compelling as a software platform when non-Microsoft alternatives can so easily and readily exist.

Maybe it's Microsoft's audacity to abandon their legacies in their toolsets, such as they have done with COM and with VB6. There still remains zero support for easily building COM objects using the Visual Studio toolsets, and I will continue to grumble about this until an alternative component technology is supported by Microsoft that is native to the metal (or until I manage to get comfortable with C/C++ linked libraries, which is a skill I still have to develop 100% during my spare time, which is a real drag when there is no accountability or team support). I'm still floored by how fast Microsoft abandoned DNA for .NET -- I can completely, 100% understand it, DNA reached its limits and needed a rewrite / rethink from the bottom up, but the swappage of strategies is still a precedent that leaves me with a bad taste in my mouth. I want my personal investments in software discovery to be worth something. I'm also discouraged--the literal sense of the word, I'm losing courage and confidence--by the drastic, even if necessary, evolutionary changes Microsoft keeps doing to its supported languages. C# 2 (with stuff like generics support) is nothing like C# 1, and C# 3 (with var and LINQ) is nothing like C# 2. Now C# 4 is being researched and developed, with new support for dynamic language interop (basically, weak typing), which is as exciting as LINQ was, but I have yet to adopt even LINQ, and getting LINQ support in CLR object graphs is a notorious nightmare, not that I would know but everyone who tries it is pronouncing it as horrible and massive. Come to think of it, it's Microsoft's interop strategy that has been very frustrating. COM is not Remoting, and Remoting is not WCF. WCF isn't even supported in Mono, and so for high performance, small overhead interprocess communications, what's the best strategy really? I could use WCF today but what if WCF is gone and forgotten in five years?

Maybe it's the fact that I don't have time to browse the blogs of Microsoft's developer staff. They have a lot of folks over there, and while it's pretty tempting to complain that Microsoft "codes silently in a box", the truth is that there are some pretty good blogs being published from Microsofties, such as "If broken it is, fix it you should", albeit half of which I don't even understand without staring at it for a very long time. Incidentally, ScottGu does a really good job of "summing up" all the goings on, so thumbs-up on that one.

I think a lot of my abandonment of loyalty to Microsoft has to do with the sincerity of my open complaint about Internet Explorer, how it is the most visible and therefore most important development platform coming from Redmond but so behind-the-times and non-innovative versus the hard work that the Webkit and Mozilla teams are putting their blood, sweat, and tears over, that things like this [http://digg.com/tech_news/Time_breakdown_of_modern_web_design_PICTURE] get posted on my wall at the office, cheerily.

Perhaps it's the over-extended yet limited branding Microsoft did with Vista, making things like this [http://reviews.cnet.com/8301-13549_7-9947498-30.html] actually make me nearly shed a tear or two over what Windows branding has become. That Windows Energy background look looks neat, but it's also very forthright and "timestamped", kind of like how disco in the 70's and synth-pop in the 80's were "timestamped", they sounded neat in their day but quickly became difficult to listen to. That's what happens with too strong of an artistic statement. Incidentally, Apple's Aqua interface is also "timestamped", but at least it's not defaulting with a strong artistic statement plastered all over the entire screen. I like the Vista taskbar, but what's up with the strict black, why can't that or other visual aspects be tweaked? At least it's mostly-neutral (who wants a bright blue or yellow taskbar?), but it's still just a bit imposing IMO.

I'll bet it has to do with the horrifying use of a virtualized Program Files directory in Windows Server 2008 where the practice was unannounced. This is the sort of practice that makes it VERY difficult to trust Microsoft Windows going forward at all. If Windows is going to put things in places that are different from where I as a user told them to be placed, then we have a behavioral disconnect--software should exist to serve me and do as I command, not to protect me from myself while deceiving me.

At the end of it all, I think my degrading sense of loyalty could just be the simple fact that I really am trying to spread out and discover and appreciate what the other players are doing. I mentioned before that Mac OS X is still the ultimate, uber OS, but now that I have it, I confess, I had no idea. Steve Jobs is brilliant, and it's also profound how much of OS X is open source, basically all of the UNIXy bits, which says a lot about OSS. Mind you, parts of the Mac I genuinely do not like and have never liked, such as the single menubar, which violates very key and important rules for UI design. I also generally find it difficult to manage multiple applications running at once, for which I much prefer the Windows taskbar over the Dock if only because it's more predictable, and although it violates UI principles I prefer Alt+Tab for all windows rather than Command+Tab just for applications because every window is its own "workflow" regardless of who owns it. But, among other things, building on PostScript for rendering, for example, was a fantastic idea; on the other hand, Microsoft's ClearType would have been difficult to achieve if Windows used PostScript for rendering. Anyway, meanwhile, learning and exposing myself to UNIX/Linux based software is good for me as a growing software developer, and impossible to cleanly discover in Windows-land without using virtual machines.

In other words, the only way one can spread out and discover the non-Microsoft ways of doing things, and appreciate the process of doing so, is to stop swearing by the Microsoft way to begin with, and approach the whole thing with an open mind. In the end, the Microsoft way may still prove to be the best, but elimination of bias (on both sides) is an ideal goal to be achieved before pursuing long-term personal growth in software.

Currently rated 3.9 by 10 people

  • Currently 3.9/5 Stars.
  • 1
  • 2
  • 3
  • 4
  • 5

Tags:

General Technology | Operating Systems | Software Development | Microsoft Windows | Mac OS X

Lists Of Microsoft's Fame And Shame - 2008

by Jon Davis 5. April 2008 20:44

Since everyone loves to pick on Microsoft, I think we can summarize exactly what has caused such a commotion among technology enthusiasts. The areas where Microsoft has been given most reputational grief have been where the bar was raised higher by a third party, or else where a third party has made people scratch their heads and wonder, "Why am I using Microsoft's technology, anyway?" So I'd like to suggest a list of technologies that shame Microsoft.

To Microsoft's Fame

Before I get started, I want to point out the areas that Microsoft has excelled in:

  1. Microsoft Word & Microsoft Outlook
    • Word processing has become a staple of the computing world. Apart from web browsing and e-mail, the word processor is the next most important and relevant "killer app" of computing technology. Microsoft Word continues to astound us with major new features with every release. It has evolved in sophistication regularly. It is not without its quirks -- for example, there's nothing more annoying than running out of callout diagram graphics due to fixed memory allocation -- but Microsoft has consistently tried to keep Word up-to-date with the demands of its users, and no other word processor can compare with its overall user experience.
    • Microsoft Outlook might feel a little sluggish for some, and I still resent the fact that Microsoft never implemented NNTP support within Microsoft Outlook, but it is still the ultimate app for the office. Maintaining e-mails, calendaring, and task lists in one application, it is always the first app to run when I get in at work, and the last app to close, if I ever close it. And it gets the most attention every day.
      • Microsoft Outlook's greatest area for growth is project management support. Outlook would be a natural environment for managing projects. If the communications and collaboration environment that Outlook already is was consolidated with issue tracking, resource allocation (perhaps merge with MS Project), and even basic SCRUM, professionals would be flocking to the platform all over again. There's always Sharepoint for team collaboration, but Outlook being a desktop application with tuned responsiveness, it blows any AJAX web application out of the water (and please don't even think Silverlight, with its lack of OLE integration, limited [or no] drag-and-drop, limited contextual menu support, and no windowing support). I think Microsoft has regularly missed opportunities to make Outlook the ultimate "portal" for all things related to collaboration.
  2. Visual Studio, .NET, and C# - http://msdn.microsoft.com/
    • A few years ago I sent a big gripe e-mail to one of Apple's feedback e-mail addresses. It said something along the lines of, "You guys just don't get it. There's a really good reason why more apps are written for Windows rather than for the Mac, and it isn't because Windows is superior. It's because Microsoft pours boatloads of its money into developer support. The MSDN program is probably more important for Microsoft than Windows itself. The return on investment that Microsoft gets with all of its investments with development tools is a no-brainer. If you guys would establish a 'developer network', provide refined developer tools, and make being a Mac developer one of the most exciting and rewarding things about your technology, people would be flocking to your platform." I still believe that this is true. Ironically, a few months after that e-mail there was a huge developer tools push by Apple. Heh..
    • Visual Studio and the core Microsoft SDKs blow all of the competitors, even the latest and greatest iterations of IDEs like Eclipse and NetBeans, completely out of the water. Don't get me wrong, I love what I see in focused IDEs like Aptana. But as a do-it-all toolset, Visual Studio 2008 is just insane. I actually don't think there's anything, except for easy COM object development (*sob* I still miss VB6), that Visual Studio can't do. C# (which is Microsoft's invention and part of the Visual Studio and MSDN strategies) is an incredibly elegant language, even more so than Java, and that's saying a lot because Java as a language was very nice. Using Visual Studio for web development is also very rewarding; I work with it every day.
      • Microsoft is doing something very right with CodePlex and Microsoft's open source initiatives. However, I think that Microsoft should put the plug back in on their idea that they prototyped during the VS 2005 beta of community-generated libraries directly integrated in a community browser. This is a huge feature of Eclipse and NetBeans alike, and if we could get our third party open source Visual Studio plug-ins and API libraries from a common interface it would be quite ideal for the developer community. Perhaps an open source initiative can be established for this.
    • I'm still not quite sold on WPF because of its bloat, and Silverlight 2 is still pretty painfully stripped-down, but what it does introduce is very, very exciting. Adobe Flash would have easily become a product technology to shame Microsoft, but when it comes to what Silverlight 2 and Blend 2.5 promise and are already delivering in beta form, Microsoft has taken the higher ground. Granted, Flash has the user base. But being a geek, I don't care; I firmly believe that the user base will follow the superior toolset.
      • I think that Microsoft still needs to implement a few things before Silverlight will really become a "killer app" technology platform, keeping in mind that for every Flash-based banner ad or RIA, there is a Flash game being introduced on the web:
        1. Limited windowing support. Please. I want to open a Silverlight window, without opening a web browser window with another isolated Silvelight app. Let me. 
        2. GDI+-esque bitmap manipulation support. For example, we should be able to render to a canvas, buffer to a bitmap, and reuse the bitmap as we like. Let us render pixels. I don't know what kind of installation footprint a rasterization API would introduce, but it seems like it would be pretty light.
        3. WPF-esque 3D support by befriending OpenGL. Please. Every platform supports OGL. EVERY PLATFORM!
    • DirectX is the responsible runtime API for, what, 90% of modern commercial electronic games today? At least, that is certainly accurate of PC games. Direct3D is feature-rich, setting the bar for the programmability of a video card's GPU (using a Shader Model programming API), to say nothing of supporting a complete set of interfaces for generating 2D and 3D scenes with lighting and high resolution textures. XACT offers a complete audio API and toolset, after years of one tool experiment after another for audio and music. DirectX also has networking APIs and input device APIs (flight sticks, gamepads, steering wheels, etc.). All your game engine needs are met with DirectX ... except for the game engine itself. That's where XNA comes in.
    • Direct3D is clearly superior in featureset than OpenGL.
    • DirectX as a suite of APIs specifically targeting Windows is vastly superior to SDL.
    • While I still scratch my head wondering why XNA and WPF are so completely isolated, and that there is neither XAML rendering support in XNA nor XNA features in WPF, I am very impressed with Microsoft's XNA. XNA had a warm welcome to the community in 2007, I feel. But XNA was quickly forgotten by the general developer community by the end of 2007, until XNA Game Studio Express v2.0 was released.
      • Microsoft's XNA strategy has been very thorough, with the Creators Club web site completing the big picture. But what Microsoft still needs to do with XNA to continue to gain and retain amateur game developers and establish the "YouTube For Games" community that it intends to foster is to convince developers to deploy Windows game install packages today, and to feature XNA games on Windows before the roll out XNA games on Xbox Live Marketplace. Microsoft should create a web service driven "XNA Amateur Games Browser" for Windows, now! It should be an optional Windows Update download for Windows Vista Ultimate. Microsoft has really blown it in gaining and retaining amateur game developers' attention on the Microsoft Windows platform. XNA has been a missed opportunity; the technology and toolset are solid, but XNA is till marketed, intentionally or not, as an Xbox technology that requires $99 to participate in, which is tragic. Most game developers would opt for spending that kind of money on such cross-platform technologies as Torque and Unity (http://unity3d.com/). Microsoft XNA needs an InstantAction-like community before it seeks out the Xbox Live Marketplace community.
  3. Windows Server 2008
    • In some ways, Window Server 2008 represents the culmination of all things heavily tested, refined, tuned, and applicable for both simple and complex scenarios and for executing both simple and complex computing applications. It compares quite closely with Mac OS X. Mac OS X, being built on a UNIX foundation, comes straight out of the box with the rich featureset of UNIX network and system tools, to say nothing of its extensibility to support additional UNIX apps that can run naturally and stably on it. But not only does Windows Server 2008 support all of the essentials of an operating system workstation as well as an IT server, it also has a UNIX compatibility subsystem, and on top of that it sets a MUCH higher bar in many areas of server technology that currently other platforms simply do not support. Just browsing the optional features one can install onto Windows Server straight out of the box, suddenly even the beta-quality grab bag of nifty new technologies one can choose in a modern Linux distro is not able to compare, even in features alone.
    • Not long ago, I posted a blog entry indicating that I'd prefer Windows Vista Ultimate SP1 over Windows Server 2008 as a web developer workstation. I think I may have to retract that opinion. Other people have posted performance comparisons of these two operating systems and have found that Windows Server 2008 performs significantly better than Vista SP1. This is very disappointing as I love Vista and was very much looking forward to SP1 picking up he pace to be on par with Windows Server 2008. 
      • I have three workstation-related complaints for Windows Server 2008, based on my watching my co-worker's / buddy's experiences with using it as a workstation:
        1. No sidebar even with Vista experience installed?!
        2. COD 4 BSOD's on Windows Server 2008 on an nVidia Quadro 1500 card that worked fine for me on Windows Server 2003, as well as for me on Vista x86 and on Vista x64. Sup widdat?
        3. What's with that awful addressbar/progressbar locking up Windows Explorer functionality just because the OS is (I guess) indexing system contents?? My buddy couldn't even right-click a folder and view Properties at times because of this stupid indexing lock-up. This went on for a month or so before it apparently went away on its own, we figured it finally managed to index everything, or something. But this one thing kept me from making the switch from Server 2003!! 
  4. IIS 7.0 & WCF
    • IIS is now fully programmable on all parts of a request pipeline, even at the protocol level (IIS is no longer a web server, it is a network application server).
    • Microsoft has taken their experiences of the nightmares that came about with COM/DCOM/COM+, MTS, .NET 1.x Remoting, and ASP.NET Web Services, and made a simple yet pretty complete solution for it all. As long as you code all your software around data contracts, you have WCF-handshakeable, interopable software that can cross most any boundary. Hosted on IIS 7, said software can cross any physical boundary.

To Microsoft's Shame

These things said, here's a list of technologies and third party products where I think Microsoft should be paying closer attention as they bring shame upon Microsoft.

    • When Scott Guthrie came here to Scottsdale (that's where I live) this year, Hamid Shojaee from Axosoft did a little presentation for his company's products, and he used presentation software that kept me blinking in awe. Although judging from the presentation the presentation software he used looked like it was pretty lightweight in features (I found out later it was Keynote for Mac), based on Hamid's presentation Keynote had one thing that made me realize that Microsoft is going about its PowerPoint strategy all wrong. I realized that rock-solid, eye-catching presentations are all about being flicker-free, with full 3D fly-ins and no visible pause between tween frames. I noticed this about Silverlight; when I look at http://www.quiksilver-europe.com/ and hover my mouse over the video player, I see something that Flash can't do, which is look frame-free and flicker-free because its animation engine is time-based, not frame-based. Between WPF and Silverlight, Microsoft already has the technology to support all this. If the next version of PowerPoint is not overhauled to look this smooth, Microsoft should be ashamed of themselves!
  1. Firefox and Webkit (TIE)
    • Once upon a time, Microsoft innovated in the web browser technology market. They introduced a powerful software plug-in model with ActiveX and pushed it out on Internet scale. They invented the fully programmable HTML DOM.
    • Eventually,
      • C# was introduced.
      • Firefox came on the scene.
      • George W. Bush was elected president.
      • Microsoft got complacent about their web browser strategy, and made it official that they would never innovate on the browser again.
    • The effects of Microsoft taking their genius Internet Explorer innovations staff (the Trident team) off of Internet Explorer and onto WPF and Silverlight has taken its toll. As web technology has evolved, Internet Explorer has become the uber-pimple of the computing world. It's the annoying, ugly blemish that people want to pinch and pop but not only won't go away but it's right out there for everyone to see on the face of Windows and you can't get rid of it. It has a slow release schedule, its dev team has been silent towards the community, and it is clearly not a part of Microsoft's MSDN strategy, yet at the same time it is one of the most prominent and heavily used development platforms that runs on Windows. I'm greatly looking forward to IE8, but Microsoft is still playing catch-up with the other browsers.
    • Microsoft has a lot of nerve to suggest that the different meanings of the word standards ("standard as in popularity? standard as in typical? standard as in standards-body documented standard?") are applicable to web technology. If you're going to put yourself out there on the web and interoperate with a platform-agnostic network, to the extent of those agnostic technologies (HTML, XHTML, CSS, Javascript, etc.) there is only one definition of standards, and that is the definition of standards that comes from the international standards bodies, in this case the W3C. Microsoft can do what they want with XAML, VBScript, and ActiveX, but if they're not going to submit to standards bodies on platform-agnostic technologies, they should drop IE and adopt Firefox or WebKit, or else they will risk their users doing as much which in effect would significantly lessen the necessity of Windows (the necessary host operating system of IE).
    • Mozilla, Safari, and Opera leaders are actively leading in innovating on the web standards, like HTML 5, a practice that Microsoft started in the early days of the web, and later abandoned. Microsoft is still not actively participating in these discussions.
    • XPCOM and XPI! Make it so!!
    • Earlier in this blog post I mentioned Windows Server 2008 being most like Mac OS X. But Mac OS X still sets certain standards for the Ultimate Operating System. Granted, Windows Vista and/or Windows Server 2008 is still the OS of choice for practical use because of the rich developer tools that Microsoft offers and because of the extent of third party apps that are available because of it. But Mac OS X still sets the bar for
      1. True "it just works" plug-and-play functionality. I don't know what Apple is doing to make things "just work", but all of the iterations of the Mac have always been rediculously clean and easy to use, for both hardware add-ons and software installations.
      2. Application packaging; Mac apps continue the trend today that they've always had, of getting both Apple apps and third party apps (except MS Office for Mac) all presenting themselves as a nicely packaged, self-contained file, rather than a gajillion DLLs among one or several shell-executable files. Mac users don't use a Start menu because they don't need one.
      3. Platform interopability. Virtualization is not platform interop. Microsoft's UNIX compatibility layer is a step in the right direction, but it isn't enough. Windows is still proprietary Windows; UNIX apps have to be re-compiled to work on the UNIX compatibility layer, and for that one might as well use Cygwin which is far easier and "funner" to use (which isn't saying it's fun). A more appropriate approach might be andLinux.org's.
      4. With Mac OS X's UNIX based core, the rich suite of well-established UNIX applications are at OS X's disposal, including Apache Web Server. Apache in itself is not all that astounding in contrast to IIS, but it is one little tool in a long list of applications that make any non-Windows computer user compelled to stick with the Mac.
      5. The new file browsing features in the latest version of the Mac that put Windows Vista's thumbnail and slide show views to shame are absolutely astounding.
      6. The new video conferencing features in the latest version of the Mac are also astounding. Windows doesn't have anything like that.
    • I've said it before, and I'll say it again: The Start menu on a mobile device is perhaps the suckiest, stupidest design idea ever invented and implemented in mainstream technology. To the same extent, though, the iPhone's multi-touch, naturalistic, responsive interface is perhaps the greatest interface design ever conceived and implemented in mainstream technology. The bar has been set about 3 times higher than it was; now Microsoft needs to measure up with its Windows Mobile strategy, and until they do I will never buy a Windows Mobile device, not when there is the iPhone, now with binary SDK available for 3rd party developers. (The 30% commission requirement only ups the quality requirement for the third parties.)
    • Q: What if a different, solid commercial database was built versus Microsoft SQL Server?
      • A: It might be called Oracle, which is butt-ugly and has a cuture of its own full of down-trodden people who don't smile.
    • Q: What if someone open-sourced a complete database server for production web and entrprise apps as free alternative to SQL Server?
      • A: It might be called mySQL, which is feature-incomplete, and until recently lacking in administrative tools worth touching with a ten foot pole.
      • A: It might also be called Firebird, and although it has the maturity timescale and featureset of a complete DB, it also lacks the polish, usability, documentation, and presentation that geeks of today demand. (That or maybe I just don't like their web site.)
    • Q: What if Microsoft built a tiny-scale, .NET-based version of their SQL Server product?
      • A: It would be called SQL Server Compact Edition, and it would suck (meaning, lacking of features and being generally non-innovative)
    • Q: What if someone built a tiny-scale, native database engine that could be used anywhere?
      • A: It would be called SQLite, and although it rocks, it is a half-baked solution with no associated administration tools or C# managed API hooks in its core implementation.
    • Q: So what if someone built a Microsoft SQL Server look-alike, with decent performance, deep joins, T-SQL transactions, T-SQL sprocs, triggers, and managed assembly plug-ins, all in managed code, and with elegant administration tools, and produced it into a small compiled codebase that can be deployed on anything from medium scale web or enterprise apps to tiny-scale mobile applications?
      • A: It would be called VistaDB, and Microsoft should be astounded, if not absolutely frightened.
    • Every time I see another search indexing technology show up on Windows, I moan. "Index Server". "Office Search". "Microsoft Search". "Desktop Search". "Microsoft Search Server". For goodness sake, come up with and standardize on a solid API already!!
    • Apache Lucene is a blazingly fast text indexing software library. It can be used as a super-fast alternative not only to Google but also SQL Server.
    • Apache Solr, a server implementation around the Lucene search engine, is to search technology is what, say, early iterations of mySQL might have been to SQL databases. It is a buggy but functional demonstration of a complete and rediculously powerful data indexing and querying engine that can be used with both REST and JSON queries.
    • Microsoft Search Server is just a stupid pre-fab web page, a Googlish front-end to a spidered web site, proof that Microsoft just doesn't get it. Microsoft's indexing strategy is limited to file scanning and reproduction, a la Nutch. Microsoft clearly hasn't figured out how technologies like Lucene can render database engines like SQL Server's Full-Text Indexing obsolete.
  2. jQuery and the tersed Javascript community
    • Where jQuery shames Microsoft is where it also shames some of the other Javascript libraries. jQuery is to Javascript and DOM objects what LINQ is to C# and database objects, XML, and managed objects. It trivializes querying them, collecting them, calling on them, and performing operations on a group of them in one line of execution code (without a for loop).
    • The team that jQuery shames is not the LINQ team, but rather the ASP.NET AJAX team that implemented the Javascript framework. (A nice job they did, by the way, but heck, jQuery shames everybody!) See, Javascript already has a language community and culture, like C# and Java have community and culture. The Javascript community favors terseness and shortcuts by way of minified libraries that do much with little effort. Even in the simplest sense, this might mean short, terse, lower-cased code. Microsoft favors Pascal Casing and long namespaces. They went halfway and shortened the "System" namespace in Javascript to "Sys", but it still uses long, Pascal-Cased namespaces and OO-esque coding style rather than terse functional programming coding style.
    • Part of the terseness and "easy calling" approach pursued by the Javascript community is the simplistic approach of using one-liner databinding of HTML forms to REST URIs. The same community also typically interoperates with an MVC-oriented server architecture like PHPCake or Ruby On Rails. ASP.NET AJAX's approach, meanwhile, is to maintain viewstate and pass everything including the kitchen sink in a slow, klunky ASP.NET page postback lifecycle that could, and should, be cleaned up with a RESTful WCF / MVC AJAX view lifecycle.
    • Any time a corporation makes a commercial product that sells well and is very innovative but is being purchased primarily as a workaround for a failure of Windows or other Microsoft product, it brings awful shame upon Microsoft. In this case, the shame is the failure on Microsoft's part to support unencrypted QAM on Windows Media Center.
  3. Microsoft's own Xbox Live Dashboard / Marketplace -- Hello, Windows Team??
    • No competing technology platform has trivialized the usefulness of Microsoft Windows like Microsoft's own Xbox Live Dashboard and Marketplace. This was a hugely missed opportunity that Microsoft completely overlooked, whereby the rich, consolidated, packaged user experience that is enjoyed on the Xbox 360 could be transferred to the Microsoft Windows platform.
    • Web pages (@live.com) don't cut it. Putting Microsoft Downloads on Silverlight doesn't cut it. You have a rich Presentation Foundation on Windows that could have been used to deploy rich interactive experiences, as well as even rediculous DRM functionality built into Windows Vista, why isn't all of this being featured in Windows Ultimate?!
    • Valve's Steam cuts it for the game side of things, but lacking the WPF wow and full-screen experience that Microsoft could have introduced, and lacking the media purchases and downloads, it isn't enough.
    • Windows Media Center (which, by the way, I do use several hours every single day) would have cut it, if only it natively and more seamlessly supported the same package-extensibility featureset, marketplace integration, and Games category where marketplace demos can be downloaded and played, that are enjoyed on the Xbox.
  4. SVN (added 4/19/2008)
    • Nothing makes it seem to the software community more so than SVN that Microsoft "knows" software from only the confines of their own innovations and culture. On this technology alone, it sometimes seems like they live in a box and engineer in a cave.
    • Visual Source Safe is not version control. It's change control. The difference is as much cultural as it is functional; think a bunch of productive engineers in an agile group ("update", "OK, merged"), versus a bunch of wedgie-suffering tightwads in a red tape overwhelmed corporation ("can you please check that in so I can edit some of the code?")
    • I tried and failed to install Team Foundation Server three times and never got it right. The list of steps is a full page long, and each step takes several minutes of installing stuff -- set up Windows, figuring out whether or not to set up Active Directory, set up SQL Server Std. (not any version but Standard!), set up Windows SharePoint (don't confuse it with Office SharePoint! Don't confuse the version number!), optionally configure SharePonit for Active Directory, etc., etc. In the end, I always had something up and running, but when I would go load the SharePoint intance up in a web browser it would give me some stupid IIS error. Was I not supposed to hit it with a web browser? I don't know; the Help file didn't say.
    • I don't consider mysef a genius, and I don't consider myself a moron either. I consider myself having slightly-better-than-average intelligence. I think my I.Q. was measured 115 when I was a kid, whoopty doo. But I can set up SVN server and SVN client (w/ TortoiseSVN) without a lot of effort, as well as a few free issue tracking web sites like Gemini. I don't have Visual Studio integration (but you can use Ankh or Visual SVN), but I do have version control and a tracking system.
    • This blog post is very telling of the whole cultural situation over there in Washington.

Notice that I didn't mention things like mySQL, Ruby, PHP, Apache Web Server, Flash/Flex, Java, or Ubuntu Linux as key items on the list. All of these, while in some cases being innovative and even heavily used, simply don't match up with the depth of features, usability, and/or stability of those mentioned above. mySQL is a half-baked wannabe, Rail's founder presents himself to supporters having "constructive criticism" with a big "F*** YOU" on the overhead projecter, PHP is at a technical level no more special than ASP Classic using Javascript, Apache now needs to contend with IIS 7, Flash/Flex has been ousted (for its toolset) by Silverlight, Blend, and Visual Studio, Java has been beaten by C# / .NET, Ubuntu is just another Linux distro that wants to mean something special but comes up short (yes, even with Compiz-Fusion). They're good, but not good enough to bring shame upon Microsoft, or else Microsoft has finally managed to catch up, and get a little ahead.

kick it on DotNetKicks.com

So How Can You Graphically Install Ubuntu 7.10 On Generic VGA, Anyway??

by Jon Davis 30. March 2008 21:13

I noticed in my Start menu that I had installed VirtualBox a few weeks ago. In a bored moment, I fired it up for the first time and started up the Installer/LiveCD for Ubuntu 7.10.

When I went to install Ubuntu, I got stuck right from the get-go. I couldn't click on 'Next'! The installer screen was way too large, and the high resolution VGA drivers weren't installed yet (as Ubuntu wasn't installed yet) so I couldn't change the resolution.

Classic moment of pure ludicrous idiocy here. Those Linux folks are always so smug, with such attitude, they deserve shame when they screw up this bad. Yay for corporations with coordinated QA teams!!

And yes, I did try using tab + spacebar. Got me to the next screen (time zone map), but tab doesn't work to change button focus on that screen; once the drop-down list has focus it won't let go of it with tab. 

Geek buddy says, "That's normal. Your environment can't support graphical mode installation. Graphical mode installation is for systems that can support it. Yours can't, because your VM video card isn't on the built-in drivers list."

That's crap. Hardware vendors, not OS distros, provide hardware drivers. Generic VGA @ 800x600 is a well-established minimum common standard. You install the hi-res video driver post-install; the fact that OS distros often have the driver bundled is just a bonus. Besides, I am in graphical mode!! If it's not supported because of resolution, it should say, "Sorry, you must reboot and enter Text mode to install, because in graphical mode we want to be promiscuous with your screen real estate when installing, and we don't know how to do that with your hardware." But that would still suck. Best to just scale down these rediculous installation screens! Or, at *least* set a maximum window height to the desktop and insert ugly window scrollbars if the height has max'd out.

Sure, perhaps I can track down valid hardware drivers (in this case VirtualBox drivers) and activate them somehow at runtime, just to get to the Next button. That's not the point. Sure, I can choose install in text mode. That's not the point, either. The point is that this is lunacy. If they just scaled down these windows, the user experience would have been acceptable. It's like these Linux people DEMAND and ENFORCE that you geek out just to get yourself initiated. Yet they keep bragging about how user-friendly Ubuntu and other distros like it are.

Currently rated 1.7 by 11 people

  • Currently 1.727273/5 Stars.
  • 1
  • 2
  • 3
  • 4
  • 5

Tags: , , , ,

Operating Systems | Linux

Open Source Operating Systems Written In C#

by Jon Davis 6. March 2008 10:44

Over at http://www.codeplex.com/singularity I came across mention of these...

Cool! I'll have to poke at these.

Uber Workstation: Windows Vista vs. Windows Server 2008

by Jon Davis 25. February 2008 12:08

I have always been adamant that as a web developer it is far better to use Windows Server 2003 rather than Windows XP as your primary workstation. This view became necessary primarily because Windows XP had a stripped-down set of IIS services, namely it was IIS 5.0 rather than IIS 6.0, and it was constrained to not allow multiple virtual hosts on the same machine. This made XP worthless; being a web developer, having the process forced down my throat of building entire web applications as "subwebs" made things infinitely more difficult to develop against. For example, you could never have a simple hyperlink that starts with a slash ("/"). You had to build everything around the ASP/ASP.NET coding model of application root ("~/"), which required you to move all of your hyperlinks to server-side code (<asp:Hyperlink>, or <img src="<%= ResolveUrl("~/") %>images/bleah.gif">).

No more. Windows Vista has multiple web server support. Microsoft perhaps got tired of basically every web developer on the planet expressing their animosity towards the Windows team for their crippling of IIS without even so much as an alternate "IIS add-on for MSDN Universal subscribers" or something. It's full-blown IIS 7, same as in Windows Server 2008.

Now that Windows Server 2008 is released, the inevitable questions should be asked (rather than the answers assumed based on prior experience with XP / 2003): does Windows Server 2008 have any new features that Windows Vista doesn't have, that a typical ASP.NET web developer would want on his workstation, and does Windows Vista have any undesirable features that are not present in Windows Server 2008 that cannot be removed from Vista?

While the answer to both of these questions were "yes" in XP/2003, for Vista/2008 I think the general answer to both of these questions, I believe, is "no".

In Windows 2008 there are a gajillion new services that the next wave of Internet technologies will need on hand for regular development. For developers of one of these next-gen technologies, Server 2008 might be essential. But for basic ASP.NET and WCF development (in other words, for most web developers), Vista can suffice.

And 2008 doesn't really filter out anything from the Vista experience except for the fact that the Vista experience is an option rather than mandatory. That's nice; but if it's going to be used for a workstation, it makes sense to just add it. Only problem is, it's not a complete Vista experience; you don't get the sidebar, for instance, and Call of Duty 4 crashes on a co-worker / friend who agreed to be a Windows Server 2008-as-a-workstation guinea pig. And to be honest, I feel a lot more uncomfortable with all the undesirable new bells and whistles of Server 2008 being available to my workstation than with them missing from a Vista environment.

The only features I saw in Server 2008 that I didn't see in Vista that might be worth something to me were: Multipath I/O, TCP port sharing, and hypervisor (native virtualization) support (which is still in beta). Actually, Vista might have the first two of the three, I don't recall. But I already have VMWare Workstation, which I continue to prefer over that awful Virtual PC platform. Meanwhile, pretty much all of the other stuff, while some of it may be valuable, it's all so server-oriented and not development-oriented that it would make more sense to move that stuff to a VM or external environment anyway.

So my tentative conclusion is that Vista Ultimate is already the ideal environment for a web developer. With it, you have all the basics that you need to build multiple IIS solutions and to test basic WCF solutions. Meanwhile you get to keep the fluff you like (and I do like some fluff on my workstation, gimme Sidebar and stuff), while you can still kill off the fluff you don't like.

Currently rated 3.6 by 5 people

  • Currently 3.6/5 Stars.
  • 1
  • 2
  • 3
  • 4
  • 5

Tags: , ,

Operating Systems | Microsoft Windows

Why Not Big RAM, RAM Disk, and 10 Gigabit Switches and Adapters?

by Jon Davis 9. November 2007 14:54

This is just something I've been pondering lately. I've been doing a lot of work this year with Lucene.Net (a port of Lucene, which is written in Java, to .NET) to manage a search engine. In our configuration, it uses RAMDirectory objects to retain the indexes in memory, then it searches the indexed content as though it was on disk. It takes up a lot of RAM, but it's very performant. A search query, including network load of transfering the XML-based query and the XML-based result set (over Windows Communication Framework), is typically in the range of about 0.05 seconds over a gigabit switch using standard, low-end, modern server hardware.

We don't just spider our sites with this stuff as with Google (or Nutch). We manually index our content using real field names and values per index, very similar to SQL Server tables except that you can have multiple same-name fields with different values in the same index record ("document") which is great for multiple keywords. If we could get it to properly join index on fields like in SQL Server you can join tables on fields, as well as to perform arbitrary functions or delegates as query parameters (which is DOABLE!!), we'd have ourselves something that is useful enough for us to consider throwing SQL Server completely out the window for read-only tasks and get a hundredfold performance boost. Yes, I just said, that!!

Because of the load we put on RAM, trying to keep the I/O off the SCSI adapter and limit it to the memory bus, all of this has led me to question why network and RAM capacities have not evolved nearly as fast as hard drive capacities. It seems to me that a natural and clean way of managing the performance of any high-traffic, database-driven web site is to minimize the I/O contention, period. I hear about people spending big money on redundant database servers with all these terabytes of storage space, but then only, say, 16 GB of RAM and gigabit switch. And that's fine, I guess, considering how when the scale goes much higher than that, the prices escalate out of control.

That, then, is my frustration. I want 10 gigabit switches and adapters NOW. I want 128GB RAM on a single motherboard NOW. I want 512GB solid state drives NOW. And I want it all for less than fifteen grand. Come on, industry. Hurry up. :P

But assuming that the hardware became available, this kind of architectural shift would be a shift, indeed, that would also directly affect how server-side software is constructed. Microsoft Windows and SQL Server, in my opinion, should be overhauled. Windows should natively support RAM disks. Microsoft yanked an in-memory OLE-DB database provider a few years ago and I never understood why. And while I realize that SQL Server needs to be rock-solid for reliably persisting committed database transaction to long-term storage, there should be greater design flexibility in the database configuration and greater runtime flexibility, such as in the Transact-SQL language, that determines how transactions persist (lately or atomically).

Maybe I missed stuff that's already there, which is actually quite likely. I'm not exactly an extreme expert on SQL Server. I just find these particular aspects of data service optimizations an area of curiosity.

Don't Spread Yourself Too Thin

by Jon Davis 2. September 2007 20:32

At work, in any workplace really, the leaders and supporters (including myself, in whatever role I play as a consultant or software engineer or otherwise) are notably hindered and evaluated for the value of the work on a one-to-one basis of how spread out they are. I tend to look upon myself with shame, for instance, when I see myself as a "jack of all trades, master of none", which is something I really don't want to be. I'd much rather be "knowledgeable of much, master of much", even if "much" does not come anywhere close to "all". I feel like I should be able to afford that since I am single, have no life except the life in front of my computer(s), and my sole hobby is software. Even so, I struggle to master my skills and to stay on top of current technologies.

When we take our talents and spread them too wide in the workplace, we find ourselves making excuses for not taking the time to produce quality work. Mediocre output starts getting thrown out there, and no one is expected to apologize because everyone was too busy to take the time to pursue excellence in the output.

I've started to notice that the rule of "spreading yourself thin makes you worthless" is a universal truth that doesn't just apply to people but also to software. I've been "nerding out" on alternative operating systems so that's where I'm applying this lately. The Haiku OS project is distinct from some other projects in this way. Tidbits of wisdom like this have been sticking in my head:

http://haiku-os.org/documents/dev/what_are_you_looking_at

While we developers are used to getting into "the zone" and flinging code that (mostly) works way it should, this is not the kind of focus to which I am referring. When developing an application, the main thing that must not remain as the sole focus is the technology. Technology is the means to the goal. Should we focus on the cool stuff we can build into software, we lose sight of what it is ultimately supposed to do — just like running a marathon while staring at your feet, one tends to not see problems until one quite literally runs into them. What is the goal? The ultimate goal is to write good, fast, stable software which does exactly what it is supposed to do — and nothing else — while making it accessible to the intended type of person, and only one person of that type. 

This goes along with so many different ideologies in software architecture discussions that had me scratching my head as I had a hard time swallowing them, like, "Don't add members to your classes that are not going to meet the needs of the documented requirements, no matter how useful they seem." It took me a few days to accept that rule, because I constantly have ideas popping up in my head about how useful some method might be but I couldn't show for the requirement for it yet. But I later learned why it was an important rule. Once it gets added, it becomes a potential bug. All code is a potential bug. The bugs get QA'd on behalf of the requirements, not on behalf of the presence of the class members. Of course, TDD (test-driven development) and agile / XP practices introduce the notion of "never write code until you write a failing test", so that your new experimental method becomes integrated with the whole test case suite. But now you've just doubled to quadrupled your workload. Is it worth it? Well, it might not matter for a few simple methods, but if that becomes a pattern then eventually you have a measurable percentage of experimental code versus actual, pratical requirements implementations.

There is one thing to keep in mind about the above quote, though, and that is the fact that the above philosphy applies to applications, not to operating systems. The above-quoted article is about building user interfaces, which is an applications-level subject. Whereas, the operating system should have, at one level, a reverse philosophy: technology over function. But let me qualify that: in an operating system, the technology IS the function. I say that because an operating system IS and MUST BE a software runtime platform. Therefore, the public-facing API (which is the technology) is key. But you're still focused on the requirements of the functions. You still have a pre-determined set of public-facing API interfaces that were determined ahead of time in order to meet the needs of some specific applications functionality. Those MUST be specific, and their implementations MUST be strict. There MUST be no unsupported public interfaces. And because the OS is the runtime environment for software, the underlying technology, whether assembly code vs. C runtime API vs. Common Language Runtime vs. Java Virtual Machine, etc, is all quite relevant in an operating system. I'd say that it's only relevant at the surface tier, but it's also relevant at lower tiers because each technology is an integration point. In the Singularity operating system, for instance, which is 99.9% C# code, it's relevant down the deepest tier.

The reason why I bring up Haiku OS vs. other open-source initiatives, as if they had contrasting interests, is because they very much do. Haiku does in fact have a very specific and absolute milestone to reach, and that is compatibility with BeOS 5. It is not yet trying to "innovate", and despite any blog posts indicating opinions otherwise, that may in fact be a very good thing indeed. What happened with the Linux community is that all players are sprawled out across the whole universe of software ideas and concepts, but the end result is a huge number of instances of software projects, all of them mediocre and far too many of them being buggy and error-prone (like Samba, for instance).

More than that, Haiku, as indicated in the above quote, seems to have a pretty strict philosphy: "focus on good, solid, bug-free output of limited features, rather than throwing in every mediocre thing but the kitchen sink". Indeed, that's what Linux seems to do (throw in every mediocre thing but the kitchen sink). There's not much going on in the Haiku world. But with what little it does have, it sure does make me smile. I don't suppose some of that .. or maybe a lot of it .. has to do with the aesthetic design talent involved on the project. Linux is *functional*, and aesthetics are slapped on as an afterthought. Haiku appears at first glance as rather clean, down to its code. And clean is beautiful.

Going around adding futuristic stubs to code is something I've been guilty of in the past but I've found it to be a truly awful, horrible practice, so much so it makes me moan with disgust and terror. It makes a mess of your public-facing APIs, where you have to keep turning to documentation (and breaking tests) to discover what's implemented and what is not. And it leaves key milestone code in a perpetual state of being unable to be RPM'd (released to manufacturers). The best way to write software is to take the most basic functional requirement, implement it, test it thoroughly until it works in all predictable scenarios, add the next piece of functionality, test that piece thoroughly (while testing the first piece of functionality all over again, in an automated fashion), and so on, until all of the requirements are met. In the case of an operating system, this is the only way to build a solid, stable, RTM-ready system.

Microsoft published some beta Managed DirectX 2.0 code a while back, and I posted on their newsgroups, "Why on earth are you calling this 'beta' when the definition of 'beta', as opposed to 'alpha', is that the functionality is SUPPOSED to be there but is buggy? Only 'alpha' releases should include empty, unimplemented stubs, yet you guys throw this stuff out there calling it 'beta'. How are we supposed to test this stuff if the stubs are there but we don't know if they're implemented until we try them?" Shortly after I posted that rant, they dropped the Managed DirectX 2.0 initiative and announced that the XNA initiative was going to completely replace it.  I obviously don't think that my post was the reason why they dropped the MDX2 initiative, but I do think it started a chain of discussions inside Microsoft that made them start rethinking all of their decision-making processes all the way around (not exclusively, but perhaps including, the beta / alpha issue I raised). Even if just one of their guys saw my rant and thought, "You know, I think this whole MDX2 thing was a mistake anyway, we should be focusing on XNA," I think my little rant triggered some doubts. 

The React OS project also had me scratching my head. Instead of littering the OS with placeholders of wanted Windows XP / Vista UI and API features while the kernel is still painfully unfinished, which indeed I have observed, what the React OS team should be doing is saying, okay, we're going to set these milestones, and we're not going to add a single stub or a single line of code for the next milestone until the current milestone has been reached. Our milestones are specifically and clearly defined:

  1. Get a primitive kernel booting off a hard drive. Execute some code at startup. Master introspection of the basic hardware (BIOS, hard drives, memory, CPU, keyboard, display, PCI, PCIE, USB controllers).
    • Test, test, TEST!! Stop here!! Do not pass Go! You may not proceed until all tests PASS!!
  2. Basic Execution OS. Implement a working but basic Windows NT-like kernel, HAL, FAT16/FAT32 filesystem, a basic user-mode runtime, a basic Ethernet + IPv4 network stack, and a DOS-style command line system for controlling and testing user-mode programs.
    • This implements a basic operating system that will execute C/C++ code and allows for future development of Win32 code and applications.
    • Test, test, TEST!! Stop here!! Do not pass Go! You may not proceed until all tests PASS!! 
    • Making this milestone flawless will result in an RTM-ready operating system that can compete with classic, old-school UNIX and MS-DOS.
  3. Basic Server OS. Implement Windows 98-level featureset of Win32 API functionality (except those that were deprecated) using Windows XP as the Win32 API design and stability standard, excluding Window handles and all GUI-related features.
    • This includes threading and protected memory if those were not already reached in Milestone 1.
    • Console only! No GUI yet.
    • Add registry, users, user profiles.
    • Add a basic COM registration subsystem.
    • Add NTFS support, including ACLs.
    • Add Windows Services support.
    • Complete the IPv4 network stack, make it solid.
    • This implements a second-generation command-line operating system that will execute multi-threaded Win32 console applications.
    • Test, test, TEST!! Stop here!! Do not pass Go! You may not proceed until all tests PASS!! 
    • Making this milestone flawless will result in an RTM-ready, competing operating system to Windows Server 2000.
  4. Basic Workstation OS. Focus on Win32 GUI API and Windows XP-compatible video hardware driver support. Prep the HAL and Win32 APIs for future DirectX compatibility. Add a very lightweight GUI shell (one that does NOT try to look like Windows Explorer but that provides mouse-driven functionality to accomplish tasks).
    • This implements a lightweight compatibility layer for some Windows apps. This is a key milestone because it brings the mouse and the GUI into the context, and allows the GUI-driven public to begin testing and developing for the system.
    • Test, test, TEST!! Stop here!! Do not pass Go! You may not proceed until all tests PASS!!
    • This milestone brings the operating system to its current state (at 0.32), except that by having a stricter milestone and release schedule the operating system is now extremely stable.
  5. CLR support, Level 1
    • Execute MSIL (.NET 2.0 compatible).
    • Write new Windows services, such as a Web server, using the CLR.
    • This is huge. It adds .NET support and gives Microsoft .NET and the Mono project a run for their money. And the CLR alone gives understanding to the question of why Microsoft was originally tempted to call Windows Server 2003 "Windows Server .NET".
    • Test, test, TEST!! Stop here!! Do not pass Go! You may not proceed until all tests PASS!!
    • This introduces another level of cross-platform compatibility, and makes the operating system an alternative to Windows Server 2003 with .NET.
  6. CLR support, Level 2, and full DirectX 9 compatibility
    • Execute MSIL (.NET 3.0 compatible), including and primarily
      • "Avalon" / Windows Presentation Foundation (XAML-to-Direct3D, multimedia, etc)
      • "Indigo" / Windows Communication Foundation
      • WF (Workflow)
      • .. do we care about InfoCard?

These are some really difficult if not impossible hurdles to jump; there isn't even any real applications functionality in that list, except for APIs, a Web server, and a couple lightweight shells (console and GUI). But that's the whole point. From there, you can pretty much allow the OS to take on a life of its own (the Linux symptom). The end result, though, is a very clean and stable operating system core that has an RTM version from the get-go in some form, rather than a walking, bloated monster of a zillion features that are half- or non-implemented and that suffers a Blue Screen of Death at every turn.

PowerBlog proved to be an interesting learning opportunity for me in this regard. There are a number of things I did very wrong in that project, and a number of things I did quite right. Unfortunately, the number of things I did wrong outweighed the other, most notably:

  • Starting out with too many features I wanted to implement all at once
  • Starting implementation with a GUI rather than with the engine
  • Having no knowledge of TDD (test-driven development)
  • Implementing too many COM and Internet Explorer integration points (PowerBlog doesn't even compile on Windows Vista or on .NET v2.0)
  • Not paying close enough attention to server integration, server features (like comments and trackbacks), and Atom
  • Not getting around to implementing automatic photo insertion / upload support even though I basically understood how it needed to be done in the blog article editor side (it was nearly-implemented on the MetaWeblog API side)
  • Not enough client/server tests, nor mastery of C# to implement them quickly, nor knowledge of how to go about them in a virtualized manner
  • Too many moving parts for a single person to keep track of, including a script and template editor with code highlighting

What I did right:

  • Built the GUI with a specific vision of functionality and user experience in mind, with moderate success
  • Built the engine in a seperate, pluggable library, with an extensibility model
  • The product did work very well, on a specific machine, under specific requirements (my personal blog), and to that end for my purposes it was a really killer app that was right on par with the current version of Windows Live Writer and FAR more functional (if a bit uglier)
  • Fully functional proof of concept and opportunity to discover the full life-cycle of product development including requirements, design, implementation, marketing, and sales (to some extent, but not so much customer support), and gain experience with full-blown VB6, COM/ActiveX, C#, XML, SOAP, XML-RPC

I suppose the more I think about the intricacies of PowerBlog and how I went about them, I have more and more pride in what I accomplished but that no one will ever appreciate but me. In fact, I started out writing this blog post as, "I am SO ashamed of PowerBlog," but I deleted all that because when I think about all the really cool features I successfully implemented, from a geeky perspective, wow, PowerBlog was really a lot of fun.

That said, it did fail, and it failed because I did not focus on a few, basic, simple objectives and test them from the bottom up.

I can't stop thinking about how globally applicable these principles are at every level, both specifically in software and broadly in my career and the choices I make in my everyday life. It's no different than a flask of water. Our time, energy, can be spilled out all over the table, or carefully focused on meeting specific objectives. The latter offsets a world of refined gold from a world full of mediocrity and waste.

There is one good thing to say about spreading yourself out in the first place, though. Spreading out and doing mediocre things solely for the intent of learning the idiosyncracies of all that gets touched is key to knowing how best to develop the things that are being focused on. One can spend months testing out different patterns for a simple piece of functionality, but experiencing real-world failures and wastes of efforts on irrelevant but similar things causes a person to be more effective and productive at choosing a pattern for the specific instance and making it work. In other words, as the saying goes, a person who never fails is never going to succeed because he never tried. That applies in the broad and unfocused vs. specific and focused philosophy in the sense that just pondering the bigger scope of things, testing them out, seeing that they didn't work, and realizing that the biggest problem was one of philosophy (I didn't FOCUS!) not only makes focusing more driven, but the experience gained from "going broad" will help during the implementation of the focused output, if only in understanding the variations of scenarios of each implementation.

Be the first to rate this post

  • Currently 0/5 Stars.
  • 1
  • 2
  • 3
  • 4
  • 5

Tags:

Open Source | Computers and Internet | Operating Systems | Software Development | Career | Microsoft Windows


 

Powered by BlogEngine.NET 1.4.5.0
Theme by Mads Kristensen

About the author

Jon Davis (aka "stimpy77") has been a programmer, developer, and consultant for web and Windows software solutions professionally since 1997, with experience ranging from OS and hardware support to DHTML programming to IIS/ASP web apps to Java network programming to Visual Basic applications to C# desktop apps.
 
Software in all forms is also his sole hobby, whether playing PC games or tinkering with programming them. "I was playing Defender on the Commodore 64," he reminisces, "when I decided at the age of 12 or so that I want to be a computer programmer when I grow up."

Jon was previously employed as a senior .NET developer at a very well-known Internet services company whom you're more likely than not to have directly done business with. However, this blog and all of jondavis.net have no affiliation with, and are not representative of, his former employer in any way.

Contact Me 


Tag cloud

Calendar

<<  May 2019  >>
MoTuWeThFrSaSu
293012345
6789101112
13141516171819
20212223242526
272829303112
3456789

View posts in large calendar