Open-Source Desktop: Giving It Another Go

by Jon Davis 31. August 2007 15:02

A week or so ago I posted a blog entry describing why I felt that Linux simply isn't the long-term answer for the need for an open-source, community-supported desktop operating system. I got some good feedback on this, as well as some not-so-helpful feedback ("here's another troll", "this goes out to everyone ELSE, not to Jon Davis, who has clearly made up his mind", etc). I also commented that Haiku looks beautiful and is quite promising since it is based on (that is, inspired by and compatible with) Be OS, which is the closest I've seen yet to an OS done right, but that Haiku won't be the answer, either, until it gets past R1, which may or may not ever happen. Meanwhile, I brought up ReactOS and how it isn't the answer, either, because if we wanted Windows we can just install Windows. (Couldn't say that about Haiku / Be OS because Be OS is no longer available.) Over the last week I downloaded the latest React OS build and ran it in VMWare. It's not nearly as far along as Haiku is, in terms of stability (not to mention the front-end aesthetic talent). Finally, one of my biggest complaints about Linux--the rediculously arcane file system layout which never seems to go away--seemed to have been resolved in Gobo Linux, until I realized that it's even worse: it's Proper Cased, and since Linux uses a case-sensitive file naming system (which sucks), that makes Gobo Linux nearly unusable for administration, having to constantly check the case of each and every letter rather than just trust that everything will be lower case.

I came across a few interesting tidbits of information since that post. I also received my old laptop from repair (replace the keyboard for a missing 'O' key), an Acer Aspire 5050 that I got at Wal-Mart about ten or eleven months ago, and I decided that since the laptop has since been replaced, before I go pawn it off I should format the drives and actually try installing Ubuntu Linux on it so that no one can say that I've only tried Ubuntu within VMWare. Unfortunately, the latest Ubuntu Live CD doesn't even boot on my newer Toshiba Satellite X205-S9359, so I couldn't even try to install it on my old laptop hard drives that were replaced. Sheesh.

Compiz-Fusion seems to be for Linux what Aero is for Windows, at least in theory. I still have not gotten it to work; "GL Desktop", which I assume is related, doesn't seem to do anything when I turn it on from the System menu. OpenGL stuff does work--I ran the OpenGL implementation of Tux Racer, worked beautifully. I tried the drivers from ATI/AMD but the stuff won't execute. Changing the Compositing option in the xorg.conf file doesn't help. *sigh* Oh well I'll keep tinkering.  UPDATE: I did get it to work, partially. At least, I get the wobbly windows. Not much else, though, like I can't get the Emerald themes to turn on. Seems there's some limitations on my video card chipset such that they disabled 3D support (even though my video chipset fully supports high-performance 3D).

Device support for Linux at install-time is improving, as are the tools for device support, but it is still an awful mess. I have never seen so many files fly across my terminal screen in my life just to try to install the ALSA audio driver. And with it installed, there's still no sound. Hello, guys? If there's no sound, the audio control panels shouldn't behave like everything's hunky dory. For the specific sound card driver I've scoured Google and Ubuntu forums regarding ALC883 support and it's clear that other people are having trouble with this sound card, so I'll have to keep tinkering with this. But that's not the point with regard to it being a mess. Scanning the forums for support, it confounds me how comfortable people are with opening up configuration files and toying with them, the only difference now is that they use gedit instead of vi. Holy cow, someone needs to give these Linux developers a lesson on UX! It has nothing to do with editing in a Windows-like editor--I have actually finally gotten used to vi, and prefer not to have to move my hand to a mouse to reach the scrollbar.

The ideal desktop operating system should not use antiquated techniques like service-proprietary configuration files, or configure / make / make install, not even if that stuff is hidden from view using some wrapper shell which in the long run only makes things more complex. In fact, compiling anything outside of a JIT'er seems rediculously arcane to me. Call it a matter of opinion, but that's one thing I really like about having a central authority (like Microsoft) to basically say this is exactly how hardware drivers should be deployed. Don't get me wrong, Windows is a mess of its own with its Registry mess, etc. But then why do you think the notion of open source desktops has gotten me curious (and critical) lately?

On that point, yes, I get it, I get that Linux's advantage is that, being an open system, it is necessary for stuff to be "recompiled into the system" but really makes me wonder why Bill Gates, rather than Linus Torvolds, is given the Borg treatment, when assimilation is done in Linux at a technical level at runtime in much the same way Microsoft traditionally did using business agreements. For that matter, what's so wonderful about a system being "open" for some source code to compile against any of its many flavors and then (crossing fingers) maybe run, as opposed to having just a few "flavors" and putting time and energy (and, yes, money) into making sure that the stuff has already been compiled, is already known to run, and will almost certainly "just work", assuming that there are no device hardware driver conflicts.

Wouldn't it be ideal if there was a cleaner pluggable hardware abstraction model that the operating system exposed and that drivers could just plug into in a cleaner fashion than the nerdy way the stuff is managed now? Isn't that essentially what a kernel is supposed to do, along with executing user-level applications? Virtual device drivers suddenly popped into my head, a la VMWare and its virtual machine and virtual devices. Abstraction is so cool. Say, why can't each basic hardware function that software expects to be able to use--file system I/O, video card / display, audio, keyboard, mouse--all be cleanly tucked into a clean API that the hardware manufacturers' drivers sit on top of, rather than vice-versa. Why can't we put them into a virtual hardware sandbox? Why can't hypervisors be taken to such an extent as to allow for physical base-level hardware to be virtualized, so that each hardware device driver "sees" a virtualized reality?

Of course, then performance and virtualization management become the huge issues.

More importantly, this isn't particularly a realistic notion since currently hardware drivers literally read/write to/from memory spaces that the kernel maps to the physical device, and execute by way of things like IRQ events. Sometimes I wonder, though, why even that shouldn't be rethought. But now I'm getting into real physical hardware design space, so it's not like I can just pull up a trusty C compiler and recompile a new motherboard. Besides, putting hardware device manufacturers into a software sandbox certainly stifles their opportunities to innovate.

Over last weekend I thought it would be cool, if naive, to actually spawn off YAOS (Yet Another Operating System), derived from nothing, but appended with virtual support for Win32 (like WINE) and Linux (like Cygwin), but inherently have its own system. After all, that is essentially what hypervisor operating systems propose to do. The difference is that it would be ideal if the hypervisor operating system itself could be a viable operating system.

VMWare ESX and Xen, being hypervisor operating systems, run on the Linux kernel variants.

Windows Server 2008, having hypervisor support (however limited, I'm not sure), runs on, well, Windows.

Good starts on hypervisor concepts, but why not take the opportunity to flush out this legacy stuff and build a hypervisor-supporting system that can also be a new OS? Oh how I would love it if, once R1 stablizes, the Haiku operating system added hypervisor support!

I'll post to my blog here at as I continue to tinker with Ubuntu on real hardware.

Pimping Out My Satellite

by Jon Davis 24. August 2007 00:10

I decided to buy a laptop to replace the cheap one I had bought at Wal-Mart about ten months ago. It was a rare (read: unpopular) Acer laptop. It was just a cheap $700 thing (could sell now for $500-600 new), and I upgraded the RAM and hard drive but my biggest problem with it was that the screen size and resolution was just too small and weak. The 'O' key on the keyboard thing fell off while I was typing. It's still under warranty and still out getting its free repair. It took me several months for me to get around to sending it because it was still usable. I prefer to use a laptop for everyday home use (even though I've got a couple desktop machines sitting around at home, I like using the laptop while the TV is in view, not to mention the obvious need to take it with me on the road and to the office). With that 'O' key missing, I was still able to use it by pressing into the little hole there, which is why it took so long for me to get around to sending it off for repair, but it was awkward enough that I simply didn't use the laptop hardly at all.

Due to some handy alignment of the moons and stars, credit, and cash, I was able to have an almost unlimited budget for a replacement, on the expectation that I will sell my obsoleted laptop along with some other stuff. I definitely wanted to spend four digits on an upgrade laptop, and I wanted something that I could depend on more regularly that would be performant, high memory, and plenty of hard drive space. Since I'm a gamer, I was also looking for DirectX 10 support. I was looking at the revised XPS line of Dell laptops, namely the Dell XPS M1330, but as handy as ultraportables are I like big keyboards and I especially wanted to get a high resolution display (in the 1600 pixel range for width), and the Dell XPS M1710 looked too similar (or was the same model as) a co-worker's machine (I like to be somewhat unique), and the Dell XPS M2010 isn't really a laptop, nor a "portable desktop replacement" so much as a full-blown desktop PC with a handle.

There's a Fry's Electronics not far from where I work, so at the end of the day yesterday (Wednesday) I went over there to see if they had anything interesting. I wasn't impressed; everything they have--everything almost everyone has, it seems--is either expensive, useless mini-gizmo gadgetry like the Sony UX Micro PC or just a bunch of cheap low-end consumer stuff in the $500-900 range. But out of three or four aisles and forty or so laptop models, they did have five or six mid-range to high-end consumer PCs. And I wasn't impressed with them, either, except that one of them really kept drawing me. It was a Toshiba Satellite X205-S9359, selling about $500-1000 over my planned price range, but the more I looked at it the more I felt compelled to consider it. Besides looking absolutely stunning on its own, the bulleted list of features on the display decals had me raising my eyebrows:

  • 1680 x 1050 resolution WSXGA TruBrite display @ 17 inches
    • this is perfect; Dell's high-end laptops have had even greater resolutions but just too small, making me squint
  • Intel Core 2 Duo processor (T7300)
  • GeForce 8700M GT (DirectX 10 compatible) with 512MB VRAM
  • 320GB hard drive space (two drives)
  • 1GB/s LAN
  • .. and some other, rather expensive stuff I didn't care about like ..
    • HD DVD-ROM
    • USB HD TV Tuner
    • 4 Harman Kardon Speakers with subwoofer
    • Dolby Home Theater technology
    • Built-in webcam
    • HDMI output
    • firewire / IEEE 1394
    • fingerprint scanner
    • 2GB RAM
    • Bluetooth

Since I already have an Xbox 360 with the HD-DVD add-on, and I have an external HD TV tuner that I bought at a recent CompUSA going out of business sale and I'm not using it, and I have an extra webcam lying around, and I knew I wanted to upgrade the RAM to 4GB which meant eBaying the 2GB, and I have no need for fingerprint security, it seemed to be an obvious waste of money. But I bought it anyway, because the processor, display resolution, future-readiness of the gaming graphics, and keyboard quality could not have been more perfect. Nobody else hit the nail on the head so perfectly from what I could tell, except for HP. I could've gone with HP. I didn't because this one was right here, I could put my hands on it, plus I could buy a RAM upgrade to 4GB while there at Fry's.

The hard drive speed is the only other option that needed an upgrade. 5600 RPM is faster than 4200 RPM but it's still slow, and I can feel it. Hitachi has a 200GB 7200 RPM drive, and this laptop supports two drives, so I bought two of those Hitachi's today over the Internet. I'll eBay the 5600 RPM drives after the 7200 RPM drives arrive.

Since 4 GB upgraded RAM requires Windows Vista 64-bit to use all 4GB (you can use the /3GB switch, but that is prone to running out of user resources due to device hardware address space utilizing the upper registers of RAM), the new question becomes, does Toshiba support it? After all, at this point the only scenarios when Windows Vista 64-bit does not work for most people is the lack of hardware driver support. Fortunately, for the most part most OEM hardware vendors have caught up with the demand for 64-bit drivers; this was not true just months ago, but seems to be true now. Unfortunately, however, Toshiba is not among those vendors.

None of the drivers that Toshiba provides on their support web site for the Satellite X205-S9359 are even labeled as 32-bit, yet they are all essentially 32-bit. It's almost like they are living in some kind of wacky dreamland where 64-bit operating systems don't even exist so there's no reason to differentiate the downloads. This is ironic, because the Toshiba hardware (Core 2 Duo) fully supports a x64 operating system, despite the lack of drivers.

On the other hand, many of the downloads that Toshiba's support web site provides are OEM software packages that are dual format 32-bit & 64-bit. I was able to get the essentials installed, but the video card drivers--the most important driver after the LAN driver--had to be obtained here: This had me uncomfortable, of course, as I posted here: Even so, I have full resolution with Aero support, and Lord of the Rings Online at maximum quality settings looks absolutely stunning.

Among the hardware devices on the laptop that I noticed are working with Vista 64-bit:

  • video adapter (using the drivers) and Direct3D support
  • LAN
  • audio
  • webcam (full software install worked)

 Among the drivers that wouldn't install or don't seem to be configured correctly in Vista 64-bit:

  • the fingerprint scanner / software, despite the same version being available in 64-bit format for purchase at the OEM manufacturer's web site
  • one or two of the Intel chipset driver installers
  • Bluetooth software wouldn't detect the hardware

Not yet tested:

  • HD-DVD playback
  • HDMI output
  • Wireless LAN
  • DirectX 10 functionality on the video card
  • TV tuner

 UPDATE: To follow-up, among the "not yet tested" list, these proved to work:

- Wireless LAN
- DirectX 10 functionality (using drivers)

.. and these proved not to work:

- HD-DVD playback
- TV tuner

I'll have to reinstall everything when my hard drive upgrades arrive, but so far the test run seems to be going successfully. There's a 15-day return policy at Fry's that I expected to take advantage of, but I am feeling more and more confident that there will be no need to return it. But on a side note, one additional disappointment is that there is NO media / restore disc provided with the laptop, so if you're not as self-sufficient as I am with my MSDN software access to Windows Ultimate, etc., you'll have to plan on going through hassle-channels rather than fetching a disc. First thing I did was use the "back up my computer" function in Windows Vista so I can restore the system to the original configuration (and I backed up to the second hard drive). But except for Windows itself, the software including bloatware was available from Toshiba's web site.

With my basic (even if costly) customizations, this is by far the most expensive computer purchase I have ever made in my life. That said, though, it's also going to be the coolest and most versatile gaming and software development workstation PC I've ever had.

Gobo Linux: My Wish Has Come True!

by Jon Davis 20. August 2007 21:39

Over the weekend I was whining (a lot) about the lame directory structure in Linux and how rediculously impossible the gobbligook is to understand at a glance. Someone pointed me to this link, which explains the mess that it is, and when I looked at it I discovered exactly what I was dreaming of: a version of Linux that has the whole structure cleaned up and making sense!!!

Gobo Linux!  ...

Downloading now!!

Too bad, though, the directory names are in Mixed Case. Since Linux is case-sensitive, this'll be cumbersome... 

UPDATE: Poking around at it now. Yeah, those Mixed Case names are a real bear. The concept is precisely what I had in mind -- refactor the whole directory structure, and then just use symbolic links for the old system in order to have compatibility. But not THESE file names, not Mixed Case. This makes me wish I could just go and run with GoboLinux and make a distribution of Gobo that goes back to lower-case names....

Haiku: Still Not The Desktop Answer

by Jon Davis 19. August 2007 23:05

Alright, now that I've downloaded the demo VM I previously linked to, the one that has several BeOS apps installed, I've reached a new conclusion about Haiku.

My conclusion is this: Haiku is to BeOS what ReactOS is to Windows NT 4.0. It's a whole lot of effort to emulate an old architecture all the way down to the bottom-most level, but ultimately being a third party rehash it will never perfectly emulate, and it will always reflect the old milestone of the OS it was trying to emulate.

In other words, Haiku OS is not a futuristic operating system, it's a remake of an old one, and it has some problems.

  • It's focused solely on legacy support. What a horrible way to build a new operating system; yes, I know, the advantage is you get all this BeOS software up and running on the system right away, but a) it will never work like it did in Be, and b) software on Be in general only had a few short-lived years to mature. Or let me put it this way, I'm really happy and excited for Haiku for being what it is--a nostolgic OS that might still have a future once they wrap up R1 (whenever that happens, if ever) and get started on R2. But until R1 is out of the way, there's nothing really I for one can do; I'm NOT going to write BeOS software targeting R1 considering R1 emulates a Windows 98 generation operating system.
  • I posted last night that I thought its base directory structure was perfect. Poking around now with a more loaded system, I retract that. For example, I don't like the idea of apps being installed three different places: in the root directory, in a "home" directory, and in the "beos" directory. WAY too confusing. You have "develop" and "apps" directories in the root, in home, and in beos, but they should all be in root and there should only be one instance of the set.
  • I see no user profile support here. I don't recall, did BeOS have a login process? The old Mac environment didn't force you to log in and access the OS through a user profile, but in my opinion that is simply the evolution of operating system technology, like multiprocessing. Without a user profile, you have a single-user machine, which makes the OS good for, yep, good ol' mobile devices. (No wonder BeOS was sold to Palm!)
  • The yellow/orange titlebar taking up only the width of the title is not just an aesthetic statement, nor does it appeal to my inner sense of geek, as in, "Cool, a window that isn't perfectly rectangle". It's actually an annoyance. My mouse scrolls on the right, always. It accesses the deskbar on the right. It usually resizes the window from the right. Why must I move the mouse all the way to the top left in order to relocate the window or hit the close button? And yes, I know you can move the window around using the border. But it's such a tiny little border. I've got a 20" monitor running at a resolution of 1600x1200, running Haiku in this little 800x600 VMWare window. It's dang hard for me to find those one or two pixels of border width where I can hold the mouse button down to drag the window; I'd rather move the mouse to the top-left where the titlebar is. Which brings me back to my original complaint; make it flush right, please. Not to mention, it makes a terribly 90's-ish aesthetic statement. (I can get left-aligned narrow title bars in Microsoft Windows using Stardock's WindowBlinds. Needless to say, I have no interest in using it.)
  • Half the apps lock up when I try to close out of them. This is because it's pre-alpha, though, so no real surprise there.

I notice Fedora there in my VMWare VMs list, and I ask myself, how does that make me feel? I feel like I miss Fedora, with all its many features and completeness. Which is ironic because I just posted a long post last night about how unimpressed I am with Linux. I'm reminded yet again of what it takes to build an OS. These guys have been at it for six years, yet they don't have an Alpha release done yet. No wonder the geeks hesitate, despite the tools. Without many millions of dollars on hand to invest in people's efforts, and at least as many years, there's just not enough motivation to get something like that done, or done right.

I must admit, though, the idea of seeing a fork of Haiku once it reaches some stable state, and producing an alternate "distro" with all new rules for toolset and GUI, appeals to my imagination. But now suddenly I am more able to understand why there are hundreds of distros of Linux.

I suppose my interest in alternate OS's is related to the wish for a good, approachable, maintainable, community embraceable OS that abandons UNIX or other unapproachable geekiness as well as proprietary and abusive underpinnings like Microsoft's inconsistent and overuse of the Windows registry. I'd love it if I could just sit down with some architects and plan out how a good operating system should be built. As I said before, Be OS came (and Haiku comes) close, but it's already old and just short of archaic considering all that we enjoy in Windows Vista and Mac OS X from both a programmer and a user perspective.

I will keep a close eye on Haiku for a while. Were I to just sit down and construct an OS (and I were a god, which I ain't, and if I was a billionaire, which I also am not), I'd probably start with Haiku, because I like its simplicity in its basic setup--no "/var/", "/opt/" or other gobbligook, and libraries are pretty plainly organized--and I like its performance, basic device support, SMP support, and the total elimination of text-mode bootup junk. I'd add a registry, and build it on something like SQLLite or Berkely DB, something like Elektra. I'd make it user profile oriented, and put users in a 'users' directory, like Vista. I'd make sure that the file system had all the key features of NTFS like security and self-repair... or heck maybe just add full NTFS support and disk checking. In so doing I'd make sure ACL (access control lists) were supported per-file, per-directory, and per other objects. Then I'd tack on cross-platform APIs like Java and Mono. Then I'd get a killer C++ IDE added and fully functional, something on par with Eclipse for Java (oh and for Java support I'd make sure Eclipse worked well). Then I'd tackle 3D graphics API, and push beyond OpenGL and look for a Windows Presentation Foundation API look-alike that had a small footprint but supported 3D and other multimedia. I'd rewrite the shell in that API, taking a cue from Windows Vista Aero. (The goal being not to make it bloated but to make it both fast and powerful, by harnessing the power of modern video cards and delivering powerful, clean, and convenient programming APIs.)

Or, I'd rebuild something from scratch like what I've heard of Singularity, and tackle features on a per-application-requirements basis.

My original silly idea which led to last night's long-winded rant was that someone should sit down with Linux, and just completely refactor the whole thing top to bottom, throwing out legacy compatibility but using GNU tools and clean the whole thing up, starting with the elimination of those oddball directories, replacing the X-Window subsytem with one that is more to-the-metal, basic, and user and developer friendly (and that starts up first to hide the text-mode startup junk), killing off or completely deprecating old-school tools and ways of doing things, add hidden symbolic links for those old, ugly paths to a compatibility subdirectory, and reworking developer tools like Java and Python, one by one, to be compatible with this new system. I'd sell it off as a whole new Linux "distro" like Debian but even much more heavily reworked.

But then I realized, all of that is almost what Haiku does. Except Haiku's focus is to reproduce Be, not to refactor Linux top-to-bottom without constraints. I think the latter is more attainable, useful, and future-friendly.

Honestly, I don't know. I'm awfully naive, but not so naive as to think that my imagining these things should be taken seriously. As much of a geek I am, I'm not qualified to even try to play any part of any of this unless it was high-level. I understand general, high-level architecture well enough, but I couldn't write a device driver, let alone a full-blown kernel, nor do I know enough people who would have an interest in taking on roles in building the hundreds of different parts of a complete OS.

Even so, it's fun to dream, isn't it.

Be the first to rate this post

  • Currently 0/5 Stars.
  • 1
  • 2
  • 3
  • 4
  • 5

Tags: ,

Computers and Internet | Operating Systems | Software Development

Haiku is getting close

by Jon Davis 19. August 2007 18:57

So I only recently (as in, last night) just stumbled upon Haiku, the complete open source operating system built from scratch to mirror, and have binary compatibility with, good old BeOS. It's not done yet, not even in a public beta yet. But after six years (the project just had a 6th birthday) I've played with its base VMWare image and I can say that it's looking really solid. Definitely simple and lightweight right now, sure, but it's coming along and seems very fast and stable, and looks to be, by far, good enough to start enticing a lot more geeks to participate. When it can recompile itself within itself (and I believe they mention that that milestone is really close) so that you don't need Linux to compile the OS, I think that will be the beginning of a new OS era.

One quick observation of the Haiku community, IMO it mirrors the BeOS community (which are not entirely one and the same, but only mostly), and I think that it suffers from a little bit too much of antagonism with commercial OS's (namely Windows), which itself I really abhor. I think Windows Vista and Mac OS X are both very fine operating systems, despite their many flaws, and a decade ago I always thought it was ever so annoying and crippling of progression for the BeOS (and now Haiku) to constantly berate proven ideas in use in Windows and in Mac OS and dismiss them as horrible and stupid, just because the ideas were introduced in those operating systems. For example, the Registry. I was around in the Windows 3.1 days and I thought the registry was a really good idea. I still think so; I just think Microsoft abuses it, and I think MS should have published and followed some better best practices for where to put installation audits so that if an app gets manually removed there is only one branch that needs to be deleted. But the idea of a consistent central repository of configuration information for installed apps that integrate with the OS as well as for the system itself is a very good one. Anyway, Be folks (Haiku folks) just whine and complain about that stuff, waa, I hate Windows, waa. The problem is that their hatred of all things exclusive to Windows has traditionally blinded them. Microsoft spends hundreds of millions of dollars on reasearching operating system best practices and usability. They've made a lot of mistakes but their results, especially their exclusive "features" like the registry, should not go completely unnoticed.

I got so annoyed by it that circa 1999 or so I tried to start a web site called BeCritical, criticizing the Be design for things like neglecting to inform the user of being in a busy state when double-clicking an application icon, just for the sake of trying not to be like Windows. I even conducted and posted a real e-mail-based interview with their role model, Alan Cooper, author of the book About Face, to gain his insight. But I was hosting the web site locally in my home office (as I do now with this blog) and the whole site was lost due to a bad hard drive (or something) shortly thereafter, and I hadn't learned to backup my stuff and so obviously I didn't rebuild.

Last night I wasn't sure how to get Be OS software installed on my copy of the VMWare image, since a web browser hasn't been added yet. I saw the FTP command line client is present, but using that assumes that what I might try out is FTP-accessible (UPDATE: wget is in there as well).

But today I found a better way to sample BeOS software on this thing. Someone in the community has started to compile a big huge VMWare image of a bunch of BeOS software apps running within Haiku. I'm downloading now ...

Be the first to rate this post

  • Currently 0/5 Stars.
  • 1
  • 2
  • 3
  • 4
  • 5

Tags: ,

Computers and Internet | Operating Systems

Why Linux Isn't The Open Source Desktop Answer

by Jon Davis 19. August 2007 01:59

On the PC, Windows wins, Linux loses, and I'm not even cheering for either side. The Mac is just off on the side, looking pretty like a squad of cheerleaders, but I have my eyes fixated straight up at the stars above. Even though I am sitting on the Windows side of the field, with my money invested in the team only because that's where I know the money will come back to me, I'm feeling bummed out, if more than just bored with this two- or three-sided competition.

Let's face it, there are only three major computer operating systems having any relevance in the modern personal computer markeplace: Mac OS, *nix, and Windows. And since Mac OS is, as of OS X, a rewrite built on Mach, a variant of *nix, there are actually only two, but since so much originality remains invested in the Mac environment it deserves recognition of its own.

And then there was BeOS, a truly original operating system that was complete, thorough, fast, cool, beautiful, happily geeky yet user friendly enough for my mother to use, but tragically lost to a failure to win the market. The intellectual assets of its demise went the way of mobile computing, which reminds me a great deal of the exact same process that was experienced by GEOS (my first exposure to a *real* user operating system) a decade prior, which in my view was the most innovative output of pure genius ever to execute on 64k bytes of silicon. GEOS gave the early Macintosh a run for its money, much like BeOS did ten years later, but both and especially the latter struggled to retain (or, for all I know, even obtain) profitability.

The obvious reason for their failure is perhaps because the Macintosh and Windows operating systems already had their foot in the door, and consumers couldn't handle choosing between more than two systems. But I tend to think it had more to do with hardware. GEOS ran on the Commodore 64 when the C64 was already proving itself antiquated, and except for the Commodore 128 (technically, the MOS Technology 6502, till 16-bit GEOS came about for x86) it didn't really have enough hardware on which to install itself, at least not soon enough to matter. The Be operating system, meanwhile, started out on the PowerPC CPU. In fact, it was the true power of the PowerPC that initiated the dream of Be. Be's founder was a former Apple employee who was frustrated with the fact that the Macintosh operating system simply wasn't taking advantage of the hardware, and Be OS was intent on harnessing that power. Meanwhile, PowerPC chips were manufacturered by IBM and Motorola, but they weren't exactly easy to come by on a PC except in Apple Macintosh hardware and proprietary systems. (Said proprietary systems included Be OS's own hardware, the BeBox.) When that proved to limit the Be OS market too heavily, Be OS re-targeted itself to the i386 platform, but by then it was too late; the excitement of a fresh new geeky operating system had waned, and now it was only a novelty.

To be more precise, and accurately agree with the observations of the general public, there was just too little reason for people to switch to Be. Not enough hardware driver support. Not enough software. Windows and Mac met those needs fine. Be had something for everyone, be sure, but by the time corporate funds ran out, it just wasn't nearly enough. Being a closed-source system on a commercial budget, the project simply dried up. An operating system of that magnitude really needs a decade to develop in order to take on the industry.

Then comes along Linus Torvalds. The guy slaps together (okay, that's harsh, he carefully knits together) a Unix kernel that runs on the i386 platform and makes it open source. The Linux operating system is birthed. Suddenly, the UNIX crowd--a huge crowd in the computing realm--have a zero-budget operating system for their microcomputer workstations that is stable, clean, fun, and cool (and isn't FreeBSD).

A decade later, we have a gajillion Linux "distros" (customized distributions of the Linux operating system, typically branded), a whole lot of eye candy, development tools, rediculously commonplace mention in geek and business tabloids, and a ubiquitous following by the non-Microsoft geek community. And if there isn't a day that I don't hear the word "Ubuntu" cross my web browser window at least two or three times per day, it will just be another distro to replace it.

So here I am now with VMWare, playing once again with an instance of Fedora (another Linux distro), and scratching my head wondering why in one day I have had to delete the entire thing and start over three times just to get everything installed and up-to-date when it has been made to be so click-easy. I wonder why I have lost at least three non-consecutive nights at the office to setting up Linux distros at the office, if tools like 'yum' make it so quick and simple.

And of course the answer to my wonderings is consistently the same as it was many years ago when Linux first showed itself in my living room: despite the limitless extent of scripts, quick commands, and distribution GUI tools, it's still the same geek operating system that demands flawless geek sequencing of configuration and implementation. What do you do when RPM dependency trees get out of sync or outta whack? Spend many hours rebuilding it, or start over from scratch. What do you do when the VM gets suddenly reset (because a whole frickin gigabyte of physical RAM didn't get allocated to it) while 'pup' extras are downloading and installing? Either go back and uninstall everything that got (half-)installed and then go back and reinstall everything, or just start over and format the drive. Then there's the matter of the whole stack of libraries for things like mySql. And, oh, the pain of getting MonoDevelop to just install on Fedora 7, with missing GtkSharp libraries leading me to a hell-hole path of looking for GTK+, Pango, GLib, atk, cairo, configure this, make that, oh I give up, please just kill me now!!

In fact, there is a laundry list of reasons why I think this whole operating system is just not the answer for converting Windows developers.

1. The directory structure isn't. It isn't a structure. It's a mess. I mean, what is the *real* difference between, oh, let's see, opt, usr, and var? I realize that some things just start trickling together into the same place, like putting configuration bits into a directory called "etc". But, I mean, for goodness sake.. Putting configuration files into a folder that is named the abbreviation for "etcetera"? Putting system-wide executable programs into a directory that is named after the user ("/usr/bin")? Putting the user's files into a directory called "home"? There is no rhyme or reason to this. It just is. And it is, because the geeks are used to it, they know where to put this stuff like they know how to deal with their siblings. But the so-called structure is maintained by memory and familiarty, not by sensibility.

Both the Macintosh operating systems and Windows operating systems make sense with their system directory structures. Or at least, the Macintosh I once knew with System 6 and System 7 (I haven't seen OS X closely yet.) The operating system was in a directory called "System". Fonts were put into a pseudo-directory (a "suitcase") in the System directory called "Fonts". User documents were put into "Documents", applications into "Applications". And on Windows, it's a bit messier but still makes more sense than opt/var/usr. The operating system files go into "Windows", the name of the operating system. Software applications go into "Program Files" (with a space in the name that has driven us all crazy, I know, but at least it's obvious what it's for). Shared DLLs now go into Program Files, in a subdirectory called "Common Files". User documents and preferences once went into a horrible place called "Profiles" in the Windows directory, but that was moved to "Documents and Settings" in the root directory, and then in Windows Vista it moved again to just "Users". A lot of moving around, but always in the pursuit of sensible structure. And inside the user's profile directory, you have a whole new world, especially in Vista, with organized directory structure, between documents, multimedia files, temp files, app settings, and more. Each directory is sensibly named, no geeky bull. As Microsoft so annoyingly puts it, it's "people-ready".

And don't get me started on how brilliantly simple, perfect, readable, yet sufficiently geeky (terse and lowercase) Be OS's directory naming convention was...

2. It's consistent in things that users don't want to see, and inconsistent in things that they do. So many distros, each one having its own touches. So many window managers, each one having its own capabilities and layouts. But no one is cleaning up the ugly bits. All that stuff is tucked away, swept under the bed, hidden, with a nice GNOME/KDE/other user interface that attempts (and fails, miserably) to shield the novice end user from being exposed from the senseless geekiness that is the UNIX underpinnings of a very old operating system. When the shielding does work, it doesn't work. For instance, Fedora 7 has a GUI configurator for the Apache web server, but if you use it you will break Apache because you'll get two different configuration files in the "etc" directory and between the two of them you'll get two bindings to Port 80. This doesn't just get fixed through the update pipeline (or hadn't when this bug bit me) because Fedora's contributors are busy working on so many other bugs. This is the big problem with having so many distros, all the efforts are forked across a couple hundred repeat efforts to provide a custom solution to the same problem and no one is ultimately accountable for leadership except the volunteers on behalf of a particular instance.

3. It is bound to its legacy. Despite very careful and successful handiwork of incredibly smart software programmers, the ancient operating system has evolved with total support for both legacy and modern architectures. That fact should bring a smile to any Linux geek's face, but it is not a good thing. Think Windows 95. Windows 98. Windows ME. Each of these was an evolution of a really, really poorly architected operating system that seemed as though Microsoft was using bubble gum and tape to expand the operating system's capabilities. (In actuality, bubble gum and tape were not used. Microsoft had a bit more money than that to afford caulk and nails--of the brittle sort. The problem was a broken foundation, that was ultimately replaced with the NT4/2000 reworked codebase.) Linux fortunately has a very stable foundation on which to add all these new evolutionary features, but the support it has for legacy software and the great multitude of development libraries is also the undying foundation that can never become declared as antiquated because so much depends on it.

I'm glad that the Linux foundational underpinnings are constantly improving; I know that the Linux kernel team(s) are working hard to stay up to date to support the latest and greatest support for things like new chipset hardware, multiprocessing (years ago), and now hypervisor support (recent). But it still walks and talks like a penguin--as in, like a geek who knows how to use Emacs and can explain what the big difference is between '/var' and '/usr', or why we use 'init 3' to break out of the GUI. Evolution has come in the form of additional software libraries that run on top of this awkward wave-bouncing boat that miraculously stays afloat without sinking.

4. The revised GUI applications are improving dramatically, but the classic software itself doesn't make sense. Look at text-mode Emacs. If you know how to use Emacs, skip this, but if you don't, look at it. Can you figure out how to use it just by looking at it? Poking at it? No? OK, try "man emacs", or google for help. What's that? Going to take you days to get started? Okay, then, let me know when you can start being productive. I expect to hear from you next week. Hopefully that's not too soon.

5. Without authoritative decision makers, you end up with chaos. And that's exactly what Linux is--organized chaos. Or is it chaotic organization? The stuff "just works", when it does. But wow is it a mess of gobbligook, with no one to account for the mess that it is. With the old Macintosh System 6 / 7 (again, I don't know what OS X looks like), applications were cleanly organized into a self-dependent, fully consolidated application file. Preferences were dropped into a Preferences directory. In Windows, applications are cleanly tucked into "Program Files\Company\App Name" or "Program Files\App Name". Shared libraries are pretty much always .dll's, and DLLs are always either 'C' invokable libraries (which may or may not be COM invokable) or CLR assemblies. App settings go into a pseudo file system, created just for configurations--the Windows registry--and follow standard path conventions like HKCU/Software/[App]/@setting=value. And when software is installed, it must be registered as an installed app and be uninstallable from "Add/Remove Programs" (or "Programs and Features" in Vista). Linux does have RPM, but the dependency trees as well as the potential corruption thereof are too painful to deal with.

A clean design starts with a designer, and consistency with a clean design depends on an authority figure who can sign off on it. With Linux, which has no one designer, you have all sorts of kinds of files, in all kinds of different "languages" (.o, .class, .pl, .bleah), no file typing ("chmod +x bleah"), and can put /any/thing/any/where/.and/yet/it/will/make/sense/to/some/geek. Everything that works, works to the people that established it and to the people who learned it to build dependencies upon it. But the fact that any one thing might work fine doesn't change the fact that the combination of the sum of all of its parts is more akin to a zoo than an organization, which makes it extremely difficult for the average person to adopt.

6. The GUI subsystem (X Window) is hardly performant and is erratic. Macintosh rebuilt its GUI subsystem based on PDF technology for crisp anti-aliasing, yet its down-to-the-metal optimizations make it a clean, fast, and beautiful environment. Microsoft Windows Vista's Aero subsystem takes it to the next level and channels everything through Direct3D, taking advantage of video card optimizations to make the user experience very smooth and responsive. But Linux? From what I can tell, it still pushes everything through the TCP/IP network stack, which is one of the slowest (if most versatile) computer communications channels on an operating system. The advantage is that you can redirect windowing instructions to another machine (like you can with Windows' Remote Desktop) even through SSH, but the down side is that you have limited performance, and you have to minimize the instructions and make the instruction set "smarter" to do things like OpenGL or other graphics-intensive work. In the end, whenever I use Linux locally I feel like I'm using Remote Desktop. The mouse is slow and erratic, and everything feels a few milliseconds slower than me. Everything seems like it's bound by elastic bands. Whereas, when I'm in Windows doing the same things, everything is very responsive; the mouse, in particular, feels flawless and perfectly optimized to bind to my hardware and physical movements. (The Windows kernel seems to manage the mouse in an isolated, high priority system thread, seperate from everything else, which is why no matter how slow other things are, it is always very responsive and true to physical movement.) And I don't suppose I can ever dream of getting VMWare to resize the client OS screen resolution at runtime without unloading X Window and restarting it.

But there are some things I like about Linux.

I like that there is a "clean" command-line environment where administration can be performed without a GUI. I still miss Windows 95/98 being able to "boot into DOS mode", switching straight to MS-DOS 7 on which Windows 95 was still built on top of. Windows NT/2000/XP/2003/Vista forces you to have a useless mouse in your face even if you're booting into "Safe Mode with Command Prompt".

I like the support for significant development languages and tools, like Java and Python. I can learn such things in Windows and deploy to Linux. (Why I would want to if it runs in Windows, though, I don't know, but some stuff I have to support is built on that crazy zoo of a foundation that Linux is, such as Perl and Apache and lots of Linux-specific add-on modules that get dropped into those weird directories.)

I like the geek love that Linux enjoys. Windows doesn't get that love. It only gets hate, from the Linux lovers. Then again, the local user's group for Microsoft technologies definitely enjoys some Windows and .NET love, so nevermind.

I like the network orientation of Linux. It's not the answer, but it does set a precedent.

I like the artistic contributions for the more recent UI elements, like GNOME. I think the fonts are lame, but more for inconsistent display behavior between windows and programs than for the font designs.

But I can totally live without those things. I'd much rather live without those things than have to put up with the mess that Linux is. Let's face it, Linux isn't the open-source answer for a sensible operating system.

The ReactOS project isn't the answer, either. If we wanted Windows, we'll get Windows. We don't need an open-source look-alike of Windows. What this is about is the need for an innovative and original open source operating system that is not bound by legacy constraints.

Be OS came ever so close, if only it was still available and open source! But Haiku OS (not Linux-based, and inspired by Be OS) seems REALLY interesting. Could it be the answer??

With the advent of virtualization with VMWare, Virtual PC, Parallels, and Xen now being commonplace, I can't help but wonder why people in the geek community haven't gone back to the drawing board already to rethink operating systems.

Microsoft gave me a glimmer of hope with their Singularity project. Singularity is a fresh, from-scratch operating system that Microsoft built using a little bit of assembly and C/C++ but mostly a variant of C# all the way down to the metal. It completely drops all ties to legacy support, and instead focuses on the future. It's so multi-threaded oriented that hypervisors are a moot concept. It is Microsoft's opportunity to take two decades of lessons learned with Windows and try to prototype an OS with no legacy constraints and lessons learned now applied. And although it's not tuned for inherent performance (which is why assembly and C++ lovers will inevitably hate the Singularity project idea), it's still performant, and runs circles around modern OS's when it comes to inter-process communications.

Sadly, the Singularity project isn't open-source. That is, Microsoft Research did "open" it up to a few professors in a few select universities, but this isn't an open source operating system intended to be consumed by the general global geek community.

This is why I'm getting a bit excited about the idea that maybe it's time for the geeks to get a clue. VMWare has debugger support to the hardware level and even rewind and replay support. The tools are ripe for dreaming, building, and playing with a whole new operating system, one that is sensibly designed, responsibly organized, and yet open for community contributions. This would be an opportunity for people to learn how to organize and delegate community representatives, to provide the world a fresh, clean, newly design operating system that is free to use, freely consumable, free to break apart, free to extend, but having a single, organized, moderated, carefully planned public "distro", all the way down to the metal, with no legacy constraints, and with sights set for the future. If the open source community really wants to compete with Microsoft, they should pay attention to what Microsoft is doing beside duplicating Windows Explorer with Nautilus, and look at how Microsoft is constantly rethinking and refactoring its core design, from the kernel (dropping Win9x for WinNT) to the directory structures to the windowing subsystem being channeled straight through Direct3D (with Aero).

Believe it, you can have your GNU and OSI (open source initiative), and lose your *nix. It's okay if you do. I'll be cheering you on. Even if I'm the only one on the stands who is doing the wave. But not until we get away from usr/var/etc, and stop taking geeky, old-school, antiquated foundations for granted. They're expensive and they're not worth keeping around.

Rediscover The Web With StumbledUpon

by Jon Davis 6. August 2007 02:44

It would seem I have not been doing my homework for a couple years -- the homework of surfing trash and trinkets scattered about the web. In one day, though, I crammed all the Internet junk food into my mental digestive system I'll need for the next several weeks.

 I did it by going, allowing the toolbar to install itself after I registered, and just kept clicking on the Stumble button.

What a beautiful way to turn a completely unproductive and boring day into a completely unproductive but light-heartedly entertaining one.

Of course, my day was not complete without firing up Guild Wars and Lord of the Rings Online for two or three hours.

Be the first to rate this post

  • Currently 0/5 Stars.
  • 1
  • 2
  • 3
  • 4
  • 5


Computers and Internet


Powered by BlogEngine.NET
Theme by Mads Kristensen

About the author

Jon Davis (aka "stimpy77") has been a programmer, developer, and consultant for web and Windows software solutions professionally since 1997, with experience ranging from OS and hardware support to DHTML programming to IIS/ASP web apps to Java network programming to Visual Basic applications to C# desktop apps.
Software in all forms is also his sole hobby, whether playing PC games or tinkering with programming them. "I was playing Defender on the Commodore 64," he reminisces, "when I decided at the age of 12 or so that I want to be a computer programmer when I grow up."

Jon was previously employed as a senior .NET developer at a very well-known Internet services company whom you're more likely than not to have directly done business with. However, this blog and all of have no affiliation with, and are not representative of, his former employer in any way.

Contact Me 

Tag cloud


<<  May 2021  >>

View posts in large calendar