Nine Reasons Why 8GB Is Only Just Enough (For A Professional Business Software Developer)

by Jon Davis 13. February 2009 21:29

Today I installed 8GB on my home workstation/playstation. I had 8GB lying around already from a voluntary purchase for a prior workplace (I took my RAM back and put the work-provided RAM back in before I left that job) but the brand of RAM didn’t work correctly on my home PC’s motherboard. It’s all good now though, some high quality performance RAM from OCZ and my Windows 7 system self-rating on the RAM I/O jumped from 5.9 to 7.2.

At my new job I had to request a RAM upgrade from 2GB to 4GB. (Since it’s 32-bit XP I couldn’t go any higher.) I asked about when 64-bit Windows Vista or Windows 7 would be put on the table for consideration as an option for employees, I was told “there are no plans for 64-bit”.

The same thing happened with my last short-term gig. Good God, corporate IT folks everywhere are stuck in the year 2002. I can barely function at 4GB, can’t function much at all at 2GB.

By quadrupling the performance of your employee's system, you’d effectively double the productivity of your employee; it’s like getting a new employee for free.

If you are a multi-role developer and aren’t already saturating at least 4GB of RAM you are throwing away your employer’s money, and if you are IT and not providing at least 4GB RAM to developers and actively working on adding corporate support for 64-bit for employees’ workstations you are costing the company a ton of money due to productivity loss!! I don’t know how many times I’ve seen people restart their computers or sit and wait for 2 minutes for Visual Studio to come up because their machine is bogged down on a swap file. That was “typical” half a decade ago, but it’s not acceptable anymore. The same is true of hard drive space. Fast 1 Terabyte hard drives are available for less than $100 these days, there is simply no excuse. For any employee who makes more than X (say, $25,000), for Pete’s sake, throw in an extra $1000-$2000 or so more and get the employee two large (24-inch) monitors, at least 1TB hard drive(s) (ideally 4 drives in a RAID-0+1 array), 64-bit Windows Server 2008 / Windows Vista / Windows 7, a quad-core CPU, and 8 GB of some high performance (800+ GHz) RAM. It’s not that that’s another $2,000 or so to lose; it’s that just $2,000 will save you many thousands more dough. By quadrupling the performance of your employee's system, you’d effectively double the productivity of your employee; it’s like getting a new employee for free. And if you are the employee, making double of X (say, more than $50,000), and if your employer could somehow allow it (and they should, shame on them if they don’t and they won’t do it themselves), you should go out and get your own hardware upgrades. Make yourself twice as productive, and earn your pay with pride.

In a business environment, whether one is paid by the hour or salaried (already expected to work X hours a week, which is effectively loosely translated to hourly anyway), time = money. Period. This is not about developers enjoying a luxury, it’s about them saving time and employers saving money.

Note to the morons who argue “this is why developers are writing big, bloated software that suck up resources” .. Dear moron, this post is from the perspective of an actual developer’s workstation, not a mere bit-twiddling programmer—a developer, that is, who wears many hats and must not just write code but manage database details, work with project plans, document technical details, electronically collaborate with teammates, test and debug, etc., all in one sitting. Nothing in here actually recommends or even contributes to writing big, bloated software for an end user. The objective is productivity, your skills as a programmer are a separate concern. If you are producing bad, bloated code, the quality of the machine on which you wrote the code has little to nothing to contribute to that—on the contrary, a poor developer system can lead to extremely shoddy code because the time and patience required just to manage to refactor and re-test become such a huge burden. If you really want to test your code on a limited machine, you can rig VMWare / VirtualPC / VirtualBox to temporarily run with lesser RAM, etc. You shouldn’t have to punish yourself with poor productivity while you are creating the output. Such punishment is far more monetarily expensive than the cost of RAM.

I can think of a lot of reasons for 8+ GB RAM, but I’ll name a handful that matter most to me.

  1. Windows XP / Server 2003 alone takes up half a gigabyte of RAM (Vista / Server 2008 takes up double that). Scheduled tasks and other processes cause the OS to peak out at some 50+% more. Cost: 512-850MB. Subtotal @nominal: ~512MB; @peak: 850MB
  2. IIS isn’t a huge hog but it’s a big system service with a lot of responsibility. Cost: 50-150. Subtotal @nominal: ~550MB; @peak 1GB.
  3. Microsoft Office and other productivity applications should need to be used more than one at a time, as needed. For more than two decades, modern computers have supported a marvelous feature called multi-tasking. This means that if you have Outlook open, and you double-click a Microsoft Word attachment, and upon reading it you realize that you need to update your Excel spreadsheet, which in your train of thought you find yourself updating an Access database, and then you realize that these updates result in a change of product features so you need to reflect these details in your PowerPoint presentation, you should have been able to open each of these applications without missing a beat, and by the time you’re done you should be able to close all these apps in no more than one passing second per click of the [X] close button of each app. Each of these apps takes up as much as 100MB of RAM, Outlook typically even more, and Outlook is typically always open. Cost: 150-1GB. Subtotal @nominal: 700MB; @peak 2GB.
  4. Every business software developer should have his own copy of SQL Server Developer Edition. Every instance of SQL Server Developer Edition takes up a good 25MB to 150MB of RAM just for the core services, multiplied by each of the support services. Meanwhile, Visual Studio 2008 Pro and Team Edition come with SQL Server 2005 Express Edition, not 2008, so for some of us that means two installations of SQL Server Express. Both SQL Server Developer Edition and SQL Server Express Edition are ideal to have on the same machine since Express doesn’t have all the features of Developer and Developer doesn’t have the flat-file support that is available in Express. SQL Server sitting idly costs a LOT of CPU, so quad core is quite ideal. Cost: @nominal: 150MB, @peak 512MB. Subtotal @nominal: 850MB; @peak: 2.5GB. We haven’t even hit Visual Studio yet.
  5. Except in actual Database projects (not to be confused with code projects that happen to have database support), any serious developer would use SQL Server Management Studio, not Visual Studio, to access database data and to work with T-SQL tasks. This would be run alongside Visual Studio, but nonetheless as a separate application. Cost: 250MB. Subtotal @nominal: 1.1GB; @peak: 2.75GB.
  6. Visual Studio itself takes the cake. With ReSharper and other popular add-ins like PowerCommands installed, Visual Studio just started up takes up half a gig of RAM per instance. Add another 250MB for a typical medium-size solution. And if you, like me lately, work in multiple branches and find yourself having to edit several branches for different reasons, one shouldn’t have to close out of Visual Studio to open the next branch. That’s productivity thrown away. This week I was working with three branches; that’s 3 instances. Sample scenario: I’m coding away on my sandbox branch, then a bug ticket comes in and I have to edit the QA/production branch in an isolated instance of Visual Studio for a quick fix, then I get an IM from someone requesting an immediate resolution to something in the developer branch. Lucky I didn’t open a fourth instance. Eventually I can close the latter two instances down and continue with my sandbox environment. Case in point: Visual Studio costs a LOT of RAM. Cost @nominal 512MB, @peak 2.25GB. Subtotal @nominal: 1.6GB; @peak: 5GB.
  7. Your app being developed takes up RAM. This could be any amount, but don’t forget that Visual Studio instantiates independent web servers and loads up bloated binaries for debugging. If there are lots of services and support apps involved, they all stack up fast. Cost @nominal: 50MB, @peak 750MB. Subtotal @nominal: 1.65GB; @peak: 5.75GB.
  8. Internet Explorer and/or your other web browsers take up plenty of RAM. Typically 75MB for IE to be loaded, plus 10-15MB per page/tab. And if you’re anything like me, you’ll have lots and lots and LOTS of pages/tabs by the end of the day; by noon I typically end up with about four or five separate IE windows/processes, each with 5-15 tabs. (Mind you, all or at least most of them are work-related windows, such as looking up internal/corporate documents on the intranet or tracking down developer documentation such as API specs, blogs, and forum posts.) Cost @nominal: 100MB; @peak: 512MB. Subtotal @nominal: 1.75GB; @peak: 6.5GB.
  9. No software solution should go untested on as many platforms as is going to be used in production. If it’s a web site, it should be tested on IE 6, IE 7, and IE 8, as well as current version of Opera, Safari 3+, Firefox 1.5, Firefox 2, and Firefox 3+. If it’s a desktop app, it should be tested on every compatible version of the OS. If it’s a cross-platform compiled app, it should be tested on Windows, Mac, and Linux. You could have an isolated set of computers and/or QA staff to look into all these scenarios, but when it comes to company time and productivity, the developer should test first, and he should test right on his own computer. He should not have to shutdown to dual-boot. He should be using VMWare (or Virtual PC, or VirtualBox, etc). Each VMWare instance takes up the RAM and CPU of a normal system installation; I can’t comprehend why it is that some people think that a VMWare image should only take up a few GB of hard drive space and half a gig of RAM; it just doesn’t work that way. Also, in a distributed software solution with multiple servers involved, firing up multiple instances of VMWare for testing and debugging should be mandatory. Cost @nominal: 512MB; @peak: 4GB. Subtotal @nominal: 2.25GB; @peak: 10.5GB.

Total peak memory (64-bit Vista SP1 which was not accounted in #1): 11+GB!!!

Now, you could argue all day long that you can “save money” by shutting down all those “peak” processes to use less RAM rather than using so much. I’d argue all day long that you are freaking insane. The 8GB I bought for my PC cost me $130 from Dell. Buy, insert, test, save money. Don’t be stupid and wasteful. Make yourself productive.

Cascading .NET Versioning In The Path

by Jon Davis 16. December 2007 15:21

The .NET Framework v3.0 and v3.5 both broke some versioning rules, and I suppose there was good reason. Version 3.0 was just version 2.0 plus a few new DLLs for WPF, WCF, WF, and CardSpace. But those new technologies were huge, huge enough to give the .NET Framework a new version number. Meanwhile, though, the C# language and .NET Framework 3.5 forked off cosistency again, in a way that was unnecessary, and with double the damage. C# 3.0 runs on .NET 3.5 -- that is way too confusing. C# 3.0 compiles to the .NET 2.0 CLR. That is way too confusing, too. .NET Framework 3.5 builds on top of .NET Framework 3.0 and 2.0. That would make sense, except that v3.0 and v2.0 are sort of seperate entities.

If you open up C:\Windows\Microsoft.NET\Framework, you'll find each of these .NET Framework versions deployed in their own directory..

08/29/2007  12:30 AM    <DIR>          v1.0.3705
08/29/2007  01:01 AM    <DIR>          v1.1.4322
12/15/2007  03:13 PM    <DIR>          v2.0.50727
11/02/2006  08:15 AM    <DIR>          v3.0
11/19/2007  09:49 PM    <DIR>          v3.5
08/28/2007  05:22 PM    <DIR>          VJSharp

Incidentally, notice that v3.0 and v3.5 don't have build numbers, while the others do. Refer back to my rant about inconsistency. (Personally, I never liked the build numbers being there anyway.) Meanwhile, VJSharp is completely unversioned.

Then there are tons of utility files that do not belong in this directory at all, and Microsoft allowed a mess to be made in here:

08/29/2007  05:39 AM    <DIR>          1028
08/29/2007  06:58 AM    <DIR>          1030
08/29/2007  02:57 AM    <DIR>          1031
08/29/2007  03:10 AM    <DIR>          1035
08/29/2007  04:13 AM    <DIR>          1036
08/29/2007  06:05 AM    <DIR>          1040
08/29/2007  03:55 AM    <DIR>          1042
08/29/2007  07:27 AM    <DIR>          1043
08/29/2007  06:31 AM    <DIR>          1044
08/29/2007  05:16 AM    <DIR>          1046
08/29/2007  03:39 AM    <DIR>          1049
08/29/2007  03:23 AM    <DIR>          1053
08/29/2007  04:33 AM    <DIR>          2052
08/29/2007  04:53 AM    <DIR>          3082
11/01/2006  11:33 PM            72,704 NETFXSBS10.exe
02/20/2003  06:44 PM            36,354 NETFXSBS10.hkf
09/18/2006  02:32 PM            41,392 netfxsbs12.hkf
11/19/2007  09:09 PM            16,896 sbscmp10.dll
11/19/2007  09:09 PM            16,896 sbscmp20_mscorwks.dll
11/19/2007  09:09 PM            16,896 sbscmp20_perfcounter.dll
11/01/2006  11:33 PM             5,120 sbs_diasymreader.dll
11/01/2006  11:33 PM             5,120 sbs_iehost.dll
11/01/2006  11:33 PM             5,120 sbs_microsoft.jscript.dll
11/01/2006  11:33 PM             5,632 sbs_microsoft.vsa.vb.codedomprocessor.dll
11/01/2006  11:33 PM             5,120 sbs_mscordbi.dll
11/01/2006  11:33 PM             5,120 sbs_mscorrc.dll
11/01/2006  11:33 PM             5,120 sbs_mscorsec.dll
11/01/2006  11:33 PM             5,120 sbs_system.configuration.install.dll
11/01/2006  11:33 PM             5,120
11/01/2006  11:33 PM             5,120 sbs_system.enterpriseservices.dll
11/01/2006  11:33 PM             5,120 sbs_VsaVb7rt.dll
11/01/2006  11:33 PM             5,120 sbs_wminet_utils.dll
11/19/2007  09:09 PM            16,896 SharedReg12.dll

Now we're one step closer to looking like the infamous Windows Registry. Each Microsoft employee or department gets to put his own file or entry wherever he wants, see? I wonder how many Microsoft employees have decided to move their office desks to the middle of the front lobby, besides the nice front desk lady(ies).

Anyway, the reason why I posted here is because, ironically, the .NET Framework does not appear in the PATH for some strange reason. I've always had to manually add "DOTNET", pointing to C:\Windows\Microsoft.NET\Framework\v1.1.4322 or ...\v2.0.50727 to my environment variables, then add %DOTNET% to my PATH. I used this regularly for tools like my own AssemblyLister so I could easily pre-JIT my assemblies for faster boot time (even if the trade-off is slower runtime performance). But this broke with v3.0 and it is still broken in v3.5, because there is no CLR and Framework directory that runs .NET Framework v3.5. .NET Framework v3.5, like v3.0, is just some add-on assemblies to v2.0. But if I were to reference only the v2.0 directory, I would only have the v2.0 framework (plus the GAC, which, fortunately, contains the v3.0 and v3.5 DLL assemblies, but not their utilities).

Fortunately, you can cascade the .NET versions in the PATH. I don't know why I didn't do this a long time ago, but the PATH environment variable, which is a semi-colon delimited list of directories in which to search for files in the file system shell without referencing their complete paths, is already a cascading list. In other words, when searching for a file using the PATH, the first directory listed is scanned first, then the second, and so on.

One thing I like about the .NET Framework versioning that Microsoft claimed that they are committed to when they were working on v2 was that the .NET Framework will try to be forwards-compatible and will always be backwards-compatible to assemblies targeting different versions of the CLR. This means that a v1.1 assembly can be run in a v2.0 CLR, and a v2.0 assembly might be able to run in a v1.1 CLR. In the end, it's just MSIL that the CLR breaks down into machine code at runtime (unless it's pre-JITted).

So as long as you're using an environment (such as cmd.exe or PowerShell) that splits the PATH string into multiple directores, recursively finding more %variables%, and scans them one by one in cascading order, you can effectively use a single environment variable to reference all of your .NET Framework directories at the same time, with the newer .NET Framework files taking priority over the older files. To do this, just add each .NET Framework version, starting with v3.5, then v3.0, then v2, then v1.1 (if v1.1 is installed), into a DOTNET environment variable, and then add %DOTNET% to your path.

  • DOTNET = C:\Windows\Microsoft.NET\Framework\v3.5; C:\Windows\Microsoft.NET\Framework\v3.0; C:\Windows\Microsoft.NET\Framework\v2.0.50727; C:\Windows\Microsoft.NET\Framework\v1.1.4322; C:\Windows\Microsoft.NET\Framework\v1.0.3705
  • PATH (virtually on %PATH%, not literally) = %PATH%;%DOTNET% 

By "virtually on %PATH%, above, all I mean is that you would replace %PATH% with what it already is, then append ";%DOTNET%".

If I wanted to use the 64-bit versions of the .NET Framework, I could do the same, using C:\Windows\Microsft.NET\Framework64\, but meanwhile also adding the x86 paths for compatibility.

  • DOTNETX64 =
    C:\Windows\Microsoft.NET\Framework64\v3.5; C:\Windows\Microsoft.NET\Framework64\v3.0; C:\Windows\Microsoft.NET\Framework64\v2.0.50727; C:\Windows\Microsoft.NET\Framework\v3.5; C:\Windows\Microsoft.NET\Framework\v3.0; C:\Windows\Microsoft.NET\Framework\v2.0.50727; C:\Windows\Microsoft.NET\Framework\v1.1.4322; C:\Windows\Microsoft.NET\Framework\v1.0.3705
    - or -

    C:\Windows\Microsoft.NET\Framework64\v3.5; C:\Windows\Microsoft.NET\Framework64\v3.0; C:\Windows\Microsoft.NET\Framework64\v2.0.50727; %DOTNET%

Unfortunately, though, this brings another issue into the mix. Which one do you want in your PATH, %DOTNET% or %DOTNETX64%? This is an important question. I do ASP.NET development in 32-bit and I debug my apps in explicit x86 CPU build mode (because Visual Studio doesn't let me perform edit-and-continue tasks in 64-bit, which is already the default environment for the "Any CPU" build mode). But without compiling to 64-bit, CLR assemblies can hit the RAM registry space ceiling, quickly running out of RAM (OutOfMemoryException or something similar), and this has happened to me already while building a search engine server based on Lucene.NET.

To be honest, I'm still not sure. I'm going to try using the 64-bit path variable (%DOTNETX64%) for now and see if it brings me any problems. I think Windows is defaulting to that one already. Meanwhile, though, I can still continue to target x86 CPUs in my builds.

So to test this out, I try where in Vista to see the cascading effect. (Note that any time you change the PATH environment variable, or any environment variable(s), you must close the command shell window and restart it before the variables will propogate. They will not propogate into the GUI shell until you log out and log back in.)

C:\Users\Jon>where ngen

So it seems to work, which is nice.

Meanwhile, I still need an environment variable to single directory for my primary CLR, which contains ngen.exe and csc.exe and other CLR essentials.

  • CLRDIR = C:\Windows\Microsoft.NET\Framework64\v2.0.50727

There's one last change I need to make, though. It might make more sense to change the environment variable %DOTNET% to %NETFXX86%, and then change %DOTNETX64% to %NETFX%. This way, apps that target a %NETFX% environment variable can properly target the environment's complete and CPU-targeted environment rather than just focus solely on x86 compatibility.

So, here's what I have in the end:

  • NETFXX86 =
  • NETFXX64 =
  • NETFX = %NETFXX64%; %NETFXX86%
  • CLRDIR = C:\Windows\Microsoft.NET\Framework64\v2.0.50727
  • PATH =
    C:\Program Files\Microsoft SDKs\Windows\v6.0A\Bin; %SystemRoot%\system32; %SystemRoot%; %SystemRoot%\System32\Wbem; %SYSTEMROOT%\System32\WindowsPowerShell\v1.0\; C:\Program Files (x86)\Microsoft SQL Server\90\Tools\binn\; C:\Program Files (x86)\Microsoft SQL Server\80\Tools\Binn\; C:\Program Files\Microsoft SQL Server\90\DTS\Binn\; C:\Program Files\Microsoft SQL Server\90\Tools\binn\; C:\Program Files (x86)\Microsoft SQL Server\90\DTS\Binn\; C:\Program Files (x86)\Microsoft SQL Server\90\Tools\Binn\VSShell\Common7\IDE\; C:\Program Files (x86)\Microsoft Visual Studio 8\Common7\IDE\PrivateAssemblies\; C:\Program Files (x86)\QuickTime\QTSystem\; C:\Program Files (x86)\Common Files\Adobe\AGL; C:\Windows\SUA\common\; C:\Windows\SUA\usr\lib\; C:\Program Files\Microsoft Network Monitor 3\; E:\filez\dev\IronPython-1.1; C:\Program Files (x86)\GnuWin32\bin; C:\ruby\bin; C:\MinGW\bin; C:\cygwin\bin;

Now I can use "native" .NET apps like csc.exe or ngen.exe, and have all tools and assemblies on hand, without manually loading the SDK command shell.

I created an EXE to auto-configure this: SetNetFxEnvVars.exe [source] Note that it will require an immediate reboot. Note also that if you're using Internet Explorer, in order to run the EXE you must download it and then unblock it (right-click it, choose Properties, then click Unblock).

kick it on

Currently rated 3.0 by 2 people

  • Currently 3/5 Stars.
  • 1
  • 2
  • 3
  • 4
  • 5

Tags: , , , , , , , ,

Software Development | Microsoft Windows

Pimping Out My Satellite

by Jon Davis 24. August 2007 00:10

I decided to buy a laptop to replace the cheap one I had bought at Wal-Mart about ten months ago. It was a rare (read: unpopular) Acer laptop. It was just a cheap $700 thing (could sell now for $500-600 new), and I upgraded the RAM and hard drive but my biggest problem with it was that the screen size and resolution was just too small and weak. The 'O' key on the keyboard thing fell off while I was typing. It's still under warranty and still out getting its free repair. It took me several months for me to get around to sending it because it was still usable. I prefer to use a laptop for everyday home use (even though I've got a couple desktop machines sitting around at home, I like using the laptop while the TV is in view, not to mention the obvious need to take it with me on the road and to the office). With that 'O' key missing, I was still able to use it by pressing into the little hole there, which is why it took so long for me to get around to sending it off for repair, but it was awkward enough that I simply didn't use the laptop hardly at all.

Due to some handy alignment of the moons and stars, credit, and cash, I was able to have an almost unlimited budget for a replacement, on the expectation that I will sell my obsoleted laptop along with some other stuff. I definitely wanted to spend four digits on an upgrade laptop, and I wanted something that I could depend on more regularly that would be performant, high memory, and plenty of hard drive space. Since I'm a gamer, I was also looking for DirectX 10 support. I was looking at the revised XPS line of Dell laptops, namely the Dell XPS M1330, but as handy as ultraportables are I like big keyboards and I especially wanted to get a high resolution display (in the 1600 pixel range for width), and the Dell XPS M1710 looked too similar (or was the same model as) a co-worker's machine (I like to be somewhat unique), and the Dell XPS M2010 isn't really a laptop, nor a "portable desktop replacement" so much as a full-blown desktop PC with a handle.

There's a Fry's Electronics not far from where I work, so at the end of the day yesterday (Wednesday) I went over there to see if they had anything interesting. I wasn't impressed; everything they have--everything almost everyone has, it seems--is either expensive, useless mini-gizmo gadgetry like the Sony UX Micro PC or just a bunch of cheap low-end consumer stuff in the $500-900 range. But out of three or four aisles and forty or so laptop models, they did have five or six mid-range to high-end consumer PCs. And I wasn't impressed with them, either, except that one of them really kept drawing me. It was a Toshiba Satellite X205-S9359, selling about $500-1000 over my planned price range, but the more I looked at it the more I felt compelled to consider it. Besides looking absolutely stunning on its own, the bulleted list of features on the display decals had me raising my eyebrows:

  • 1680 x 1050 resolution WSXGA TruBrite display @ 17 inches
    • this is perfect; Dell's high-end laptops have had even greater resolutions but just too small, making me squint
  • Intel Core 2 Duo processor (T7300)
  • GeForce 8700M GT (DirectX 10 compatible) with 512MB VRAM
  • 320GB hard drive space (two drives)
  • 1GB/s LAN
  • .. and some other, rather expensive stuff I didn't care about like ..
    • HD DVD-ROM
    • USB HD TV Tuner
    • 4 Harman Kardon Speakers with subwoofer
    • Dolby Home Theater technology
    • Built-in webcam
    • HDMI output
    • firewire / IEEE 1394
    • fingerprint scanner
    • 2GB RAM
    • Bluetooth

Since I already have an Xbox 360 with the HD-DVD add-on, and I have an external HD TV tuner that I bought at a recent CompUSA going out of business sale and I'm not using it, and I have an extra webcam lying around, and I knew I wanted to upgrade the RAM to 4GB which meant eBaying the 2GB, and I have no need for fingerprint security, it seemed to be an obvious waste of money. But I bought it anyway, because the processor, display resolution, future-readiness of the gaming graphics, and keyboard quality could not have been more perfect. Nobody else hit the nail on the head so perfectly from what I could tell, except for HP. I could've gone with HP. I didn't because this one was right here, I could put my hands on it, plus I could buy a RAM upgrade to 4GB while there at Fry's.

The hard drive speed is the only other option that needed an upgrade. 5600 RPM is faster than 4200 RPM but it's still slow, and I can feel it. Hitachi has a 200GB 7200 RPM drive, and this laptop supports two drives, so I bought two of those Hitachi's today over the Internet. I'll eBay the 5600 RPM drives after the 7200 RPM drives arrive.

Since 4 GB upgraded RAM requires Windows Vista 64-bit to use all 4GB (you can use the /3GB switch, but that is prone to running out of user resources due to device hardware address space utilizing the upper registers of RAM), the new question becomes, does Toshiba support it? After all, at this point the only scenarios when Windows Vista 64-bit does not work for most people is the lack of hardware driver support. Fortunately, for the most part most OEM hardware vendors have caught up with the demand for 64-bit drivers; this was not true just months ago, but seems to be true now. Unfortunately, however, Toshiba is not among those vendors.

None of the drivers that Toshiba provides on their support web site for the Satellite X205-S9359 are even labeled as 32-bit, yet they are all essentially 32-bit. It's almost like they are living in some kind of wacky dreamland where 64-bit operating systems don't even exist so there's no reason to differentiate the downloads. This is ironic, because the Toshiba hardware (Core 2 Duo) fully supports a x64 operating system, despite the lack of drivers.

On the other hand, many of the downloads that Toshiba's support web site provides are OEM software packages that are dual format 32-bit & 64-bit. I was able to get the essentials installed, but the video card drivers--the most important driver after the LAN driver--had to be obtained here: This had me uncomfortable, of course, as I posted here: Even so, I have full resolution with Aero support, and Lord of the Rings Online at maximum quality settings looks absolutely stunning.

Among the hardware devices on the laptop that I noticed are working with Vista 64-bit:

  • video adapter (using the drivers) and Direct3D support
  • LAN
  • audio
  • webcam (full software install worked)

 Among the drivers that wouldn't install or don't seem to be configured correctly in Vista 64-bit:

  • the fingerprint scanner / software, despite the same version being available in 64-bit format for purchase at the OEM manufacturer's web site
  • one or two of the Intel chipset driver installers
  • Bluetooth software wouldn't detect the hardware

Not yet tested:

  • HD-DVD playback
  • HDMI output
  • Wireless LAN
  • DirectX 10 functionality on the video card
  • TV tuner

 UPDATE: To follow-up, among the "not yet tested" list, these proved to work:

- Wireless LAN
- DirectX 10 functionality (using drivers)

.. and these proved not to work:

- HD-DVD playback
- TV tuner

I'll have to reinstall everything when my hard drive upgrades arrive, but so far the test run seems to be going successfully. There's a 15-day return policy at Fry's that I expected to take advantage of, but I am feeling more and more confident that there will be no need to return it. But on a side note, one additional disappointment is that there is NO media / restore disc provided with the laptop, so if you're not as self-sufficient as I am with my MSDN software access to Windows Ultimate, etc., you'll have to plan on going through hassle-channels rather than fetching a disc. First thing I did was use the "back up my computer" function in Windows Vista so I can restore the system to the original configuration (and I backed up to the second hard drive). But except for Windows itself, the software including bloatware was available from Toshiba's web site.

With my basic (even if costly) customizations, this is by far the most expensive computer purchase I have ever made in my life. That said, though, it's also going to be the coolest and most versatile gaming and software development workstation PC I've ever had.


Powered by BlogEngine.NET
Theme by Mads Kristensen

About the author

Jon Davis (aka "stimpy77") has been a programmer, developer, and consultant for web and Windows software solutions professionally since 1997, with experience ranging from OS and hardware support to DHTML programming to IIS/ASP web apps to Java network programming to Visual Basic applications to C# desktop apps.
Software in all forms is also his sole hobby, whether playing PC games or tinkering with programming them. "I was playing Defender on the Commodore 64," he reminisces, "when I decided at the age of 12 or so that I want to be a computer programmer when I grow up."

Jon was previously employed as a senior .NET developer at a very well-known Internet services company whom you're more likely than not to have directly done business with. However, this blog and all of have no affiliation with, and are not representative of, his former employer in any way.

Contact Me 

Tag cloud


<<  May 2018  >>

View posts in large calendar