Technology Status Update 2016

by Jon Davis 10. July 2016 09:09

Hello, peephole. (people.) Just a little update. I've been keeping this blog online for some time, my most recent blog entries always so negative, I keep having to see that negativity every time I check to make sure the blog's up, lol. I'm tired of it so I thought I'd post something positive.

My current job is one I hope to keep for years and years to come, and if that doesn't work out I'll be looking for one just like it and try to keep it for years and years to come. I'm so done with contracting and consulting (except the occasional mentoring session on code mentor -dot- io). I'm still developing, of course, and as technology is changing, here's what's up as I see it. 

  1. Azure is relevant. 
    image
    The world really has shifted to cloud and the majority of companies, finally, are offloading their hosting to the cloud. AWS, Azure, take your pick, everyone who hates Microsoft will obviously choose AWS but Azure is the obvious choice for Microsoft stack folks, there is nothing meaningful AWS has that Azure doesn't at this point. The amount of stuff on Azure is sufficiently terrifying in quantity and supposed quality enough to give me a thrill. So I'm done with hating on Azure, after all their marketing and nagging and pushing, Microsoft has crossed a threshold of market saturation that I am adequately impressed. I guess that means I have to be an Azure fan, too, now. Fine. Yay Azure, woo. -.-
  2. ASP.NET is officially rebooted. 
    image
    So I hear this thing called ASP.NET Core 1.0 formerly known as ASP.NET 5 formerly known as ASP.NET vNext has RTM'd, and I hear it's like super duper important. It snuck by me, I haven't mastered it, but I know it enought to know a few things:
    • It's a total redux by means of redo. It's like the Star Trek reboot except it’s smaller and there are fewer planets it can manage, but it’s exactly like the Star Trek reboot in that it will probably implode yours.
    • If you've built your career on ASP.NET and you want to continue living on ASP.NET's laurals, now is not the time to master ASP.NET 1.0 Core. Give it another year or two to mature. 
    • If you're stuck on or otherwise fascinated by non-Microsoft operating systems, namely Mac and Linux, but you want to use the Microsoft programming stack, you absolutely must learn and master ASP.NET Core 1.0 and EF7.
    • If all you liked from ASP.NET Core 1.0 was the dynamic configs and build-time transpiles, you don't need ASP.NET Core for that LOL LOL ROFLMAO LOL LOL LOL *cough*
  3. The Aurelia Javascript framework is nearly ready.
    image
    Overall, Javascript framework trends have stopped. Companies are building upon AngularJS 1.x. Everyone who’s behind is talking about React as if it was new and suddenly newly relevant (it isn’t new anymore). Everyone still implementing Knockout are out of the loop and will die off soon enough. jQuery is still ubiquitous and yet ignored as a thing, but meanwhile it just turned v3.

    I don’t know what to think about things anymore. Angular 2.0 requires TypeScript, people hate TypeScript because they hate transpilers. People are still comparing TypeScript with CoffeeScript. People are dumb. If it wasn’t for people I might like Angular 2.0, and for that matter I’d be all over AureliaJS, which is much nicer but just doesn’t have Google as the titanic marketing arm. In the end, let’s just get stuff done, guys. Build stuff. Don’t worry about frameworks. Learn them all as you need them.
  4. Node.js is fading and yet slowly growing in relevance.
    image
    Do you remember .. oh heck unless you're graying probably not, anyway .. do you remember back in the day when the dynamic Internet was first loosed on the public and C/C++ and Perl were used to execute from cgi-bin, and if you wanted to add dynamic stuff to a web site you had to learn Perl and maybe find Perl pearls and plop them into your own cgi-bin? Yeah, no, I never really learned Perl, either, but I did notice the trend, but in the end, what did C/C++ and Perl mean to us up until the last decade? Answer: ubiquitous availability, but not web server functionality, just an ever-present availability for scripts, utilities, hacks, and whatever. That is where node.js is headed. Node.js for anything web related has become and will continue to be a gigantic mess of disorganized, everyone-is-equal, noisily integrated modules that sort of work but will never be as stable in built compositions as more carefully organized platforms. Frankly, I see node.js being more relevant as a workstation runtime than a server runtime. Right now I'm looking at maybe poking at it in a TFS build environment, but not so much for hosting things.
    I will always have a bitter taste in my mouth with node.js after trying to get socket.io integrated with Express and watching the whole thing just crumble, with no documentation or community help to resolve it, and this happened not just once on the job (never resolved before I walked away) but also during a code-mentor mentoring session (which we didn't figure out), even after a good year or so of maturity of the platform after the first instance. I still like node.js but will no longer be trying to build a career on it.
  5. Pay close attention and learn up on Swagger aka OpenAPI. 
    image
    Remember when -- oh wait, no, unless you're graying, .. nevermind .. anyway, -- once upon a time something called SOAP came out and it came with it a self-documentation feature that was a combination of WSDL and some really handy HTML generated scaffolding built into web services that would let you manually test SOAP-based services by filling out a self-generated form. Well now that JSON-based REST is the entirety of the playing field, we need the same self-documention. That's where Swagger came in a couple years ago and everyone uses it now. Swagger needs some serious overhauling--someone needs to come up with a Swagger-compliant UI built on more modular and configurable components, for example--but as a drop-in self-documentation feature for REST services it fits the bill.
    • Swagger can be had on .NET using a lib called Swashbuckle. If you use OData, there is a lib called Swashbuckle.OData. We use it very, very heavily where I work. (I was the one who found it and brought it in.) "Make sure it shows up and works in Swagger" is a requirement for all REST/OData APIs we build now.
    • Swagger is now OpenAPI but it's still Swagger, there are not yet any OpenAPI artifacts that I know of other than Swagger. Which is lame. Swagger is ugly. Featureful, but ugly, and non-modular.
    • Microsoft is listed as a contributing member of the OpenAPI committee, but I don't know what that means, and I don't see any generic output from OpenAPI yet. I'm worried that Microsoft will build a black box (rather than white box) Swagger-compliant alternative for ASP.NET Core.
    • Other curious ones to pay attention to, but which I don't see as significantly supported by the .NET community yet (maybe I haven't looked hard enough), are:
  6. OData v4 has potential but is implementation-heavy and sorely needs a v5. 
    image
    A lot of investments have been made in OData v4 as a web-based facade to Entity Framework data resources. It's the foundation of everything the team I'm with is working on, and I've learned to hate it. LOL. But I see its potential. I hope investments continue because it is sorely missing fundamental features like
    • MS OData needs better navigation property filtering and security checking, whether by optionally redirecting navigation properties to EDM-mapped controller routes (yes, taking a performance hit) or some other means
    • MS OData '/$count' breaks when [ODataRoute] is declared, boo.
    • OData spec sorely needs "DISTINCT" feature
    • $select needs to be smarter about returning anonymous models and not just eliminating fields; if all you want is one field in a nested navigation property in a nested navigation property (equivalent of LINQ's .Select(x=>new {ID=x.ID, DesiredField2=x.Child.Child2.DesiredField2}), in the OData result set you will have to dive into an array and then into an array to find the one desired field
    • MS OData output serialization is very slow and CPU-heavy
    • Custom actions and functions and making them exposed to Swagger via Swashbuckle.OData make me want to pull my hair out, it takes sometimes two hours of screaming and choking people to set up a route in OData where it would take me two minutes in Web API, and in the end I end up with a weird namespaced function name in the route like /OData/Widgets/Acme.GetCompositeThingmajig(4), there's no getting away from even the default namespace and your EDM definition must be an EXACT match to what is clearly obviously spelled out in the C# controller implementation or you die. I mean, if Swashbuckle / Swashbuckle.OData can mostly figure most of it out without making us dress up in a weird Halloween costume, surely Microsoft's EDM generator should have been able to.
  7. "Simple CRUD apps" vs "messaging-oriented DDD apps"
    has become the new Microsoft vs Linux or C# vs Java or SQL vs NoSQL. 

    imageThe war is really ugly. Over the last two or three years people have really been talking about how microservices and reaction-oriented software have turned the software industry upside down. Those who hop on the bandwagon are neglecting to know when to choose simpler tooling chains for simple jobs, meanwhile those who refuse to jump on the bandwagon are using some really harsh, cruel words to describe the trend ("idiots", "morons", etc). We need to learn to love and embrace all of these forms of software, allow them to grow us up, and know when to choose which pattern for which job.
    • Simple CRUD apps can still accomplish most business needs, making them preferable most of the time
      • .. but they don't scale well
      • .. and they require relatively very little development knowledge to build and grow
    • Non-transactional message-oriented solutions and related patterns like CQRS-ES scale out well but scale developers' and testers' comprehension very poorly; they have an exponential scale of complexity footprint, but for the thrill seekers they can be, frankly, hella fun and interesting so long as they are not built upon ancient ESB systems like SAP and so long as people can integrate in software planning war rooms.
    • Disparate data sourcing as with DDD with partial data replication is a DBA's nightmare. DBAs will always hate it, their opinions will always be biased, and they will always be right in their minds that it is wrong and foolish to go that route. They will sometimes be completely correct.

  8. Integrated functional unit tests are more valuable than TDD-style purist unit tests. That’s my new conclusion about developer testing in 2016. Purist TDD mindset still has a role in the software developer’s life. But there is still value in automated integration tests, and when things like Entity Framework are heavily in play, apparently it’s better to build upon LocalDB automation than Moq.
    At least, that’s what my current employer has forced me to believe. Sadly, the purist TDD mindset that I tried to adopt and bring to the table was not even slightly appreciated. I don’t know if I’m going to burn in hell for being persuaded out of a purist unit testing mindset or not. We shall see, we shall see.
  9. I'm hearing some weird and creepy rumors I don't think I like about SQL Server moving to Linux and eventually getting itself renamed. I don't like it, I think it's unnecessary. Microsoft should just create another product. Let SQL Server be SQL Server for Windows forever. Careers are built on such things. Bad Microsoft! Windows 8, .NET Framework version name fiascos, Lync vs Skype for Business, when will you ever learn to stop breaking marketing details to fix what is already successful??!
  10. Speaking of SQL Server, SQL Server 2016 is RTM'd, and full blown SSMS 2016 is free.
  11. On-premises TFS 2015 only just recently acquired gated check-in build support in a recent update. Seriously, like, what the heck, Microsoft? It's also super buggy, you get a nasty error message in Visual Studio while monitoring its progress. This is laughable.
    • Clear message from Microsoft: "If you want a premium TFS experience, Azure / Visual Studio Online is where you have to go." Microsoft is no longer a shrink-wrapped product company, they sell shrink wrapped software only for the legacy folks as an afterthought. They are hosted platform company now all the way. .
      • This means that Windows 10 machines including Nokia devices are moving to be subscription boxes with dumb client deployments. Boo.
  12. imageAnother rumor I've heard is that
    Microsoft is going to abandon the game industry.

    The Xbox platform was awesome because Microsoft was all in. But they're not all in anymore, and it shows, and so now as they look at their lackluster profits, what did they expect?
    • Microsoft: Either stay all-in with Xbox and also Windows 10 (dudes, have you seen Steam's Big Picture mode? no excuse!) or say goodbye to the consumer market forever. Seriously. Because we who thrive on the Microsoft platform are also gamers. I would recommend knocking yourselves over to partner with Valve to co-own the whole entertainment world like the oligarchies that both of you are since Valve did so well at keeping the Windows PC relevant to the gaming markets.

For the most part I've probably lived under a rock, I'm sure, I've been too busy enjoying my new 2016 Subaru WRX (a 4-door racecar) which I am probably going to sell in the next year because I didn't get out of debt first, but not before getting a Kawasaki Vulcan S ABS Café as my first motorized two-wheeler, riding that between playing Steam games, going camping, and exploring other ways to appreciate being alive on this planet. Maybe someday I'll learn to help the homeless and unfed, as I should. BTW, in the end I happen to know that "love God and love people" are the only two things that matter in life. The rest is fluff. But I'm so selfish, man do I enjoy fluff.  I feel like such a jerk. Those who know me know that I am one. God help me.

image2016-Kawasaki-Vulcan-S-ABS-Cafe1 
image  image
Top row: Fluff that doesn’t matter and distracts me from matters of substance.
Bottom row: Matters of substance.

Be the first to rate this post

  • Currently 0/5 Stars.
  • 1
  • 2
  • 3
  • 4
  • 5

Tags:

ASP.NET | Blog | Career | Computers and Internet | Cool Tools | LINQ | Microsoft Windows | Open Source | Software Development | Web Development | Windows

Still Stuck In Dell Hell

by Jon Davis 7. December 2009 19:37

Just sent this off to Dell tonight.

So I placed an order (Purchase # ------------ [for a Studio 16 XPS Intel i7 laptop]) back in September, the order was delayed to October, it was then re-ordered behind my back to be delivered in November, then it was delayed again behind my back to be delivered in December, and then finally, behind my back, my order was cancelled.

That was weeks ago. I have still not been reimbursed for this cancelled order.

Look, you can ship the order (NOW!!) and keep the money, or you can cancel the order and return the money. You may not have both. Shame on you guys for trying to pull a scam like this off!

Please resolve this, now.

Thank you,
Jon Davis

Be the first to rate this post

  • Currently 0/5 Stars.
  • 1
  • 2
  • 3
  • 4
  • 5

Tags:

Computer Hardware | Computers and Internet

Dell: Wow. Just Wow. I Really Liked You. This Is Saddening.

by Jon Davis 9. November 2009 03:42

I'm trying to figure out if I should be extremely mad. But I'm really very sad, and amazed. And annoyed.

I always admired Dell, especially for some of their higher-end laptops and some of the innovations they’ve made. And I always figured that made-to-order meant made-upon-order, moderately efficiently, considering how long they’ve been in business and how successful they’ve been as the #1 PC vendor in the world. McDonald’s wouldn’t survive, after all, if their food took an hour to prepare.

I placed an initial order way back in early September, taking advantage of my Dell Financing available credit, but splitting my payment between Dell Financial and a credit card. The order was submitted online and received, and all was well, but that night/morning I received an e-mail saying that the order had been "rejected". Excuse me? I wasn't sure what to think. I called them up to find out what was going on. Half an hour later, after getting forwarded over and over again, I was finally told that I placed my order with one cent ($0.01) more than my available credit with Dell Financial. Huh?!! The order form figured that bit out. If that's what happened, it wasn't my doing, it was the order form. Anyway, I cancelled that order, after repeating my order number and security information at least five more times.

A week or two later, in the latter half of September, I placed another order, for the same computer (the Studio XPS 16 with the Intel i7), and this time while splitting payments I made very sure I was very careful to round down the Dell Financing part by $10 (for example $1259 would become $1250), so that there would be no "one penny" too much being overcharged like last time, with the rest on my credit card. I submitted that order, it was received and accepted. That night I got another rejection e-mail, and a day or two after that I received a phone call from a Dell customer service rep saying that my order had been declined because too much was charged on my Dell Financing account again. I asked how much had been charged on it. He quoted a number that was waaay over what I had manually entered on the order form, and I knew it couldn't have been a mistake on my part because it wasn't rounded down, and I triple checked while producing that order that it was rounded down.

I stayed on the phone with that customer service rep to be sure that this order was fixed. But they had to produce another order (cancelling the original order) and make another charge to another credit card because they couldn't make an instant release on the card they already charged. I registered a complaint with him and by the time the call was finished I'd been talking with him for a good 45 minutes or so. Then I went online and saw that the new order was not accessible to my profile's order history. I sent an e-mail to that same CS rep and he assisted me on how to merge that order with my online profile. Kudos to that fella, Rajesh_R-AT-Dell.com.

So now that my order was associated with my online profile--it was now the end of September--I could not get a delivery date, as it was listed as N/A. I e-mailed Rajesh R again about that, and received no reply, but then the next time I checked the delivery estimate was showing up as end of October, a FULL FRIGGIN MONTH out, which I thought was absolutely ridiculous, but survivable. I am, fortunately, getting by with my existing non-Dell gear, I was just looking forward to using the newer hardware which I had already "paid" for.

Well, that estimated delivery date came and went. When I looked online shortly before that date, I was shocked to see that it had been bumped out to around November 7--another week and a half out!! What gives?!

So I waited on that. If you look at the calendar, it's November 9. And what does my order status say?

Oh, apparently I'm only just getting started!! My order was CANCELLED AND REMADE ("Changed"), without my permission or involvement, with a new order date (not mine!) of 10/29/2009 and a new estimated delivery date of December 2, 2009. The order status is also no longer "In Production", it has been rolled back to "Order Processing" which means it's waiting for payment, and it's been there for a week. (Guess who's not placing any bets I'll get my September order by Christmas?!) No explanation. I think I could maintain some sanity if Dell would just say that critical parts were backordered. But they don't show me anything like that. No e-mails, either. Just *shrug* "Changed". As if they'll ship it when they feel like it.

Had this been an oddball personal experience, I would have a bit of hope that this will "just work itself out", but after scouring these forums both here and elsewhere for similar testimonies, I'm realizing that this seems to be standard practice with Dell.

The problem is, not only is this bad customer service, it's very close to illegal. It is certainly unlawful to collect money and not deliver on goods purchased, but it might be unlawful, if perhaps in civil court, to collect money without producing any clear expectations as to a deliverable timeline and a reasonable effort to meet that timeline or provide a reasonable explanation. When I purchase from Amazon I receive delivery within a week. When I purchase from strangers on eBay, I normally receive delivery within a couple weeks. In all cases, a rough estimate with reasonable accuracy is provided. However, in Dell's case, it seems clear that they are intentionally stalling, I suspect perhaps their Dell Financing is under-funded, I don't know, and I don't care, but for Dell to make copping out on an order this frequently is clearly unacceptable and should not be tolerated by its customer base, nor for that matter by Dell management.

It's for this reason that I feel that it may be responsible for fellow customers to see about instigating change--starting with online pressuring with the likes of http://ihatedell.org and perhaps going so far as a class action suit. I really don't know what to think. I just think that all of this is startlingly evil and wrong.

Be the first to rate this post

  • Currently 0/5 Stars.
  • 1
  • 2
  • 3
  • 4
  • 5

Tags:

Computer Hardware | Computers and Internet

The Perfect Photos To Illustrate The Sadly Typical Software Development Process

by Jon Davis 8. October 2009 22:43

image

http://thereifixedit.com/

I just wanted to post a quickie post here to link to a site that, while it's a great and funny site in itself, I was actually very surprised by how perfectly every single photo over there illustrates some kind of software system I've touched, whether recently or long ago.

For example,

 

image

This one reminds me of the countless number of memory leaks we have to put up with when garbage collectors fail or are absent.

 

image

And this one reminds me of ol’ Windows ME, racing stripe and all.

 

image 

The ugly do-it-all “make this site your home page” portal web sites that cluttered the web just three years ago, Yahoo! being among them.

 

image

“We’ll refactor later.”

  

image

The company’s legacy software with broken APIs and/or endpoints we simply don’t have the resources to support anymore.

 

image

I swear this is a virtual photo representing every developer’s workstation at the office. It takes half an hour to boot our machines.

  

image

Manual deployment. It takes several of us to push a web site out to the servers, plus QA to approve the closure of the deployment ticket.

  

image

A very nice admin interface that no one but the devs will ever see. I wish we had an admin interface that spiffy and complete.

 

image 

This is our dev server, a virtual machine on an overloaded VM host with limited RAM. We have CruiseControl.NET on it and it takes about 30 minutes for it to build and deploy to test/QA on a single run.

 

image

This is what happens when your business partner uses Java and you’re using WCF. Add SSL, and viola!

 

image

The Microsoft Office COM APIs.

  

Epic-Kludge-Photo-RoundKnobInARectangleHole

(From udhay in comments:) Perfect example of the so called work-around.

 

.. Ohh this could go on forever. You get the idea. Have fun.

New CAPTCHA Services Being Reworked

by Jon Davis 2. August 2009 16:35

So a few posts back I mentioned CAPTCHAThisYo.com (CAPTCHA This, Yo!), a web site name that took me all of about 5 seconds of creative exploration to come up with, whereby I was inspired by ReCAPTCHA and other similar CAPTCHA services but wanted to provide a more diverse, multi-format CAPTCHA web site/service that exposed a number of various CAPTCHA algorithms that multiple web sites can use.

The prototype failed when I forgot that cross-site scripting blocked my prototype once I deployed it. I felt so stupid, I’ve been doing this stuff for over a decade and hadn’t paid attention to the browsers’ evolution of blocking cross-site script calls; stuff I was able to do years ago I can’t do anymore because of browser security constraints.

So I spent some time looking at various workarounds. First I looked at Flash as a client-side proxy, which had me stumped because it just plain wouldn’t work and wouldn’t provide any feedback. Then I discovered JSONP, which is what Yahoo!’s APIs use. This had me stumped, too, but see my previous post; it didn’t work, either, but that’s because the browser won’t perform lately-invoked script references from ‘localhost’. Once I got around that, I decided JSONP will be the cross-domain scripting method of choice.

I might be abandoning CAPTCHAThisYo.com, not because I want to abandon a CAPTCHA service but a) because it’s a ridiculous name, and b) because the scope changed. I figured out how to make the CAPTCHA algorithm a public-submissions based community without asking users to upload .NET interface-implementing assemblies to a server that runs stuff automatically. I also discovered that OpenCAPTCHA.org and OpenCAPTCHA.net are available, so I snagged those, and humanticate.com/humanticate.net (a word merge of “human” and “authenticate”) as well.

I spent this evening compiling a rough draft of a spec that I’ll post on OpenCAPTCHA.org, eventually have posted on OpenCAPTCHA.org. The rough draft doc has been written. The basic idea is simple: there are two (2) types of services, a Challenge service and a Challenge-Answer provider service. A Challenge-Answer provider service returns a JSON object consisting of a challenge (question, image, etc) and an answer (array of possible acceptable answers). This Challenge-Answer provider is invoked by either a Challenge service that passes the Challenge to the client that then passes the user’s answer to the consuming web site’s server that then calls the Challenge service to validate the answer, or by a web site’s server that retains the answer for validation on its own, without a Challenge server acting in the middle. The former is easier to implement, the latter is more performant.

So these are going to be the web site / service URLs:

http://www.opencaptcha.org/ – Will define the OpenCAPTCHA.org spec 
http://www.opencaptcha.net/ – Will expose a formal list of OpenCAPTCHA.org spec compliant service providers (Challenge services and Challenge-Answer provider services) 
http://www.humanticate.com/ – Will be a branded CAPTCHA service that complies with OpenCAPTCHA.org spec. CAPTCHA algorithms are proprietary. 
http://open.humanticate.com/ - Will be a community-driven sandbox of CAPTCHA providers where user-created algorithms can be rated and commented on.

Be the first to rate this post

  • Currently 0/5 Stars.
  • 1
  • 2
  • 3
  • 4
  • 5

Tags:

Computers and Internet | Web Development

CAPTCHA This, Yo! - Early Alpha

by Jon Davis 14. July 2009 01:54

I've posted a mini-subproject:

http://www.CAPTCHAThisYo.com/

The site is self-explanatory. The idea is simple. I want CAPTCHA. I don't want to support CAPTCHA in my apps. I just want to drop in a one-liner snippet somewhere and call it done. I think other people share the same desire. So I now support CAPTCHA as a CAPTCHA app. I did all the work for myself so that I don't have to do that work. I went through all that trouble so that I don't have to go through the trouble .... Wait, ...

Seriously, it's not typical CAPTCHA, and it's Not Quite Done Yet (TM). It's something that'll evolve. Right now there isn't even any hard-to-read graphic CAPTCHA.

But what I'd like to do is have an ever-growing CAPTCHA questions library and, by default, have it just rotate through them randomly. The questions might range from shape detection to riddles to basic math. I'd really like to have some kind of community uploads thingmajig with ratings, so that people can basically share their own CAPTCHA solutions and they all run inside the same captchathisyo.com CAPTCHA engine. I'm just not sure yet how to pull that off.

Theoretically, I could take the approach Microsoft took when C# was initially released (long, long ago, in a galaxy far, far away), they had a cool insect sandbox game where you could write a .NET class that implements some interface and then send it up to the server and it would just run as an insect on the server. The objective of the game was to write the biggest killer/eater. I'm not sure how feasible the idea is of opening up all .NET uploads to the server, but it's something I'm pondering on.

Anyway, the concept has been prototyped and the prototype has been deployed is sound, but I still need to work out cross-site scripting limitations, bear with me. I still need to find a designer to make something beautful out of it. That said, feel free to use it and give feedback. Stand by.

Be the first to rate this post

  • Currently 0/5 Stars.
  • 1
  • 2
  • 3
  • 4
  • 5

Tags:

C# | Computers and Internet | Cool Tools | Pet Projects | Techniques | Web Development

Running Out Of Resources (Even With Lots Of RAM Installed)

by Jon Davis 10. March 2009 13:33

“.. Ask yourself if you have ever encountered any of the following problems :-

    1. You try to open an application but it refuses to load or it starts to load and then it disappears!
    2. You try to open or use an application but you get an "Out Of Memory" error message!
    3. One of your running applications inexplicably quits!
    4. When you right-click on your application, nothing happens! The command menu refuses to pop-up!
    5. Your web browser simply refuses to load any more windows or tabs!
    6. Your application is missing some menus or toolbars!
    7. You get the following error messages :
               Initialization of the dynamic library <system>\system32\user32.dll failed. The process is terminating abnormally.
               Initialization of the dynamic library <system>\system32\kernel32.dll failed. The process is terminating abnormally.”

I’m shocked I never blogged this. Over at Coding Horror there was an article that was posted wherein in the comments section someone pointed me here:

http://www.techarp.com/showarticle.aspx?artno=238&pgno=0

It’s a 3-page article describing how to reconfigure your desktop heap so that you don’t keep running out of system resources. I’ve gone back to 32-bit XP w/ 4GB RAM at the office and needed to look this up again because I was still getting erratic behavior—more so, actually, because more RAM meant less hesitancy to open things up. A co-worker got the RAM, too, so I had to look it up yet again. It’s definitely worth looking at if you open lots of apps and/or web browser windows like I do.

In case that article goes down, it goes something like this:

Go to HKEY_LOCAL_MACHINE -> SYSTEM -> CurrentControlSet -> Control -> Session Manager –> SubSystems -> Key: Windows

If you check the Value data for the Windows key, it should be something like this :-

%SystemRoot%\system32\csrss.exe ObjectDirectory=\Windows SharedSection=1024,3072,512 Windows=On SubSystemType=Windows ServerDll=basesrv,1 ServerDll=winsrv:UserServerDllInitialization,3 ServerDll=winsrv:ConServerDllInitialization,2 ProfileControl=Off MaxRequestThreads=16

The portion of interest is "SharedSection=1024,3072,512".
The three values under Shared Section determines how much memory in kilobytes (KB) is allocated to each component of the desktop heap.

The first value is the shared heap size, common to all desktops. It's used to store the global handle table and shared system settings.

By default, it's set to 1024KB. You generally do not need to modify this value.

The second value is the desktop heap size for each desktop associated with the "interactive" window station. It's used to store user objects like hooks, menus, strings and windows.

By default, it's set to 3072KB. The more users log into the system, the more desktops are created. Consequently, the total "interactive" desktop heap size will increase to reflect the number of desktops created. But each desktop will only have an "interactive" desktop heap of 3072KB.

The third value is the desktop heap size for each desktop associated with the "non-interactive" window station.

By default, it's set to 512KB. But if this value is not present, the size of the "non-interactive" window station will be the same as that of the "interactive" window station.

Every service process created under a user account will be given a new desktop in a "non-interactive" window station created by the Service Control Manager (SCM). Therefore, each of these services will consume the amount of desktop heap, as specified in the third SharedSection value.

I’m guessing here but it looks like the desktop heap size should only be limited if you’re using Windows Server and Terminal Services, as I’m not sure how you can measure “the more users that log onto the desktop” outside of Terminal Services. Even Remote Desktop is a single-desktop experience.

I have mine reconfigured to:

%SystemRoot%\system32\csrss.exe ObjectDirectory=\Windows SharedSection=1024,6144,512 Windows=On SubSystemType=Windows ServerDll=basesrv,1 ServerDll=winsrv:UserServerDllInitialization,3 ServerDll=winsrv:ConServerDllInitialization,2 ProfileControl=Off MaxRequestThreads=16

Not sure if that’s right but the system does still boot and I haven’t run into problem much anymore.

Be the first to rate this post

  • Currently 0/5 Stars.
  • 1
  • 2
  • 3
  • 4
  • 5

Tags:

Computers and Internet | Microsoft Windows

Nine Reasons Why 8GB Is Only Just Enough (For A Professional Business Software Developer)

by Jon Davis 13. February 2009 21:29

Today I installed 8GB on my home workstation/playstation. I had 8GB lying around already from a voluntary purchase for a prior workplace (I took my RAM back and put the work-provided RAM back in before I left that job) but the brand of RAM didn’t work correctly on my home PC’s motherboard. It’s all good now though, some high quality performance RAM from OCZ and my Windows 7 system self-rating on the RAM I/O jumped from 5.9 to 7.2.

At my new job I had to request a RAM upgrade from 2GB to 4GB. (Since it’s 32-bit XP I couldn’t go any higher.) I asked about when 64-bit Windows Vista or Windows 7 would be put on the table for consideration as an option for employees, I was told “there are no plans for 64-bit”.

The same thing happened with my last short-term gig. Good God, corporate IT folks everywhere are stuck in the year 2002. I can barely function at 4GB, can’t function much at all at 2GB.

By quadrupling the performance of your employee's system, you’d effectively double the productivity of your employee; it’s like getting a new employee for free.

If you are a multi-role developer and aren’t already saturating at least 4GB of RAM you are throwing away your employer’s money, and if you are IT and not providing at least 4GB RAM to developers and actively working on adding corporate support for 64-bit for employees’ workstations you are costing the company a ton of money due to productivity loss!! I don’t know how many times I’ve seen people restart their computers or sit and wait for 2 minutes for Visual Studio to come up because their machine is bogged down on a swap file. That was “typical” half a decade ago, but it’s not acceptable anymore. The same is true of hard drive space. Fast 1 Terabyte hard drives are available for less than $100 these days, there is simply no excuse. For any employee who makes more than X (say, $25,000), for Pete’s sake, throw in an extra $1000-$2000 or so more and get the employee two large (24-inch) monitors, at least 1TB hard drive(s) (ideally 4 drives in a RAID-0+1 array), 64-bit Windows Server 2008 / Windows Vista / Windows 7, a quad-core CPU, and 8 GB of some high performance (800+ GHz) RAM. It’s not that that’s another $2,000 or so to lose; it’s that just $2,000 will save you many thousands more dough. By quadrupling the performance of your employee's system, you’d effectively double the productivity of your employee; it’s like getting a new employee for free. And if you are the employee, making double of X (say, more than $50,000), and if your employer could somehow allow it (and they should, shame on them if they don’t and they won’t do it themselves), you should go out and get your own hardware upgrades. Make yourself twice as productive, and earn your pay with pride.

In a business environment, whether one is paid by the hour or salaried (already expected to work X hours a week, which is effectively loosely translated to hourly anyway), time = money. Period. This is not about developers enjoying a luxury, it’s about them saving time and employers saving money.

Note to the morons who argue “this is why developers are writing big, bloated software that suck up resources” .. Dear moron, this post is from the perspective of an actual developer’s workstation, not a mere bit-twiddling programmer—a developer, that is, who wears many hats and must not just write code but manage database details, work with project plans, document technical details, electronically collaborate with teammates, test and debug, etc., all in one sitting. Nothing in here actually recommends or even contributes to writing big, bloated software for an end user. The objective is productivity, your skills as a programmer are a separate concern. If you are producing bad, bloated code, the quality of the machine on which you wrote the code has little to nothing to contribute to that—on the contrary, a poor developer system can lead to extremely shoddy code because the time and patience required just to manage to refactor and re-test become such a huge burden. If you really want to test your code on a limited machine, you can rig VMWare / VirtualPC / VirtualBox to temporarily run with lesser RAM, etc. You shouldn’t have to punish yourself with poor productivity while you are creating the output. Such punishment is far more monetarily expensive than the cost of RAM.

I can think of a lot of reasons for 8+ GB RAM, but I’ll name a handful that matter most to me.

  1. Windows XP / Server 2003 alone takes up half a gigabyte of RAM (Vista / Server 2008 takes up double that). Scheduled tasks and other processes cause the OS to peak out at some 50+% more. Cost: 512-850MB. Subtotal @nominal: ~512MB; @peak: 850MB
  2. IIS isn’t a huge hog but it’s a big system service with a lot of responsibility. Cost: 50-150. Subtotal @nominal: ~550MB; @peak 1GB.
  3. Microsoft Office and other productivity applications should need to be used more than one at a time, as needed. For more than two decades, modern computers have supported a marvelous feature called multi-tasking. This means that if you have Outlook open, and you double-click a Microsoft Word attachment, and upon reading it you realize that you need to update your Excel spreadsheet, which in your train of thought you find yourself updating an Access database, and then you realize that these updates result in a change of product features so you need to reflect these details in your PowerPoint presentation, you should have been able to open each of these applications without missing a beat, and by the time you’re done you should be able to close all these apps in no more than one passing second per click of the [X] close button of each app. Each of these apps takes up as much as 100MB of RAM, Outlook typically even more, and Outlook is typically always open. Cost: 150-1GB. Subtotal @nominal: 700MB; @peak 2GB.
  4. Every business software developer should have his own copy of SQL Server Developer Edition. Every instance of SQL Server Developer Edition takes up a good 25MB to 150MB of RAM just for the core services, multiplied by each of the support services. Meanwhile, Visual Studio 2008 Pro and Team Edition come with SQL Server 2005 Express Edition, not 2008, so for some of us that means two installations of SQL Server Express. Both SQL Server Developer Edition and SQL Server Express Edition are ideal to have on the same machine since Express doesn’t have all the features of Developer and Developer doesn’t have the flat-file support that is available in Express. SQL Server sitting idly costs a LOT of CPU, so quad core is quite ideal. Cost: @nominal: 150MB, @peak 512MB. Subtotal @nominal: 850MB; @peak: 2.5GB. We haven’t even hit Visual Studio yet.
  5. Except in actual Database projects (not to be confused with code projects that happen to have database support), any serious developer would use SQL Server Management Studio, not Visual Studio, to access database data and to work with T-SQL tasks. This would be run alongside Visual Studio, but nonetheless as a separate application. Cost: 250MB. Subtotal @nominal: 1.1GB; @peak: 2.75GB.
  6. Visual Studio itself takes the cake. With ReSharper and other popular add-ins like PowerCommands installed, Visual Studio just started up takes up half a gig of RAM per instance. Add another 250MB for a typical medium-size solution. And if you, like me lately, work in multiple branches and find yourself having to edit several branches for different reasons, one shouldn’t have to close out of Visual Studio to open the next branch. That’s productivity thrown away. This week I was working with three branches; that’s 3 instances. Sample scenario: I’m coding away on my sandbox branch, then a bug ticket comes in and I have to edit the QA/production branch in an isolated instance of Visual Studio for a quick fix, then I get an IM from someone requesting an immediate resolution to something in the developer branch. Lucky I didn’t open a fourth instance. Eventually I can close the latter two instances down and continue with my sandbox environment. Case in point: Visual Studio costs a LOT of RAM. Cost @nominal 512MB, @peak 2.25GB. Subtotal @nominal: 1.6GB; @peak: 5GB.
  7. Your app being developed takes up RAM. This could be any amount, but don’t forget that Visual Studio instantiates independent web servers and loads up bloated binaries for debugging. If there are lots of services and support apps involved, they all stack up fast. Cost @nominal: 50MB, @peak 750MB. Subtotal @nominal: 1.65GB; @peak: 5.75GB.
  8. Internet Explorer and/or your other web browsers take up plenty of RAM. Typically 75MB for IE to be loaded, plus 10-15MB per page/tab. And if you’re anything like me, you’ll have lots and lots and LOTS of pages/tabs by the end of the day; by noon I typically end up with about four or five separate IE windows/processes, each with 5-15 tabs. (Mind you, all or at least most of them are work-related windows, such as looking up internal/corporate documents on the intranet or tracking down developer documentation such as API specs, blogs, and forum posts.) Cost @nominal: 100MB; @peak: 512MB. Subtotal @nominal: 1.75GB; @peak: 6.5GB.
  9. No software solution should go untested on as many platforms as is going to be used in production. If it’s a web site, it should be tested on IE 6, IE 7, and IE 8, as well as current version of Opera, Safari 3+, Firefox 1.5, Firefox 2, and Firefox 3+. If it’s a desktop app, it should be tested on every compatible version of the OS. If it’s a cross-platform compiled app, it should be tested on Windows, Mac, and Linux. You could have an isolated set of computers and/or QA staff to look into all these scenarios, but when it comes to company time and productivity, the developer should test first, and he should test right on his own computer. He should not have to shutdown to dual-boot. He should be using VMWare (or Virtual PC, or VirtualBox, etc). Each VMWare instance takes up the RAM and CPU of a normal system installation; I can’t comprehend why it is that some people think that a VMWare image should only take up a few GB of hard drive space and half a gig of RAM; it just doesn’t work that way. Also, in a distributed software solution with multiple servers involved, firing up multiple instances of VMWare for testing and debugging should be mandatory. Cost @nominal: 512MB; @peak: 4GB. Subtotal @nominal: 2.25GB; @peak: 10.5GB.

Total peak memory (64-bit Vista SP1 which was not accounted in #1): 11+GB!!!

Now, you could argue all day long that you can “save money” by shutting down all those “peak” processes to use less RAM rather than using so much. I’d argue all day long that you are freaking insane. The 8GB I bought for my PC cost me $130 from Dell. Buy, insert, test, save money. Don’t be stupid and wasteful. Make yourself productive.

Let Me Google That For You

by Jon Davis 9. February 2009 20:59

I was hanging out in the C# IRC channel on FreeNode and someone blew his top with a newbie coder with a few "RTFM" statements.

He gave a funny link. Instead of saying "Google it" or giving a google query URL, one can use:

http://www.letmegooglethatforyou.com/?q= << your query here, i.e.

http://www.letmegooglethatforyou.com/?q=how+do+I+do+threading+in+C%23

Windows 7 Beta first Impressions

by Jon Davis 14. January 2009 04:47

Everyone has already made Windows 7 first impression comments, but I had to see Windows 7 for myself, as I always do wth Windows pre-releases. So here are my first experiences. I tried the earlier PDC release, downloaded from a torrent, but I got an error after booting from the DVD saying that it could not locate an installer file.

Windows could not collect information for [OSImage] since the specified image file [install.wim] does not exist.

I chalked it up to a bad torrent download and tossed the copy.

Then Microsoft released Beta 1 this month. I tried downloading this torrent again, and the download was inturrupted. I tried to restore the download process and no seeds were found after hours. I found another torrent, and after about half a day and half-downloaded I realized Microsoft had actually released this version to the open public for anyone to download, so deleted that torrent and started download again, this time straight from Microsoft.

The next day, the download having been completed while I was sleeping, I burned it to DVD-RW and gave it a run. Guess what?

Windows could not collect information for [OSImage] since the specified image file [install.wim] does not exist.

Oh, poop. So the original download wasn't any more flawed on this part than this one is, it's something else.

I tried booting the DVD in VMWare on another PC, and it worked! Aha! It's a hardware problem, perhaps a DVD driver problem. My computer is only about one and a half years old, but the DVD drive is about four years old. I Googled around a bit for more information on this ridiculous error, and the only advice I could find were two suggestions:

  1. Some commented, "You probably found an old DVD-RW from behind a sofa. Use a new DVD-R and that'll fix it right up." Hm. Doubtful. I burned another DVD-RW (same brand, roughly the same condition) and this time I checked off the "Verify" option on my burner software, and it checked out. Still got the error. It was at this point that I tried it on VMWare, and it got past this error, so no, it's not a bad disc. I suppose it could have to do with the failure of the other drive, on the other PC, to read the disc, though. In other words, the drive might have failed, not the disc.
  2. Someone said, "I was using an old USB-attached DVD drive that the BIOS enabled me to boot the disc from, but after installing an IDE-based DVD drive in the actual computer the error went away." Well that stinks, because I'm using an IDE-based DVD drive, it's never given me any problems except that it often refuses to burn discs.

So I pondered, I'm leaning towards the #2 scenario as a clue, I know Microsoft was trying to thin down the core surface area in Windows 7 and I bet this is a lack of some drivers for my drive. But I wonder if "new" is the keyword here, not the form (IDE vs USB).

I just happened to have a external USB-based DVD drive I recently purchased at Amazon. USB, but new. Could it work? I ran to the back room and grabbed it, brought it back in, stretched it across the room to the outlet, configured the BIOS to boot from USB, and booted the Windows 7 DVD. I went to install and....... yes!! It got past the error.

So here's the first first impression: While I greatly appreciate Microsoft's attempt to slim down the core dependency set of Windows and its drivers set, in this area (CD/DVD drive support) they chopped off WAY too much. Perhaps driver support isn't the issue here, but if it is, this IS a bug. There are a LOT of people who were power users 4 years ago, who invested in the latest and greatest back then, and had no Windows version but XP, and were reluctant to switch to Vista because of the corners that Windows 7 rounded out. These years-old systems are more than adequate, surely, for Windows 7 performance-wise, but the CD/DVD drivers are right there along with USB subsystem and SATA as being most needed for success. Fix this, guys, this is a BUG, not a mere risky compromise (intentional droppage of legacy hardware support). Microsoft can't afford to lose THIS hardware.

I experienced no other hardware glitches, fortunately, and even my audio hardware was working, and the Aero experience working right from post-setup first boot. There was only one other hardware-related annoyance, and that is that my two monitors were backwards.. I had to mouse far to the right to access the left monitor. Yes, this is configurable with the Control Panel, but I got annoyed watching setup and dealing with dialog boxes, etc., while everything was backwards and the setup didn't have the Control Panel available to me. It would've been nice, I suppose, if there was one optional button during setup that brought up the Monitors dialog, but at least the Monitors dialog isn't accessed through the wholly inappropriately named (in Vista's time) "Personalization" dialog, which was SO ridiculously placed since monitor setup (resolution, monitor placement monitor drivers, color depth, etc) has little to nothing to do with personalization. Might as well rename Control Panel to "Personalizations".. but they got it, I'm glad.

The new Windows 7 is all about rounding off the corners and adding the polishing touches that Windows Vista only touched on and inspired.

  1. More ever-present Aero Glass experience, with lots of smooth animations and roll-overs.
  2. Explorer.exe got a huge overhaul with Aero and usability enhancements.
    • As is very well known, the ubiquitous taskbar that has been around through Windows 95, Windows NT 4, Windows 98, Windows ME, Windows NT, Windows 2000, Windows XP, Windows Server 2003, Windows Vista, and Windows Server 2008 (did I miss one somewhere? surely I did ..) is now no more. There is no longer a taskbar. There is a bar down there, but it's more like a "smartbar"; the Quick Launch toolbar and the taskbar have sorta merged. It's all very much inspired, no doubt, by the Mac OS X's dock, which frankly disgusts me. But so far I don't have a hatred of the Windows 7 smartbar thingmajig. I do very strongly believe that someone (i.e. Stardock), if not Microsoft themselves, will be pushing a "Windows Vista taskbar" as an add-on accessory to Windows 7, for those people who preferred it, as there is now a rather obvious market for it.
    • The awesome feature in the Windows Vista desktop compositing system that enabled Direct3D and high definition video to be managed in an already D3D desktop environment, advantages of which were only slightly touched upon by Windows key + Tab and taskbar mouseover tooltip previews, both showing these windows re-displayed in distorted, small form in realtime with no performance loss, has been expanded upon in Windows 7. I'm still discovering these, but the most obvious feature is the smartbar mouseover with Internet Explorer showing each tab and letting you pick a tab as it is rendered in real-time. I hope to find a lot more such scenarios
  3. Paint, Calculator, and Wordpad have finally been rewritten with an Office 2007 feel. We no longer have to puke on the Windows 95 versions. I didn't see if Notepad was replaced with something anywhere near the simplicity yet completeness of Notepad2. But I doubt Notepad was touched, which if not is a shame. But at least there's always Notepad2. *cough*
  4. In general, the things in Windows such as in the Control Panel that got moved around a lot in Vista and that everyone complained about, such as me complaining about Monitor settings showing up under stupid Personalization, have been rearranged again. Generally, things are just better and more thought out. Vista was a trial run in this matter, Windows 7 beta is just more thought through. There are still quirky "features" but nothing I've found so far that is just blaringly wrong. I do think that the personalization bits are now too broken apart but this might just be a style issue that needs some getting used to. Microsoft seems to be leaning more than before towards the Apple/Mozilla approach of pursuing minimalist options while burying advanced features down in an obvious "Advanced" click-trail. Themes are consolidated sets now, a little more like Win95 Plus! themes in the sense of consolidation, and not so much isolated background, color, and sound options. But those options as individual settings are still there. In fact, Sounds is now (finally) a personalization configuration, as it should be.
  5. You start off with a big fish. Literally. It's a nice painting (of a fish). But come on. It's a fish! I went to choose a different background image, and, while I could very possibly be mistaken, I think the number of background images you can choose from has been slashed by half since Vista, and the new offerings in the theme picker don't look as good. Boooo!
  6. Other people ran the numbers so I didnt do any testing, but the general consensus is that Windows 7 performs closer to Windows XP's performance than Windows Vista's performance. (Read: It's very performant.)
  7. The max system rating has been nudged up from 5.9 to 7.9. My score of 5.7 on Windows Vista went up to 5.9 in Windows 7... but given the max of 7.9 my year-and-a-half old PC is no longer 0.2 from ceiling. *sob*
  8. I was impressed that the color palettes across all themes, just like IE 8 beta on Vista, are way too bright. It's ugly and uncomfortable. It's not easily configurable to make darker, either.
  9. I haven't stressed Windows 7 yet with software to see how stable it is, but one of the first apps I downloaded was Google Chrome and that puked. All of Windows froze up while I was doing something else, too, but I don't remember what it was, and that sort of thing is something I'd expect from a Beta. 

I have one other complaint. Windows Vista and Office 2007 introduced some really nice glow animations on buttons. Windows 7 pushes the Office 2007 glow animations and transition animations everywhere. The new smartbar (taskbar replacement) has a really, really cool "you just clicked me!" gradient animation that is almost magical. It's nice, but the animations are so slow they're actually rather obnoxious. For example, in the new Calculator, if you simply hover over and click on a button, yeah, blue-gray turns amber, but mouse-away and it seems to take a full three or four seconds for it to animate back to the original color. It's artistically nice, but it's just too long, and I think it will be too distracting. It might actually produce some serious usability issues, fast-moving users are going to be forced to slow down because their "feedback loop" they're getting on the screen is going to all be just a big blur. I really don't like that. It's already making me a little nauseous. Weird huh.

I think Vista's close/maximize/minimize effects the animation timings just right in this matter. Office 2007 ribbon buttons were just over the edge in my taste (too slow), and I could be wrong but Windows 7 in various places feels like it tripled the Office 2007 animation timings (very, very slow).

Be the first to rate this post

  • Currently 0/5 Stars.
  • 1
  • 2
  • 3
  • 4
  • 5

Tags: ,

Computers and Internet | General Technology | Operating Systems


 

Powered by BlogEngine.NET 1.4.5.0
Theme by Mads Kristensen

About the author

Jon Davis (aka "stimpy77") has been a programmer, developer, and consultant for web and Windows software solutions professionally since 1997, with experience ranging from OS and hardware support to DHTML programming to IIS/ASP web apps to Java network programming to Visual Basic applications to C# desktop apps.
 
Software in all forms is also his sole hobby, whether playing PC games or tinkering with programming them. "I was playing Defender on the Commodore 64," he reminisces, "when I decided at the age of 12 or so that I want to be a computer programmer when I grow up."

Jon was previously employed as a senior .NET developer at a very well-known Internet services company whom you're more likely than not to have directly done business with. However, this blog and all of jondavis.net have no affiliation with, and are not representative of, his former employer in any way.

Contact Me 


Tag cloud

Calendar

<<  January 2019  >>
MoTuWeThFrSaSu
31123456
78910111213
14151617181920
21222324252627
28293031123
45678910

View posts in large calendar