Technology Status Update 2016

by Jon Davis 10. July 2016 09:09

Hello, peephole. (people.) Just a little update. I've been keeping this blog online for some time, my most recent blog entries always so negative, I keep having to see that negativity every time I check to make sure the blog's up, lol. I'm tired of it so I thought I'd post something positive.

My current job is one I hope to keep for years and years to come, and if that doesn't work out I'll be looking for one just like it and try to keep it for years and years to come. I'm so done with contracting and consulting (except the occasional mentoring session on code mentor -dot- io). I'm still developing, of course, and as technology is changing, here's what's up as I see it. 

  1. Azure is relevant. 
    image
    The world really has shifted to cloud and the majority of companies, finally, are offloading their hosting to the cloud. AWS, Azure, take your pick, everyone who hates Microsoft will obviously choose AWS but Azure is the obvious choice for Microsoft stack folks, there is nothing meaningful AWS has that Azure doesn't at this point. The amount of stuff on Azure is sufficiently terrifying in quantity and supposed quality enough to give me a thrill. So I'm done with hating on Azure, after all their marketing and nagging and pushing, Microsoft has crossed a threshold of market saturation that I am adequately impressed. I guess that means I have to be an Azure fan, too, now. Fine. Yay Azure, woo. -.-
  2. ASP.NET is officially rebooted. 
    image
    So I hear this thing called ASP.NET Core 1.0 formerly known as ASP.NET 5 formerly known as ASP.NET vNext has RTM'd, and I hear it's like super duper important. It snuck by me, I haven't mastered it, but I know it enought to know a few things:
    • It's a total redux by means of redo. It's like the Star Trek reboot except it’s smaller and there are fewer planets it can manage, but it’s exactly like the Star Trek reboot in that it will probably implode yours.
    • If you've built your career on ASP.NET and you want to continue living on ASP.NET's laurals, now is not the time to master ASP.NET 1.0 Core. Give it another year or two to mature. 
    • If you're stuck on or otherwise fascinated by non-Microsoft operating systems, namely Mac and Linux, but you want to use the Microsoft programming stack, you absolutely must learn and master ASP.NET Core 1.0 and EF7.
    • If all you liked from ASP.NET Core 1.0 was the dynamic configs and build-time transpiles, you don't need ASP.NET Core for that LOL LOL ROFLMAO LOL LOL LOL *cough*
  3. The Aurelia Javascript framework is nearly ready.
    image
    Overall, Javascript framework trends have stopped. Companies are building upon AngularJS 1.x. Everyone who’s behind is talking about React as if it was new and suddenly newly relevant (it isn’t new anymore). Everyone still implementing Knockout are out of the loop and will die off soon enough. jQuery is still ubiquitous and yet ignored as a thing, but meanwhile it just turned v3.

    I don’t know what to think about things anymore. Angular 2.0 requires TypeScript, people hate TypeScript because they hate transpilers. People are still comparing TypeScript with CoffeeScript. People are dumb. If it wasn’t for people I might like Angular 2.0, and for that matter I’d be all over AureliaJS, which is much nicer but just doesn’t have Google as the titanic marketing arm. In the end, let’s just get stuff done, guys. Build stuff. Don’t worry about frameworks. Learn them all as you need them.
  4. Node.js is fading and yet slowly growing in relevance.
    image
    Do you remember .. oh heck unless you're graying probably not, anyway .. do you remember back in the day when the dynamic Internet was first loosed on the public and C/C++ and Perl were used to execute from cgi-bin, and if you wanted to add dynamic stuff to a web site you had to learn Perl and maybe find Perl pearls and plop them into your own cgi-bin? Yeah, no, I never really learned Perl, either, but I did notice the trend, but in the end, what did C/C++ and Perl mean to us up until the last decade? Answer: ubiquitous availability, but not web server functionality, just an ever-present availability for scripts, utilities, hacks, and whatever. That is where node.js is headed. Node.js for anything web related has become and will continue to be a gigantic mess of disorganized, everyone-is-equal, noisily integrated modules that sort of work but will never be as stable in built compositions as more carefully organized platforms. Frankly, I see node.js being more relevant as a workstation runtime than a server runtime. Right now I'm looking at maybe poking at it in a TFS build environment, but not so much for hosting things.
    I will always have a bitter taste in my mouth with node.js after trying to get socket.io integrated with Express and watching the whole thing just crumble, with no documentation or community help to resolve it, and this happened not just once on the job (never resolved before I walked away) but also during a code-mentor mentoring session (which we didn't figure out), even after a good year or so of maturity of the platform after the first instance. I still like node.js but will no longer be trying to build a career on it.
  5. Pay close attention and learn up on Swagger aka OpenAPI. 
    image
    Remember when -- oh wait, no, unless you're graying, .. nevermind .. anyway, -- once upon a time something called SOAP came out and it came with it a self-documentation feature that was a combination of WSDL and some really handy HTML generated scaffolding built into web services that would let you manually test SOAP-based services by filling out a self-generated form. Well now that JSON-based REST is the entirety of the playing field, we need the same self-documention. That's where Swagger came in a couple years ago and everyone uses it now. Swagger needs some serious overhauling--someone needs to come up with a Swagger-compliant UI built on more modular and configurable components, for example--but as a drop-in self-documentation feature for REST services it fits the bill.
    • Swagger can be had on .NET using a lib called Swashbuckle. If you use OData, there is a lib called Swashbuckle.OData. We use it very, very heavily where I work. (I was the one who found it and brought it in.) "Make sure it shows up and works in Swagger" is a requirement for all REST/OData APIs we build now.
    • Swagger is now OpenAPI but it's still Swagger, there are not yet any OpenAPI artifacts that I know of other than Swagger. Which is lame. Swagger is ugly. Featureful, but ugly, and non-modular.
    • Microsoft is listed as a contributing member of the OpenAPI committee, but I don't know what that means, and I don't see any generic output from OpenAPI yet. I'm worried that Microsoft will build a black box (rather than white box) Swagger-compliant alternative for ASP.NET Core.
    • Other curious ones to pay attention to, but which I don't see as significantly supported by the .NET community yet (maybe I haven't looked hard enough), are:
  6. OData v4 has potential but is implementation-heavy and sorely needs a v5. 
    image
    A lot of investments have been made in OData v4 as a web-based facade to Entity Framework data resources. It's the foundation of everything the team I'm with is working on, and I've learned to hate it. LOL. But I see its potential. I hope investments continue because it is sorely missing fundamental features like
    • MS OData needs better navigation property filtering and security checking, whether by optionally redirecting navigation properties to EDM-mapped controller routes (yes, taking a performance hit) or some other means
    • MS OData '/$count' breaks when [ODataRoute] is declared, boo.
    • OData spec sorely needs "DISTINCT" feature
    • $select needs to be smarter about returning anonymous models and not just eliminating fields; if all you want is one field in a nested navigation property in a nested navigation property (equivalent of LINQ's .Select(x=>new {ID=x.ID, DesiredField2=x.Child.Child2.DesiredField2}), in the OData result set you will have to dive into an array and then into an array to find the one desired field
    • MS OData output serialization is very slow and CPU-heavy
    • Custom actions and functions and making them exposed to Swagger via Swashbuckle.OData make me want to pull my hair out, it takes sometimes two hours of screaming and choking people to set up a route in OData where it would take me two minutes in Web API, and in the end I end up with a weird namespaced function name in the route like /OData/Widgets/Acme.GetCompositeThingmajig(4), there's no getting away from even the default namespace and your EDM definition must be an EXACT match to what is clearly obviously spelled out in the C# controller implementation or you die. I mean, if Swashbuckle / Swashbuckle.OData can mostly figure most of it out without making us dress up in a weird Halloween costume, surely Microsoft's EDM generator should have been able to.
  7. "Simple CRUD apps" vs "messaging-oriented DDD apps"
    has become the new Microsoft vs Linux or C# vs Java or SQL vs NoSQL. 

    imageThe war is really ugly. Over the last two or three years people have really been talking about how microservices and reaction-oriented software have turned the software industry upside down. Those who hop on the bandwagon are neglecting to know when to choose simpler tooling chains for simple jobs, meanwhile those who refuse to jump on the bandwagon are using some really harsh, cruel words to describe the trend ("idiots", "morons", etc). We need to learn to love and embrace all of these forms of software, allow them to grow us up, and know when to choose which pattern for which job.
    • Simple CRUD apps can still accomplish most business needs, making them preferable most of the time
      • .. but they don't scale well
      • .. and they require relatively very little development knowledge to build and grow
    • Non-transactional message-oriented solutions and related patterns like CQRS-ES scale out well but scale developers' and testers' comprehension very poorly; they have an exponential scale of complexity footprint, but for the thrill seekers they can be, frankly, hella fun and interesting so long as they are not built upon ancient ESB systems like SAP and so long as people can integrate in software planning war rooms.
    • Disparate data sourcing as with DDD with partial data replication is a DBA's nightmare. DBAs will always hate it, their opinions will always be biased, and they will always be right in their minds that it is wrong and foolish to go that route. They will sometimes be completely correct.

  8. Integrated functional unit tests are more valuable than TDD-style purist unit tests. That’s my new conclusion about developer testing in 2016. Purist TDD mindset still has a role in the software developer’s life. But there is still value in automated integration tests, and when things like Entity Framework are heavily in play, apparently it’s better to build upon LocalDB automation than Moq.
    At least, that’s what my current employer has forced me to believe. Sadly, the purist TDD mindset that I tried to adopt and bring to the table was not even slightly appreciated. I don’t know if I’m going to burn in hell for being persuaded out of a purist unit testing mindset or not. We shall see, we shall see.
  9. I'm hearing some weird and creepy rumors I don't think I like about SQL Server moving to Linux and eventually getting itself renamed. I don't like it, I think it's unnecessary. Microsoft should just create another product. Let SQL Server be SQL Server for Windows forever. Careers are built on such things. Bad Microsoft! Windows 8, .NET Framework version name fiascos, Lync vs Skype for Business, when will you ever learn to stop breaking marketing details to fix what is already successful??!
  10. Speaking of SQL Server, SQL Server 2016 is RTM'd, and full blown SSMS 2016 is free.
  11. On-premises TFS 2015 only just recently acquired gated check-in build support in a recent update. Seriously, like, what the heck, Microsoft? It's also super buggy, you get a nasty error message in Visual Studio while monitoring its progress. This is laughable.
    • Clear message from Microsoft: "If you want a premium TFS experience, Azure / Visual Studio Online is where you have to go." Microsoft is no longer a shrink-wrapped product company, they sell shrink wrapped software only for the legacy folks as an afterthought. They are hosted platform company now all the way. .
      • This means that Windows 10 machines including Nokia devices are moving to be subscription boxes with dumb client deployments. Boo.
  12. imageAnother rumor I've heard is that
    Microsoft is going to abandon the game industry.

    The Xbox platform was awesome because Microsoft was all in. But they're not all in anymore, and it shows, and so now as they look at their lackluster profits, what did they expect?
    • Microsoft: Either stay all-in with Xbox and also Windows 10 (dudes, have you seen Steam's Big Picture mode? no excuse!) or say goodbye to the consumer market forever. Seriously. Because we who thrive on the Microsoft platform are also gamers. I would recommend knocking yourselves over to partner with Valve to co-own the whole entertainment world like the oligarchies that both of you are since Valve did so well at keeping the Windows PC relevant to the gaming markets.

For the most part I've probably lived under a rock, I'm sure, I've been too busy enjoying my new 2016 Subaru WRX (a 4-door racecar) which I am probably going to sell in the next year because I didn't get out of debt first, but not before getting a Kawasaki Vulcan S ABS Café as my first motorized two-wheeler, riding that between playing Steam games, going camping, and exploring other ways to appreciate being alive on this planet. Maybe someday I'll learn to help the homeless and unfed, as I should. BTW, in the end I happen to know that "love God and love people" are the only two things that matter in life. The rest is fluff. But I'm so selfish, man do I enjoy fluff.  I feel like such a jerk. Those who know me know that I am one. God help me.

image2016-Kawasaki-Vulcan-S-ABS-Cafe1 
image  image
Top row: Fluff that doesn’t matter and distracts me from matters of substance.
Bottom row: Matters of substance.

Be the first to rate this post

  • Currently 0/5 Stars.
  • 1
  • 2
  • 3
  • 4
  • 5

Tags:

ASP.NET | Blog | Career | Computers and Internet | Cool Tools | LINQ | Microsoft Windows | Open Source | Software Development | Web Development | Windows

Announcing Fast Koala, an alternative to Slow Cheetah

by Jon Davis 17. July 2015 19:47

So this is a quick FYI for teh blogrollz that I have recently been working on a little Visual Studio extension that will do for web apps what Slow Cheetah refused to do. It enables build-time transformations for both web apps and for Windows apps and classlibs. 

Here's the extension: https://visualstudiogallery.msdn.microsoft.com/7bc82ddf-e51b-4bb4-942f-d76526a922a0  

Here's the Github: https://github.com/stimpy77/FastKoala

Either link will explain more.

Gemli.Data v0.3.0 Released

by Jon Davis 12. October 2009 03:57

I reached a milestone this late night with The Gemli Project and decided to go ahead and release it. This release follows an earlier blog post describing some syntactical sugar that was added along with pagination and count support, here:

http://www.jondavis.net/techblog/post/2009/10/05/GemliData-Pagination-and-Counts.aspx

Remember those tests I kept noting that I still need to add to test the deep loading and deep saving functionality? No? Well I remember them. I kept procrastinating them. And as it turned out, I’ve discovered that a lesson has been taught to me several times in the past, and again now while trying to implement a few basic tests, a lesson I never realized was being taught to me until finally just this late night. The lesson goes something like this:

If in your cocky self-confidence you procrastinate the tests of a complex system because it is complex and because you are confident, you are guaranteed to watch them fail when you finally add them.

Mind you, the “system” in context is not terribly complex, but I’ll confess there was tonight a rather ugly snippet of code to wade through, particularly while Jay Leno or some other show was playing in the background to keep me distracted. On that latter note, it wasn’t until I switched to ambient music that I was finally able to fix a bug that I’d spent hours staring at.

This milestone release represents 130 total unit tests since the lifetime of the project (about 10-20 or so new tests, not sure exactly how many), not nearly enough but enough to feel a lot better about what Gemli is shaping in itself. I still yet need to add a lot more tests, but what these tests exposed was a slew of buggy relationship inferences that needed to be taken care of. I felt the urgency to release now rather than continue to add tests first because the previous release was just not suitably functional on the relationships side.

 

Gemli Project v0.2 Released

by Jon Davis 21. September 2009 03:57

Earlier this late night I posted the first major overhaul update of Gemli since a month ago. This brings a huge number of design fixes and new features to Gemli.Data and makes it even more usable. It's still alpha, though, as there's a lot more testing to be done, not to mention still some new features to be added.

http://gemli.codeplex.com/Release/ProjectReleases.aspx?ReleaseId=33281

From the release notes:

  • Added to DataProviderBase: SupportsTransactions { get; } , BeginTransaction(), and BeginTransaction(IsolationLevel)
  • Facilitate an application default provider: Gemli.Data.Providers.ProviderDefaults.AppProvider = myDBProvider .. This allows DataModels and DataModelCollections to automatically use the application's data provider when one hasn't already been associated with the model (so you can, for example, invoke .Save() on a new model without setting its provider manually)
  • Replace (and drop) TypeConvertor with DbTypeConverter.
  • Operator overloads on DataModelQuery; you can now use == instead of IsEqualTo(), but this comes at the cost of chainability. (IsEqualTo() is still there, and still supports chaining.)
  • Added SqlDbType attribute property support. DataType attribute value is no longer a DbType and is instead a System.Type. The properties DbType and SqlDbType have been added, and setting any of these three properties will automatically translate and populate the other two.
  • DataType attribute value is no longer a DbType and is instead a System.Type. The properties DbType and SqlDbType have been added, and setting any of these three properties will automatically translate and populate the other two.
  • WhereMappedColumn is now WhereColumn
  • Facilitate optional behavior of inferring from all properties not marked as ignore, rather than the behavior or inferring from none of the unattributed properties unless no properties are attributed. The latter behavior--the old behavior--currently remains the default. The new behavior can be applied with DataModelTableMappingAttribute's PropertyLoadBehavior property
    • This may change to where the default is InferProperties.NotIgnored (the new behavior), still pondering on this.
  • Add sorting to MemoryDataProvider
  • TempMappingTable is now DataModelMap.RuntimeMappingTable
  • ForeignDataModel.ForeignXXXX was backwards and is now RelatedXXXX (multiple members)
  • DataModelFieldMappingAttribute is now DataModelColumnAttribute.
  • DataModelMap.FieldMappings is now DataModelMap.ColumnMappings.
  • DataModelTableMappingAttribute is now DataModelTableAttribute.
  • DataModelForeignKeyAttribute is now ForeignKeyAttribute.
  • The XML serialization of all elements now follows these renamings and is also now camelCased instead of ProperCase.
  • XML loading of mapping configuration is getting close, if it isn't already there. Need to test.
  • Added DataModelMap.LoadMappings(filePath)

Be the first to rate this post

  • Currently 0/5 Stars.
  • 1
  • 2
  • 3
  • 4
  • 5

Tags: ,

C# | Cool Tools | Pet Projects | Software Development

PowerShell 2 In Windows 7 Comes With A Windows Shell

by Jon Davis 1. September 2009 01:48

Here’s something I overlooked about Windows 7 RTM. Not only does it comes with PowerShell v2 (I didn’t overlook that) but it also comes with an “ISE”—an Integrated Shell Environment. The “ISE” gives you three “panes” or sub-windows to work with PowerShell from within a single containing window: an input console, an output console, and a syntax highlighting text editor for script editing and debugging.

image

It does let you specify a layout. However, firing this thing up I immediately felt like I was stuck in Windows Live applications’ CandyLand. It has a notably consumer feel, and I’m afraid that system administrators and developers will tend to shy away from it simply because of that. Why Microsoft didn’t just reuse their Visual Studio Integrated Shell is beyond me.

Nonetheless, this is a nice addition to the Windows 7 and PowerShell combination/suite and will no doubt prove to be very handy for those who want to casually tinker with PS scripting without several different windows open or dishing out dough for the basic functionality of a PS script debugger.

image
Another much wanted feature finally arrives: Remote PS Shells

 

I’m still eyeing PowerShell Plus, albeit just a tiny fraction of a hair less now because of this.

CAPTCHA This, Yo! - Early Alpha

by Jon Davis 14. July 2009 01:54

I've posted a mini-subproject:

http://www.CAPTCHAThisYo.com/

The site is self-explanatory. The idea is simple. I want CAPTCHA. I don't want to support CAPTCHA in my apps. I just want to drop in a one-liner snippet somewhere and call it done. I think other people share the same desire. So I now support CAPTCHA as a CAPTCHA app. I did all the work for myself so that I don't have to do that work. I went through all that trouble so that I don't have to go through the trouble .... Wait, ...

Seriously, it's not typical CAPTCHA, and it's Not Quite Done Yet (TM). It's something that'll evolve. Right now there isn't even any hard-to-read graphic CAPTCHA.

But what I'd like to do is have an ever-growing CAPTCHA questions library and, by default, have it just rotate through them randomly. The questions might range from shape detection to riddles to basic math. I'd really like to have some kind of community uploads thingmajig with ratings, so that people can basically share their own CAPTCHA solutions and they all run inside the same captchathisyo.com CAPTCHA engine. I'm just not sure yet how to pull that off.

Theoretically, I could take the approach Microsoft took when C# was initially released (long, long ago, in a galaxy far, far away), they had a cool insect sandbox game where you could write a .NET class that implements some interface and then send it up to the server and it would just run as an insect on the server. The objective of the game was to write the biggest killer/eater. I'm not sure how feasible the idea is of opening up all .NET uploads to the server, but it's something I'm pondering on.

Anyway, the concept has been prototyped and the prototype has been deployed is sound, but I still need to work out cross-site scripting limitations, bear with me. I still need to find a designer to make something beautful out of it. That said, feel free to use it and give feedback. Stand by.

Be the first to rate this post

  • Currently 0/5 Stars.
  • 1
  • 2
  • 3
  • 4
  • 5

Tags:

C# | Computers and Internet | Cool Tools | Pet Projects | Techniques | Web Development

Quickie Alarm

by Jon Davis 13. May 2009 23:39

I flew out to California on Friday night to visit my siblings, and when I arrived I rented a car and drove around in the middle of the night looking for a cheap place to sleep. After an hour of getting completely lost I finally found a Motel 6 (*shudder*), climbed into bed, and then realized that there’s no alarm clock in the room. Great.

So I threw together another alarm clock on my laptop. I went through the trouble of giving it a nice touch, so it took an hour or two rather than a minute or two, but it did the job and I was proud and still managed to sleep well.

And now I’m sharing it here online. This is freeware with source code included. (Uses free Developer Express controls.)

image

.. click ‘Go’ and ..

image

Download: http://www.jondavis.net/codeprojects/QuickieAlarm/QuickieAlarm.zip

Nine Reasons Why 8GB Is Only Just Enough (For A Professional Business Software Developer)

by Jon Davis 13. February 2009 21:29

Today I installed 8GB on my home workstation/playstation. I had 8GB lying around already from a voluntary purchase for a prior workplace (I took my RAM back and put the work-provided RAM back in before I left that job) but the brand of RAM didn’t work correctly on my home PC’s motherboard. It’s all good now though, some high quality performance RAM from OCZ and my Windows 7 system self-rating on the RAM I/O jumped from 5.9 to 7.2.

At my new job I had to request a RAM upgrade from 2GB to 4GB. (Since it’s 32-bit XP I couldn’t go any higher.) I asked about when 64-bit Windows Vista or Windows 7 would be put on the table for consideration as an option for employees, I was told “there are no plans for 64-bit”.

The same thing happened with my last short-term gig. Good God, corporate IT folks everywhere are stuck in the year 2002. I can barely function at 4GB, can’t function much at all at 2GB.

By quadrupling the performance of your employee's system, you’d effectively double the productivity of your employee; it’s like getting a new employee for free.

If you are a multi-role developer and aren’t already saturating at least 4GB of RAM you are throwing away your employer’s money, and if you are IT and not providing at least 4GB RAM to developers and actively working on adding corporate support for 64-bit for employees’ workstations you are costing the company a ton of money due to productivity loss!! I don’t know how many times I’ve seen people restart their computers or sit and wait for 2 minutes for Visual Studio to come up because their machine is bogged down on a swap file. That was “typical” half a decade ago, but it’s not acceptable anymore. The same is true of hard drive space. Fast 1 Terabyte hard drives are available for less than $100 these days, there is simply no excuse. For any employee who makes more than X (say, $25,000), for Pete’s sake, throw in an extra $1000-$2000 or so more and get the employee two large (24-inch) monitors, at least 1TB hard drive(s) (ideally 4 drives in a RAID-0+1 array), 64-bit Windows Server 2008 / Windows Vista / Windows 7, a quad-core CPU, and 8 GB of some high performance (800+ GHz) RAM. It’s not that that’s another $2,000 or so to lose; it’s that just $2,000 will save you many thousands more dough. By quadrupling the performance of your employee's system, you’d effectively double the productivity of your employee; it’s like getting a new employee for free. And if you are the employee, making double of X (say, more than $50,000), and if your employer could somehow allow it (and they should, shame on them if they don’t and they won’t do it themselves), you should go out and get your own hardware upgrades. Make yourself twice as productive, and earn your pay with pride.

In a business environment, whether one is paid by the hour or salaried (already expected to work X hours a week, which is effectively loosely translated to hourly anyway), time = money. Period. This is not about developers enjoying a luxury, it’s about them saving time and employers saving money.

Note to the morons who argue “this is why developers are writing big, bloated software that suck up resources” .. Dear moron, this post is from the perspective of an actual developer’s workstation, not a mere bit-twiddling programmer—a developer, that is, who wears many hats and must not just write code but manage database details, work with project plans, document technical details, electronically collaborate with teammates, test and debug, etc., all in one sitting. Nothing in here actually recommends or even contributes to writing big, bloated software for an end user. The objective is productivity, your skills as a programmer are a separate concern. If you are producing bad, bloated code, the quality of the machine on which you wrote the code has little to nothing to contribute to that—on the contrary, a poor developer system can lead to extremely shoddy code because the time and patience required just to manage to refactor and re-test become such a huge burden. If you really want to test your code on a limited machine, you can rig VMWare / VirtualPC / VirtualBox to temporarily run with lesser RAM, etc. You shouldn’t have to punish yourself with poor productivity while you are creating the output. Such punishment is far more monetarily expensive than the cost of RAM.

I can think of a lot of reasons for 8+ GB RAM, but I’ll name a handful that matter most to me.

  1. Windows XP / Server 2003 alone takes up half a gigabyte of RAM (Vista / Server 2008 takes up double that). Scheduled tasks and other processes cause the OS to peak out at some 50+% more. Cost: 512-850MB. Subtotal @nominal: ~512MB; @peak: 850MB
  2. IIS isn’t a huge hog but it’s a big system service with a lot of responsibility. Cost: 50-150. Subtotal @nominal: ~550MB; @peak 1GB.
  3. Microsoft Office and other productivity applications should need to be used more than one at a time, as needed. For more than two decades, modern computers have supported a marvelous feature called multi-tasking. This means that if you have Outlook open, and you double-click a Microsoft Word attachment, and upon reading it you realize that you need to update your Excel spreadsheet, which in your train of thought you find yourself updating an Access database, and then you realize that these updates result in a change of product features so you need to reflect these details in your PowerPoint presentation, you should have been able to open each of these applications without missing a beat, and by the time you’re done you should be able to close all these apps in no more than one passing second per click of the [X] close button of each app. Each of these apps takes up as much as 100MB of RAM, Outlook typically even more, and Outlook is typically always open. Cost: 150-1GB. Subtotal @nominal: 700MB; @peak 2GB.
  4. Every business software developer should have his own copy of SQL Server Developer Edition. Every instance of SQL Server Developer Edition takes up a good 25MB to 150MB of RAM just for the core services, multiplied by each of the support services. Meanwhile, Visual Studio 2008 Pro and Team Edition come with SQL Server 2005 Express Edition, not 2008, so for some of us that means two installations of SQL Server Express. Both SQL Server Developer Edition and SQL Server Express Edition are ideal to have on the same machine since Express doesn’t have all the features of Developer and Developer doesn’t have the flat-file support that is available in Express. SQL Server sitting idly costs a LOT of CPU, so quad core is quite ideal. Cost: @nominal: 150MB, @peak 512MB. Subtotal @nominal: 850MB; @peak: 2.5GB. We haven’t even hit Visual Studio yet.
  5. Except in actual Database projects (not to be confused with code projects that happen to have database support), any serious developer would use SQL Server Management Studio, not Visual Studio, to access database data and to work with T-SQL tasks. This would be run alongside Visual Studio, but nonetheless as a separate application. Cost: 250MB. Subtotal @nominal: 1.1GB; @peak: 2.75GB.
  6. Visual Studio itself takes the cake. With ReSharper and other popular add-ins like PowerCommands installed, Visual Studio just started up takes up half a gig of RAM per instance. Add another 250MB for a typical medium-size solution. And if you, like me lately, work in multiple branches and find yourself having to edit several branches for different reasons, one shouldn’t have to close out of Visual Studio to open the next branch. That’s productivity thrown away. This week I was working with three branches; that’s 3 instances. Sample scenario: I’m coding away on my sandbox branch, then a bug ticket comes in and I have to edit the QA/production branch in an isolated instance of Visual Studio for a quick fix, then I get an IM from someone requesting an immediate resolution to something in the developer branch. Lucky I didn’t open a fourth instance. Eventually I can close the latter two instances down and continue with my sandbox environment. Case in point: Visual Studio costs a LOT of RAM. Cost @nominal 512MB, @peak 2.25GB. Subtotal @nominal: 1.6GB; @peak: 5GB.
  7. Your app being developed takes up RAM. This could be any amount, but don’t forget that Visual Studio instantiates independent web servers and loads up bloated binaries for debugging. If there are lots of services and support apps involved, they all stack up fast. Cost @nominal: 50MB, @peak 750MB. Subtotal @nominal: 1.65GB; @peak: 5.75GB.
  8. Internet Explorer and/or your other web browsers take up plenty of RAM. Typically 75MB for IE to be loaded, plus 10-15MB per page/tab. And if you’re anything like me, you’ll have lots and lots and LOTS of pages/tabs by the end of the day; by noon I typically end up with about four or five separate IE windows/processes, each with 5-15 tabs. (Mind you, all or at least most of them are work-related windows, such as looking up internal/corporate documents on the intranet or tracking down developer documentation such as API specs, blogs, and forum posts.) Cost @nominal: 100MB; @peak: 512MB. Subtotal @nominal: 1.75GB; @peak: 6.5GB.
  9. No software solution should go untested on as many platforms as is going to be used in production. If it’s a web site, it should be tested on IE 6, IE 7, and IE 8, as well as current version of Opera, Safari 3+, Firefox 1.5, Firefox 2, and Firefox 3+. If it’s a desktop app, it should be tested on every compatible version of the OS. If it’s a cross-platform compiled app, it should be tested on Windows, Mac, and Linux. You could have an isolated set of computers and/or QA staff to look into all these scenarios, but when it comes to company time and productivity, the developer should test first, and he should test right on his own computer. He should not have to shutdown to dual-boot. He should be using VMWare (or Virtual PC, or VirtualBox, etc). Each VMWare instance takes up the RAM and CPU of a normal system installation; I can’t comprehend why it is that some people think that a VMWare image should only take up a few GB of hard drive space and half a gig of RAM; it just doesn’t work that way. Also, in a distributed software solution with multiple servers involved, firing up multiple instances of VMWare for testing and debugging should be mandatory. Cost @nominal: 512MB; @peak: 4GB. Subtotal @nominal: 2.25GB; @peak: 10.5GB.

Total peak memory (64-bit Vista SP1 which was not accounted in #1): 11+GB!!!

Now, you could argue all day long that you can “save money” by shutting down all those “peak” processes to use less RAM rather than using so much. I’d argue all day long that you are freaking insane. The 8GB I bought for my PC cost me $130 from Dell. Buy, insert, test, save money. Don’t be stupid and wasteful. Make yourself productive.

EntitySpaces 2009 Q1 WCF Demo

by Jon Davis 25. January 2009 16:45

I created a new WCF demo for EntitySpaces, one of the most popular ORM solutions available for .NET which now comes with its own code generator (no longer relies on CodeSmith or myGeneration). The demo is bundled in the Release Candidate for v2009 Q1. (The developer version is released, trial version will be released tomorrow.) This one includes both console and Windows Forms clients, and a console-based service, for showing the barebones basics of what it takes to get EntitySpaces working with WCF. Both full proxies (EntitySpaces runtime libraries referenced on the client) and lightweight proxies/stubs (*no* EntitySpaces runtime libraries referenced on the client) are demonstrated, but the lightweight demo is currently limited to a console app.

Next on my plate will be a WPF demo for the lightweight proxies/stubs. No guarantees...

Anyway, here’s the documentation that went with the demo. It got posted on the EntitySpaces blog.

http://www.entityspaces.net/blog/2009/01/25/EntitySpaces+2009+And+WCF.aspx

Currently rated 5.0 by 1 people

  • Currently 5/5 Stars.
  • 1
  • 2
  • 3
  • 4
  • 5

Tags:

C# | Cool Tools | Pet Projects | Software Development | SQL Server | WCF

SiteCore: Is the ultimate CMS also the ultimate ASP.NET web app in general?!

by Jon Davis 5. November 2008 20:11

 Important note: Please read this article with a grain of salt and note my follow-up post.

At my previous job as a web developer at a B2C magazine publisher, the developer team I was in was looking for CMS solutions that could be piggybacked to quickly build and sell web sites for advertisers, like microsites except being scalable to support integrating e-commerce and educational content easily. For full-out magazine content, the company had been using Krang, which is a CMS written in Perl for mySql and as such was not considerable for this initiative as we were ASP.NET developers. We looked at a number of smallish CMS systems for ASP.NET -- Umbraco, Graffiti, SubSonic web starter kit, etc. Most of these were laughable as options, even Umbraco had to be shrugged off. DotNetNuke wasn't an option because it's a prefab portal, not really a CMS, and frankly it's embarrassing (read: ugly). Sharepoint wasn't even worth considering because it, too, is a portal, and such a bear and a weird mutt of poorly integrated technologies and technology designs, plus it's an installation nightmare and we didn't have the hours it takes to get the thing installed when we knew we wouldn't likely like it. We also considered CMS's using "wiki" nomenclature, psuedo-wikis if you will. But those are not appropriate for the job.

But I recently began a job transition to another magazine publisher, this one B2B, a move that was mostly coincidence but they had the same need for a CMS--though on a larger scale, i.e. for their magazine web sites, not for microsites--but had already made the decision before I was interviewed. They'd researched many, many CMS's and considered all of them but the CMS that won in their research, hands down, was one I'd never heard of called SiteCore.


SiteCore's desktop uses a XAML-to-XHTML engine called Sheer UI to create a beautiful content editor and developer user interface.

I'm not entirely sure why I hadn't heard of SiteCore, they've actually been around since ASP.NET v1.0. But I believe it probably has to do with the fact that they've apparently gone through a very large number and extent of overhauls and evolution, such that those who evaluated SiteCore even recently might not have been here now to see what sort of monumental, breathtaking system this puppy is today.

I just completed training and am now a certified SiteCore developer. But before I took the training I was completely stoked about what was coming--the opportunity to work with this software.

SiteCore v6 is built upon ASP.NET 3.5, and it is impressively respectful of maintaining an ASP.NET-oriented developer/extensible workflow. That is to say, unlike other CMS's that try to break out of the box by using IronPython scripts (like Umbraco) or odd proprietary XML markup (SharePoint's CAML) or buried in a virtual machine image with a PHP front-end (like MindTouch Deki, yuck), SiteCore allows you to extend the CMS using your choice of ASPX pages with placeholders, ASCX controls with placeholders, C# code-only web controls, and .NET-extendable XSLT templates. Everything else, from articles to data records to drop-down options to security features to workflow states, everything else is just an "item" that is just a tree node with configurable settings. From what I can tell, on the developer & administrator end there aren't a lot of .NET 3.5 features in use right now--no LINQ, for example--but that's not the point. The point is that SiteCore makes the perfect CMS core of a site that you can extend with ASP.NET 3.5 with ease and without having to shoehorn anything because SiteCore is built to be respectful of pure ASP.NET. Even the pipelines and events are configurable and extensible, as all of the pipeline stages and event handlers are declared in a huge web.config. The web.config file also exposes niceties like Lucene.net for the content developers, and multi-site configurations. And yes, you use Visual Studio 2008 to extend a SiteCore-based site.

And these are just the fundamental aspects of what makes this product so awesome. Wait till you see the fluff!! SiteCore exposes to content developers, editors, and designers a user interface so profoundly rich that would make the Microsoft Office Live team pee in their pants. (And being that Microsoft is no stranger to SiteCore, they being a SiteCore customer, I'm sure they did exactly that.) SiteCore uses a proprietary dialect of XAML called "Sheer UI" that it converts to XHTML+Javascript to emit a gorgeous Windows XP-like desktop with a Start menu and everything, the point of which is not to 'wow' (which it does anyway) but to provide a focused workspace of tools that can be used to access CMS content, data structures ("data templates"), workflow management, security settings, and new feature development tools.

The best part about SiteCore, which I just found out about today, is that there is a free, one-user version you can download called SiteCore Xpress and play with right now.

The funny thing is, there are really very few "bad smells" with this system. The whole thing just feels right. It was designed right. It doesn't feel awkward, kludgy, or "shortcutted" in any way. If it does smell, it has more of a new car smell than a body odor smell.

They say a picture is worth a thousand words, but then a video is worth a million.

Now, be aware, the pro versions of SiteCore don't come cheap. In my opinion, you get what you pay for because this product is just so profoundly good, it's kinda like Apple's expensive products, it offers a fixed set of functions done right and no weirdness, just goodness. Where SiteCore differs from Apple, though, is in its completeness. After sitting through 3 days of training this week I kept having to pull up my jaw because the level of detail those guys went through to cover all aspects of real CMS and yet to do it cleanly without making a mess is just .. plain .. insane. I mean, just to throw some examples out there ...

  • Tweakable URL routing
  • *Real* content versioning, with a side-by-side diff tool.
  • Extensive (i.e. unlimited) multi-language support for content (and multi-lingual content editing / development in a few different prefab languages)
  • Extensive ACL-like security on a per-content-item basis
  • Field-level security on content fields!
  • Multiple inheritence for tree items!
  • Consolidated editing and publishing environment (that can be optionally staged in isolation)
  • Configurable workflows (i.e. approval processes)
  • Application-extensible -- build your own applications for the SiteCore desktop experience (for content editors to experience) using simple .ASPX files (no Sheer UI XML required, that stuff just runs the core)
  • Per-item performance metering (i.e. in a "tooltip window", this content item takes 10ms to render)
  • On-the-fly field editing while in preview mode
  • An API fully exposing everything you need to dynamically create or access CMS data
  • A nice "developer network" web site with detailed documentation and guides
  • A security model that even lets you use Active Directory!

.. the list would just go on and on.

The best part about SiteCore, which I just found out about today, is that there is a free, one-user version you can download called SiteCore Xpress and play with right now. From what I can tell, there are no other strings attached, i.e. no logo or ad requirements and no trial period, other than it only allows for one user (administrator), can only run on one server, and isn't licensed for commercial use. [More info] Unfortunately the one other down side (and possible deal-killer, but not likely) is that the Xpress version is v5.3, not v6. But at least it's v5.3 and not v5.

From what I've learned, v5.3 was about the time when SiteCore really reached a turning point and started making big in-roads to U.S. sales. (SiteCore is apparently developed in Denmark.) v5.3 being the last product update before the very recently released (Sep '08?) v6, it's still a very good product. I didn't use v5.3, but from what I know, I believe the things that are different in v5.3 vs. v6 are:

  • v5.3 is based on .NET 3.0 / ASP.NET 2.0, not .NET 3.5 / ASP.NET 3.5.
  • v5.3 has a proprietary thing called "masters", which were dropped in v6. In v5.3, new items had to be instantiated with templates+masters, but in v6 only templates are required.
  • The security subsystem might've been overhauled or tweaked out, such as its newly revised configuration UI.
  • v5.3 had huge performance improvements over previous versions. v6 probably has huge performance improvements on top of those, particularly as the core now takes advantage of ASP.NET 3.5 (if indeed it does, I don't know)
  • A ridiculous bug was fixed in v6 where in v5.3 cached content would process the layout anyway before returning the cached version (duh) which rumor is SiteCore isn't owning up to fix in v5.x.
  • v6 has a slick optional Page Editor which is basically a preview of the actual page (with formatting) but with inline editable regions that actually edit the CMS items on the fly.
  • v6 has a new validation component (to slap your content editors around a bit with a fish)
  • v6 has a new Grid Designer that's a nice addition but really isn't necessary

There's a review on v6 over here: http://www.cmswire.com/cms/web-cms/web-cms-sitecore-6-customer-driven-usability-002840.php

Anyway, this isn't a sales pitch as I'm just a new user, not a business partner (or at least, not yet! *grin*), but it's something I wanted to share. I love cool developer technology, particularly for .NET, and above all ASP.NET solutions I've seen on the market so far, open source or commercial, CMS or otherwise, I think SiteCore v6 sets the bar for all web applications to try to measure up to. Literally. And if you don't believe me, go download SiteCore Xpress and see for yourself.

kick it on DotNetKicks.com

Currently rated 5.0 by 4 people

  • Currently 5/5 Stars.
  • 1
  • 2
  • 3
  • 4
  • 5

Tags: , ,

Cool Tools | Web Development


 

Powered by BlogEngine.NET 1.4.5.0
Theme by Mads Kristensen

About the author

Jon Davis (aka "stimpy77") has been a programmer, developer, and consultant for web and Windows software solutions professionally since 1997, with experience ranging from OS and hardware support to DHTML programming to IIS/ASP web apps to Java network programming to Visual Basic applications to C# desktop apps.
 
Software in all forms is also his sole hobby, whether playing PC games or tinkering with programming them. "I was playing Defender on the Commodore 64," he reminisces, "when I decided at the age of 12 or so that I want to be a computer programmer when I grow up."

Jon was previously employed as a senior .NET developer at a very well-known Internet services company whom you're more likely than not to have directly done business with. However, this blog and all of jondavis.net have no affiliation with, and are not representative of, his former employer in any way.

Contact Me 


Tag cloud

Calendar

<<  October 2019  >>
MoTuWeThFrSaSu
30123456
78910111213
14151617181920
21222324252627
28293031123
45678910

View posts in large calendar