Paying Attention To DinnerNow.net - Microsoft Killer App Best Practices Tutorial for Everything .NET 3.0

by Jon Davis 31. July 2007 18:01

I've been noticing my RSS feeds to Channel9 plugging some "DinnerNow" thing that sounded like some third party company who was just sharing some architectural trick up their sleeve, so I kinda shrugged it off. After all, how many software tricks are up people's sleeves at sourceforge.net or elsewhere?

But being a big new fan of ARCast.TV podcasts, I came across the Architecture Overview and Workflow piece which I didn't realize until I started listening that it was Part 1 of the DinnerNow bits. Listening to it now, I am realizing that this is something rather special.

DinnerNow, which I've only just downloaded and haven't actually seen yet but I'm hearing these guys talk about it in the podcast, is apparently NOT a Starter Kit style mini-solution like IBuySpy was (which was a dorky little shopping cart web site starter kit that Microsoft hacked together in ASP.NET back in the v1.1 days as a proof of concept).

Rather, DinnerNow is a full-blown software sample of an online restaurant food ordering web service application, one that I had wanted to build commercially for years (along with a hundred other ideas I had), that is top-to-bottom, soup-to-nuts, thorough and complete implementation of the entire solution from the servers, the restaurants, the buyer, demonstrating all of the awesome components of the latest long-released .NET Framework technologies, including:

  • Windows Communication Foundation (WCF) on the restaurant and web server
  • Windows Workflow Foundation (WF) on the restaurant and web server
  • Windows Presentation Foundation (WPF) for the restaurant (kiosk @ kitchen)
  • ASP.NET AJAX 1.0 and JSON-based synchronization
  • CardSpace for user security
  • PowerShell for things like administrative querying
  • Microsoft Management Console (MMC) for administrative querying, graphically
  • Windows Mobile / .NET Compact Framework
  • Windows Management Instrumentation (WMI)

Ron Jacobs (the ARCast host) made a good introduction for this with a statement I agree with, something along the lines of, "As architects, what we often try to do is look at someone else's software that was written successfully," and learn and discover new and/or best practices for software from it. In fact, that's probably the most important task in the learning process. If learning how to learn is an essential thing to learn in software architecture, then learn this essential point, that both understanding new technology and discovering best practices is learned by seeing it for yourself.

Update: Ugh. With many technologies being features comes many prerequisites. I had all of the above, but the setup requires everything to be exact. In other words, start with a fresh Virtual Machine with Vista 32-bit, and add the prerequisites. XP? Screwed. Win2003? Screwed. 64-bit? Screwed. Orcas Beta 2 installed? Screwed. And so on the screwing goes. An optional Virtual PC download from MSDN premium downloads with everything installed would have been nice. And I'm not hearing anything from the CodePlex forum / Issue Tracker, not a peep from its "maintainers". I think this project was abandoned. What bad timing, to put it on ARCast.tv now...

Another update: Microsoft did post a refresh build for Visual Studio 2008 Beta 2. I noticed it on CodePlex and they also started replying (for a change) to the multiple discussion threads and Issue Tracker posts on CodePlex. Among those replies were comments about 64-bit support--not supported, by design [which I knew] but they said with workarounds [which I posted multiple reports about] a 64-bit OS target build should be possible. I saw all this a few days ago; should have posted this follow-up update earlier, seems a Microsoft employee posted a comment here first. (Sorry.)

With passion comes passionate resentment when things go wrong. It's a fair give and take; on the other hand, whatever.

Multi-line SQL strings in C# and Stored Procs

by Jon Davis 25. July 2007 10:20

Here's a tip. When dealing with SQL strings, try to minimize the multiline delimiters like quotation marks and concatonators, so that you can easily read the SQL string.

Bad: 

            string sBrokerSQL = "SELECT MemberId, UserId, PackageConfigId"
                + " FROM Broker"
                + " WHERE"
                + "     PushToSystem = 1"
                + "     AND UserId IS NOT NULL";

Worse:

            StringBuilder sbSql = new StringBuilder();
            sbSql.Append("SELECT MemberId, UserId, PackageConfigId");
            sbSql.Append(" FROM Broker");
            sbSql.Append(" WHERE");
            sbSql.Append("      PushToSystem = 1");
            sbSql.Append("      AND UserId IS NOT NULL");

Better:

            string sBrokersSQL = @"
                SELECT MemberId, UserId, PackageConfigId
                FROM Broker
                WHERE
                    PushToSystem = 1
                    AND UserId IS NOT NULL";

With the last sample, you can copy and paste the SQL string right into SQL Server Management Studio (or for SQL 2000 users, into SQL Server Query Analyzer). This formatting is perfectly valid and SQL doesn't care about extra spaces, as long as the SQL itself doesn't have extra spaces in a SQL-quoted value (which should use SqlParameter anyway).  If you need to add concatonated bits, you can either use String.Format, or you can break out and restart a "@" sequence again.

            string sBrokersSQL = @"
                SELECT MemberId, UserId, PackageConfigId
                FROM Broker
                WHERE
                    PushToSystem = " + iPush + @"
                    AND UserId IS NOT NULL";

            string sBrokersSQL = String.Format(@"
                SELECT MemberId, UserId, PackageConfigId
                FROM Broker
                WHERE
                    PushToSystem = {0}
                    AND UserId IS NOT NULL", iPush);
 

Likewise, you can follow the same princple when working with dynamically generated SQL in a stored procedure. When concatonating strings to build a SQL query at runtime, the single-quote identifier supports line breaks and doesn't raise any exception as, say, Visual Basic would. It behaves similarly to C# literals having an "@" prefix.

Currently rated 5.0 by 1 people

  • Currently 5/5 Stars.
  • 1
  • 2
  • 3
  • 4
  • 5

Tags: , , ,

Software Development

NUnit Test Case

by Jon Davis 25. July 2007 00:19

So after going through that primer, I snagged some very interesting points about "test case" and what rules a test case abides by, thereby qualifying a snippet of code as a usable unit test for NUnit:

  • Test case is a programmer test (think low-level or class-level, as in programmatic).
  • Test case is a self-validating test (in NUnit, using Assert).
  • Test case can be automatically discovered by a test runner (in C# / NUnit, using attributes like [Test]).
  • Test case can be executed independently of other test cases.
  • Test cases are units of organization and execution, and are grouped into test suites.

Currently rated 1.1 by 7 people

  • Currently 1.142857/5 Stars.
  • 1
  • 2
  • 3
  • 4
  • 5

Tags: , ,

Test-Driven Development in Microsoft .NET (Ch. 1)

by Jon Davis 24. July 2007 21:45

Last night I read the first chapter of Agile Principles, Patterns, and Practices in C#. Summary:  Essentially, the Agile Manifesto, expounded.

This afternooon I just cracked open the introduction and Chapter 1 of Test-Driven Development in Microsoft .NET (Microsoft Professional). Summary: The basic, trivialized definition of TDD is 1) Never write code without starting with a breaking test, and 2) never repeat yourself. Test-driven development is apparently not about sitting around testing code like a QA engineer stuck in hell like I thought it would be. It's about writing code in tiny increments until each increment is working flawlessly, virtually eliminating the process of debugging. (Funny, that sounds very similar to a quote I saw after installing #develop a year or two ago. Something like: "The best way to write code is always to start with one small requirement, test it thoroughly, then add to it and repeat the process, until all requirements are complete.")

The section called "Red/Green/Refactor" is (I've seen this before, in a diagram in the Presenter First PDF from Atomic Object):

  1. Write the test code.
  2. Compile the test code. (It should fail because you haven't implemented anything yet.)
  3. Implement just enough to compile.
  4. Run the test and see it fail.
  5. Implement just enough to make the test pass.
  6. Run the test and see it pass.
  7. Refactor for clarity and to eliminate duplication.
  8. Repeat from the top.

This is an older book, a couple years, but still relevant. It focuses on NUnit which everyone is still using, although today I discovered MbUnit, which seems to have more features, but it's recommended by some to start with NUnit until you need some of MbUnit's extra features. I've also installed TestDriven.net but not sure what it offers yet.

Tonight before I get out of here (I'm stuck at the office and it's 10pm..) I am going to go through the Appendix A of this TDD book which is an NUnit primer.

Update: *click*. This happened to me when I realized that programmer tests are executed in bulk with a click of a button, and grow as your project grows, so if requirements change you can see what breaks from a tweak just by re-processing your programmer tests. No more manual trudging, no more "if it breaks, enter debug mode" start-to-finish manual tests. Those have to happen, too, but before they do you can quickly and easily see exactly where red flags are raised before you even get that far.

Currently rated 1.0 by 4 people

  • Currently 1/5 Stars.
  • 1
  • 2
  • 3
  • 4
  • 5

Tags: , ,

Software Development

PowerShell Community Extensions on x64

by Jon Davis 24. July 2007 21:39

I ran into this error after installing PowerShell Community Extensions on Windows Server 2003 R2 (64-bit):

    Cannot load Windows PowerShell snap-in [..] because of the following error:
    No Windows PowerShell Snap-ins are available for version 1. 			

This is one of the lamest error messages in recent history. The solution was found here: http://www.eggheadcafe.com/software/aspnet/30117657/powershell-on-64bit-serv.aspx

.. where I just swapped out the referenced DLL with C:\Program Files (x86)\PowerShell Community Extensions\Pscx.dll and C:\Program Files (x86)\PowerShell Community Extensions\Pscx.Core.dll.

    C:\> C:\Windows\Microsoft.NET\Framework64\v2.0.50727\installutil.exe 
         "C:\Program Files (x86)\PowerShell Community Extensions\Pscx.dll" 

In that directory I found the post-install .bat file that attempted to do this. Apparently, it failed.

Be the first to rate this post

  • Currently 0/5 Stars.
  • 1
  • 2
  • 3
  • 4
  • 5

Tags:

Computers and Internet | Cool Tools

Paying Attention to MVP, Presenter First, Castle, MonoRail, Igloo, et al

by Jon Davis 23. July 2007 00:34

In order to be an excellent software developer, one must be able to communicate and interoperate with others' software excellently. Lately, deafening chatter has been overwhelming the software communities about MVC, MVP, Presenter First, Castle, Spring, MonoRail, and so on. To my shame, I was so focused for so long on embracing OOP and the C# language and the .NET Framework and general Microsoft APIs and technologies that I overlooked these essential patterns, practices, and tools used by software professionals the world over. If I had only swallowed MVC and XP years ago, I would not be struggling so badly to play catch-up.

So far I haven't actually written a single line of code yet based on these patterns. I have looked over the shoulder of our team architect who was trying to get into it, and in so doing got a snapshot of what MVP code "looks like" (lots of interfaces and event handlers). After listening to Atomic Object's ArcCast podcast interviews, I've read Atomic Object's PDF presentation on Presenter First. I've realized the value of mock objects, and I've heard about Rhino Mocks. I've learned about Inversion of Control and dependency injection, and implementation tools for these in Spring.Net and in the Castle Project. I've been looking around for how MVP is supposed to work correctly in a web-based environment, where view state is already URI-controlled (unlike a GUI app, where the controller / presenter can push a view change more immediately). I've come across MonoRail and Igloo, but except for coding shortcuts I still don't see a solution to this problem.

But I haven't actually started using any of them.

I still haven't figured out whether these processes, patterns, and tools are directly related to those of Agile and XP. I do know that all of these are directly tied to unit testing and testable software--a critical process of software development I have tended to abhor, to my awful, disgusting shame.

I have an "Agile Princples, Patterns, and Practices in C#" book sitting on my lap, and I'm trying to figure out whether I should delve into this book, or if I should start tinkering with Castle to lead to MonoRails, or tinker with NUnit, or with NMock, or Rhino Mocks, or what. Maybe it doesn't matter, as long as I delve into one of these and progress myself.

I am certain, however, that by the end of this year, I had better know and be practicing all of the essentials of the above, or I will not feel confident in calling myself "excellent".

Update: After reading the forewords and the prefaces of Agile Principles, Patterns, and Practices in C#, I am pursuaded, this is the path I should take right now. This should be a book I should try to read end-to-end. While the subject is not related to MVC/MVP/PF, etc., I'm pursuaded that XP / Agile programming is a skill that is mandatory to understand as a software professional. It is about working with people, test-driven development, changeable software, and core values. MVP is just one technique for process and pattern, but it does not fit all. XP/Agile knowledge, however, might actually be one-size-fits-all. We'll see.

Be the first to rate this post

  • Currently 0/5 Stars.
  • 1
  • 2
  • 3
  • 4
  • 5

Tags: , , , , , , , , , , , ,

Software Development

Mono Needs Help

by Jon Davis 21. July 2007 19:30

The Mono Project over at http://www.mono-project.org/ really needs help, mainly with quality control and distribution maintenance.

For a long time, I have been very glad for Mono, always smiling when I see it mentioned in the press such as SD Times articles or in announcements and Internet buzzes like the one about Moonlight, the Silverlight-to-Mono port project. It has always annoyed me when people would dis the Mono project as being nothing but an intellectual tinkering, something or other, because I thrive in .NET and as much as I admire Linux I need a transitional environment before I can embrace it. I have full confidence in the Mono team's ability to write excellent .NET-compatible code that runs on Linux.

But there are some painful quality control issues going on with Mono, and personally I think it just needs one or two people to jump in and help out, if the team is willing to embrace another helping hand. Otherwise, they need to get their own act together.

The biggest issues I believe are with GTK# and with the Mono-Develop IDE. While the Mono team would likely argue that an IDE does not make up a programming SDK like .NET, one must appreciate the fact that in the Windows world, .NET and Visual Studio are like peanut butter and jelly--they can be eaten on bread alone but the experience is incomplete unless they are together.

I have no problem with Mono or MonoDevelop being incomplete. My issue is with the broken installation process. The whole idea of .NET on the Microsoft front was to eliminate "DLL hell" by allowing folks to both install multiple versions of DLL side-by-side and GAC them, as well as to enable and even encourage software developers to distribute non-GAC'd distributions of libraries that can execute revised functionality for objects being called by an application that expects an older version.

The MonoDevelop solution, or at least the one that is disributed by the Fedora project's Extras repository (yum install monodevelop), installs all of the dependencies, but with incompatible versions of GTK# (et al). So when you fire up MonoDevelop, in addition to getting an error about MonoQuery.addin (not sure what that one's about), if you start a new GTK# project, despite GTKSharp clearly showing up in the References, you get a compile error saying that the Gtk namespace cannot be found.

I have installed Fedora at least five times in the last week or two, in VMWare, trying different yum / rpm installation sequences, trying to figure out where I went wrong. I have reached the conclusion that I wasn't going wrong--the MonoDevelop and/or Mono teams are the ones who did wrong.

One might argue that it's Fedora's problem since they were doing the distro. Wrong again; the Mono project's web site's download links are distribution-targeted, such as for SuSE and Fedora, but the Fedora links are for Fedora 5 (that's TWO major releases old), and are strongly versioned for Fedora 5 when installed. When I use the noarch installer, at the end of installation you get an error message, "it appears that some graphics applications might not run correctly, please install those libraries individually", and MonoDevelop still doesn't "automagically" fix itself.

GAC is overrated. Backward compatibility support for future-versioned libraries inherent in the CLR is as key to Mono's success as side-by-side version installations. This is a fundamental problem with using RPM technology with Mono. You cannot install an older version of GTK# when a newer version is already installed. You can do a force install, but at what cost? What breaks?

I'm still trying to get this stupid thing going. But be sure, I would rather drop Mono than drop Fedora 7 for Fedora 5 or for SuSE (which I also have installed on a VM, and I'm unimpressed with it).

Be the first to rate this post

  • Currently 0/5 Stars.
  • 1
  • 2
  • 3
  • 4
  • 5

Tags: , , ,

Software Development | Linux

The Secretly Lame Idea For My Blog Title

by Jon Davis 17. July 2007 23:43

Anyone who knows what System.Reflection.Emit actually means, or does, will either hate my use of that namespace as my blog title and think it's incredibly stupid and lame, or will "get" it. To be honest, I chose it tongue-in-cheek, knowing that people would probably roll their eyes. But only the obnoxious Microsoft-haters and the anti-geeks!

Literal Background

To the uninformed geeks: System.Reflection.Emit is a namespace (or a section) in the Microsoft .NET namespace that allows a software program to dynamically generate MSIL at runtime. This has many uses but can be helpful, for instance, for things like creating dynamic languages.

To the non-geeks: System.Reflection.Emit is something that computer programmers can use to write a software program that automatically writes and runs computer software by itself. No, not as in like a computer virus. No, not as in like artificial intelligence. More like, here's a pattern of data, you figure out what to do with it. Make sense? No? Nevermind.

Philosophical Background

So why name a blog after this namespace? Well, I am no robot, but if you think about it, our brains run on software. Oh, sure, we have spirits, and those of us who are Christians know that there's a lot more going on in someone's head (or heart) than software. Nonetheless, technically and in the physical realm, the human mind runs on software.

Imagine, then, that the computer software that our minds are could "automatically" write new software, and run it, at runtime. Well, that's exactly what our brains do. An "idea" is just a dynamically generated software class, generated from multiple sources (mostly the experiences of the mind) that may or may not be instantiated (idea put to work).

Consider also that "Emit" itself is a word that means "to output". So here I am blogging; I am thinking, pondering new ideas, and I'm putting it out there. I'm emitting dynamically generated classes (ideas). Get it?

An Important Confession of Genuine Lameness

Personally, I haven't done much with the System.Reflection.Emit namespace. This isn't to say I haven't tried or tinkered with it. I've spent a few days poking at it. But the results were not much different from what would happen if I poked at an animal's brain while it was living. Mostly things just got gross.

Be the first to rate this post

  • Currently 0/5 Stars.
  • 1
  • 2
  • 3
  • 4
  • 5

Tags: ,

Map My Mind

by Jon Davis 17. July 2007 02:25

At my last job, a business partner was taking notes using some mind mapping software. I had never seen such software before; it got my attention. She managed to capture an outline bullet on every little facet of information that was divulged in a day-long meeting consisting of some thirty or so sales staff all of whom had an opinion about the requirements about the product we were about to implement. I was spellbound. 

The software she was using was MindJet MindManager Pro (http://www.mindjet.com/us/).

Eventually I got PDFs and Word docs of her output, and I had to help compile a legible 200-or-so-page design document (one among a zillion other reasons why I quit that job), I quickly realized that as handy as the tool seemed to be, it didn't seem to be very useful from where I sat after receiving its output. This is NOT a documentation tool!! It is a brainstorming tool, and realistically it can only be fully comprehended by the person who prepared it.

Nonetheless, there have been a few times in the last year or so that I really wanted such a tool. Actually, it was just for pet projects, not so much for work. But I cannot afford MindJet's offering at $350. So I quickly found FreeMind (http://freemind.sourceforge.net/wiki/index.php/Main_Page). It's Java-based, but I forgive. Some parts of it feel a bit obsolete, the diagramming feels a little unpolished, and the icons it seems to demand that you place are rather cheesy. But at the basic level, it does the job.

I came across MindApp (http://www.mindapp.com/) which costs $29 after a free trial, which is far more affordable than the $350 offering from MindJet. After completing my brain dump with it I felt like it's perhaps worth the $29 because of some extra polish and formatting features, but it has quirks, such as messed up font size in the HTML output. However, I didn't find myself compelled; the gap between it and FreeMind is tiny compared to the gap between it and MindJet's MindManager. I want more.

On a side topic, wanna know what I was brain-dumping? Well, a couple months or so ago I heard about another public database going down (this time a free TV listings service). Wouldn't it be nice, I thought, if public access databases were maintained by the Internet community rather than just one company that could shrug its shoulders one day and walk away? This had me thinking about my old Peer-to-Peer file sharing idea...

For many years, before Azureus, before Morpheus, before Kazaa, even before Napster (but somewhere close to the days of Metallica's relevance), I had an idea about peer-to-peer technology. Specifically, seeding a distributed database, by injecting metadata (i.e. XML attachments) to NNTP posts into an alt.* Usenet newsgroup that would contain IP addresses, DNS hostnames, and/or URIs, along with function metadata, describing the whereabouts of a peer service seed. This, in my mind, took hosted distributed peer-to-peer network seed hosts out of the equation; Usenet already propagates the metadata that would be needed to seed something like that.

But acknowledging that some level of organized seeding is necessary, I registered these domain names:

  • DistributedDB.org
  • DistributedDB.net
  • DistributedDB.info

The objective for a site that would use these domain names is NOT just for P2P networks--that backstory was just an example of a trigger that led me down this path. Rather, the objective is to give people a place to either dump tiny data records, or else to proxy or alias their own network services. The service would be free but with some maximum records or limitations.

Here's what my rather small brain dump looks like.. (oh, and yeah, it's alright, I know you can't read it...)

image

I'll divulge more later. Need sleep.

Currently rated 2.5 by 2 people

  • Currently 2.5/5 Stars.
  • 1
  • 2
  • 3
  • 4
  • 5

Tags:

Pet Projects | Cool Tools | Distributed Database

Testing LiveWriter Plug-ins

by Jon Davis 16. July 2007 03:41
Da! ...
Sample Code:
static void Main(string[] args)   
{
    Console.WriteLine(   
        "Checking to see how this LiveWriter "
        + "plug-in looks on this server template."); 
}
CoolBanana

Currently rated 2.0 by 4 people

  • Currently 2/5 Stars.
  • 1
  • 2
  • 3
  • 4
  • 5

Tags: , ,

Blog | Blog


 

Powered by BlogEngine.NET 1.4.5.0
Theme by Mads Kristensen

About the author

Jon Davis (aka "stimpy77") has been a programmer, developer, and consultant for web and Windows software solutions professionally since 1997, with experience ranging from OS and hardware support to DHTML programming to IIS/ASP web apps to Java network programming to Visual Basic applications to C# desktop apps.
 
Software in all forms is also his sole hobby, whether playing PC games or tinkering with programming them. "I was playing Defender on the Commodore 64," he reminisces, "when I decided at the age of 12 or so that I want to be a computer programmer when I grow up."

Jon was previously employed as a senior .NET developer at a very well-known Internet services company whom you're more likely than not to have directly done business with. However, this blog and all of jondavis.net have no affiliation with, and are not representative of, his former employer in any way.

Contact Me 


Tag cloud

Calendar

<<  May 2018  >>
MoTuWeThFrSaSu
30123456
78910111213
14151617181920
21222324252627
28293031123
45678910

View posts in large calendar