TDD: Here I Go!!

by Jon Davis 4. April 2009 11:57

About a week and a half ago I finally started breaking ground (that is, created a Visual Studio soluton and started writing code, with the intention of it being used in production) on a big composite pet project I've been considering for a very long time. I won't get into what the project is going to be about (at least not now), but I will say parts of it are multi-faceted software foundation / framework. So, once I'm done, I'll be able to take a bunch of the code I wrote here and re-apply it on a completely different project. This is important to me as I am seriously in need of gaining some inertia in pet project development and deployments.

So right away I started writing tests to "prove out" each little piece of code as I wrote it. I'd heard the TDD gospel and wanted to convert, but I don't have a lot of TDD practice under my belt. But right away I had about 75% code coverage, which I felt was pretty good compared to 0% of projects in the past. Every time I wrote a little bit of code, I hit Ctrl+R, A to be sure nothing I just did failed. This worked fine, at the beginning, while working with simple, lightweight objects.

Pretty soon I found myself implementing abstract provider/service classes, and then create at least one or two implementations of them. And then I wrote this big dependency object that gets passed around back and forth. By the time I had some prototype code written, I realized that my tests would not break because I hadn't been writing code around my tests, I had been writing my tests around the code I had been writing. And at this point I was clueless as to whether the code I had just written would work, because this was all plumbing code. I wanted to at least try something and see if the basics were working, so I created a console application and wrote some code with Console.Write()'s to get a visual on the output. But at this point things were getting messy. My dependency object I had created was created for the one implementation class of my abstract provider/service object. I now had a throw-away console app in my solution that didn't belong. And my test coverage was down to something like 5%.

That's when I realized I was going about this all wrong. For months I had a hard time understanding why TDD proponents argued that writing code before their tests is "putting the cart before the horse", when writing tests before code seemed so backwards to me. But now it started to click. For many years, I've been designing code in my head, then implementing those designs in code, and if it works, great, if not, I fix the bugs. I'm starting to see now how I need to retrain myself. It's okay to design code in your head. You just need to document those designs in tests. At first, they won't even compile. Fine. But at least document your designs in tests.

Funny, I haven't ever heard anyone make the connection, but writing tests with mocks of the way you want your code to work is exactly the same to a programmer as wireframes are to web design. One should not design a web page by throwing some HTML into a web site, trying to make it look nice and complete, and then showing it to the customer for the first time and saying, "Here you go, it's done. Now tell me what's wrong with it." Rather, a professional web production team would first collect the requirements (in workflow verbiage) of the web site, throw together some wireframe mock-ups, create Photoshop design comps in parallel, and then go to implementation, all the while following up with the customer for feedback during each and every step along the way. TDD is exactly the same but for the programmer. The tests are wireframes and business rules, just like a web application spec has bullet-point business rule text that serves as manual tests for all parties involved to both define and validate the implementation.

Even if there is no third party customer who would approve your "wireframes" and you're working solo, the analogy of "wireframing a web design" should still apply. A web designer/engineer can, but shouldn't, create a web site before wireframing it. He should start with a napkin, and go from there. Likewise, one can, but shouldn't, write code before writing tests for that code, because tests are not just validations that assumed functionality works but are also definitions of assumed functionality, and without definitions of assumed functionality there are really no coding objectives.

The hardest part for me to do TDD so far in this new experience is realizing I wasn't doing TDD in the first place, and then going back and scrapping all the code I had just written, whether it was good or not. I knew it had a little bit of 'bad' mixed in with the good, and that should not be acceptable. Awareness of "a little bit of bad code buried in there" typically means mostly bad code in the long run, and a lot of wasted time, because bad code cascades and affects everything else, like yeast.

One other thing about these incidents is that I've also been re-learning the theory of YAGNI (you ain't gonna need it!). I was getting too comfortable with the idea of introducing extra method signatures on objects that were not getting properly tested to begin with, and for which my use cases were purely theoretical, and not truely intentional. I've argued here in this blog in the past that YAGNI is extremely naive and not the appropriate attitude when writing software in smaller shops (like one or two man bands) because the multi-role developer often knows exacty what he needs and should not be withheld from adding what he needs in order to do his job. That's great, ignore YAGNI, so long as you're taking an "undisciplined", less-formal approach to software or are writing software that is not intended to be reused such as a framework. However, in my case, I'm writing framework bits, and I must balance ease of use by way of high versatility versus ease of use by way of simplicity. Other programmers, including myself at some later point in time, should not be confused with options. Programmers are software users, just like their end-users are software users. So the end-users use the mouse and click simple things, whereas programmers use framework code to write code, fine, either way, they're all users. The Apple and Google approaches are to keep it simple, stupid (KISS). Don't overwhelm the user with options, especially if either of two options reach the same outcome. I should define one "best practices" path and only introduce the other path when it is needed, not when it is slightly and occasionally convenient.

Part of the reason why I write like such is to drive it into my head. I still settle for less than what I preach sometimes.

There's one other thing; tests are a part of the TDD strategy, but I'm also beginning to think that once I have made some headway into these framework bits to be able to actually write some code that passes tests, I might also start writing some application/implementation project code. Call it a manual test if you like. The tests themselves should not lose priority. But I think it makes sense, to an extent, to see code written in action as soon as it's feasible in order to prove out code design prototypes. Writing application code should NOT actually be additional tests but rather take advantage of what's alreay passing and help guide the tests as the tests are being written. In this light, the only problem I have with TDD is that TDD is spoken of too often in terms of unit and integration tests, whereas the whole lifecycle of designing with tests and of testing must be much broader than unit and intgration tests. TDD starts with unit tests, but unit tests, by definition, only test one tiny unit of functionality. In order to really drive your design and development through tests, you need to be able to test workflows, too. Starting from Point A, given n parameters, we should ultimately arrive at B, and this might involve several steps.

Oh, I still need to watch Rob Conery's MVC Storefront videos where he, too, learned TDD. I got about ten or so videos in until I caught up, and then stopped watching while he kept streaming them out.

Tags:

C#

MVC, TDD, ORM, WCF, OMG LOL

by Jon Davis 23. April 2008 22:05

Rob Conery, creator of the SubSonic ORM and scaffolding framework for ASP.NET, has been transparently publishing, in video format, what he understands TDD (Test Driven Development or Test Driven Design) in his presentations on the the reworking of the ASP.NET Storefront Starter Kit based on the ASP.NET MVC toolkit.

Surprisingly, the series has little to nothing to do with the storefront project. It's more of .. what I just said, Rob's transparent discovery process of TDD and keeping his process in check. That said, I thought it was a rarity among webcasts, something very much worth watching. Rob, who was recently brought into the Microsoft fold, has a way of making Microsoft look and feel human like the rest of us, using common sense and even humbly making adjustments of mindset and worldview according to the wise words of industry professionals as they give feedback. (This is something Microsoft has been getting good at lately, stepping out from their big Redmond city sized box and learning from a bit the rest of the planet, which is why I still root for them despite my relentless criticisms.)

Rob's discovery process is actually a very, very good opportunity for the rest of us to learn from. Rob came in prepared, he presented well, and the content and message are very good. While it was mostly review for me, TDD brings about such a different mindset to software that I feel like I need as many such "reviews" as I can find so that I can get it ingrained and rooted in my mental patterns.

That said, I am still greener than I want to be at ASP.NET MVC. This is just an issue of experiece; we're using it at work but I haven't been given the opportunity to implement, the other fellows have. Same with SubSonic.

I'm a big fan of SubSonic. I like its approach to ORM and its scaffolding.

Lately, though, I've been touching a bit with one of the software industry's best kept secrets among rediculously handy ORM solutions, and that is EntitySpaces. Mike Griffin's work with ORM tools has been around longer than Rob Conery's, and I've admired his work with both MyGeneration (a free, open source, and in many ways much better alternative to CodeSmith) and EntitySpaces for a couple years now. EntitySpaces is still pioneering in the ORM space, they've had a super-sweet ORM query object model that has been around and in production for much longer than SubSonic's recent "super-query" tweaks, which are still in beta. Makes us all wonder ...


Rob Conery

Mike Griffin
ROFLMAO.

Aside from the additional fact that ES works cleanly and happily on the Mac and on Linux with Mono, and aside from the additional fact that SubSonic is less about ORM and more about "building web sites that build themselves" (not everyone is building ASP.NET web sites, some people are writing software and just need a focused ORM solution), one thing I'm noticing recently about ES that makes it stand out from a lot of the other solutions out there is that an investment in ES on the server side is a really smooth and effortless transition to the client when you throw WCF in the mix, if all the client needs is the data models. The WCF proxy support in MyGen+ES makes client/server integration over WCF a snap. Part of this magic is also in Visual Studio, when you right-click References and add a Service Reference, the proxy objects are brought over from WSDL or a WSDL equivalent proxy definition, and you can begin coding on a seemingly rich object model on the client right away.

For me when I was tinkering with ES over WCF, being still green to ES, this brought TDD (actually integration testing) back into the picture. I found myself taking what I learned from Rob Conery on TDD and designing my services by executing tests--specifically, integration tests, which isn't pure TDD but I trusted WCF. I kind of had to; I needed to invoke the interfaces so that I could step through the debugger and introspect the objects and see what was going on under the covers as I was invoking code over the wire. This isn't pure TDD by principle but it is "test driven design" in the sense that I was finding myself writing tests that simulate real-world behavior in order to design how I want the code to function.

TDD, meanwhile, has made me think a lot about what I wrote half a year or so ago about Design Top-Down, Implement Bottom-Up. I didn't get much feedback on that post, but I, too, was being transparent with what I thought was a good idea. My word choice in that pattern seems foreign at first to TDD principles, if not contradictory, but the more I think about it the more I think it is actually very much complimentary.

  • In TDD, the tests simulate, by way of invocation, the top-down design and verify the bottom-up implementations, and
  • The implementations are kept in check by these simulations of top-down usage.

So I think TDD is really the glue, or a type of glue, that makes Top-Down-Bottom-Up work. I think the only big differences in perspective here are physical; TDD's perspective is a "I poke at you first, before you yelp", or forwards, perpective, Top-Down-Bottom-Up assumes that encapsulation is a layer, and the priority in the design process, over the implementation. (Or something.) It's all pretty much the same thing. And actually, while TDD validates the Top-Down-Bottom-Up process, TDBU validates TDD, too, because done right it uses TDD to prove out the stability and rock-solid implementation of an end product.

I'm feeling now like haacking (pun intended, for those who know) together a SuperText blog engine using a mish-mash of funzy geek stuff I want to keep pushing myself with including ES, ASP.NET MVC, TDD, REST, WCF, VistaDB, Silverlight, Javascript/AJAX, and Internet Explorer. Just kidding, I won't use IE ...

... not that I haven't built a complete blogging solution before.

NUnit Test Case

by Jon Davis 25. July 2007 00:19

So after going through that primer, I snagged some very interesting points about "test case" and what rules a test case abides by, thereby qualifying a snippet of code as a usable unit test for NUnit:

  • Test case is a programmer test (think low-level or class-level, as in programmatic).
  • Test case is a self-validating test (in NUnit, using Assert).
  • Test case can be automatically discovered by a test runner (in C# / NUnit, using attributes like [Test]).
  • Test case can be executed independently of other test cases.
  • Test cases are units of organization and execution, and are grouped into test suites.

Currently rated 1.1 by 7 people

  • Currently 1.142857/5 Stars.
  • 1
  • 2
  • 3
  • 4
  • 5

Tags: , ,

Test-Driven Development in Microsoft .NET (Ch. 1)

by Jon Davis 24. July 2007 21:45

Last night I read the first chapter of Agile Principles, Patterns, and Practices in C#. Summary:  Essentially, the Agile Manifesto, expounded.

This afternooon I just cracked open the introduction and Chapter 1 of Test-Driven Development in Microsoft .NET (Microsoft Professional). Summary: The basic, trivialized definition of TDD is 1) Never write code without starting with a breaking test, and 2) never repeat yourself. Test-driven development is apparently not about sitting around testing code like a QA engineer stuck in hell like I thought it would be. It's about writing code in tiny increments until each increment is working flawlessly, virtually eliminating the process of debugging. (Funny, that sounds very similar to a quote I saw after installing #develop a year or two ago. Something like: "The best way to write code is always to start with one small requirement, test it thoroughly, then add to it and repeat the process, until all requirements are complete.")

The section called "Red/Green/Refactor" is (I've seen this before, in a diagram in the Presenter First PDF from Atomic Object):

  1. Write the test code.
  2. Compile the test code. (It should fail because you haven't implemented anything yet.)
  3. Implement just enough to compile.
  4. Run the test and see it fail.
  5. Implement just enough to make the test pass.
  6. Run the test and see it pass.
  7. Refactor for clarity and to eliminate duplication.
  8. Repeat from the top.

This is an older book, a couple years, but still relevant. It focuses on NUnit which everyone is still using, although today I discovered MbUnit, which seems to have more features, but it's recommended by some to start with NUnit until you need some of MbUnit's extra features. I've also installed TestDriven.net but not sure what it offers yet.

Tonight before I get out of here (I'm stuck at the office and it's 10pm..) I am going to go through the Appendix A of this TDD book which is an NUnit primer.

Update: *click*. This happened to me when I realized that programmer tests are executed in bulk with a click of a button, and grow as your project grows, so if requirements change you can see what breaks from a tweak just by re-processing your programmer tests. No more manual trudging, no more "if it breaks, enter debug mode" start-to-finish manual tests. Those have to happen, too, but before they do you can quickly and easily see exactly where red flags are raised before you even get that far.

Currently rated 1.0 by 4 people

  • Currently 1/5 Stars.
  • 1
  • 2
  • 3
  • 4
  • 5

Tags: , ,

Software Development


 

Powered by BlogEngine.NET 1.4.5.0
Theme by Mads Kristensen

About the author

Jon Davis (aka "stimpy77") has been a programmer, developer, and consultant for web and Windows software solutions professionally since 1997, with experience ranging from OS and hardware support to DHTML programming to IIS/ASP web apps to Java network programming to Visual Basic applications to C# desktop apps.
 
Software in all forms is also his sole hobby, whether playing PC games or tinkering with programming them. "I was playing Defender on the Commodore 64," he reminisces, "when I decided at the age of 12 or so that I want to be a computer programmer when I grow up."

Jon was previously employed as a senior .NET developer at a very well-known Internet services company whom you're more likely than not to have directly done business with. However, this blog and all of jondavis.net have no affiliation with, and are not representative of, his former employer in any way.

Contact Me 


Tag cloud

Calendar

<<  October 2018  >>
MoTuWeThFrSaSu
24252627282930
1234567
891011121314
15161718192021
22232425262728
2930311234

View posts in large calendar