Keys To Web 3.0 Design and Development: I Have A Theory About Litter vs. Meat!

by Jon Davis 19. November 2008 21:21

I have a theory.

While Web 2.0 is generally well-defined and understood, Web 3.0 is still being defined and speculated upon every day. However, most folks are in agreement that a Web 3.0 web site has certain characteristics that set it apart from all web sites in the past. Among them, of which not everyone agrees but many do, is the general support for "Open API X" on a site such as OpenID, etc. According to Wikipedia

Nova Spivack defines Web 3.0 as the third decade of the Web (2010–2020) during which he suggests several major complementary technology trends will reach new levels of maturity simultaneously including:

  • transformation of the Web from a network of separately siloed applications and content repositories to a more seamless and interoperable whole.
  • ubiquitous connectivity, broadband adoption, mobile Internet access and mobile devices;
  • network computing, software-as-a-service business models, Web services interoperability, distributed computing, grid computing and cloud computing;
  • open technologies, open APIs and protocols, open data formats, open-source software platforms and open data (e.g. Creative Commons, Open Data License);
  • open identity, OpenID, open reputation, roaming portable identity and personal data;
  • the intelligent web, Semantic Web technologies such as RDF, OWL, SWRL, SPARQL, GRDDL, semantic application platforms, and statement-based datastores;
  • distributed databases, the "World Wide Database" (enabled by Semantic Web technologies); and
  • intelligent applications, natural language processing.[2], machine learning, machine reasoning, autonomous agents.

But simplest and most requisite among the Web 3.0 characteristics is the Web 2.0 ideal, that a Web 2.0/3.0 web site would be built around a "semantic web", whereby a spider or other "robotic" application would be able to make as much sense from a web site just by looking around as humans do today.

This is not my theory, this is fact--fact, that is, that these are the popular opinions and speculations of what "Web 3.0" actually is and how it is achieved. My theory, however, is one perspective on how to achieve a few basics of Web 2.0+/3.0 ideals in a web site. It goes something like this:

Objective: Isolate content from UI, advertising, and other fluff. This will enable SEO opportunities and facilitate semantic web mechanisms by giving "robots" a truly clean document right out of the oven. It can also help developers and content authors / content developers to have workflows that are consistently and entirely isolated from the merged production rendering and end user experience.

Theory: An SEO-rich semantic web site can be achieved when valuable content associated with a URL is returned by the URL (with strict and micro- formatting, and with isolated XML aggregation such as RSS/Atom), and all navigation, advertising, and other non-essential "content" is late-injected in script.

This theory boils down to just a few rules. These are Web 2.0 rules that used to be ideals, but a Web 3.0 site would require ALL of them to be carried out without any fail points.

SEO and aggregation know no style nor script - use this to your advantage!!

This is the core crux of my theory. CSS designers have been enjoying this fact for years, the fact that you can dump ugly, plain-vanilla, raw content on a page and convert it into an absolutely beautiful, breathtaking document while changing absolutely no markup whatsoever except simply a .css file reference. But unlike CSS experts, script gurus tend to be far from consistent. Most search engines and content aggregators don't bother with dynamic AJAX calls and document.write's. Granted, some search engines are getting smarter, smart enough to perform document.write's, but it will still be years before they will support content injection at runtime in order to index the content of a web page; by the time they start doing that, they might as well start aggregating Flash "MXML" and Silverlight "XAML" instances while they're at it (yeah right).

But instead of treating this as "something to keep in mind", what would happen if web developers built everything on a page around that rule, by breaking it down into several rules:

  1. Keep in mind that SEO (search engine optimization), aggregation-friendliness, and the semantic web all share similar requirements from a web site:
    • SEO-friendly data is good data
      • URL path - Use relevant keywords, use slugs, and prefer hyphens (-'s) over underscores or spaces which end up looking like %20's and may confuse the spiders / aggregators.
      • HTML <title> and <meta> keyword tags
      • Use of <h1>, <h2>, <h3>, <h4>, <h5>, <h6> tags and the like is absolutely critical. Never just use formatted <div>'s.
      • Always put the real content at the top of the document since the bottom part could be truncated.
      • Never, never, NEVER spam any page on your site to increase SEO. It doesn't work anymore and will get you blacklisted.
      • Never detect and customize the web page experience on behalf of the search engine such that the search engine sees something different than the user would see. Also blacklistable.
    • Non-content data in the document will be perceived as content if not treated with isolation.
  2. In planning, draw a line between "meat" content--your article paragraphs, your metadata, your title, your copyright line, etc--and everything else--your top-nav links, your left-nav sidebar, your advertisements, your widgets and buttons, etc.
  3. Label everything else as "litter content", because to a "robot" such as a spider or aggregator that's exactly what it is. A robot has no use for advertisements or sidebar except for the sitemap hyperlinks. Regular use of the verbiage "litter content" would be appropriate during design and development cycles because it forces you as a web developer to keep runtime isolation in mind in everything that you do.
  4. Treat a single web page as a multi-tier, multi-component application, not as generated HTML output. Specifically,
    • "Meat/SEO content" should be generated prior to client consumption, in either flat HTML files or as dynamically server-generated HTML (i.e. PHP or ASPX output). From the very first character to the very last line of "meat content", this file should be completely clean, rid of non-SEO content.
    • What the client receives should be NOTHING except for real, raw "meat/SEO content", along with what the client would perceive to be tiny bits of safely ignorable litter markup. Litter markup (tags) in itself is not litter content.
    • This litter markup would consist of all the attribute-decorated, XHTML-compliant placeholder tags and script blocks needed for the client to perform a secondary fetch or otherwise isolated insertion of litter content at runtime.
    • The litter content can be AJAX-fetched, or it can be appended to the tail end of the document. Runtime script would "inject" the content to their placeholder regions. For example, using jQuery, one can simply use: $("#placeholderID").append("#litter_contentID"); in a script block at the bottom of the page to inject litter content to a placeholder region of the page.
  5. Microformat everything. Anything that cannot be microformatted is "litter content" and should also be microformatted as basically ignorable. If no microformat standard exists for a particular region of content on a web page, look harder, and then otherwise invent your own.
  6. Supply XML equivalents to all types of content. This goes back to Web 2.0 ideals but never went away; have RSS/Atom aggregation support for all forms of content, whether for media galleries or for e-commerce product listings.
  7. Use proper cache controls on EVERY component of the page, including script links. In doing so, much of the potential "popping" of content, both meat and litter, can be minimized or altogether avoided. Theoretically.
  8. The litter content can modify the display characteristics of the meat content, but never the other way around. HTML injected with script as litter content should be viewed as transforming content into a thing of usability exactly the same way as CSS has already transformed content into a thing of beauty.

Here is a mind map that demonstrates these rules for e-commerce, CMS, and social web sites:

 


A new lightweight way to look at semantic web production

 

There are two very important questions that must be considered with regard to this overall theory.

  1. Does the user see any ugly "popping" of litter content, or is the page otherwise slow to load? While such "popping" can be unavoidable on the first hit to the site, subsequent eye sores such as this should be avoided if possible.
  2. Can the developer workflow and productivity sustain and ideally benefit from this redefinition of what a web page consists of?

For the most part, the answers are likely favorable depending entirely on the inventiveness of the web developer and what tricks he conceives.

These are rules worth building around. Assuming it's correct, it could be one of the several basis of the way we--or at least I--would want to build web sites for the next decade.

Be the first to rate this post

  • Currently 0/5 Stars.
  • 1
  • 2
  • 3
  • 4
  • 5

Tags: , ,

Web Development

Keys To Web 3.0 Design and Development When Using ASP.NET

by Jon Davis 9. October 2008 05:45

You can skip the following boring story as it's only a prelude to the meat of this post.

As I've been sitting at my job lately trying to pull off my web development ninja skillz I feel like my hands tied behind my back because I'm there temporarily as a consultant to add features, not to refactor. The current task at hand involves adding a couple additional properties to key user component in a rich web application. This requires a couple extra database columns and a bit of HTML interaction to collect the new settings. All in all, about 15 minutes, right? Slap in the columns into the database, update the SQL SELECT query, throw on a couple ASP.NET controls, add some data binding, and you're done, right? Surely not more than an hour, right?

Try three hours, just to add the columns to the database! The HTML is driven by a data "business object" that isn't a business object at all, just a data layer that has method stubs for invoking stored procedures and returns only DataTables. There are four types of "objects" based on the table being modified, and each type has its own stored procedure that ultimately proxies out to the base type's stored procedure, so that means at least five stored procedures for each CRUD operation affected by the addition. Overall, about 10 database objects were touched and as many C# data layer objects as well. Add to that a proprietary XML file that is used to map these data objects' DataTable columns, both in (parameters) and out (fields).

That's just the data. Then on the ASP.NET side, to manage event properties there's a control that's inheriting another control that is contained by another control that is contained by two other controls before it finally shows up on the page. Changes to the properties are a mix of hard-wired bindings to the lowest base control (properties) for some of the user's settings, and for most of the rest of the user's settings on the same page, CLR events (event args) are raised by the controls and are captured by the page that contains it all. There are at least five different events, one for each "section" of properties. To top it off, in my shame, I added both another "SaveXXX" event, plus I added another way of passing the data--I use a series of FindControl(..) invocation chains to get to the buried control and fetch the setting I wanted to add to the database and/or translate back out to the view. (I would have done better than to add more kludge, but I couldn't without being enticed to refactor, which I couldn't do, it's a temporary contract and the boss insisted that I not.)

To top it all off, just the simple CRUD stored procedures alone are slower than an eye blink, and seemingly showstopping in code. It takes about five seconds to handle each postback on this page, and I'm running locally (with a networked SQL Server instance).

The guys who architected all this are long gone. This wasn't the first time I've been baffled by the output of an architect who tries too hard to do the architectural deed while forgetting that his job is not only to be declarative on all layers but also to balance it with performance and making the developers' lives less complicated. In order for the team to be agile, the code must be easily adaptable.

Plus the machine I was given is, just like everyone else's, a cheap Dell with 2GB RAM and a 17" LCD monitor. (At my last job, which I quit, I had a 30-inch monitor and 4GB RAM which I replaced without permission and on my own whim with 8GB.) I frequently get OutOfMemoryExceptions from Visual Studio when trying to simply compile the code.

There are a number of reasons I can pinpoint to describe exactly why this web application has been so horrible to work with. Among them,

  • The architecture violates the KISS principle. The extremities of the data layer prove to be confounding, and buring controls inside controls (compositing) and then forking instances of them are a severe abuse of ASP.NET "flexibility".
  • OOP principles were completely ignored. Not a single data layer inherits from another. There is no business object among the "Business" objects' namespace, only data invocation stubs that wrap stored procedure execution with a transactional context, and DataTables for output. No POCO objects to represent any of the data or to reuse inherited code.
  • Tables, not stored procedures, should be used in basic CRUD operations. One should use stored procedures only in complex operations where multiple two-way queries must be accomplished to get a job done. Good for operations, bad for basic data I/O and model management.
  • Way too much emphasis on relying on Web Forms "featureset" and lifcycle (event raising, viewstate hacking, control compositing, etc.) to accomplish functionality, and way too little understanding and utilization of the basic birds and butterflies (HTML and script).
  • Way too little attention to developer productivity by failure to move the development database to the local switch, have adequate RAM, and provide adequate screen real estate to manage hundreds of database objects and hundreds of thousands of lines of code.
  • Admittance of the development manager of the sadly ignorant and costly attitude that "managers don't care about cleaning things up and refactoring, they just want to get things done and be done with it"--I say "ignorant and costly" because my billable hours were more than quadrupled versus having clean, editable code to begin with.
  • New features are not testable in isolation -- in fact, they aren't even compilable in isolation. I can compile and do lightweight testing of the data layer without more than a few heartbeats, but it takes two minutes to compile the web site just to see where my syntax or other compiler-detected errors are in my code additions (and I haven't been sleeping well lately so I'm hitting the Rebuild button and monitoring the Errors window an awful lot). 

Even as I study (ever so slowly) for MCPD certification for my own reasons while I'm at home (spare me the biased anti-Microsoft flames on that, I don't care) I'm finding that Microsoft end developers (Morts) and Microsofties (Redmondites) alike are struggling with the bulk of their own technology and are heaping up upon themselves the knowledge of their own infrastructure before fully appreciating the beauty and the simplicity of the pure basics. Fortunately, Microsoft has had enough, and they've been long and hard at the drawing board to reinvent ASP.NET with ASP.NET MVC. But my interests are not entirely, or not necessarily, MVC-related.

All I really want is for this big fat pillow to be taken off of my face, and all these multiple layers of coats and sweatshirts and mittens and ski pants and snow boots to be taken off me, so I can stomp around wearing just enough of what I need to be decent. I need to breathe, I need to move around, and I need to be able to do some ninja kung fu.

These experiences I've had with ASP.NET solutions often make me sit around brainstorming how I'd build the same solutions differently. It's always easy to be everyone's skeptic, and it requires humility to acknowledge that just because you didn't write something or it isn't in your style or flavor doesn't mean it's bad or poorly produced. Sometimes, however, it is. And most solutions built with Web Forms, actually, are.

My frustration isn't just with Web Forms. It's with corporations that build upon Internet Explorer rather than HTML+Javascript. It's with most ASP.NET web applications adopting a look-and-feel that seem to grow in a box that is controlled by Rendmondites, with few artistic deviators rocking the boat. It's with the server-driven view management rather than smart clients in script and markup. It's with nearly all development frameworks that cater towards the ASP.NET crowd being built for IIS (the server) and not for the browser (the client).

I intend to do my part, although intentions are easy, actions can be hard. But I've helped design an elaborate client-side MVC framework before, with great pride, I'm thinking about doing it again and implementing myself (I didn't have the luxury of real-world implementation [i.e. a site] last time, I only helped design it and wrote some of the core code) and open sourcing it for the ASP.NET crowd. I'm also thinking about building a certain kind of ASP.NET solution I've frequently needed to work with (CRM? CMS? Social? something else? *grin* I won't say just yet), that takes advantage of certain principles.

What principles? I need to establish these before I even begin. These have already worked their way into my head and my attitude and are already an influence in every choice I make in web architecture, and I think they're worth sharing.

1. Think dynamic HTML, not dynamically generated HTML. Think of HTML like food; do you want your fajitas sizzling when when it arrives and you have to use a fork and knife while you enjoy it fresh on your plate, or do you prefer your food preprocessed and shoved into your mouth like a dripping wet ball of finger-food sludge? As much as I love C#, and acknowledge the values of Java, PHP, Ruby on Rails, et al, the proven king and queen of the web right now, for most of the web's past, and for the indefinite future are the HTML DOM and Javascript. This has never been truer than now with jQuery, MooTools, and other (I'd rather not list them all) significant scripting libraries that have flooded the web development industry with client-side empowerment. Now with Microsoft adopting jQuery as a core asset for ASP.NET's future, there's no longer any excuse. Learn to develop the view for the client, not for the server.

Why? Because despite the fact that client-side debugging tools are less evolved than on the server (no edit-and-continue in VS, for example, and FireBug is itself buggy), the overhead of managing presentation logic in a (server) context that doesn't relate to the user's runtime is just too much to deal with sometimes. Server code often takes time to recompile, whereas scripts don't typically require compilation at all. While in theory there is plenty of control on the server to debug what's needed while you have control of it in your own predictable environment, in practice there are just too many stop-edit-retry cycles going on in server-oriented view management.

And here's why that is. The big reason to move view to the client is because developers are just writing WAY too much view, business, and data mangling logic in the same scope and context. Client-driven view management nearly forces the developer to isolate view logic from data. In ASP.NET Web Forms, your 3 tiers are database, data+view mangling on the server, and finally whatever poor and unlucky little animal (browser) has to suffer with the resulting HTML. ASP.NET MVC changes that to essentially five tiers: the database, the models, the controller, the server-side view template,and finally whatever poor and unlucky little animal has to suffer with the resulting HTML. (Okay, Microsoft might be changing that with adopting jQuery and promising a client solution, we'll see.)

Most importantly, client-driven views make for a much richer, more interactive UIX (User Interface/eXperience); you can, for example reveal/hide or enable/disable a set of sub-questions depending on if the user checks a checkbox, with instant gratification. The ASP.NET Web Forms model would have it automatically perform a form post to refresh the page with the area enabled/disabled/revealed/hidden depending on the checked state. The difference is profound--a millisecond or two versus an entire second or two.

2. Abandon ASP.NET Web Forms. RoR implements a good model, try gleaning from that. ASP.NET MVC might be the way of the future. But frankly, most of the insanely popular web solutions on the Internet are PHP-driven these days, and I'm betting that's because PHP is on a similar coding model as ASP classic. No MVC stubs. No code-behinds. All that stuff can be tailored into a site as a matter of discipline (one of the reasons why PHP added OOP), but you're not forced into a one-size-fits-all paradigm, you just write your HTML templates and go.

Why? Web Forms is a bear. Its only two advantages are the ability to drag-and-drop functionality onto a page and watch it go, and premier vender (Microsoft / Visual Studio / MSDN) support. But it's difficult to optimize, difficult to templatize, difficult to abstract away from business logic layers (if at least difficult in that it requires intentional discipline), and puts way too much emphasis on the lifecycle of the page hit and postback. Look around at the ASP.NET web forms solutions out there. Web Forms is crusty like Visual Basic is crusty. It was created for, and is mostly used for, corporate grunts who use B2B (business-to-business) or internal apps. The rest of the web sites who use ASP.NET Web Forms suffer greatly from the painful code bloat of the ASP.NET Web Forms coding model and the horrible end-user costs of page bloat and round-trip navigation.

Kudos to Guthrie, et al, who developed Web Forms, it is a neat technology, but it is absolutely NOT a one-size-fits-all platform any more than my winter coat from Minnesota is. So congratulations to Microsoft for picking up the ball and working on ASP.NET MVC.

3. Use callbacks, not postbacks. Sometimes a single little control, like a textbox that behaves like an auto-suggest combobox, just needs a dedicated URL to perform an AJAX query against. But also, in ASP.NET space, I envision the return of multiple <form>'s, with DHTML-based page MVC controllers powering them all, driving them through AJAX/XmlHttpRequest.

Why? Clients can be smart now. They should do the view processing, not the server. The browser standard has finally arrived to such a place that most people have browsers capable of true DOM/DHTML and Javascript with JSON and XmlHttpRequest support.

Clearing and redrawing the screen is as bad as 1980s BBS ANSI screen redraws. It's obsolete. We don't need to write apps that way. Postbacks are cheap; don't be cheap. Be agile; use patterns, practices, and techniques that save development time and energy while avoiding the loss of a fluid user experience. <form action="someplace" /> should *always* have an onsubmit handler that returns false but runs an AJAX-driven post. The page should *optionally* redirect, but more likely only the area of the form or a region of the page (a containing DIV perhaps) should be replaced with the results of the post. Retain your header and sidebar in the user experience, and don't even let the content area go white for a split second. Buffer the HTML and display it when ready.

ASP.NET AJAX has region refreshes already, but still supports only <form runat="server" /> (limit 1), and the code-behind model of ASP.NET AJAX remains the same. Without discipline of changing from postback to callback behavior, it is difficult to isolate page posts from componentized view behavior. Further, <form runat="server" /> should be considered deprecated and obsolete. Theoretically, if you *must* have ViewState information you can drive it all with Javascript and client-side controllers assigned to each form.

ASP.NET MVC can manage callbacks uniformly by defining a REST URL suffix, prefix, or querystring, and then assigning a JSON handler view to that URL, for example ~/employee/profile/jsmith?view=json might return the Javascript object that represents employee Joe Smith's profile. You can then use Javascript to pump HTML generated at the client into view based on the results of the AJAX request.

4. By default, allow users to log in without accessing a log in page. A slight tangent (or so it would seem), this is a UI design constraint, something that has been a pet peeve of mine ever since I realized that it's totally unnecessary to have a login page. If you don't want to put ugly Username/Password fields on the header or sidebar, use AJAX.

Why? Because if a user visits your site and sees something interesting and clicks on a link, but membership is required, the entire user experience is inturrupted by the disruption of a login screen. Instead, fade out to 60%, show a DHTML pop-up login, and fade in and continue forward. The user never leaves the page before seeing the link or functionality being accessed.

Imagine if Microsoft Windows' UAC, OS X's keyring, or GNOME's sudo auth, did a total clear-screen and ignored your action whenever it needed an Administrator password. Thankfully it doesn't work that way; the flow is paused with a small dialogue box, not flat out inturrupted.

5. Abandon the Internet Explorer "standard". This goes to corporate folks who target IE. I am not saying this as an anti-IE bigot. In fact, I'm saying this in Internet Explorer's favor. Internet Explorer 8 (currently not yet released, still in beta) introduces better web standards support than previous versions of Internet Explorer, and it's not nearly as far behind the trail of Firefox and WebKit (Safari, Chrome) as Internet Explorer 7 is. With this reality, web developers can finally and safely build W3C-compliant web applications without worrying too much about which browser vendor the user is using, and instead ask the user to get the latest version

Why? Supporting multiple different browsers typically means writing more than one version of a view. This means developer productivity is lost. That means that features get stripped out due to time constraints. That means that your web site is crappier. That means users will be upset because they're not getting as much of what they want. That means less users will come. And that means less money. So take on the "Write once, run anywhere" mantra (which was once Java's slogan back in the mid-90s) by writing W3C-compliant code, and leave behind only those users who refuse to update their favorite browsers, and you'll get a lot more done while reaching a broader market, if not now then very soon, such as perhaps 1/2 yr after IE 8 is released. Use Javascript libraries like jQuery to handle most of the browser differences that are left over, while at the same time being empowered to add a lot of UI functionality without postbacks. (Did I mention that postbacks are evil?)

6. When hiring, favor HTML+CSS+Javascript gurus who have talent and an eye for good UIX (User Interface/eXperience) over ASP.NET+database gurus. Yeah! I just said that!

Why? Because the web runs on the web! Surprisingly, most employers don't have any idea and have this all upside down. They favor database gurus as gods and look down upon UIX developers as children. But the fact is I've seen more ASP.NET+SQL guys who halfway know that stuff and know little of HTML+Javascript than I have seen AJAX pros, and honestly pretty much every AJAX pro is bright enough and smart enough to get down and dirty with BLL and SQL when the time comes. Personally, I can see why HTML+CSS+Javascript roles are paid less (sometimes a lot less) than the server-oriented developers--any script kiddie can learn HTML!--but when it comes to professional web development they are ignored WAY too much because of only that. The web's top sites require extremely brilliant front-end expertise, including Facebook, Hotmail, Gmail, Flickr, YouTube, MSNBC--even Amazon.com which most prominently features server-generated content but yet also reveals a significant amount of client-side expertise.

I've blogged it before and I'll mention it again, the one, first, and most recent time I ever had to personally fire a co-worker (due to my boss being out of town and my having authority, and my boss requesting it of me over the phone) was when I was working with an "imported" contractor who had a master's degree and full Microsoft certification, but could not copy two simple hyperlinks with revised URLs within less than 5-10 minutes while I watched. The whole office was in a gossipping frenzy, "What? Couldn't create a hyperlink? Who doesn't know HTML?! How could anyone not know HTML?!", but I realized that the core fundamentals have been taken for granted by us as technologists to such an extent that we've forgotten how important it is to value it in our hiring processes.

7.  ADO.NET direct SQL code or ORM. Pick one. Don't just use data layers. Learn OOP fundamentals. The ActiveRecord pattern is nice. Alternatively, if it's a really lightweight web solution, just go back to wring plain Jane SQL with ADO.NET. If you're using C# 3.0, which of course you are in the context of this blog entry, then use LINQ-to-SQL or LINQ-to-Entities. On the ORM side, however, I'm losing favor with some of them because they often cater to a particular crowd. I'm slow to say "enterprise" because, frankly, too many people assume the word "enterprise" for their solutions when they are anything but. Even web sites running at tens of thousands of hits a day and generating hundreds of thousands of dollars of revenue every month don't necessarily mean "enterprise". The term "enterprise" is more of a people management inference than a stability or quality effort. It's about getting many people on your team using the same patterns and not having loose and abrupt access to thrash the database. For that matter, the corporate slacks-and-tie crowd of ASP.NET "Morts" often can relate to "enterprise", and not even realize it. But for a very small team (10 or less) and especially for a micro ISV (developers numbering 5 or less) with a casual and agile attitude, take the word "enterprise" with a grain of salt. You don't need a gajillion layers of red tape. For that matter, though, smaller teams are usually small because of tighter budgets, and that usually means tighter deadlines, and that means developer productivity must reign right there alongside stability and performance. So find an ORM solution that emphasizes productivity (minimal maintenance and easily adaptable) and don't you dare trade routine refactoring for task-oriented focus as you'll end up just wasting everyone's time in the long run. Always include refactoring to simplicity in your maintenance schedule.

Why? Why go raw with ADO.NET direct SQL or choose an ORM? Because some people take the data layer WAY too far. Focus on what matters; take the effort to avoid the effort of fussing with the data tier. Data management is less important than most teams seem to think. The developer's focus should be on the UIX (User Interface/eXperience) and the application functionality, not how to store the data. There are three areas where the typical emphasis on data management is agreeably important: stability, performance (both of which are why we choose SQL Server over, oh, I dunno, XML files?) and queryability. The latter is important both for the application and for decision makers. But a fourth requirement is routinely overlooked, and that is the emphasis on being able to establish a lightweight developer workflow of working with data so that you can create features quickly and adapt existing code easily. Again, this is why a proper understanding of OOP, how to apply it, when to use it, etc, is emphasized all the time, by yours truly. Learn the value of abstraction and inheritence and of encapsulating interfaces (resulting in polymorphism). Your business objects should not be much more than POCO objects with application-realized properties. Adding a new simple data-persisted object, or modifying an existing one with, say, a new column, should not take more than a minute of one's time. Spend the rest of that time instead on how best to impress the user with a snappy, responsive user interface.

8. Callback-driven content should derive equally easily from your server, your partner's site, or some strange web service all the way in la-la land. We're aspiring for Web 3.0 now, but what happened to Web 2.0? We're building on top of it! Web 2.0 brought us mashups, single sign-ons, and cross-site social networking. FaceBook Applications are a classic demonstration of an excelling student of Web 2.0 now graduating and turning into a Web 3.0 student. Problem is, keeping the momentum going, who's driving this rig? If it's not you, you're missing out on the 3.0 vision.

Why? Because now you can. Hopefully by now you've already shifted the bulk of the view logic over to the client. And you've empowered your developers to focus on the front-end UIX. Now, though, the client view is empowered to do more. It still has to derive content from you, but in a callback-driven architecture, the content is URL-defined. As long as security implications are resolved, you now have the entire web at your [visitors'] disposal! Now turn it around to yourself and make your site benefit from it!

If you're already invoking web services, get that stuff off your servers! Web services queried from the server cost bandwidth and add significant time overhead before the page is released from the buffer to the client. The whole time you're fetching the results of a web service you're querying, the client is sitting there looking at a busy animation or a blank screen. Don't let that happen! Throw the client a bone and let it fetch the external resources on its own.

9. Pay attention to the UIX design styles of the non-ASP.NET Web 2.0/3.0 communities. There is such a thing as a "Web 2.0 look", whether we like to admit it or not; we web developers evolved and came up with innovations worth standardizing on, why can't designers evolve and come up with visual innovations worth standardizing on? If the end user's happiness is our goal, how are features and stable and performant code more important than aesthetics and ease of use? The problem is, one perspective of what "the Web 2.0 look" actually looks like is likely very different from another's or my own. I'm not speaking of heavy gloss or diagonal lines. I most certainly am not talking about the "bubble gum" look. (I jokingly mutter "Let's redesign that with diagonal lines and beveled corners!" now and then, but when I said that to my previous boss and co-worker, both of whom already looked down on me WAY more than they deserved to do, neither of them understood that I was joking. Or, at least, they didn't laugh or even smile.) No, but I am talking about the use of artistic elements, font choices and font styles, and layout characteristics that make a web site stand out from the crowd as being highly usable and engaging. 

Let's demonstrate, shall we? Here are some sites and solutions that deserve some praise. None of them are ASP.NET-oriented.

  • http://www.javascriptmvc.com/ (ugly colors but otherwise nice layout and "flow"; all functionality driven by Javascript; be sure to click on the "tabs")
  • http://www.deskaway.com/ (ignore the ugly logo but otherwise take in the beauty of the design and workflow; elegant font choice)
  • http://www.mosso.com/ (I really admire the visual layout of this JavaServer Pages driven site; fortunately I love the fact that they support ASP.NET on their product)
  • http://www.feedburner.com/ (these guys did a redesign not too terribly long ago; I really admire their selective use of background patterns, large-font textboxes, hover effects, and overall aesthetic flow)
  • http://www.phpbb.com/ (stunning layout, rock solid functionality, universal acceptance)
  • http://www.joomla.org/ (a beautiful and powerful open source CMS)
  • http://goplan.org/ (I don't like the color scheme but I do like the sheer simplicity
  • .. for that matter I also love the design and simplicity of http://www.curdbee.com/)

Now here are some ASP.NET-oriented sites. They are some of the most popular ASP.NET-driven sites and solutions, but their design characteristics, frankly, feel like the late 90s.

  • http://www.dotnetnuke.com/ (one of the most popular CMS/portal options in the open source ASP.NET community .. and, frankly, I hate it)
  • http://www.officelive.com/ (sign in and discover a lot of features with a "smart client" feel, but somehow it looks and feels slow, kludgy, and unrefined; I think it's because Microsoft doesn't get out much)
  • http://communityserver.com/ (it looks like a step in the right direction, but there's an awful lot of smoke and mirrors; follow the Community link and you'll see the best of what the ASP.NET community has to offer in the way of forums .. which frankly doesn't impress me as much as phpBB)
  • http://www.dotnetblogengine.net/ (my blog uses this, I like it well enough, but it's just one niche, and that's straight-and-simple blogs
  • http://subsonicproject.com/ (the ORM technology is very nice, but the site design is only "not bad", and the web site starter kit leave me shrugging with a shiver)

Let's face it, the ASP.NET community is not driven by designers.

Why? Why do I ramble on about such fluffy things? Because at my current job (see the intro text) the site design is a dump of one feature hastilly slapped on after another, and although the web app has a lot of features and plenty of AJAX to empower it here and there, it is, for the most part, an ugly and disgusting piece of cow dung in the area of UIX (User Interface/eXperience). AJAX functionality is based on third party components that "magically just work" while gobs and gobs of gobblygook code on the back end attempts to wire everything together, and what AJAX is there is both rare and slow, encumbered by page bloat and server bloat. The front-end appearance is amateurish, and I'm disheartened as a web developer to work with it.

Such seems to be the makeup of way too many ASP.NET solutions that I've seen.

10. Componentize the client. Use "controls" on the client in the same way you might use .ASCX controls on the server, and in the process of doing this, implement a lifecycle and communications subsystem on the client. This is what I want to do, and again I'm thinking of coming up with a framework to pursue it to compliment Microsoft's and others' efforts. If someone else (i.e. Microsoft) beats me to it, fine. I just hope theirs is better than mine.

Why? Well if you're going to emphasize the client, you need to be able to have a manageable development workflow.

ASP.NET thrives on the workflows of quick-tagging (<asp:XxxXxx runat="server" />) and drag-and-drop, and that's all part of the equation of what makes it so popular. But that's not all ASP.NET is good for. ASP.NET's greatest strengths are two: IIS and the CLR (namely the C# language). The quality of integration of C# with IIS is incredible. You get near-native-compiled-quality code with scripted text file ease of deployment, and the deployment is native to the server (no proxying, a la Apache->Tomcat->Java, or even FastCGI->PHP). So why not utilize these other benefits as a Javascript-based view seed rather than as generating the entirety of the view.

On the competitive front, take a look at http://www.wavemaker.com/. Talk about drag-and-drop coding for smart client-side applications, driven by a rich server back-end (Java). This is some serious competition indeed.

11. RESTful URIs, not postback or Javascript inline resets of entire pages. Too many developers of AJAX-driven smart client web apps are bragging about how the user never leaves the page. This is actually not ideal.

Why? Every time the primary section of content changes, in my opinion, it should have a URI, and that should be reflected (somehow) in the browser's Address field. Even if it's going to be impossible to make the URL SEO-friendly (because there are no predictable hyperlinks that are spiderable), the user should be able to return to the same view later, without stepping through a number of steps of logging in and clicking around. This is partly the very definition of the World Wide Web: All around the world, content is reflected with a URL.

12. Glean from the others. Learn CakePHP. Build a simple symfony or Code Igniter site. Watch the Ruby On Rails screencasts and consider diving in. And have you seen Jaxer lately?!

And absolutely, without hesitation, learn jQuery, which Microsoft will be supporting from here on out in Visual Studio and ASP.NET. Discover the plug-ins and try to figure out how you can leverage them in an ASP.NET environment.

Why? Because you've lived in a box for too long. You need to get out and smell the fresh air. Look at the people as they pass you by. You are a free human being. Dare yourself to think outside the box. Innovate. Did you know that most innovations are gleaning from other people's imaginative ideas and implemenations, and reapplying them in your own world, using your own tools? Why should Ruby on Rails have a coding workflow that's better than ASP.NET? Why should PHP be a significantly more popular platform on the public web than ASP.NET, what makes it so special besides being completely free of Redmondite ties? Can you interoperate with it? Have you tried? How can the innovations of Jaxer be applied to the IIS 7 and ASP.NET scenario, what can you do to see something as earth-shattering inside this Mortian realm? How can you leverage jQuery to make your web site do things you wouldn't have dreamed of trying to do otherwise? Or at least, how can you apply it to make your web application more responsive and interactive than the typical junk you've been pumping out?

You can be a much more productive developer. The whole world is at your fingertips, you only need to pay attention to it and learn how to leverage it to your advantage.

 

And these things, I believe, are what is going to drive the Web 1.0 Morts in the direction of Web 3.0, building on the hard work of yesteryear's progress and making the most of the most powerful, flexible, stable, and comprehensive server and web development technology currently in existence--ASP.NET and Visual Studio--by breaking out of their molds and entering into the new frontier.

kick it on DotNetKicks.com

Currently rated 3.0 by 4 people

  • Currently 3/5 Stars.
  • 1
  • 2
  • 3
  • 4
  • 5

Tags: , , ,

Opinion | Web Development | ASP.NET


 

Powered by BlogEngine.NET 1.4.5.0
Theme by Mads Kristensen

About the author

Jon Davis (aka "stimpy77") has been a programmer, developer, and consultant for web and Windows software solutions professionally since 1997, with experience ranging from OS and hardware support to DHTML programming to IIS/ASP web apps to Java network programming to Visual Basic applications to C# desktop apps.
 
Software in all forms is also his sole hobby, whether playing PC games or tinkering with programming them. "I was playing Defender on the Commodore 64," he reminisces, "when I decided at the age of 12 or so that I want to be a computer programmer when I grow up."

Jon was previously employed as a senior .NET developer at a very well-known Internet services company whom you're more likely than not to have directly done business with. However, this blog and all of jondavis.net have no affiliation with, and are not representative of, his former employer in any way.

Contact Me 


Tag cloud

Calendar

<<  September 2018  >>
MoTuWeThFrSaSu
272829303112
3456789
10111213141516
17181920212223
24252627282930
1234567

View posts in large calendar