Automatically Declaring Namespaces in Javascript (namespaces.js)

by Jon Davis 25. September 2012 18:24

Namespaces in Javascript are a pattern many untrained or undisciplined developers may fail to do, but they are an essential strategy in retaining maintainability and avoiding collisions in Javascript source.

Part of the problem with namespaces is that if you have a complex client-side solution with several Javascript objects scattered across several files but they all pertain to the same overall solution, you may end up with very long, nested namespaces like this:

var ad = AcmeCorporation.Foo.Bar.WidgetFactory.createWidget('advertisement');

I personally am not opposed to long namespaces, so long as they can be shortened with aliases when their length gets in the way.

var wf = AcmeCorporation.Foo.Bar.WidgetFactory;
var ad = wf.createWidget('advertisement');

The problem I have run into, however, is that when I have multiple .js files in my project and I am not 100% sure of their load order, I may run into errors. For example:

// acme.WidgetFactory.js
AcmeCorporation.Foo.Bar.WidgetFactory = {
createWidget: function(e) {
return new otherProvider.Widget(e);
}
};

This may throw an error immediately because even though I’m declaring the WidgetFactory namespace, I am not certain that these namespaces have been defined:

  • AcmeCorporation
  • AcmeCorporation.Foo
  • AcmeCorporation.Foo.Bar

So again if any of those are missing, the code in my acme.WidgetFactory.js file will fail.

So then I clutter it with code that looks like this:

// acme.WidgetFactory.js
if (!window['AcmeCorporation']) window['AcmeCorporation'] = {};
if (!AcmeCorporation.Foo) AcmeCorporation.Foo = {};
if (!AcmeCorporation.Foo.Bar) AcmeCorporation.Foo.Bar = {};
AcmeCorporation.Foo.Bar.WidgetFactory = {
createWidget: function(e) {
return new otherProvider.Widget(e);
}
};

This is frankly not very clean. It adds a lot of overhead to my productivity just to get started writing code.

So today, to compliment my using.js solution (which dynamically loads scripts), I have cobbled together a very simple script that dynamically defines a namespace in a single line of code:

// acme.WidgetFactory.js
namespace('AcmeCorporation.Foo.Bar');
AcmeCorporation.Foo.Bar.WidgetFactory = {
createWidget : function(e) {
return new otherProvider.Widget(e);
}
};
/* or, alternatively ..
namespace('AcmeCorporation.Foo.Bar.WidgetFactory');
AcmeCorporation.Foo.Bar.WidgetFactory.createWidget = function(e) {
return new otherProvider.Widget(e);
};
*/

As you can see, a function called “namespace” splits the dot-notation and creates the nested objects on the global namespace to allow for the nested namespace to resolve correctly.

Note that this will not overwrite or clobber an existing namespace, it will only ensure that the namespace exists.

a = {};
a.b = {};
a.b.c = 'dog';
namespace('a.b.c');
alert(a.b.c); // alerts with "dog"

Where you will still need to be careful is if you are not sure of load order then your namespace names all the way up the dot-notation tree should be namespaces alone and never be defined objects, or else assigning the defined objects manually may clobber nested namespaces and nested objects.

namespace('a.b.c');
a.b.c.d = 'dog';
a.b.c.e = 'bird';
// in another script ..
a.b = { 
c : {
d : 'cat'
}
};
// in consuming script / page
alert(a.b.c); // alerts [object]
alert(a.b.c.d); // alerts 'cat'
alert(a.b.c.e); // alerts 'undefined'

Here’s the download if you want it as a script file [EDIT: the linked resource has since been modified and has grown significantly], and here is its [original] content:

function namespace(ns) {
var g = function(){return this}();
ns = ns.split('.');
for(var i=0, n=ns.length; i<n; ++i) {
var x = ns[i];
if (x in g === false) g[x]={}; 
g = g[x];
}
} 

The above is actually written by commenter "steve" (sjakubowsi -AT- hotmail -dot-com). Here is the original solution that I had come up with:

namespace = function(n) {
var s = n.split('.');
var exp = 'var ___v=undefined;try {___v=x} catch(e) {} if (___v===undefined)x={}';
var e = exp.replace(/x/g, s[0]);
eval(e);
for (var i=1; i<s.length; i++) {
var ns = '';
for (var p=0; p<=i; p++) {
if (ns.length > 0) ns += '.';
ns += s[p];
}
e = exp.replace(/x/g, ns);
eval(e);
}
}

Currently rated 4.0 by 2 people

  • Currently 4/5 Stars.
  • 1
  • 2
  • 3
  • 4
  • 5

Tags: , ,

Javascript | Pet Projects

Why jQuery Plugins Use String-Referenced Function Invocations

by Jon Davis 25. September 2012 10:52

Some time ago (years ago) I cobbled together a jQuery plug-in or two that at the time I was pretty proud of, but in retrospect I’m pretty embarrassed. One of these plugins was jqDialogForms. The embarrassment was not due to its styling—the point of it was that it could be skinnable, I just didn’t have time to create sample skins—nor was the embarrassment due to the functional conflict with jQuery UI’s dialog component, because I had a specific vision in mind which included modeless parent/child ownership, opening by simple DOM reference or by string, and automatic form serialization to JSON. Were I to do all this again I would probably just extend jQuery UI with syntactical sugar, and move form serialization to another plugin, but all that is a tangent from the purpose of this blog post. My embarassment with jqDialogForms is with the patterns and conventions I chose in contradiction to jQuery’s unique patterns.

Since then I have abandoned (or perhaps neglected) jQuery plugins development, but I still formed casual and sometimes uneducated opinions along the way. One of the patterns that had irked me was jQuery UI’s pattern of how its components’ functions are invoked:

$('#mydiv').accordion( 'disable' );
$('#myauto').autocomplete( 'search' , [value] );
$('#prog').progressbar( 'value' , [value] );

Notice that the actual functions being invoked are identified with a string parameter into another function. I didn’t like this, and I still think it’s ugly. This came across to me as “the jQuery UI” way, and I believed that this contradicted “the jQuery way”, so for years I have been baffled as to how jQuery could have adopted jQuery UI as part of its official suite.

Then recently I came across this, and was baffled even more:

http://docs.jquery.com/Plugins/Authoring

Under no circumstance should a single plugin ever claim more than one namespace in the jQuery.fn object.

(function( $ ){

  $.fn.tooltip = function( options ) { 
    // THIS
  };
  $.fn.tooltipShow = function( ) {
    // IS
  };
  $.fn.tooltipHide = function( ) { 
    // BAD
  };
  $.fn.tooltipUpdate = function( content ) { 
    // !!!  
  };

})( jQuery );

This is a discouraged because it clutters up the $.fn namespace. To remedy this, you should collect all of your plugin’s methods in an object literal and call them by passing the string name of the method to the plugin.

(function( $ ){

  var methods = {
    init : function( options ) { 
      // THIS 
    },
    show : function( ) {
      // IS
    },
    hide : function( ) { 
      // GOOD
    },
    update : function( content ) { 
      // !!! 
    }
  };

  $.fn.tooltip = function( method ) {
    
    // Method calling logic
    if ( methods[method] ) {
      return methods[ method ].apply( this, Array.prototype.slice.call( arguments, 1 ));
    } else if ( typeof method === 'object' || ! method ) {
      return methods.init.apply( this, arguments );
    } else {
      $.error( 'Method ' +  method + ' does not exist on jQuery.tooltip' );
    }    
  
  };

})( jQuery );

// calls the init method
$('div').tooltip(); 

// calls the init method
$('div').tooltip({
  foo : 'bar'
});

// calls the hide method
$('div').tooltip('hide'); 
// calls the update method
$('div').tooltip('update', 'This is the new tooltip content!'); 

This type of plugin architecture allows you to encapsulate all of your methods in the plugin's parent closure, and call them by first passing the string name of the method, and then passing any additional parameters you might need for that method. This type of method encapsulation and architecture is a standard in the jQuery plugin community and it used by countless plugins, including the plugins and widgets in jQueryUI.

What baffled me was not their initial reasoning pertaining to namespaces. I completely understand the need to keep plugins’ namespaces in their own bucket. What baffled me was how this was considered a solution. Why not simply use this?

$('#mythingamajig').mySpecialNamespace.mySpecialFeature.doSomething( [options] );

To see about proving that I could make both myself and the “official” jQuery team happy, I cobbled this test together ..

(function($) {
    
    $.fn.myNamespace = function() {
        var fn = 'default';
        
        var args = $.makeArray(arguments);
        if (args.length > 0 && typeof(args[0]) == 'string' && !(!($.fn.myNamespace[args[0]]))) {
            fn = args[0];
            args = $(args).slice(1);
        }
        $.fn.myNamespace[fn].apply(this, args);
    };
    $.fn.myNamespace.default = function() {
        var s = '\n';
        var i=0;
        $(arguments).each(function() {            
            s += 'arg' + (++i).toString() + '=' + this + '\n';
        });
        alert('Default' + s);
        
    };
    $.fn.myNamespace.alternate = function() {
        var s = '\n';
        var i=0;
        $(arguments).each(function() {            
            s += 'arg' + (++i).toString() + '=' + this + '\n';
        });
        alert('Alternate' + s);
        
    };

    $().myNamespace('asdf', 'xyz');
    $().myNamespace.default('asdf', 'xyz');
    $().myNamespace('default', 'asdf', 'xyz');
    $().myNamespace.alternate('asdf', 'xyz');
    $().myNamespace('alternate', 'asdf', 'xyz');
    
})(jQuery);

Notice the last few lines in there ..

    $().myNamespace('asdf', 'xyz');
    $().myNamespace.default('asdf', 'xyz');
    $().myNamespace('default', 'asdf', 'xyz');
    $().myNamespace.alternate('asdf', 'xyz');
    $().myNamespace('alternate', 'asdf', 'xyz');

When this worked as I hoped I originally set about making this blog post be a “plugin generator plugin” that would make plug-in creation really simple and also enable the above calling convention. But when I got to the some passing tests, adding a few more tests I realized I had failed to notice a critical detail: the this context, and chainability.

In JavaScript, navigating a namespace as with $.fn.myNamespace.something.somethingelse doesn’t execute any code within the dot-notation. Without the execution of functional code, there can be no context for the this context, which should be the jQuery-wrapped selection, and as such there can be no context for the return chainable object. (I realize that it is possible to execute code with modern JavaScript getters and setters but all modern browsers don’t support getters and setters and all commonly used browsers certainly don’t.) This was something that I as a C# developer found easy to forget and overlook, because in C# we take the passing around of context in property getters for granted.

Surprisingly, this technical reasoning for the string-based function identifier for jQuery plug-in function invocations was not mentioned on the jQuery Plugins documentation site, nor was it mentioned in the Pluralsight video-based training I recently perused. It seemed like what Pluralsight’s trainer was saying was, “You can use $().mynamespace.function1()” but that’s obscure! Use a string parameter instead!” And I’m like, “No, it is not obscure! Calling a function by string is obscure because you can’t easily identify it as a function reference distinct from a parameter value!”

The only way to retain the this context while removing the string-based function reference is to invoke it along the way.

$().myNamespace().myFunction('myOption1', true, false);

Notice the parenthesis after .myNamespace. And that is a wholly different convention that few in jQuery-land are used to. But I do think that it is far more readable than ..

$().myNamespace('myFunction', 'myOption1', true, false);

I still like the former, it is more readable, and I remain unsure as to why the latter is the accepted convention over the former, but my guess is that a confused user might try to chain back to jQuery right after .myNamespace() rather than after executing a nested function. And that, I suppose, demonstrates how the former pattern is contrary to jQuery’s chainability design of every().invocation().just().returns().jQuery.

Currently rated 5.0 by 2 people

  • Currently 5/5 Stars.
  • 1
  • 2
  • 3
  • 4
  • 5

Tags: , , ,

Javascript

Personal Status Update [September 19, 2012]

by Jon Davis 19. September 2012 12:16

I know, I know, I promised I’d pick up some steam on my blog and then suddenly I got quiet. Here’s the deal .. when I started picking up a bit more steam a couple months ago, I was looking for my next permanent job. Well, I found it. And, I’m kind of floored by its potential, as well as the preexisting knowledge that I’d be surrounded by some amazing people as well as be given high expectations of technical and professional maturity. This is my dream job.

I’ll post more but for now I need to catch up on some Pluralsight training, perhaps (finally) get some Microsoft certifications (MCPD and perhaps at some point even MCM), get a couple project successes behind me, and ultimately prove my worth to my employer. They’re watching. o.O

Oh, by the way, remember using.js? I created it waaay back in 2008 and it was surprisingly popular. Anyway, today I finally moved it over to GitHub. https://github.com/stimpy77/using.js Recent Javascript design patterns training on Pluralsight as a refresher got me motivated to blow the dust off of it and put it in a more publically enjoyable space. Cheers.

Be the first to rate this post

  • Currently 0/5 Stars.
  • 1
  • 2
  • 3
  • 4
  • 5

Tags:

ASP.NET MVC 4: Where Have All The Global.asax Routes Gone?

by Jon Davis 23. June 2012 03:03

I ran into this a few days back and had been meaning to blog about it, so here it finally is while it’s still interesting information.

In ASP.NET MVC 1.0, 2.0, and 3.0, routes are defined in the Global.asax.cs file in a method called RegisterRoutes(..).

mvc3_register_routes

It had become an almost unconscious navigate-and-click routine for me to open Global.asax.cs up to diagnose routing errors and to introduce new routes. So upon starting a new ASP.NET MVC 4 application with Visual Studio 11 RC (or Visual Studio 2012 RC, whichever it will be called), it took me by surprise to find that the RegisterRoutes method is no longer defined there. In fact, the MvcApplication class defined Global.asax.cs contains only 8 lines of code! I panicked when I saw this. Where do I edit my routes?!

mvc4_globalasax

What kept me befuddled for far too long (quite a bit longer than a couple seconds, shame on me!) was the fact that these lines of code, when not actually read and only glanced at, look similar to the Application_Start() from the previous iteration of ASP.NET MVC:

mvc3_globalasax

Eventually I squinted and paid closer attention to the difference, and then I realized that the RegisterRoutes(..) method is being invoked still but it is managed in a separate configuration class. Is this class an application settings class? Is it a POCO class? A wrapper class for a web.config setting? Before I knew it I was already right-clicking on RegisterRoutes and choosing Go To Definition ..

mvc4_globalasax_gotodef

Under Tools –> Options –> Projects and Solutions –> General I have Track Active Item in Solution Explorer enabled, so upon right-clicking an object member reference in code and choosing “Go To Definition” I always glance over at Solution Explorer to see where it navigates to in the tree. This is where I immediately found the new config files:

mvc4_app_start_solex

.. in a new App_Start folder, which contains FilterConfig.cs, RouteConfig.cs, and BundleConfig.cs, as named by the invoking code in Global.asax.cs. And to answer my own question, these are POCO classes, each with a static method (i.e. RegisterRoutes).

I like this change. It’s a minor refactoring that cleans up code. I don’t understand the naming convention of App_Start, though. It seems like it should be called “Config” or something, or else Global.asax.cs should be moved into App_Start as well since Application_Start() lives in Global.asax.cs. But whatever. Maintaining configuration details in one big Global.asax.cs file gets to be a bit of a pain sometimes especially in growing projects so I’m very glad that such configuration details are now tucked away in their own dedicated spaces.

I am curious but have not yet checked to determine whether App_Start as a new ASP.NET folder has any inherent behaviors associated with it, such as for example post-edit auto-compilation. I’m doubtful.

In future blog post(s), perhaps my next post, I’ll go over some of the other changes in ASP.NET MVC 4.

Currently rated 3.8 by 27 people

  • Currently 3.814815/5 Stars.
  • 1
  • 2
  • 3
  • 4
  • 5

Tags:

ASP.NET | Web Development

LiveStream No Longer Free, Now $45 Per Month

by Jon Davis 1. March 2012 13:18

This is not a developer post at all, I wanted to share it with the world to see if this might get this picked up by Google as I am somewhat bewildered that a Google search does not reveal this information at all.

I have been enjoying a side personal interest in video production for several months now. I recently found a venue to focus this interest—a weekly group event at my church. I’ve become the videographer there during Wednesday evening meetings, and I’ve restored the execution of the group’s vision to LiveStream the meetings by providing the tools needed to get the job done, i.e. connectors from my camera, converters to a PC, etc.

Unfortunately, after some recent changes to LiveStream that they are still undergoing it appears that with the “New Livestream” the ability to produce and publish live content or share video clips for free is going to be going away completely. This ability is apparently being replaced by “Producer Accounts” which will apparently cost $45 each month. This is what I posted on their Forums @ http://www.livestream.com/forum/showpost.php?p=31605&postcount=40:

There does appear to be a Search field at the top of the site, don't know if it was recently added or not, but it doesn't seem to work to find either of the two channels I support. I agree that category browsing MUST be returned.

My concern, however, is that the profile page on the "new" site shows nothing. The old page for http://www.livestream.com/{channelname} works "fine", but in the new site with http://new.livestream.com/accounts/79748 it's flat disabled. There is no support whatsoever for even producing live content for free. The messaging when logged in says, "Upgrade to a Producer Account to Create Events - We see that you currently have a New Livestream Viewer Account (Free). To create events and post text, video clips, photos and live video, you'll require a New Livestream Producer Account." Even video clips and live video requires a "producer account" which starts at $45/mo.

Presumably, the old site and its featureset are going to go away completely. So, apparently, it's over, guys. No more free LiveStreaming.

These observations could be temporary or perhaps misread, but it looked pretty obvious to me that this is what is going on.

image

*click* on Upgrade Now ..

image

 UPDATE: Heard back from LiveStream:

"The reason your channel is not showing up in search is most likely because it has not yet been verified. To be clear, the original Livestream isn't going anywhere. We plan to continue to offer free streaming for the foreseeable future.

Currently rated 2.3 by 3 people

  • Currently 2.333333/5 Stars.
  • 1
  • 2
  • 3
  • 4
  • 5

Tags:

Adding Compiled .ResX Resources To NuGet Packages

by Jon Davis 16. November 2011 17:05

At my current workplace, we are using NuGet internally for managing internal ASP.NET MVC 3 project templates. If you open up Visual Studio’s Tools -> Options dialog window and expand Package Manager, you will find a “Package Sources” section where you can define locations for acquiring NuGet packages. We have a second location defined in this section with the path being a directory on our LAN. There are several advantages of taking this approach to managing team resources, not the least of which is the fact that updating our template source code repositories over SVN will not, and should not, update (read: break) the development workflows for people already working on their projects unless those individuals manually invoke an Update-Package command from the Package Manager Console. This scenario is obviously not ideal for many teams but it is quite useful for ours since each project instance has its own lifecycle and is short-lived.

One of the dependencies of our ASP.NET MVC templates is the utilization of .resx files for managing various pieces of content such as the labels on the forms. Have a resource file gives specific members of the organization a very specific and anticipated location to update the content to suit the needs of the project instance. The content items are also programmatically accessible when the Access Modifier is set to public; a .Designer.cs file is generated and injected into the project automatically by Visual Studio, appearing as a generated code-behind file for the .resx file, which exposes the API required to programmatically read content from this file by item name so that the developer does not have to stream the .resx out manually as an embedded resource stream. Microsoft .NET also automatically pulls the content from the suitable resource file when the culture is added to the filename, based on the user’s culture of the current thread; in other words, if a resource file is called FormFields.resx, but the user is French-Canadian (fr-CA), then the content in FormFields.fr-CA.resx is automatically used. Of course, we have to manually synchronize the user’s culture with the thread, that is a separate discussion, but the point is that it can be beneficial, and in our case it is, to utilize .resx resources in a project.

Unfortunately, when installing a NuGet project that contains .resx file content, the generated designer C# file that exposes programmatic access to the items in the resources file does not get added correctly, nor is the metadata on the .resx file that declares the resource’s Access Modifier to be “Public”, which is how Visual Studio knows to inject that generated file. Our template’s C# codebase depends upon the presence of the generated C# object members for the .resx, so in the absence of the generated code-behind file the project importing this template will not compile. Our workaround had been to open up the .resx file, change the Access Modifier back to “Public”, save it, and Rebuild. This worked fine, but it has been a huge annoyance.

So we decided to look into automatically fixing this within the Install.ps1 PowerShell script which NuGet invokes upon installing a package. Visual Studio’s DTE automation object and its Project object are both injected to the Install.ps1 script that PowerShell invokes.

param($installPath, $toolsPath, $package, $project)

We store our resources in a Resources directory in the project, so iterating through the project items to identify our .resx files was straightforward enough.

$resitems = @{}
foreach ($item in $project.ProjectItems) 
{
    if ($item.Name -eq "Resources") {
        foreach ($resitem in $resources.ProjectItems) {
            if ($resitem.Name.ToLower().EndsWith(".resx")) {
               $resitems.Add("Resources\" + $resitem.Name, $resitem.FileNames(0))
            }
        }
    }
}

Unfortunately, I found no way within EnvDTE automation to modify the properties the .resx file that pertain to the generated file! At best, the ProjectItem object exposes a Properties member that lists out various bits of metadata, but I found nothing here that can be changed to cause the .resx file to use a generated file.

The best I could find and tried to play with was a property called “IsDesignTimeBuildInput” that I thought I could apply to the .Designer.cs file, but attempting to set this value to true proved unfruitful:

# where $cb2 is the .Designer.cs file
$cb2.Properties.Item("IsDesignTimeBuildInput").Value = $TRUE

.. results in ..
Exception setting "Value": "Invalid number of parameters. (Exception from HRESULT: 0x8002000E (DISP_E_BADPARAMCOUNT))"

I did manage to get a code-behind file added to the .resx file, however.

switch ($item.ProjectItems) { default {
	if ($_.Name.ToLower() -eq $resitem.Name.ToLower().Replace(".resx", ".designer.cs")) {
		$hasCodeBehind = $TRUE
		$codebehinditem = $_
	}
}}
if ($hasCodeBehind -eq $TRUE) {
	$fn = $resitem.FileNames(0)
	$cbfn = $codebehinditem.FileNames(0)
	$codebehinditem.Remove()
	$cb2 = $resitem.ProjectItems.AddFromFile($cbfn)
}

At this point, it would prove obviously beneficial to use a comparison tool such as Beyond Compare (which I used) to compare the contents of the .resx file, the .Designer.cs file, and the .csproj file (the Visual Studio project file) between my half-restored NuGet injection and a properly working project instance. Doing this, I found that there are absolutely no changes made to the .resx file to toggle its code-behind / generator behavior, and of course the .Designer.cs is just the output of the generator so it has no flags, either. All of this metadata is therefore made to the project (.csproj) file.

And since there do not seem to be any EnvDTE interfaces to support these project file changes, it seems that the change must be made in the project XML directly. This can cause all kinds of unpredictable problems, the least of which is an ugly dialog box for the user, “Project has changed, reload?” Nonetheless, this is what’s working for us, and it’s better than a broken build that requires us to manually open the .resx file.

The full solution:

param($installPath, $toolsPath, $package, $project)

#script to fix code-behind for resx
set-alias Write-Host -Name whecho
whecho "Restoring resource code-behinds (this may cause the project to be reloaded with a dialog) ..."
$resitems = @{}
foreach ($item in $project.ProjectItems) 
{
    if ($item.Name -eq "Resources") {
        $resources = $item
        foreach ($resitem in $resources.ProjectItems) {
            $codebehinditem = $NULL
            if ($resitem.Name.ToLower().EndsWith(".resx")) {
                $hasCodeBehind = $FALSE
                switch ($item.ProjectItems) { default {
                    if ($_.Name.ToLower() -eq $resitem.Name.ToLower().Replace(".resx", ".designer.cs")) {
                        $hasCodeBehind = $TRUE
                        $codebehinditem = $_
                    }
                }}
                if ($hasCodeBehind -eq $TRUE) {
                    $fn = $resitem.FileNames(0)
                    $cbfn = $codebehinditem.FileNames(0)
                    $codebehinditem.Remove()
                    $cb2 = $resitem.ProjectItems.AddFromFile($cbfn)
                }
                $resitems.Add("Resources\" + $resitem.Name, $resitem.FileNames(0))
                whecho $resitem.Name
            }
        }
    }
}
$project.Save($project.FullName)
$projxml = [xml](get-content $project.FullName)
$ns = New-Object System.Xml.XmlNamespaceManager $projxml.NameTable
$defns = "http://schemas.microsoft.com/developer/msbuild/2003"
$ns.AddNamespace("csproj", $defns)
foreach ($item in $resitems.GetEnumerator()) {
	$xpath = "//csproj:Project/csproj:ItemGroup/csproj:Compile[@Include=`"" + $item.Name.Replace(".resx", ".Designer.cs") + "`"]"
	$resxDesignerNode = $projxml.SelectSingleNode($xpath, $ns)
	
	if ($resxDesignerNode -ne $NULL) {
	
		$autogen = $projxml.CreateElement('AutoGen', $defns)
		$autogen.InnerText = 'True'
		$resxDesignerNode.AppendChild($autogen)
		
		$designtime = $projxml.CreateElement('DesignTime', $defns)
		$designtime.InnerText = 'True'
		$resxDesignerNode.AppendChild($designtime)
	}
	
	$xpath = "//csproj:Project/csproj:ItemGroup/csproj:EmbeddedResource[@Include=`"" + $item.Name + "`"]"
	$resxNode = $projxml.SelectSingleNode($xpath, $ns)

	$generator = $projxml.CreateElement('Generator', $defns)
	$generator.InnerText = 'PublicResXFileCodeGenerator'
	$resxNode.AppendChild($generator)
	
	if ($resxDesignerNode -ne $NULL) {
		$lastGenOutput = $projxml.CreateElement('LastGenOutput', $defns)
		$lastGenOutput.InnerText = $item.Name.Replace("Resources\", "").Replace(".resx", ".Designer.cs")
		$resxNode.AppendChild($lastGenOutput)
	}

}
$projxml.Save($project.FullName)

UPDATE: Just an update on this, we have abandoned this approach to editing the XML. The project XML can be manipulated in-memory using the MSBuild automation object. Hints of what to do are found here:

http://nuget.codeplex.com/discussions/254095

Be the first to rate this post

  • Currently 0/5 Stars.
  • 1
  • 2
  • 3
  • 4
  • 5

Tags:

Upgrade / Update Rooted Android HTC EVO 4G Including HBOOT 2.16 Update

by Jon Davis 10. October 2011 06:50

I am just putting this out there for the Googlebots to pick up because I just went through hell trying to gather this information. The Android rooting community is awfully crude.

I had a somewhat stale version of Unrevoked-rooted Android installed on my HTC EVO 4G phone and it had 2.10.001 of HBOOT installed. Sprint pushed a new system update a month and a half or two months ago, and any such system update requires backing up (Titanium Backup), updating, re-rooting, and restoring. The latest from the Android rooting community is http://revolutionary.io which seems to require v2.15 or v2.16 of HBOOT. In my case, I got this unfriendly error: supersonic with 2.10.001 is not supported at this time.

So the obvious next path to take is to update HBOOT. Google for “how do I update HBOOT” or “update HBOOT”' or “upgrade HBOOT” or “download latest HBOOT”, etc., and you get a gajillion hits to absolute crap. Tons of forum.xda-developers.com forum posts, et al, but with no explanation on how to upgrade HBOOT. Eventually I figured that maybe the latest OTA update package download might bundle in an HBOOT update, so I kicked off a download of PC36IMG_SuperSonic_GB_Sprint_WWE_4.53.651.1_Radio_2.15.00.0808_NV_2.15_release_209995_signed.zip

This seemed to do the trick. Before I ran that, though, I should add, revolutionary.io’s web site referenced an IRC channel. Someone on the Unrevoked team proved INCREDIBLY helpful months ago when that was the primary method of rooting the EVO 4G, so I figured that was a likely good path. All I wanted to do was to validate that the above-mentioned .zip file was the correct approach, as well as seek assistance on steps to properly deploy in case my guesswork fails me.

Unfortunately, the guys I ran into on revolutionary.io team got bitten by the old IRC social crudeness bug. Not only did they complain that my questions were more appropriate for a “general Android discussion channel” and pointed me at http://revolutionary.io/topic.jpg, when I said that it couldn’t be more on-topic since their own EXE indicated that only v2.15/2.16 is supported they not only immediately banned me from the channel but they blocked my IP from their web server. (As if I didn’t have other access points.)

Just be wary. Rude and crude people lurk in those parts.

To root access developers, you might find more success in life if you replace that whole “ban the n00bs” mentality with a donation button.

Currently rated 4.3 by 6 people

  • Currently 4.333333/5 Stars.
  • 1
  • 2
  • 3
  • 4
  • 5

Tags:

YouTube Is No Longer For Leeroy Jenkins

by Jon Davis 28. August 2011 19:22
I am still very happy to keep my focus on web development by day as part of my day job, but for the last several months I have been getting personally acquainted with the World Wide Web's second or third most viewed web site. Not Facebook, not Google, we all know about those. These are fast getting supplanted by a web site and social network that is antiquating those two sites.

Two months ago, more than 30 hours of video content were being uploaded to YouTube every minute. Today, this number has grown to approximately 50 hours of video content per minute, and climbing. This is clearly the year of YouTube.

I am referring of course to YouTube.

The last year or so has seen a jaw-dropping surge of growth of activity on YouTube--not just in my own free time but with statistically everyone's Internet use on the whole. What was once known mainly for tired memes like the dancing baby of 1996, the fake vlogs of lonelygirl15, and the laughably bizarre martial arts moves of Star Wars fans, YouTube has recently become revolutionized by the broad availability of camera-enabled smartphones, iPads, and HD-video-ready cameras. 

Two months ago, more than 30 hours of video content were being uploaded to YouTube every minute. Today, invigorated by the proliferation of high quality video support in smartphones, cameras, and iPads, this number has grown to approximately 50 hours of video content per minute, and climbing. This is clearly the year of YouTube. Even I myself have begun dinking around with producing YouTube content, partly out of a huge curiosity I've had since I was a child in videography, photography, video editing, and video effects, and partly out of interest in the social, interactive network that YouTube is. I have also been attempting to use the video camera as a new sort of mirror, to reexamine my personal self and my life at home. The whole process has been surprisingly revealing and transformative. To boldly hold up a video camera, point it at oneself in one's own home, and say, "This is me, my life, how I live," regardless of whether one makes such content public, it is a life-changing experience to examine oneself through "another pair of eyes", so to speak. And this is especially true of me living alone, without a family (so far), with no one giving me direct feedback on a daily basis on how I think and live. On the other hand, were I married, I think we would have a blast as a family sharing real collaborative content with other vloggers rather than me going it alone.

But as far as the social network of YouTube goes, for me, YouTube has replaced both television and PC/Xbox gaming as the entertainment venue of choice. I no longer watch TV, except to watch the news and Conan O'Brien. MMORPGs have no charm anymore; LOTRO (and for that matter World of Warcraft) once enticed me with its dazzling graphics and fun gameplay, but the key ingredient in MMORPGs is the idea of doing fun stuff with other human beings around the world in a surreal way. YouTube is like an MMORPG, in a sense, too, but it is that MMORPG known as "real life", and I am entranced by the magic of watching my "friends"--that is, my favorite YouTube vloggers--crack jokes at each other, make music together, discover natural beauty of the Earth together, enjoy adorable pets together, travel the world together, or just slow down and be artistic.

Don't call it narcissism. When you have a video camera in your hands and flip the "REC" switch, anything and everything becomes a resource of creative content generation, and it's perfectly logical to take advantage of the most pliable, animate and controllable piece of material at one's disposal: oneself. On the other hand, should one be so lucky as to have other individuals, or surroundings, or other animate subject matters one can forget himself and focus on that instead.

For me, YouTube participation has replaced both television and PC/Xbox gaming as the entertainment venue of choice.

I credit the bulk of my fascination to the perfect blending of high definition video cameras, the HD video hosting that YouTube is, and high bandwidth from Internet service providers. High definition video has become the new "nice graphics" of last decade's graphics cards and gameplay; where I used to enjoy PC games like Unreal Tournament and Guild Wars because the graphics were so rich in detail, now I can watch a fresh high definition video displayed in 1080p produced by a fellow YouTube vlogger, and the video content is so realistic it is actually real. ;)  It's still only being rendered on my computer monitor, but you can even watch YouTube videos with 3D glasses, and produce video content for it relatively cheaply.

Vlogging (short for "video blogging") is taking over [written-form] blogging, and this is becoming more real by the hour. In fact, what programming I have been doing at home has involved abandoning (temporarily) the blogging software I was writing about just a couple months ago in order to work on some new desktop vlogging software I may or may not sell someday, if only for the occasional paid-for Starbucks coffee. It takes advantage of the YouTube publishing API and alleviates the problem I saw and experienced with YouTube's video upload page balking frequently on my erratic Wi-Fi connection. 

For those following for business-related interests, vlogging's growing popularity presents both an opportunity and a problem to Internet monetization. YouTube has a closed but ever-present monetization model. Owned by Google, it is Google. If you want to make money on YouTube, you need to produce compelling content on YouTube, associate your YouTube account with a Google AdSense account, and get people to watch your content. AdWords ads will then be displayed directly over and alongside the videos that are played by the viewing user. This is the traditional revenue model, and it has been succeeding even for amateur vloggers who have turned into professionals rather quickly. Cory Williams' "Mean Kitty" music video turned him into a star; in fact, it was disclosed on Tyra Banks' talk show a couple years ago that he was (at least at that time) raking in some $20,000 per month after that silly homemade video was produced. (He didn't want that disclosed, but it's very interesting to know.)

Understand clearly, I have no intentions of dropping my career in software & web development, you need to be physically attractive to make it in the vlogging world if you are not exceptionally talented in your creativity, and I am neither, though I have a few creative talents I exploit. Cory Williams is both. But that does not keep me from finding amazing opportunities in YouTube as both an entertainment and social venue. This becomes a real-life fascination when events like VidCon prove to be so much fun. At other entertainment-based social gatherings such as BlizzCon and ComicCon, you are surrounded by strangers who are either out-of-character or in full costume and looking silly. Whereas, at VidCon and the like, you are meeting and discovering the same people that you saw and "befriended" online with the exchange of video content, in their real and same form.

Businesses seeking to exploit the opportunities of the YouTube community require as much creativity as the YouTubers' creativity, to the extent of the opportunities available. There has never been a more interesting time to engage in guerilla marketing. The most jaw-dropping, amazing marketing campaign I have ever seen on the web occurred this year with Wrigley's 5 gum. Between amazing event stunts which were captured on video, highly unusual "seeding" tactics (that link is to my own video with my own ugly face! .. be warned!), and an astounding set of Hollywood-esque sci-fi-oriented interactive web sites, they literally freaked people out and convinced people that the world was going to come to an end or there was mind control going on, and they shocked everyone who was paying attention. To be honest, I think they took it too far. People became angry it was all about mere chewing gum. On the other hand, it was probably cheaper yet more effective for them to engage in guerilla marketing than to just dump a big, boring advertisement on traditional television.

Honestly, I think there can be simpler exploits. Target, for example, has really blown me away with their YouTube channel where they pay more for the content production and less for the distribution (YouTube is free!), although some of their YouTube ads have made it on the traditional television, too, I've noticed. It could also significantly benefit a business or organization to participate in, sponsor, or host an event that collaborates with YouTube "players". For example, Maxis promoted Darkspore by inviting YouTube vlogger KatersOneSeven to visit their office for a promotional round of vlogging about the game's release. More recently, a "YouStars" event might as well have been sponsored by Poland's department of tourism because a recent round of vlogs from various vloggers by way of a hosted event there have really put Poland on the map this month.

I must also make mention of video generator web sites, such as Animoto and Xtranormal. These are interesting examples of third party creative efforts to work with the opportunities of online video content production, by assisting end users with no video cameras or know-how with tools to give them an outlet for creativity. While initial tinkerings can be had for free, everything worthwhile comes at a price, and that means monetization from good tool makers. Obviously, credit goes also to the fine consumer-level desktop software applications such as iMoviePinnacle StudioSony moviEZ, and Sony Vegas. Such applications, especially iMovie, sometimes bundle a number of creative "movie-production-in-a-can" prefabbed generators, as well as transitions, text effects, and video and audio effects.

One does not have to be a video producer or AdWords marketer to be able to exploit YouTube, and it's time to start getting creative about all this.

The problem of YouTube, which I suppose is also an opportunity, is the fact that YouTube is still a video uploading and viewing web site with social features, and not a complete social network. You cannot even publish a video just to your friends list, for example, which I personally find very frustrating as I had a lot of content I ended up deleting because it wasn't appropriate for public viewing but I didn't want it "private", either. This seems to open the door to alternative web development. I have been pondering the viability of someone producing an external YouTube-like site that exploits the YouTube API and perhaps even looks and feels like YouTube, but is clearly not YouTube, and offers additional features YouTube doesn't offer, such as sharing "Unlisted" videos with people on one's Friends list. There seems to be the absence of significant external web application exploits of the YouTube APIs with compelling statistical followings. I am still yet unsure as to whether this is because somehow people are unwilling to get involved with external web sites as YouTubers, or if this is because there have just been too few attempts made to make it all work. I suspect the latter. YouTube has been integrating at the embeded video level quite successfully for several years, but to actually search for videos and browse videos and enjoy a YouTube channel on such a web site as if actually on YouTube is something I just have not seen yet, or have not seen done cleanly and in a trustworthy manner. I was hoping to see something like a re-made channel concept on Tumblr, but firing Tumblr up and poking at it for a day or two I discovered it is nowhere near supportive of such a thing.

You also cannot produce any content for YouTube besides video, video organizing, video descriptions, and commenting. Whereas, Facebook and other social networks provide opportunities for individuals and companies to produce any form of content by way of a plug-in architecture, a la "Facebook applications". This is another area where an external web site that takes advantage of YouTube's API for its content and membership features could greatly enhance the whole experience, if only it could be implemented well and gain sufficient popularity.

I'd like to see where this goes. Seriously. One does not have to be a video producer or AdWords marketer to be able to exploit YouTube, and it's time to start getting creative about all this. What are your thoughts?

Who Cares About Blog Software?

by Jon Davis 8. July 2011 02:02

It dawned on me tonight that there are a few people out there who actually read my blog, mostly people I don’t know, but almost all of them people who write ASP.NET software. I suppose this would make me happy, except for the fact that sometimes, such as with my previous post and the last couple months of “I’m gonna do something here” posts, I really make myself look like .. well, like someone who could be labeled lots of different ways, depending on the reader, but several such ways could be derogatory.

Truth be told, at this point now at 2011 I confess I don’t care deeply about blog software. Innovation in blog software is no longer my motivator for creating another one, if simply because it’s been done, many times, not just by other .NET developers but even by me. Yet I move forward, still not knowing for sure what the next couple weeks will hold for this strategy. I could have easily called it done already after last weekend’s nearly-successful implementation, but there were changes I wanted to make, changes that I found tonight I could not make without actually starting over, thanks to the limitations of Entity Framework Code First Magical Unicorn Edition 4.1™.

Tonight I have to take an honest additional look at my motives, as I began to do here already. If I don’t have a deep care about blogging software, why reinvent the wheel?

It’s more about having some limited degree of independence upon a prepackaged application, rather than install a prefab website like WordPress which is something that one of my non-technical family members could do if they put their mind to it. If one cannot build his own blogging platform upon which he blogs, what right can he say that he is a broadly experienced web developer in the first place?

On the other hand, I have drawn the line at some point in the architecture, in this case my use of components that are intended to make development very lightweight exercises. And perhaps the tech is not the place where I should be growing myself. Perhaps the re-selection, adoption, and participation with others’ efforts is a better path.

For example, even as I have become bored of BlogEngine.net as I have switched to ASP.NET MVC, and I am disheartened by the motives of some members of the Orchard project community, there are still other projects out there that are worth considering delving into.

Two of them are worth noting right now:

ASP.NET MVC CMS ( Using CommonLibrary.NET ) – This one is a mouthful and hardly a pretty name. But what I like about it is its heavy use of highly reusable “make-my-development-life-better” code libraries, namely CommonLibrary.NET which has a ton going for it. This package claims to be a CMS, but appears to be more of a blog engine at least in its prefab implementation, but although it looks a bit ugly on the aesthetic side, it looks feature rich, and it’s tempting to fork it, migrate it to MVC 3 + Razor, and give it a prettier name or something. I have been looking for something I can use as a strong foundation and on the surface this looks like a very accommodating solution, although I have yet to actually try it out.

Another solution worth checking out is AtomSite. This one is a couple years old but it feels a little bit like it was pretty much exactly what I was going to do anyway, although it is based on MVC 1.0 so I’d definitely migrate it to MVC 3 + Razor as I would with the other solution mentioned above.

If I choose to run with one of these, I really feel like I’d need to try to make it my own—fork it, name it, port it to MVC 3 + Razor, theme it, spruce it up, and make it sexy. This would probably mean finding preexisting designs such as for WordPress and porting them. I don’t do original creative design work very well.

At the end of doing all that, though, I still fear I’d end up just going back to my own blog engine idea, because I still have this lingering desire to build upon something that was more independent .. oh brother .. why are you even bothering to read this?! I’m just taking notes and bemoaning my indecision here. At the end of the day, I still need to step away from the computer and have a life.

Be the first to rate this post

  • Currently 0/5 Stars.
  • 1
  • 2
  • 3
  • 4
  • 5

Tags:

Changes Are Coming

by Jon Davis 6. July 2011 23:30

Well, it's been a wonderful ride, nearly half a decade working with BlogEngine.net's great blogging software. But it's time to move on.

Orchard, it was very nice to meet you. You have a wonderful future ahead of you, and I was honored to have known you, even just a little. Unfortunately, you and I are each looking for something different. 

WordPress, you are like a beautiful, sexy whore, tantalizing on the outside and known by everybody and his brother, but quite honestly I'm not sure I want to see you naked more than I already have.

I'm frickin' Jon Davis, I've been doing software and web development for 14 nearly 15 years now, and doggonit I should assume myself to be "all that" by now. Actually, blog engines should be like "Hello World" to me by now. I suppose the only reason why I've been too shy to do it thus far is because the first time I started building a complete blogging solution eight or nine years ago and stopped its continuance six or seven years ago the thing I built was proven to be an oddball hunk of an over-programmed desktop application that I had primarily leveraged to grow broad technical talents. It was a learning opportunity, not a proper blogging solution, and it smelled of adolescence. (To this day it won't even compile because .NET 2.0 broke it.)

In the mean time, I've moved on. I've been an employer-focused career guy for the last five or six years, having little time for major projects like that, but still growing both in technical skill set and in broad understanding of Internet markets and culture.

But I kind of miss blogging. I used to be a prolific blogger. I sometimes browse my blog posts from years ago and find some interesting tidbits of knowledge, in fact sometimes I actually learn from my prior writings because I later forget the things I had learned and blogged about but come back to re-learn them. Sometimes, meanwhile, I'll find some blog posts that are a little bizarre--thoughtful in prose, yet ridiculous in their findings. That's okay. My goal is to get myself to think again, and not be continuously caught up in a daily grind whereby neither my career nor technically-minded side life have any meaning.

Last weekend over two or three days I created a new blog engine. (Anyone who knows me well knows that I've been tinkering with social platform development on my own time for some years, but this one was from-scratch.) I successfully ported all of my blog posts and blog comments from my BlogEngine.net to my new engine and got it to render in blog form using my own NUnit-tested ASP.NET MVC implementation. I would have replaced BlogEngine.net here on my site with my blog engine already, were it not for the fact that as I used Entity Framework Code First I ran into snags getting the generated database schema to correctly align with long-term strategies. And as much as I'd be delighted to prove out my ability to rush a new blog engine out the door, I don't necessarily want to rush a database schema, especially if I intend to someday share the schema and codebase with the world.

And I never said I was going to open-source this. I might, but I also want to commercialize it as a hosted service. I'll likely do both.

But it's coming, and here are my dreamy if possibly ridiculous plans for it:

 

  1. Blogging with comments and image attachments. Nothing special here. But I want to support using the old-skool MetaWeblog API, so that'll definitely be there, as well as the somewhat newer AtomPub protocol.
  2. Syndication with RSS and Atom. Again, nothing special here.
  3. As a blogging framework it will be a WebMatrix-ready web site (not web application). Even though it will use ASP.NET MVC it will be WebMatrix gallery-loadable and Notepad-customizable. The controllers/models will just be precompiled. Note that this is already working and proven out; the depth and detail of customizability (such as a good file management pattern for having multiple themes preinstalled) have not been sorted out yet, though.
  4. AppHarbor-deployable. AppHarbor is awesome! Everything I'm doing here is going to ultimately target AppHarbor. Right now the blog you're looking at is temporarily hosted on a private server, but I want that to end soon as this server is flaky.
  5. Down-scalable. I am prototyping this with SQL Server Compact Edition 4.0, with no stored procedures. Once the project begins to mature, I'll start supporting optimizations for upwards-scalable platforms like SQL Server with optimized stored procedures, etc., but for now flexibility for the little guy who's coming from WordPress to my little blog engine is the focus.
  6. Phase 1 goal: BlogEngine.net v1.4.5.0 approximate feature equivalence (minus prefab templates and extra features I don't use). BlogEngine.net is currently at v2.x now, and I haven't really even looked much at v2.x yet, but as of this blog post I'm currently still using v1.4.5.0 and rather than upgrade I just want to swap it out with something of my own that does roughly the same as what BlogEngine.net does. This includes commenting, categories, widgets, a solid blog editor, and strong themeability; I won't be creating a lot of prefab themes, but if I'm going to produce something of my own I want to expose at least the compiled parts of it to others to reuse, and I'm extremely picky about cleanliness of templates such that they can be easily updated and CSS swappages with minimal server-side code changes can go very far.
  7. Phase 2 goal: Tumblr approximate feature equivalence (minus prefab templates). All I mean by this is that blog posts won't just be blog posts, they'll be content item posts of various forms--blog posts, microblog posts, photo posts, video posts, etc. Still browsable sorted descending by date, but the content type is swappable. In my current implementation, a blog entry is just a custom content type, and blogs are declared in an isolated class library from the core content engine. I also want to support importing feeds from other sources, such as RSS feeds from Flickr or YouTube. Tumblr approximate equivalence also means being mobile-ready. Tumblr is a very smartphone-friendly service, and this is going to be a huge area of focus.
  8. Phase 3 goal: WordPress approximate equivalence (minus prefab templates). Yeah I know, to suggest WordPress equivalence after already baking in something of a BlogEngine.net and Tumblr functionality equivalence, this is sorta-kinda a step backwards on the content engine side. But it's a huge step forward in these areas:
    • Elegance in administration / management .. the blogger has to live there, after all
    • Configurability - WordPress has a lot of custom options, making it really a blog on steroids
    • Modularity - rich support for plug-ins or "modules" so that whether or not many people use this thing, whoever does use it can take advantage of its extensibility
    • Richer themeability - WordPress themes are far from CSS drops, they are practically engine replacements, but that is as much its beauty as it is its shortcoming. You can make it what you want, really, by swapping out the theme.
Non-goals include creating a full-on CMS. I have no interest in trying to build something that competes directly with Orchard, and frankly I think Orchard's real goals are already met with Umbraco which is a fantastic CMS. But Umbraco is nothing like WordPress, WP is really just a glorified blog engine. If anything, I want to compete a little bit with WordPress. And I do think I can compete with WordPress better than Orchard does; even though Orchard seems to be trying to do just that (compete with WordPress), its implementation goals are more in line with Umbraco and those goals are just not compatible because WordPress is a very focused kind of application with a very specific kind of content management.
 
And don't worry, I don't ever actually think I could ever literally compete with WordPress as if to produce something better. I for one strongly believe that it's completely okay to go and build yet another mousetrap, even if mine is of lesser ideals compared to the status quo. There's nothing wrong with doing that. People use what they want to use, and I don't like the LAMP stack nor PHP all that much, otherwise I'd readily embrace WordPress. Then again, I'd probably still create my own WordPress after embracing WordPress, perhaps just like I am going to create my own BlogEngine.net after embracing BlogEngine.net.

 

Be the first to rate this post

  • Currently 0/5 Stars.
  • 1
  • 2
  • 3
  • 4
  • 5

Tags:

ASP.NET | Blog | Pet Projects


 

Powered by BlogEngine.NET 1.4.5.0
Theme by Mads Kristensen

About the author

Jon Davis (aka "stimpy77") has been a programmer, developer, and consultant for web and Windows software solutions professionally since 1997, with experience ranging from OS and hardware support to DHTML programming to IIS/ASP web apps to Java network programming to Visual Basic applications to C# desktop apps.
 
Software in all forms is also his sole hobby, whether playing PC games or tinkering with programming them. "I was playing Defender on the Commodore 64," he reminisces, "when I decided at the age of 12 or so that I want to be a computer programmer when I grow up."

Jon was previously employed as a senior .NET developer at a very well-known Internet services company whom you're more likely than not to have directly done business with. However, this blog and all of jondavis.net have no affiliation with, and are not representative of, his former employer in any way.

Contact Me 


Tag cloud

Calendar

<<  March 2019  >>
MoTuWeThFrSaSu
25262728123
45678910
11121314151617
18192021222324
25262728293031
1234567

View posts in large calendar