Media Center Uses Flash For Some Internet TV Video Playback

by Jon Davis 24. October 2009 19:14

With Windows 7 installed, the Media Center desktop showed “Star Trek” (the original series) as something I could watch. I clicked on it (as something to laugh at), Media Center came up, it asked me to agree to the terms of Internet TV, and then it showed me three seasons of Star Trek to choose from. I selected an episode, and it showed me another dialog, asking me to agree to the terms of installing the Internet TV Update for Adobe Flash! So apparently the show runs on Flash video.

image

Why is this a big deal? Because Microsoft already has a lot already invested in video streaming and their own technologies compete with Flash. Why would they use Flash on their own Media Center? Was it because CBS refused to reencode their videos to WMV? Or does this mean that Microsoft has already caved?

Be the first to rate this post

  • Currently 0/5 Stars.
  • 1
  • 2
  • 3
  • 4
  • 5

Tags:

Visual Studio’s and .NET’s Most Overlooked Want Items

by Jon Davis 24. October 2009 17:19

There are some features gone missing in Visual Studio and .NET whose absence frankly tick me off at times and make me go looking around for alternative IDEs and toolsets or plan on building my own tooling. I’ve done an initial evaluation of VS 2010 Beta 2 for each of these and confirmed that the issues remain. I am going to be appending this with occasional tweaks as time goes on, so feel free to post comments and I’ll add them to my “prayer list”. :)

Keep in mind, this is not a “most wanted” list, but a “most overlooked” list. That is, I think the demand is a lot greater than people seem to appreciate, perhaps because they’ve gone on all these years living with the way things are and being numb.

  1. Javascript brace matching (FB 340268) – Anyone who spends more than a few occasional hours writing Javascript within Visual Studio has no doubt been frustrated by the fact that, despite IntelliSense’s best (yet usually broken) effort to auto-indent on line breaks—a feature, by the way, I’m constantly fighting with when writing JSON and the bodies of closures—it completely fails when it comes to transitioning from C# and we’ve been spoiled with simple brace matching. The fact is, there is absolutely no brace matching in the Javascript editor.
     
    By “brace matching” I’m referring to the feature in C#’s editor where typing a “}” will highlight both the “{” and the “}” momentarily to help you keep track of your context so as to help you ensure all of your braces have been properly closed. Unlike typical C# code, Javascript code with all its deep-nested closures can get pretty hairy when it comes to nested brace blocks. Put the two problems together—a broken auto-indent feature and no brace matching, and the best you can do sometimes is fire up the WHOLE FREAKING APPLICATION being developed, navigate to the page or context you’re working on, and see if the browser’s Javascript parser pukes on entry.
     
    Seriously, Microsoft, where’s the Javascript love? Your half-hearted IntelliSense support for jQuery was nice (too bad there’s STILL no auto-completion in VS 2010 for summary documentation, nor IntelliSense while writing the XML documentation or references nor even color coding / syntax highlighting of XML documentation) but it’s not even meeting us at halfway to our needs for an IDE that identifies Javascript as one of the most critical programming languages for all web development.
     
    We want you, Microsoft, to treat Javascript as a first-class language, not as an afterthought, because Javascript is the web’s programming language as much as HTML is. If you have to abandon Active Scripting as your IntelliSense provider and rewrite from scratch using JScript.NET or or Jint or somesuch in order to give Javascript IntelliSense greater functionality, so be it, do what it takes, but this is clearly quite broken as it is.
     
    However, brace matching is hardly treating Javascript as a first class language. It’s a simple feature, and we need it.
     
    I haven’t yet checked to see if Aptana Studio—for which I do personally have a professional license I don’t use—does any better in this department, but I’m getting close to desperate and should follow up on that. Unfortunately, then the problem becomes integration with a .NET-oriented codebase. Time to switch away from .NET just for that? No, but the question does come up in my mind. Seriously.
     
  2. Configureless Entities – I’ll let my Gemli.Data subproject do the talking on this one. I’ve already done all my whining and complaining by way of starting to take some action so I’d rather not repeat myself yet again. I’ve seen the blog articles highlighting the new POCO support for LINQ-to-Entities 4.0 and it looks as verbose and awkward as Fluent NHibernate.
     
  3. Configureless Routing – I haven’t said much about this, but another sub-project I’ve added to Gemli’s road map is fixing the ASP.NET MVC routing engine by replacing it altogether with one that actually works. I understand where ASP.NET MVC’s URL Routing engine came from. It came from copying Ruby on Rails. And everyone loves Ruby on Rails, so it must be good, right? Ruby on Rails is one of the big sources for pushing the whole “Convention Over Configuration”, so if we just copy everything that makes RoR tick we’ll have our CoC right? Am I the only one who is puzzled by the nonsense of the realities of the routing engine NOT conforming to CoC? In order to have a URL get routed to a controller’s action and parameters, you HAVE TO open up global.asax.cs and drum up a mapping—yes, a MAPPING!—of that URL to your controller’s method. Visual Studio starts you off with a default one with a parameter mapping of “id” to get you started. I honestly don’t think I’ve ever used that mapping. There is regex-based wildcarding and pattern-matching support, which is I guess where Microsoft gets off saying, “see? it’s config-free!”, but you still have to declare your mappings.
     
    It gets worse; not only do you have to add your route mappings manually, for me, when I declare them, half the time they simply don’t work. Just yesterday I came across a declared route where MyMethod(string myParam) was explicitly routed as such, yet when it executed it ended up going to MyOtherMethod() by way of the “wonderful” wildcarding feature. I don’t recall how I fixed this, but there’s another broken piece in this routing mess: the order in which you declare your mappings matters a great deal, so you *must* move the wildcarded declarations down to the bottom of your declarations.
     
    We shouldn’t even be talking about manual mappings. MVC routing should automatically map to controllers, their actions, and their parameters, using reflection, in the same way as old-school web paradigms classically mapped the file system and file names directly to slash-delimited URL naming structure.
     
    The Gemli sub-project will, on the application’s initialization, scan for all classes that inherit controllers, assume that all their public methods are action names, and regex-pattern match all primitive types to parsable strings. Any parameter on a method that has a complex type or struct will be disqualified as an MVC action at least as far as this auto-mapping is concerned. Route customizations via C# attributes as well as other means will also be added. Honestly, this sort of routing engine is not very hard to write. I’ll get around to working on it within the next week or two or three, and expect to have it out there in working order in a week or two after I get started. (Keep in mind I only do this stuff on weekends and occasional evenings.)
     
  4. Deploy-as-Windows-App Web Deployment – Long, long ago, in a galaxy far, far away, there was a web server created called Cassini that allowed you to execute ASP.NET without running IIS. A few people adopted and even extended Cassini, but it was always Cassini. Cassini was then bundled in with Visual Studio itself for Visual Studio localhost debugging, but deployment of web applications continued to target IIS, and thus ends the story of Cassini.
      
    Remember ClickOnce? That technology that Firefox users hated until this week because Microsoft snuck its ClickOnce support add-in onto Firefox without users’ permission, until this week when Mozilla banned it from the browser? Yeah, .. that ClickOnce technology was actually a brilliant idea, but it didn’t work too well because users had too little control. But what about the inverse? ClickOnce allowed for installing desktop applications as Windows Forms and WPF applications straight from the web. But what about a technology that auto-packages a locally-running (for localhost-only access) web app that can be shrink-wrapped distributed, executed as an EXE, and accessible from the Start menu?
     
    It’s really quite silly that we can’t create web applications and deploy them as standalone EXEs that run in a web browser. I mean, we can. But we have to track down that old Cassini codebase or find one of the third party distributions of it such as UltiDev. Then you still have a bunch of moving parts that you have to manually figure out how to package and distribute your solution in a self-starting thing. IIS 7.5 supports standalone instances but that requires Windows 7 and feature activation .. and, well, IIS.
     
    Requiring IIS or even auto-rigging a standalone web server to run a simple web app seems like a configuration burden to me. The new WebPI paradigm is nice but it’s really more for web dev experts/admins and developers and for Internet web app discovery, and not useful for software vendors writing software for people who just want to use a simple application at home with or without Internet access and without any knowledge of web dev. Using ASP.NET as a Windows desktop app view engine would be huge, I think, because it would give software vendors a transition path from a preexisting Internet web site already built to a Windows desktop application. Not to mention a transition path for individual web developers who would like to know more about desktop solutions. 
      
    I don’t know, maybe the feature isn’t hotly requested because there isn’t much demand, but I think there’s little demand because there’s been too little innovation in this area on Microsoft’s side; the practice has been used for over a decade but it’s almost always people outside of Redmond doing it. Do I have an immediate need for this? Admittedly, no, I don’t; however, the more I think about it the more my mind wanders into creative ideas of what ASP.NET MVC could introduce to the desktop publishing crowd (the original desktop weblog engine, Userland Radio, from which blogging was invented, used a self-running web hosting engine to run), the PowerShell crowd (yeah hey look at this), social networking (did anyone else notice that Opera 10’s Unite is now proxying media and file sharing from the Opera Windows app itself using Internet-accessible, sharable URLs?), the reporting and finance crowds (remember Microsoft Money? it was built on DHTML), and more.

This is a small list. There was at least one other thing I wanted to mention in a blog like this but I forgot what it was. As I said, I’m going to grow my list over time as I continue to recall the things I always wanted but never bothered to mention.

Be the first to rate this post

  • Currently 0/5 Stars.
  • 1
  • 2
  • 3
  • 4
  • 5

Tags:

Why Am I Scared Of C# 4.0?

by Jon Davis 21. October 2009 23:21

Microsoft is doing a good thing, and I should be excited to be tinkering with the new Visual Studio 2010 Beta 2 that just came out this week. Fact is, I haven’t prioritized time enough to do much of anything of any beta of VS 2010 yet. Limited energy levels caused by sloppy living habits and everyday work stress might have a bit to do with it, but I also must admit that, while I greatly look forward to the future, I’m simply not prepared to make it the present. I, like so many people in this field, am limited significantly by the constraints of my employer’s best interests and the limited availability of my time (or in my case the juggling of free time prioritization between Gemli, Stargate Universe, StumbleUpon surfing, and other geeky activities).

With respect to Gemli, much of what that project (including tons of wishlist scoped details) was about is based on the limitations of current RTM’d offerings from Microsoft. Visual Studio 2010 and .NET 4.0 address a lot of things I wanted, making parts of future-scoped bits of Gemli more or less moot. On the other hand, the parts that .NET 4.0 doesn’t address will likely become greatly enhanced by some of the new features of .NET 4.0, things like the dynamic object.

And in any case, the objectives of Gemli or any other open source project for that matter is to have the tooling needed to produce and maintain software in an efficient and stable manner, and so there’s nothing wrong with continuous improvement and evolution of the tools we use. I should, then, be excited about what Visual Studio 2010 and .NET 4.0 bring to do the table in the contexts of everything I’m doing, and not be holding back.

Yet, I do hold back. It could be fear, but it’s not that simple.

Perhaps the biggest problem that keeps me from tinkering is that one cannot tinker with “legacy” (RTM-based) projects in the new tooling environments without permanently upgrading the projects/solutions. It would be really neat if I could check out Gemli from TFS into VS 2010 and dabble in C# 4.0 features using #ifdef-like directives to isolate “legacy” (C# 3.0) and “modern” (C# 4.0) features and code sets. In this way, producing a .NET 3.5 flavor and a separate .NET 4.0 flavor of the framework would be as easy as recompiling with the selected solution configuration. And indeed I could do that, if I want to switch to VS 2010. But I can’t do that given that it’s an open source solution that I want other people to open up in a current RTM version of Visual Studio (2008), nor do I want to cut off that version of Visual Studio the moment VS 2010 is released.

But since VS 2010 cannot open a VS 2008 solution without upgrading it – or at least I assume it can’t, such was the behavior of VS 2005->VS 2008 – the benefits of VS 2010 / .NET 4.0 tooling are limited for the most part to greenfield projects and self-education of future technology. Neither of these are particularly practical in the absence of an employer who is willing and indeed desirous to stay on the cutting edge of a future-scoped product/project release. And believe me, this is not a complaint of my current employer. Few employers want to take such risks; they exist, but unless you’re talking about an innovative startup company, they’re really quite rare.

On the other hand, Visual Studio 2010 Beta 2 comes with a go-live license. If that’s not a sign that the core functionality of the toolsets is stable, I don’t know what a better sign would look like. Technically, anyone can start on Visual Studio 2010 Beta 2 projects with the intent to deploy at any time, if indeed they want to (it is technically still a beta), and if they’re looking at a long-term development process, RTM of .NET 4 / VS 2010 will come before they release. Otherwise, they can release anyway.

So really, making the switch right now, once and for all, might even make sense. The only catch or down side is the abandonment of last year’s tech. Such would be the downfall for an open source project. But this doesn’t have to be the case for everything I’m working on; if there’s no intent for the project to be shared with unknown outsiders, making the switch might as well be considered safe, even. Would you be able to place that bet? If so, how do you categorize yourself and where you work?

Be the first to rate this post

  • Currently 0/5 Stars.
  • 1
  • 2
  • 3
  • 4
  • 5

Tags:

Is It An Entity, Or Is It A Data Model?

by Jon Davis 13. October 2009 00:16

One of the big dilemmas I came across before making Gemli a publicly accessible resource (open source project) was what to call the objects being loaded and saved. I’d heard “entities” and I’d heard “data models”, and I didn’t know the difference.

Truth be told, I’m still not entirely sure I know the difference. But after Wikipedia searches and other small research I came to the conclusion that:

  • An “Entity” is a business object that is closer to being a domain object.
  • A “Data Model” is more of a description of something as it relates to data; it is closer to a database record, including its schema.

This distinction was important to me because Gemli is built to support POCO with minimal code, but it also needs to work with the data at a database integration level, so I needed to know what to call the two kinds of objects (the POCO object and the object that actually represents what will be loaded to/from the database). Initially, I called the objects that represented the database record “DataEntity”, and then its property that spit back out the POCO object was called the “InstanceObject” or something like that. Really weird. Bleaah.

Anyway, now any POCO object that is suitable for being wrapped and serialized to the database is called in Gemli an “Entity”. Once wrapped into database-translatable semantics, it is called a “Data Model”.

This is why some generic interfaces in Gemli take a TEntity and some take a TModel. They are not named differently because they are inconsistent, they are named differently because a different kind of type is expected. If you pass in a TEntity, you’re probably dealing with a generic interface that will spit that TEntity back out in its original form. If you pass in a TModel, you’re probably dealing with a query object or a data provider that can only work with data models using Gemli’s data modeling semantics.

var Bob = new Person("Bob Smith"); // Bob is an Entity. Person is any ol' POCO class.
var BobRecord = new DataModel<Person>(Bob); // BobRecord is a data model.
BobRecord.Save(); // Bob is a new customer/employee/thingamajig.

var SueRecord = DataModel<Person>.NewQuery()
    .WhereProperty["FirstName"].IsEqualTo("Sue")
    .SelectFirst(); // Sue has a record in the database because she is an
                    // existing employee (and yes lambdas will come to 
                    // Gemli, sheesh!)
Person Sue = SueRecord.Entity; // Sue is a person, too!!

Not sure if this discussion is valuable to anyone, but that’s Gemli’s story on the word choices.

Be the first to rate this post

  • Currently 0/5 Stars.
  • 1
  • 2
  • 3
  • 4
  • 5

Tags:

Gemli.Data v0.3.0 Released

by Jon Davis 12. October 2009 03:57

I reached a milestone this late night with The Gemli Project and decided to go ahead and release it. This release follows an earlier blog post describing some syntactical sugar that was added along with pagination and count support, here:

http://www.jondavis.net/techblog/post/2009/10/05/GemliData-Pagination-and-Counts.aspx

Remember those tests I kept noting that I still need to add to test the deep loading and deep saving functionality? No? Well I remember them. I kept procrastinating them. And as it turned out, I’ve discovered that a lesson has been taught to me several times in the past, and again now while trying to implement a few basic tests, a lesson I never realized was being taught to me until finally just this late night. The lesson goes something like this:

If in your cocky self-confidence you procrastinate the tests of a complex system because it is complex and because you are confident, you are guaranteed to watch them fail when you finally add them.

Mind you, the “system” in context is not terribly complex, but I’ll confess there was tonight a rather ugly snippet of code to wade through, particularly while Jay Leno or some other show was playing in the background to keep me distracted. On that latter note, it wasn’t until I switched to ambient music that I was finally able to fix a bug that I’d spent hours staring at.

This milestone release represents 130 total unit tests since the lifetime of the project (about 10-20 or so new tests, not sure exactly how many), not nearly enough but enough to feel a lot better about what Gemli is shaping in itself. I still yet need to add a lot more tests, but what these tests exposed was a slew of buggy relationship inferences that needed to be taken care of. I felt the urgency to release now rather than continue to add tests first because the previous release was just not suitably functional on the relationships side.

 

The Perfect Photos To Illustrate The Sadly Typical Software Development Process

by Jon Davis 8. October 2009 22:43

image

http://thereifixedit.com/

I just wanted to post a quickie post here to link to a site that, while it's a great and funny site in itself, I was actually very surprised by how perfectly every single photo over there illustrates some kind of software system I've touched, whether recently or long ago.

For example,

 

image

This one reminds me of the countless number of memory leaks we have to put up with when garbage collectors fail or are absent.

 

image

And this one reminds me of ol’ Windows ME, racing stripe and all.

 

image 

The ugly do-it-all “make this site your home page” portal web sites that cluttered the web just three years ago, Yahoo! being among them.

 

image

“We’ll refactor later.”

  

image

The company’s legacy software with broken APIs and/or endpoints we simply don’t have the resources to support anymore.

 

image

I swear this is a virtual photo representing every developer’s workstation at the office. It takes half an hour to boot our machines.

  

image

Manual deployment. It takes several of us to push a web site out to the servers, plus QA to approve the closure of the deployment ticket.

  

image

A very nice admin interface that no one but the devs will ever see. I wish we had an admin interface that spiffy and complete.

 

image 

This is our dev server, a virtual machine on an overloaded VM host with limited RAM. We have CruiseControl.NET on it and it takes about 30 minutes for it to build and deploy to test/QA on a single run.

 

image

This is what happens when your business partner uses Java and you’re using WCF. Add SSL, and viola!

 

image

The Microsoft Office COM APIs.

  

Epic-Kludge-Photo-RoundKnobInARectangleHole

(From udhay in comments:) Perfect example of the so called work-around.

 

.. Ohh this could go on forever. You get the idea. Have fun.

Why Hello There, Expression Tree

by Jon Davis 6. October 2009 20:53

I sought some enlightenment on FreeNode IRC in the C# channel regarding how to have strong property references without invoking them, including without delegate implementation.

The problem I’ve been faced with is that nasty string-based property specifier in Gemli.Data’s DataModelQuery<TModel>:

var women = DataModel<Person>.NewQuery()   
    .WhereProperty["Gender"] == Gender.Female;   
    .SelectMany().Unwrap<Person>();   
women.ForEach( woman => Hug(woman) );  

I first asked the same IRC channel what initial thoughts were on the direction Gemli was going and that stupid string-based indexer was the first thing to come up. “It looks brittle.” Agreed. This is the part that bugs me the most about Gemli.Data at the moment—well, tied, along with the lack of individual field selection and aggregate function support.

But LINQ-to-SQL doesn’t have this problem, which is interesting because LINQ-to-SQL doesn’t have any proprietary, Microsoft-alone-can-see-it under-the-covers magic about it. Yet with LINQ you get strong property references, without invoking the properties as CLR property behavior.

var women = (from woman in People   
             where woman.Gender == Gender.Female   
             select woman).ToList();   
women.ForEach( woman => Feed(woman) );  

How? LINQ-to-SQL doesn’t even execute that binary comparison in the CLR, it converts it to T-SQL.

First I had recently assumed up until today that this was a lambda expression and your CLR object that reaches LINQ’s “where” expression must surely be a class inheritor that mocks its base with no literal implementation. I mentioned this to the IRC channel, though, and was immediately told it’s done with expression trees. I was given a sample of Expression<Func<T, U>>. What? I then came across this: http://www.interact-sw.co.uk/iangblog/2005/09/30/expressiontrees  .. among a bunch of other articles and blog posts.

Looks like I still have a lot of homework this week because I’ve never looked at expression trees in C# before. Makes me feel like I’m behind, but there’s joy in knowing that I’m catching up in this and once I get caught up it’ll open up all kinds of new doors for Gemli.

(I might follow up on this blog entry inline, I just wanted to get something mentioned now since I’m heading to bed now and don’t want to lose the thought.)

Be the first to rate this post

  • Currently 0/5 Stars.
  • 1
  • 2
  • 3
  • 4
  • 5

Tags: , , ,

Gemli.Data: Pagination and Counts

by Jon Davis 5. October 2009 03:54

It occurred to me that I’m not blogging enough about how Gemli is shaping, other than mentioning the project on the whole in completely different blog posts.

In the dev branch I’ve dropped the abstract base DataModelQuery class (only DataModelQuery<TModel> remains) and replaced it with an IDataModelQuery interface. This was a long-needed change, as I was really only using it for its interfaces anyway. It was being used because I didn’t necessarily know the TModel type until runtime, namely in the DataProviderBase class which handles all the deep loads with client-side-joins-by-default behavior. But an interface will work for that just as well, and meanwhile the query chaining syntax behavior was totally inconsistent and in some cases useless.

Some recent changes that have been made in the dev branch include some more syntactical sugar, namely the support for loading and saving directly from a DataModel/DataModel<T> class or from a DataModelQuery<TModel> plus support for pagination and getting record counts.

Quick Loads

From DataModel:

List<MyPoco> collectionOfMyPoco = DataModel<MyPoco>.LoadAll().Unwrap<MyPoco>();

There are also Load() and LoadMany() on DataModel, but since they require a DataModelQuery object as a parameter, and DataModelQuery has new SelectFirst() and SelectMany() that do the same thing (forward the task on to the DataProvider), I’m not sure they’re needed and I might just delete them.

From DataModelQuery:

List<MyPoco> collectionOfMyPoco = DataModel<MyPoco>.NewQuery()
    .WhereProperty["Amount"].IsGreaterThan(5000m)
    .SelectMany().Unwrap<MyPoco>();

Pagination

Since Gemli is being built for web development, pagination support with pagination semantics (“page 3”, rather than “the rows between rows X and Y”) is a must. There are many things besides data grids that require pagination—actually, pretty much any list needs pagination if you intend not to show as many items on a page as there are in a database table.

Client-side pagination is implemented by default, but only if DB server-side pagination is not handled by the data provider. I still intend to add DB-side pagination for SQL Server.

// get rows 61-80, or page 4 @ 20 items per page, of my filtered query
var myListOfStuff = DataModel<Stuff>.NewQuery()
    .WhereProperty["IsActive"].IsEqualTo(true)
    .Page[4].OfItemsPerPage(20)
    .SelectMany().Unwrap<Stuff>();

To add server-side pagination support for a proprietary database, you can inherit DbDataProvider and override the two variations of CreateCommandBuilder() (one takes a DataModel for saving, the other takes a DataModelQuery for loading). The command builder object has a HandlePagination property that can be assigned a delegate. The delegate would then add or modify properties in the command builder that inject appropriate SQL text into the command.

Count

Support for a basic SELECT COUNT(*) FROM MyTable WHERE [..conditions..] is a pretty obvious part of any minimal database interaction library (O/RM or what have you). For this reason, GetCount<TModel>(DataModelQuery query) has been added to DataProviderBase, and in DbDataProvider it replaces the normal LoadModel command builder generated text with SELECT COUNT(*)…  The DataModelQuery<TModel> class also has a new SelectCount() method which forwards the invocation to the DataProvider.

long numberOfPeepsInMyNetwork = DataModel<Person>.NewQuery()
    .WhereProperty["Networkid"].IsEqualTo(myNetwork.ID)
    .SelectCount();

All of these changes are still pending another alpha release later. But if anyone wants to tinker it’s all in the dev branch in source code at http://gemli.codeplex.com/ .

LINQ May Be Coming

I still have a lot of gruntwork to do to get this next alpha built, particularly in adding a lot more tests. I still haven’t done enough testing of deep loads and deep saves and I’m not even confident that deep saves are working correctly at the basic level as I haven’t yet used them. Once all of that is out of my way, I might start looking at adding LINQ support. I’m not sure how awfully difficult LINQ support will prove to be yet, but I’m certain that it’s doable.

One of the biggest advantages of LINQ support is strongly typed member checking. For example, in Gemli, you currently have to declare your WHERE clauses with a string reference to the member name or the column name, whereas with LINQ the member name can be referenced as strongly-typed code. As far as I know, it does this by way of syntactic sugar that ultimately boils down to delegates that literally work with the object directly, depending on how the LINQ support was added.

Among the other advantages of adding LINQ support might be in adding support for anonymous types. For example, right now without LINQ you’re limited to DataModels and the class structures they represent. But LINQ lets you select specific members into your return object, for example, rather than the whole enchilada.

LINQ:

// only return ID and Name, not the whole Product
var mySelection = (from p in Product
                   select p.ID, p.Name).ToList();

It’s this underlying behavior that creates patterns for me to work with that I might drag into play the support for aggregate functions in SQL using terse Gemli+LINQ semantics. Right now, it’s literally impossible for Gemli to select the result of an aggregate function (other than Count(*)) unless that was wrapped in a stored procedure on the DB side and then wrapped in a DataModel on Gemli’s side. There’s also no GROUP BY support, etc. I like to believe that LINQ will help me consolidate the interface patterns I need to make all that work correctly. However, so far I’m still just one coder as no one has jumped on board with Gemli yet so we’ll see if LINQ support makes its way in at all.

Be the first to rate this post

  • Currently 0/5 Stars.
  • 1
  • 2
  • 3
  • 4
  • 5

Tags: ,

Pet Projects | Software Development


 

Powered by BlogEngine.NET 1.4.5.0
Theme by Mads Kristensen

About the author

Jon Davis (aka "stimpy77") has been a programmer, developer, and consultant for web and Windows software solutions professionally since 1997, with experience ranging from OS and hardware support to DHTML programming to IIS/ASP web apps to Java network programming to Visual Basic applications to C# desktop apps.
 
Software in all forms is also his sole hobby, whether playing PC games or tinkering with programming them. "I was playing Defender on the Commodore 64," he reminisces, "when I decided at the age of 12 or so that I want to be a computer programmer when I grow up."

Jon was previously employed as a senior .NET developer at a very well-known Internet services company whom you're more likely than not to have directly done business with. However, this blog and all of jondavis.net have no affiliation with, and are not representative of, his former employer in any way.

Contact Me 


Tag cloud

Calendar

<<  May 2018  >>
MoTuWeThFrSaSu
30123456
78910111213
14151617181920
21222324252627
28293031123
45678910

View posts in large calendar