Introducing The Gemli Project

by Jon Davis 23. August 2009 21:11

I’ve just open-sourced my latest pet project. This project was started at around the turn of the year, and its scope has been evolving over the last several months. I started out planning on a framework for rapid development and management of a certain type of web app, but the first hurdle as with anything was picking a favorite DAL methodology. I tend to lean towards O/RMs as I hate manually managed DAL code, but I didn’t want to license anything (it’d mean sublicensing if I redistribute what I produce), I have some issues with LINQ-to-SQL and LINQ-to-Entities, I find nHibernate difficult to swallow, I have some reservations about SubSonic, and I just don’t have time nor interest to keep perusing the many others. Overall, my biggest complaint with all the O/RM offerings, including Microsoft’s, is that they’re too serious. I wanted something lightweight that I don’t have to think about much when building apps.

So as a project in itself I decided first to roll my own O/RM, in my own ideal flavor. Introducing Gemli.Data! It’s a lightweight O/RM that infers as much as possible using reflection, assumes the most basic database storage scenarios, and helps me keep things as simple as possible.

That’s hype-speak to say that this is NOT a “serious O/RM”, it’s intended more for from-scratch prototype projects for those of us who begin with C# classes and want to persist them, and don’t want to fuss with the database too much.

I got the code functioning well enough (currently 92 unit tests, all passing) that I felt it was worth it to go ahead and let other people start playing with it. Here it is!

Gemli Project Home: http://www.gemli-project.org/ 

Gemli Project Code: http://gemli.codeplex.com/

Gemli.Data is currently primarily a reflection-based mapping solution. Here’s a tidbit sample of functioning Gemli.Data code (this comes from the CodePlex home page for the project):

// attributes only used where the schema is not inferred
// inferred: [DataModelTableMapping(Schema = "dbo", Table = "SamplePoco")]
public class SamplePoco
{
    // inferred: [DataModelFieldMapping(ColumnName = "ID", IsPrimaryKey = true, IsIdentity = true, 
    //     IsNullable = false, DataType = DbType.Int32)] // note: DbType.Int32 is SQL type: int
    public int ID { get; set; }

    // inferred: [DataModelFieldMapping(ColumnName = "SampleStringValue", IsNullable = true, 
    //     DataType = DbType.String)] // note: DbType.String is SQL type: nvarchar
    public string SampleStringValue { get; set; }

    // inferred: [DataModelFieldMapping(ColumnName = "SampleDecimalValue", IsNullable = true, 
    //     DataType = DbType.Decimal)] // note: DbType.Decimal is SQL type: money
    public decimal? SampleDecimalValue { get; set; }
}

[TestMethod]
public void CreateAndDeleteEntityTest()
{
    var sqlFactory = System.Data.SqlClient.SqlClientFactory.Instance;
    var dbProvider = new DbDataProvider(sqlFactory, TestSqlConnectionString);

    // create my poco
    var poco = new SamplePoco { SampleStringValue = "abc" };

    // wrap and auto-inspect my poco
    var dew = new DataModel<SamplePoco>(poco); // data entity wrapper

    // save my poco
    dew.DataProvider = dbProvider;
    dew.Save(); // auto-synchronizes ID
    // or...
    //dbProvider.SaveModel(dew);
    //dew.SynchronizeFields(SyncTo.ClrMembers); // manually sync ID

    // now let's load it and validate that it was saved
    var mySampleQuery = DataModel<SamplePoco>.NewQuery()
        .WhereProperty["ID"].IsEqualTo(poco.ID); // poco.ID was inferred as IsIdentity so we auto-returned it on Save()
    var data = dbProvider.LoadModel(mySampleQuery);
    Assert.IsNotNull(data); // success!

    // by the way, you can go back to the POCO type, too
    SamplePoco poco2 = data.Entity; // no typecast nor "as" statement
    Assert.IsNotNull(poco2);
    Assert.IsTrue(poco2.ID > 0);
    Assert.IsTrue(poco2.SampleStringValue == "abc");

    // test passed, let's delete the test record
    data.MarkDeleted = true; 
    data.Save();

    // ... and make sure that it has been deleted
    data = dbProvider.LoadModel(mySampleQuery);
    Assert.IsNull(data);

}

Gemli.Data supports strongly typed collections and multiple records, too, of course.

var mySampleQuery = DataModel<SamplePoco>.NewQuery()
    .WhereMappedColumn["SampleStringValue"].IsLike("%bc");
var models = dbProvider.LoadModels(mySampleQuery);
SamplePoco theFirstSamplePocoEntity = models.Unwrap<SamplePoco>()[0];
// or.. SamplePoco theFirstSamplePocoEntity = models[0].Entity;

Anyway, go to the URLs above to look at more of this. It will be continually evolving.

Be the first to rate this post

  • Currently 0/5 Stars.
  • 1
  • 2
  • 3
  • 4
  • 5

Tags: , , ,

C# | Open Source | Software Development | Pet Projects

CAPTCHA This, Yo! - Early Alpha

by Jon Davis 14. July 2009 01:54

I've posted a mini-subproject:

http://www.CAPTCHAThisYo.com/

The site is self-explanatory. The idea is simple. I want CAPTCHA. I don't want to support CAPTCHA in my apps. I just want to drop in a one-liner snippet somewhere and call it done. I think other people share the same desire. So I now support CAPTCHA as a CAPTCHA app. I did all the work for myself so that I don't have to do that work. I went through all that trouble so that I don't have to go through the trouble .... Wait, ...

Seriously, it's not typical CAPTCHA, and it's Not Quite Done Yet (TM). It's something that'll evolve. Right now there isn't even any hard-to-read graphic CAPTCHA.

But what I'd like to do is have an ever-growing CAPTCHA questions library and, by default, have it just rotate through them randomly. The questions might range from shape detection to riddles to basic math. I'd really like to have some kind of community uploads thingmajig with ratings, so that people can basically share their own CAPTCHA solutions and they all run inside the same captchathisyo.com CAPTCHA engine. I'm just not sure yet how to pull that off.

Theoretically, I could take the approach Microsoft took when C# was initially released (long, long ago, in a galaxy far, far away), they had a cool insect sandbox game where you could write a .NET class that implements some interface and then send it up to the server and it would just run as an insect on the server. The objective of the game was to write the biggest killer/eater. I'm not sure how feasible the idea is of opening up all .NET uploads to the server, but it's something I'm pondering on.

Anyway, the concept has been prototyped and the prototype has been deployed is sound, but I still need to work out cross-site scripting limitations, bear with me. I still need to find a designer to make something beautful out of it. That said, feel free to use it and give feedback. Stand by.

Be the first to rate this post

  • Currently 0/5 Stars.
  • 1
  • 2
  • 3
  • 4
  • 5

Tags:

C# | Computers and Internet | Cool Tools | Pet Projects | Techniques | Web Development

TDD: Here I Go!!

by Jon Davis 4. April 2009 11:57

About a week and a half ago I finally started breaking ground (that is, created a Visual Studio soluton and started writing code, with the intention of it being used in production) on a big composite pet project I've been considering for a very long time. I won't get into what the project is going to be about (at least not now), but I will say parts of it are multi-faceted software foundation / framework. So, once I'm done, I'll be able to take a bunch of the code I wrote here and re-apply it on a completely different project. This is important to me as I am seriously in need of gaining some inertia in pet project development and deployments.

So right away I started writing tests to "prove out" each little piece of code as I wrote it. I'd heard the TDD gospel and wanted to convert, but I don't have a lot of TDD practice under my belt. But right away I had about 75% code coverage, which I felt was pretty good compared to 0% of projects in the past. Every time I wrote a little bit of code, I hit Ctrl+R, A to be sure nothing I just did failed. This worked fine, at the beginning, while working with simple, lightweight objects.

Pretty soon I found myself implementing abstract provider/service classes, and then create at least one or two implementations of them. And then I wrote this big dependency object that gets passed around back and forth. By the time I had some prototype code written, I realized that my tests would not break because I hadn't been writing code around my tests, I had been writing my tests around the code I had been writing. And at this point I was clueless as to whether the code I had just written would work, because this was all plumbing code. I wanted to at least try something and see if the basics were working, so I created a console application and wrote some code with Console.Write()'s to get a visual on the output. But at this point things were getting messy. My dependency object I had created was created for the one implementation class of my abstract provider/service object. I now had a throw-away console app in my solution that didn't belong. And my test coverage was down to something like 5%.

That's when I realized I was going about this all wrong. For months I had a hard time understanding why TDD proponents argued that writing code before their tests is "putting the cart before the horse", when writing tests before code seemed so backwards to me. But now it started to click. For many years, I've been designing code in my head, then implementing those designs in code, and if it works, great, if not, I fix the bugs. I'm starting to see now how I need to retrain myself. It's okay to design code in your head. You just need to document those designs in tests. At first, they won't even compile. Fine. But at least document your designs in tests.

Funny, I haven't ever heard anyone make the connection, but writing tests with mocks of the way you want your code to work is exactly the same to a programmer as wireframes are to web design. One should not design a web page by throwing some HTML into a web site, trying to make it look nice and complete, and then showing it to the customer for the first time and saying, "Here you go, it's done. Now tell me what's wrong with it." Rather, a professional web production team would first collect the requirements (in workflow verbiage) of the web site, throw together some wireframe mock-ups, create Photoshop design comps in parallel, and then go to implementation, all the while following up with the customer for feedback during each and every step along the way. TDD is exactly the same but for the programmer. The tests are wireframes and business rules, just like a web application spec has bullet-point business rule text that serves as manual tests for all parties involved to both define and validate the implementation.

Even if there is no third party customer who would approve your "wireframes" and you're working solo, the analogy of "wireframing a web design" should still apply. A web designer/engineer can, but shouldn't, create a web site before wireframing it. He should start with a napkin, and go from there. Likewise, one can, but shouldn't, write code before writing tests for that code, because tests are not just validations that assumed functionality works but are also definitions of assumed functionality, and without definitions of assumed functionality there are really no coding objectives.

The hardest part for me to do TDD so far in this new experience is realizing I wasn't doing TDD in the first place, and then going back and scrapping all the code I had just written, whether it was good or not. I knew it had a little bit of 'bad' mixed in with the good, and that should not be acceptable. Awareness of "a little bit of bad code buried in there" typically means mostly bad code in the long run, and a lot of wasted time, because bad code cascades and affects everything else, like yeast.

One other thing about these incidents is that I've also been re-learning the theory of YAGNI (you ain't gonna need it!). I was getting too comfortable with the idea of introducing extra method signatures on objects that were not getting properly tested to begin with, and for which my use cases were purely theoretical, and not truely intentional. I've argued here in this blog in the past that YAGNI is extremely naive and not the appropriate attitude when writing software in smaller shops (like one or two man bands) because the multi-role developer often knows exacty what he needs and should not be withheld from adding what he needs in order to do his job. That's great, ignore YAGNI, so long as you're taking an "undisciplined", less-formal approach to software or are writing software that is not intended to be reused such as a framework. However, in my case, I'm writing framework bits, and I must balance ease of use by way of high versatility versus ease of use by way of simplicity. Other programmers, including myself at some later point in time, should not be confused with options. Programmers are software users, just like their end-users are software users. So the end-users use the mouse and click simple things, whereas programmers use framework code to write code, fine, either way, they're all users. The Apple and Google approaches are to keep it simple, stupid (KISS). Don't overwhelm the user with options, especially if either of two options reach the same outcome. I should define one "best practices" path and only introduce the other path when it is needed, not when it is slightly and occasionally convenient.

Part of the reason why I write like such is to drive it into my head. I still settle for less than what I preach sometimes.

There's one other thing; tests are a part of the TDD strategy, but I'm also beginning to think that once I have made some headway into these framework bits to be able to actually write some code that passes tests, I might also start writing some application/implementation project code. Call it a manual test if you like. The tests themselves should not lose priority. But I think it makes sense, to an extent, to see code written in action as soon as it's feasible in order to prove out code design prototypes. Writing application code should NOT actually be additional tests but rather take advantage of what's alreay passing and help guide the tests as the tests are being written. In this light, the only problem I have with TDD is that TDD is spoken of too often in terms of unit and integration tests, whereas the whole lifecycle of designing with tests and of testing must be much broader than unit and intgration tests. TDD starts with unit tests, but unit tests, by definition, only test one tiny unit of functionality. In order to really drive your design and development through tests, you need to be able to test workflows, too. Starting from Point A, given n parameters, we should ultimately arrive at B, and this might involve several steps.

Oh, I still need to watch Rob Conery's MVC Storefront videos where he, too, learned TDD. I got about ten or so videos in until I caught up, and then stopped watching while he kept streaming them out.

Tags:

C#

C#: Type Inferences For Generic Methods

by Jon Davis 17. March 2009 12:40

C# 3.0 has a cool feature I didn’t realize existed until now. I already knew that you can create generic methods, like so:

protected static T CaseInsensitiveFindValue<T>(Dictionary<string, T> dic, string key)
{
    if (dic.ContainsKey(key)) return dic[key];
    foreach (var kvp in dic)
    {
        if (kvp.Key.ToLower() == key.ToLower()) return kvp.Value;
    }
    return default(T);
}

That goes back to 2.0. What I didn’t realize until now was that when you invoke this method, you don’t have to pass the type in as a type argument for T. The type can be inferred!

var value = CaseInsensitiveFindValue(myDictionary, myKey);

Be the first to rate this post

  • Currently 0/5 Stars.
  • 1
  • 2
  • 3
  • 4
  • 5

Tags:

C# | Software Development

Simple Dependency Mapping

by Jon Davis 19. February 2009 07:19

Here’s my idea of simple dependency mapping, without an IoC container. The idea is to drop your binaries into the bin directory, optionally (yes, optionally) name your dependencies somewhere such as in app.config / web.config, and let ‘er rip.

I still hate Castle Windsor, et al, and will never write code around it if I can avoid it.

By the way, this is just DI, not IoC. For IoC without an IoC container, I still like eventing. See my past posts:

Dependency mapping implementation example (conceptual; one can and should use better code than this):

// Scenario infers IMyService as a referenced/shared interface-only type library
// used on both the invoker and the IMyService implementor.

// Example 1:
// match on Type.FullName
IMyService myService = (IMyService)Activator.CreateInstance(
	Dependency.GetDependency<IMyService>(ConfigurationManager.AppSettings["myServiceProvider"]));
myConsumer.MyDependency = myService;

// Example 2:
// match on Type.Name case insensitively; NOTE: This is not recommended.
IMyService myOtherService = (IMyService)Activator.CreateInstance(
	Dependency.GetDependency<IMyService>(ConfigurationManager.AppSettings["myServiceProvider2"], true));
myConsumer.MyDependency = myOtherService;

// Example 3:
// Only one match? Too lazy to name it? Just take it. NOTE: This is not recommended.
IMyService whateverService = (IMyService)Activator.CreateInstance(
	Dependency.GetDependency<IMyService>());
myConsumer.MyDependency = whateverService;

Code:

using System.Reflection; //, etc...

// ...

public class Dependency
{
    public static Dictionary<Type, List<Type>> KnownServices
        = new Dictionary<Type, List<Type>>();


    /// <summary>
    /// Returns any <see cref="Type"/> in the current <see cref="AppDomain"/>
    /// that is a <typeparamref name="T">T</typeparamref>. 
    /// </summary>
    /// <typeparam name="T">The type in the <see cref="AppDomain"/> that the
    /// return <see cref="Type"/> should be or inherit.</typeparam>
    /// <returns></returns>
    public static object GetDependency<T>()
    {
        return GetDependency<T>(null);
    }

    /// <summary>
    /// Returns any <see cref="Type"/> in the current <see cref="AppDomain"/>
    /// that is a <typeparamref name="T">T</typeparamref>. 
    /// If <paramref name="by_name"/> is not <code>null</code>, only the type(s) matching
    /// the specified <paramref name="by_name"/> with the <see cref="Type.FullName"/> is returned.
    /// </summary>
    /// <typeparam name="T">The type in the <see cref="AppDomain"/> that the
    /// return <see cref="Type"/> should be or inherit.</typeparam>
    /// <param name="by_name">The type name that should match the return value(s).</param>
    /// <example>object myService = Activator.CreateInstance(GetDependency&lt;IMyService&gt;);</example>
    public static object GetDependency<T>(string by_name)
    {
        return GetDependency<T>(by_name, false);
    }

    /// <summary>
    /// Returns any <see cref="Type"/> in the current <see cref="AppDomain"/>
    /// that is a <typeparamref name="T">T</typeparamref>. 
    /// If <paramref name="by_name"/> is not <code>null</code>, only the type(s) matching
    /// the specified name will be returned.
    /// </summary>
    /// <typeparam name="T">The type in the <see cref="AppDomain"/> that the
    /// return <see cref="Type"/> should be or inherit.</typeparam>
    /// <param name="by_name">The type name that should match the return value(s).</param>
    /// <param name="short_name_case_insensitive">If true, ignores the namespace,
    /// casing, and assembly name. For example, a match on type <code>Dog</code>
    /// might return both <code>namespace_a.doG</code> and <code>NamespaceB.Dog</code>.
    /// Otherwise, a match is made only if the <see cref="Type.FullName"/> matches
    /// exactly.
    /// </param>
    /// <returns>A <see cref="Type"/>.</returns>
    /// <example>object myService = Activator.CreateInstance(GetDependency&lt;IMyService&gt;);</example>
    public static object GetDependency<T>(string by_name, bool short_name_case_insensitive)
    {
        if (by_name != null && !short_name_case_insensitive)
        {
            return Type.GetType(by_name, true);
        }
        Init<T>();
        var t_svcs = KnownServices[typeof(T)];
        if (string.IsNullOrEmpty(by_name))
        {
            if (t_svcs.Count == 0) return null;
            if (t_svcs.Count == 1) return t_svcs[0];
            return t_svcs; // more than one, return the whole list
        }
        return t_svcs.Find(t => t.Name.ToLower() == by_name.ToLower());
    }

    private static readonly Dictionary<Type, bool> Inited = new Dictionary<Type, bool>();
    private static void Init<T>()
    {
        if (Inited.ContainsKey(typeof(T)) && Inited[typeof(T)]) return;
        if (!KnownServices.ContainsKey(typeof(T)))
            KnownServices.Add(typeof(T), new List<Type>());
        var refAssemblies = new List<Assembly>(AppDomain.CurrentDomain.GetAssemblies());
        foreach (var assembly in refAssemblies)
        {
            Type[] types = assembly.GetTypes();
            foreach (var type in types)
            {
                if (type.IsClass && !type.IsAbstract &&
                    typeof(T).IsAssignableFrom(type))
                {
                    KnownServices[typeof(T)].Add(type);
                }
            }
        }
        Inited[typeof(T)] = true;
    }
}

Be the first to rate this post

  • Currently 0/5 Stars.
  • 1
  • 2
  • 3
  • 4
  • 5

Tags:

C# | Software Development

EntitySpaces 2009 Q1 WCF Demo

by Jon Davis 25. January 2009 16:45

I created a new WCF demo for EntitySpaces, one of the most popular ORM solutions available for .NET which now comes with its own code generator (no longer relies on CodeSmith or myGeneration). The demo is bundled in the Release Candidate for v2009 Q1. (The developer version is released, trial version will be released tomorrow.) This one includes both console and Windows Forms clients, and a console-based service, for showing the barebones basics of what it takes to get EntitySpaces working with WCF. Both full proxies (EntitySpaces runtime libraries referenced on the client) and lightweight proxies/stubs (*no* EntitySpaces runtime libraries referenced on the client) are demonstrated, but the lightweight demo is currently limited to a console app.

Next on my plate will be a WPF demo for the lightweight proxies/stubs. No guarantees...

Anyway, here’s the documentation that went with the demo. It got posted on the EntitySpaces blog.

http://www.entityspaces.net/blog/2009/01/25/EntitySpaces+2009+And+WCF.aspx

Currently rated 5.0 by 1 people

  • Currently 5/5 Stars.
  • 1
  • 2
  • 3
  • 4
  • 5

Tags:

C# | Cool Tools | Pet Projects | Software Development | SQL Server | WCF

XmlSerialized<T> and BinarySerialized<T>

by Jon Davis 21. January 2009 20:46

Here are a couple classes that I threw together for web services and WCF usage where the client is using the same data class library as the server. Microsoft auto-serializes stuff, but when serializing explicitly I prefer to retain an explicit type reference rather passing and manually deserializing a string or byte array. In other words, if I see a web service API return a string or byte[] as an XML or binary serialization of a complex class, I get cranky, because the client should know, without having to use external documentation, how the deserialization is type-mapped.

I'm sure there are a bunch of other uses for these, like saving an object graph to disk in two lines of code, etc. In fact, I'm also using XmlSerialized<T> to convert one compatible object type (full) to another object type (lightweight) and back again.

Being as these were indeed thrown together, I give no guarantees of anything. They worked for me.

Scenario:

// take ..
public string GetMyObject() {
	var myObject = new MyType();
	var sw = new System.IO.StringWriter();
	var serializer = new XmlSerializer(typeof(MyType));
	serializer.Serialize(sw, myObject);
	return sw.ToString();
}
// .. which is a total pain to deserialize, 
// and replace it altogether with ..
public XmlSerialized<MyType> GetMyObject() {
	return new XmlSerialized<MyType>(new MyType());
}
// which can be deserialized either manually or "automagically"
// using .Deserialize().

Usage:

 

var myObject = new MyType();
// serialize to XML
var myXmlSerializedObject = new XmlSerialized<MyType>(myObject);
// preview the serialized value
string serializedValue = myXmlSerializedObject.SerializedValue;
// create it on the client
myXmlSerializedObject = new XmlSerialized<MyType>(serializedValue);
// and deserialized back to POCO
var myObjectAgain = myXmlSerializedObject.Deserialize();

// binary-serialized and compressed (slower CPU, smaller footprint)
var myBinarySerializedObject = new BinarySerialized<MyType>(myObject, true);
byte[] binaryValue = myBinarySerializedObject.SerializedValue;
bool is_it_compressed = myBinarySerializedObject.IsCompressed;
myObjectAgain = myBinarySerializedObject.Deserialize();
// uncompressed (faster CPU, larger footprint)
var myUncompressedSerializedObject = new BinarySerialized<MyType>(myObject, false);
byte[] uncompressedBinaryValue = myUncompressedBinarySerializedObject.SerializedValue;
is_it_compressed = myBinarySerializedObject.IsCompressed;
myObjectAgain = myUncompressedBinarySerializedObject.Deserialize();

 

The classes:

 

[Serializable]
[DataContract]
public class XmlSerialized<T>
{
    public XmlSerialized() { }
    public XmlSerialized(string serializedValue)
    {
        this.SerializedValue = serializedValue;
    }
    public XmlSerialized(T value)
    {
        Stream stream = new MemoryStream();
        Serializer.Serialize(stream, value);
        stream.Seek(0, SeekOrigin.Begin);
        StreamReader sr = new StreamReader(stream);
        this.SerializedValue = sr.ReadToEnd();
        sr.Close();
        stream.Close();
    }

    [System.Xml.Serialization.XmlIgnore]
    private string _serializedValue = null;
    [System.Xml.Serialization.XmlElement]
    [DataMember]
    public string SerializedValue
    {
        get { return _serializedValue; }
        set { _serializedValue = value; }
    }

    public T Deserialize()
    {
        Stream s = new MemoryStream();
        StreamWriter sw = new StreamWriter(s);
        sw.Write(this.SerializedValue);
        sw.Flush();
        s.Seek(0, SeekOrigin.Begin);
        T value = (T)Serializer.Deserialize(s);
        return value;

    }

    private static XmlSerializer _Serializer = null;
    private static XmlSerializer Serializer
    {
        get
        {
            if (_Serializer == null) _Serializer = new XmlSerializer(typeof(T));
            return _Serializer;
        }
    }

    public virtual To_T ConvertTo<To_T>()
    {
        var toObj = new XmlSerialized<To_T>(this.SerializedValue);
        return toObj.Deserialize();
    }

    public virtual T ConvertFrom<FromT>(FromT obj)
    {
        var fromObj = new XmlSerialized<FromT>(obj);
        this.SerializedValue = fromObj.SerializedValue;
        return this.Deserialize();
    }

}

 

 

[Serializable]
[DataContract]
public class BinarySerialized<T>
{
    public bool IsCompressed { get; set; }
    public BinarySerialized(T value, bool compressed)
    {
        this.IsCompressed = compressed;
        var bf = new BinaryFormatter();
        var ms = new MemoryStream();
        bf.Serialize(ms, value);
        byte[] bytes = ms.ToArray();
        SerializedValue = compressed ? Compress(bytes) : bytes;
    }
    public BinarySerialized(T value)
        : this(value, true) {}
    public BinarySerialized(byte[] serializedValue)
    {
        this.SerializedValue = serializedValue;
    }
    public BinarySerialized() { }

    [DataMember]
    public byte[] SerializedValue { get; set; }

    public T Deserialize()
    {
        byte[] bytes = IsCompressed 
            ? Decompress(SerializedValue) 
            : SerializedValue;
        var ms = new MemoryStream(bytes);
        ms.Position = 0;
        var bf = new BinaryFormatter();
        T retval = (T)bf.Deserialize(ms);
        return retval;
    }

    protected virtual byte[] Compress(byte[] byteArray)
    {
        //Prepare for compress
        System.IO.MemoryStream ms = new System.IO.MemoryStream();
        System.IO.Compression.GZipStream sw = new System.IO.Compression.GZipStream(ms,
            System.IO.Compression.CompressionMode.Compress);

        //Compress
        sw.Write(byteArray, 0, byteArray.Length);
        sw.Flush();
        sw.Close();

        return ms.ToArray();
    }


    protected virtual byte[] Decompress(byte[] byteArray)
    {
        //Prepare for decompress
        var ms = new System.IO.MemoryStream(byteArray);
        ms.Position = 0;
        var sr = new System.IO.Compression.GZipStream(ms,
            System.IO.Compression.CompressionMode.Decompress);

        //Reset variable to collect uncompressed result
        int buffer_length = 100;
        byteArray = new byte[buffer_length];

        //Decompress
        int offset = 0;
        while (true)
        {
            if (offset + buffer_length > byteArray.Length)
            {
                byte[] newArray = new byte[offset + buffer_length];
                Array.Copy(byteArray, newArray, byteArray.Length);
                byteArray = newArray;
            }
            int rByte = sr.Read(byteArray, offset, buffer_length);
            if (rByte == 0)
            {
                byte[] retval = new byte[offset];
                Array.Copy(byteArray, retval, offset);
                byteArray = retval;
                break;
            }
            offset += rByte;
        }

        return byteArray;
    }
}

 

Be the first to rate this post

  • Currently 0/5 Stars.
  • 1
  • 2
  • 3
  • 4
  • 5

Tags: ,

C# | Software Development

LINQ Didn't Replace SQL Queries

by Jon Davis 3. January 2009 04:17

Just observing the obvious here, but about three and a half or so years ago when I heard about C-Omega, which later became LINQ, which then later became a part of C# 3.0, I got the impression that LINQ would perform SQL querying on the fly with integrated SQL-like syntax directly inline with C#, with strong typing and everything. In other words, I thought the language would inherently know ADO.NET/SQL semantics.

It does, but well, no, it doesn't. LINQ isn't an inline SQL translator at all. It is only syntactical sugar for a set of interfaces to a neutral provider to different sources of data. Once the provider is set, LINQ is still not translating SQL inline.

LINQ-to-SQL, LINQ-to-Entities, LINQ-to-NHibernate, LINQ-to-YoMamasDB, all of these use ORM templating and database boilerplating with class-specific generated code to match up the generated SQL.

I'm not against ORM, but it's still too much for smaller (read: tiny) quick-and-dirty hacks. In these cases LINQ (-to-anything) would be overkill galore in the database context. I do say this as a complaint: I still have to use plain old ADO.NET for quick-and-dirty SQL invocation additions, there's no way around it without making it not-so-quick-and-dirty.

Meanwhile, LINQ-to-Objects and LINQ-to-XML are legit. No boilerplating / generated code there. Very sweet.

Tags: , , ,

C#

Silverlight 3: 3D Support, But Is This True 3D?

by Jon Davis 6. December 2008 13:49

Just a couple days after my repeat whine about Silverlight not having a rasterization API and 3D support I saw that Scott Guthrie announced that Silverlight 3 will offer "3D support and GPU hardware acceleration". I nearly crapped my pants when I read that. :)

What's not clear is whether 3D support will come via GPU hardware acceleration. I'm of the firm mindset that true 3D is OpenGL or Direct3D; anything else is fake. So if GPU hardware accleration is fully applied to the 3D support, it may be using OpenGL / D3D under the covers.

Otherwise, I'm just going to assume that the so-called GPU acceleration is intended to assist in video playback and vector graphics. And I only base that assumption on the fact that Adobe Flash "supports 3D" (has an API) but its hardware acceleration is limited to video playback and vector graphics. And I'm honestly not familiar with graphics acceleration on vector graphics, how that works, but given that they claim it, I believe it's feasible.

To be honest, I think it would be just about as bad if GPU acceleration was applied to 3D support but not to vector graphics.

Now on "rasterization", I over-use the word really, but all I wish for is two things: a mutable bitmap buffer, where we can plot pixels, and the ability to take "snapshots" of the vector brushes being displayed (i.e. myEllipsis.ToBitmap()) and export them to the same bitmap feature. This buffer should be reusable for textures on vector artwork as well as the coming 3D support. This is NOT an advanced component that would grow the framework, such a thing is EXTREMELY simple, particularly considering that Silverlight already supports images and manages bitmaps under the covers, it's just so heavily encapsulated that you're limited to URL maps and embedded files. I'm not asking for GDI+ functions! I just want to be able to retain a bitmap in memory and reuse it, and maybe be able to modify it by plotting pixels. There is no workaround for this; one could create his own bitmap class and write to a data stream but then Silverlight still can't use it (without persisting the data stream somewhere) because it's limited to URLs and embedded files!!

I'm starting to get a better picture of Microsoft in the Silverlight context lately. While I greatly respect Microsoft's tools and capacity to think brilliantly, as well as their enthusiast division(s) (XNA, Zune, Xbox, Games for Windows, etc), I think the fact that their enthuiasts division(s) are not very involved in the Silverlight product is seriously undermining the adoptability of Silverlight in competition with Flash. I acknowledge that there's not a lot of money to be made from it, but I at least consider it good marketing.

But on the enterprise business side, I truthfully cannot give Silverlight credit for meeting demand yet, despite the remarkable uses some have found of it (with some effort), because of the lack of high-level controls and APIs, although I fully respect and appreciate the fact that it is on track to meet demand. The new Controls Toolkit is great, but the controls as demonstrated in the samples, and the preexisting datagrid as demonstrated by normal use, all demonstrate that Silverlight has performance issues. Performance issues are of serious concern in business environments every bit as much as in enthusiast niches because it amounts to decreases in usability, which in turn amounts to turned off customers and ultimately lost profits. An example of this is the horrible scrolling performance issue, most seen in the datagrid, although I've come to realize that Silverlight's scrolling performance issues seem to have been improved somewhat since the last beta (when I stopped watching Silverlight because I was so turned off). Still, not improved enough. The Silverlight datagrid looks good until you touch it, and then makes your computer feel like something from the late 90s.

Actually, scrolling is only one example. It's user responsiveness on the whole; buttons have a similar problem. Actually, all the input objects have the problem. In general, while I don't question nor doubt Silverlight's capacity to look beautiful or to have very high FPS animations, etc., I think where it doesn't shine at all is in user input responsiveness. There are exceptions. Microsofts and Silverlight enthusiasts will always have some great demos lying around. But generally I've noticed a problem with user responsiveness, particularly when scrolling or dragging comes into play.

It's not a dealbreaker issue; not so bad that it's unusable. It's just bad enough to keep me from using Silverlight in everything I do. I prefer DHTML and its less-sexy appearance. Responsiveness is more important than appearance, and AJAX gets the job done in most cases. Silverlight meanwhile can fill in gaps where Javascript can't cut it, but then, so can Flash, and by the time you're already choosing to stick with HTML for the basics there's no compelling reason to use Silverlight except for the CLR and WMV support.

Be the first to rate this post

  • Currently 0/5 Stars.
  • 1
  • 2
  • 3
  • 4
  • 5

Tags:

C# | Silverlight

LINQ-to-SQL On The Web Client

by Jon Davis 4. December 2008 09:29

This is really old news, but something I didn't realize until last night at a Silverlight users group meeting.

Silverlight 2.0 brought us LINQ-to-Objects and LINQ-to-XML on the client. Bravo, yay, etc., but obviously LINQ-to-SQL was out of the question because you can't make an Internet-based T-SQL connection (i.e. over port 1433), for obvious reasons (*cough* security).

But Service Pack 1 for Visual Studio 2008, which introduced ADO.NET Data Services, also brought in along with it transparent LINQ-to-SQL support over WCF. This to me is bizzare. I haven't tried it, I only heard it mentioned. Frankly I'm worried about security, still, as it sounds a lot like RDO (Remote Data Objects) from back in the ASP Classic / ActiveX days, and which turned out to be a huge security disaster. Nonetheless, I'm still curious about how this might work securely and how it might make the workflow of a modern-age developer much, MUCH more pleasant than manually wiring up WCF for data synchronization to begin with.

Be the first to rate this post

  • Currently 0/5 Stars.
  • 1
  • 2
  • 3
  • 4
  • 5

Tags: , , , ,

C# | Web Development


 

Powered by BlogEngine.NET 1.4.5.0
Theme by Mads Kristensen

About the author

Jon Davis (aka "stimpy77") has been a programmer, developer, and consultant for web and Windows software solutions professionally since 1997, with experience ranging from OS and hardware support to DHTML programming to IIS/ASP web apps to Java network programming to Visual Basic applications to C# desktop apps.
 
Software in all forms is also his sole hobby, whether playing PC games or tinkering with programming them. "I was playing Defender on the Commodore 64," he reminisces, "when I decided at the age of 12 or so that I want to be a computer programmer when I grow up."

Jon was previously employed as a senior .NET developer at a very well-known Internet services company whom you're more likely than not to have directly done business with. However, this blog and all of jondavis.net have no affiliation with, and are not representative of, his former employer in any way.

Contact Me 


Tag cloud

Calendar

<<  September 2020  >>
MoTuWeThFrSaSu
31123456
78910111213
14151617181920
21222324252627
2829301234
567891011

View posts in large calendar