A .NET Server Control for Imaging (Part I)

When I designed Structure Too Big, I knew that an image control would be necessary. I've got hundreds of photos, artwork, etc., and back when I was doing my first web sites (circa 1995), I'd need to bring each photo into an editor, crop it correctly, place a drop shadow, etc. Even with batch processing, it's a pain and frankly not very cool.

This time around, I wanted the web site, as an application, handle all of this for me. The first step in doing this was to handle image resizing dynamically. In an earlier site I developed a few years ago, I used AutoImageSize from UnitedBinary (here). It works great in ASP, and the author has created a .NET wrapper. But preferrably, I wanted a managed solution -- and ideally, I wanted to write it myself using GDI+.

After some initial prototyping, I was pretty happy with the results except for one rather annoying bug in GDI+ using the High Quality Bicubic filter. The bug will occasionally not interpret the edge correctly, leaving 1 pixel of nastiness around one or two of the image borders. One requirement I had for the resizer is that it must cache the output to a folder. Even though the output can be cached via fragment caching of the control that invokes this object, there's no need to continually resize images as it's a fairly expensive operation, and I wanted the object to handle its own caching.

I've posted the C# class file to this solution and documentation here. Hopefully within the next few days I'll get the actual server control documented, cleaned up, and posted online as well.

A couple of things worth blogging about in the resizer class. The resizer will write the new images to a cache folder. I spent quite a bit of time trying to figure out what to name the file. In my original prototype, I simply kept the original name, appending parameters onto the filename. When moving the code out the prototyping phase, I wanted the change this because the filename revealed too much of the implementation -- generally a bad idea.

After discussing some methods with a buddy, I came up with the following routine to hash the filename into a Guid. The cool part is that this is useful in other applications with minor modifications.

private static string HashFile(FileInfo fileinfo, int Width, int Height,
    int Quality, AspectRatio Aspect, string AntiLeechKey) {

    byte[] GuidBytes = new byte[8];
    GuidBytes[0] = (byte)Quality;
    GuidBytes[1] = (byte)(int)Aspect;
    GuidBytes[2] = (byte)fileinfo.LastWriteTimeUtc.Second;
    GuidBytes[3] = (byte)fileinfo.LastWriteTimeUtc.Minute;
    GuidBytes[4] = (byte)fileinfo.LastWriteTimeUtc.Hour;
    GuidBytes[5] = (byte)fileinfo.LastWriteTimeUtc.Day;
    GuidBytes[6] = (byte)fileinfo.LastWriteTimeUtc.Month;
    GuidBytes[7] = (byte)fileinfo.LastWriteTimeUtc.Year;

    Guid FileHash = new Guid(
        string.Format("{0}{1}", AntiLeechKey, fileinfo.FullName).GetHashCode(),
        (short)Width,
        (short)Height,
        GuidBytes[0],
        GuidBytes[1],
        GuidBytes[2],
        GuidBytes[3],
        GuidBytes[4],
        GuidBytes[5],
        GuidBytes[6],
        GuidBytes[7]
        );

    return string.Format("{0}{1}",
        FileHash.ToString().Replace("-",""),
        fileinfo.Extension
        );
}


While the statistical probability is not zero, there's virtually no chance this code will produce the same value for 2 different filenames. While you wouldn't want to rely on a string's Hashcode by itself, including the additional parameters ensures the value will be unique. And if you're looking at the code carefully, yes I know: converting the year to a byte will repeat itself every 256 years. It's a limitation, but not realistically an issue.

The second blog worthy topic is the antileeching capability in the class. Bandwidth leeching occurs when site A uses content from site B directly in site A's pages. The result is that site B pays the bandwidth and overhead for serving content to site A. One of the best ways to do prevent leeching is to implement an HttpHandler that considers each image request, but this may not be feasible and does have some performance and usage considerations. The implementation here is very simple: when used in this class, an AntiLeech key (a string) is hashed with the filename. Essentially the key acts a salt value, so the Guid filename will be changed. This allows the images on the site to be seamlessly changed and the old cached images can be deleted, thus preventing leeching.

What I like about this solution is that it introduces no real performance hit, and also lets you implement it only where you need it. For example, I'm not worried about pictures getting leeched, I'm more concerned about the artwork that ends up as backgrounds on a lot of web sites around the net. (When LoneWolf Design was up, the site averaged 75gig/month bandwidth for years, 40% of which was leeched.)

Because only a few images on my site will use this antileeching capability, I've decided to change my key every time the application starts. My Application_Start looks like:

protected void Application_Start(Object sender, EventArgs e) {
        try
        {
            string PurgeFolder = Hitney.Globals.ResizeCacheDirAntiLeechFolder;
                Hitney.Imaging.ImageResizer.PurgeCache(PurgeFolder);
        }
        catch
        {
    #if DEBUG
            throw;
    #endif
        }
        Application["AntiLeechKey"] = DateTime.Now.ToString("MMddyyyyHH");
}


Because the rest of my site uses static fields in a Global class, I assign this to a static in my Globals class:

public static string AntiLeechKey = (string)System.Web.HttpContext.Current.Application["AntiLeechKey"];


The first call in the Application_Start will delete all images currently in the Art Gallery cache, the second generates a new key. The new images take about 100ms to generate (completely invisible to the end user). If AntiLeeching were site wide and there were hundreds or thousands of thumbnails or resized versions to regenerate, this wouldn't be performant enough and the key would need to be implemented differently. Obviously, this method is called each time the application is reset; so, it may be too random (or too frequent or infrequent) depending on your needs.

Another option, loosely related to antileeching, is the ShadowDirectories flag. If true (default), the output cache folder mimics the file structure in your site as it generates new images. For example, an image "/images/pictures/family.jpg" would end up in "[cachefolder]/images/pictures/family.jpg/[guid.jpg]" -- if you're trying to prevent leeching, set this to false as it will completely hide the original image source. Images end up in root cache folder (such as "[cachefolder]/[guid.jpg]") only. Optionally -- and what I've done -- is setup a folder in the [cachefolder] for these "volatile" images. It just makes content management easier.

So with that, the ImageResizer class is complete. The next step is to whack the server control into shape, which I hope to have done soon!
Comments are closed

My Apps

Dark Skies Astrophotography Journal Vol 1 Explore The Moon
Mars Explorer Moons of Jupiter Messier Object Explorer
Brew Finder Earthquake Explorer Venus Explorer  

My Worldmap

Month List