Monthly Archives: May 2010

My current CI setup

I was asked by email what my current CI setup is, and did I have a blog post about it.  Um, actually, no.  Oops.  So, here it is.  As always, it’s a work in process, and there are lots of unfinished rough edges.  It’s also got some phenomenally cool stuff too  Thus, without further ado, my current setup for Continuous Integration:

CruiseControl.NET which runs on SVN commit that runs an NAnt build script which runs:

– MSBuild on one or more solutions

– aspnet_compiler.exe on all the websites (to validate the code in the markup)

– Use YUI Compressor to compress CSS files and compress and combine JavaScript files (yes, the irony isn’t lost here)

– NUnit tests including:

       – Fire up Cassini on each site and insure a carefully selected page doesn’t blow chunks.  (e.g. no configuration or initialization errors on each site.)

       – Database / code integrity checks like do the enum values match the lookup table content(I realize they’re mostly integration tests, and cheesy at that, but it’s a far cry better than the previous state of zero tests and “hope it works out” deployment.)

– Deploy content to test server(s), calling iisreset and stopping / restarting services as necessary.

– Label the CI build via svnrevisionlabeler (so the build number in CCTray matches the SVN version number).

– Email out to those who want the spam how the build did.  (Personally I prefer data pull mechanisms like CCTray.)

There’s also an SVN commit trigger that generates a commit email and sends it out.

That’s what I’ve got now.

What I want to add to this (given another 257 hours in the day):

– JSLint validation of .js files and hopefully script tags in html

– CSS validation of .css files and hopefully style tags in html

– HTML validation to match the page’s doctype of .aspx and .html pages

– SEO evaluation of .aspx and .html pages by crawling the site

– Database migration via Tarantino or RedGate’s Sql Compare Pro & Sql Data Compare Pro

Once I’ve got these in place, I’ll be confident that the code functions and is of descent quality before I deploy it to the test servers.  Granted, I haven’t validated that it functions correctly, only that it functions completely.  The next step will be to look to Selenium Grid to validate JS works cross-browser and that various pages function as expected.  I hope by then I can also kick-start the the idea that writing unit tests to validate the code functions as expected is also a good idea.

Add a bit of duct tape, a sprinkle of insanity, and that’s my CI setup.  Cheers.


.IsNullOrEmpty() for List and Dictionary

string.IsNullOrEmpty() in C# for strings is awesome.  I pass in a string, it tells me if it was null or blank.  Pre-trim it with something like this :

string.IsNullOrEmpty( ( var ?? “” ).Trim() )

and I know if I’m toast or not before the null reference exception or blank screen.

Well, what if I have a List<T>?  Or a Dictionary<T,U>?  Here’s extension methods I wrote for checking blank-ness:

    public static bool IsNullOrEmpty<T>( this IList<T> List ) {
        return ( List == null || List.Count < 1 );

    public static bool IsNullOrEmpty<T,U>( this IDictionary<T,U> Dictionary ) {
        return ( Dictionary == null || Dictionary.Count < 1 );

The added benefit of this is I can say:


which is usually more like the thought I had when I started writing the code.  So I also add this method:

    public static bool IsNullOrEmpty( this string String ) {
        return string.IsNullOrEmpty( ( String ?? “” ).Trim() );

so I can call it like this:


but since I’m coalescing to empty string before trimming, I can just as easily say:

    public static bool IsNullOrEmpty( this string String ) {
        return ( (String ?? “”).Trim() != “” );

And for good measure, here’s a similar JavaScript function I wrote to check for blank-ness:

    function isNullOrEmpty(val) {
        var empty = true,
            name = null;

        if ( typeof(val) === ‘undefined’ || val === null ) {
            return true; // It’s null or undefined
        if ( typeof(val) === ‘string’ ) {
            return (val === ”); // It’s a string that may or may not be blank
        if ( typeof(val) === ‘object’ ) {
            if (value.constructor === Array && val.length === 0) {
                return true; // It’s an empty array
            for ( name in val ) {
                if ( val.hasOwnProperty(name) ) {
                    empty = false;
            return empty; // It’s an object that has or doesn’t have data in it
        // It’s not null or empty
        return false;

And that, as we say, is null … or empty.  :D

Validating web content in CI

If I had another 257 hours in the day, I’d love to build the ultimate web content validator into the continuous integration process I now have.  After a successful build, I’d start by kicking off a WebDev.WebServer instance of the site, then fire the SEO toolkit by wrapping it into a .net library.  Then extend it with custom tasks run on each page download: like validating the HTML and CSS via W3C, validating the JavaScript via JSLint, and for html content, I’d regex out script and style tag content, padding the top of a temp file with whitespace to keep the line numbers right, then validate that as CSS and JavaScript as well.  (I’d rather find an offline way to do HTML validation that doesn’t involve Cygwin, as I have enough emulation going on here.)  Perhaps I’d wrap it all in an NAnt task, or just an NUnit test suite that either reflected through the solution for web.configs or took in a list of projects via TestCase or the project’s AppSettings.  (I’d like to be able to authenticate certain requests too, so I can validate the user profile content.  WebCrawler.Settings exposes a Credentials property, though I’ve had more success setting an Authorization header than using HttpWebRequest.Credentials.  Neither gets me through the forms authentication cookie though as UrlDownloader.WebRequestCreate() has no settings.Cookies.  I’d love a per-url dictionary of “use credentials or don’t”, though I realize that’s totally overkill for the stock use of the SEO toolkit, and more than likely I can just rescan with a StartUrl inside the profile, and an ExternalLinkCriteria of SameFolderAndDeeper.  If all else fails, I’d Reflector out the WebCrawler, or inject an override into UrlDownloader.OnGetContent() in the same dll.)  More than likely, the report from each of the validators is too big for the build log, so I’d save off each report, named by download url and module, and build an index page dynamically to navigate through them all.  After the report was run, I’d RoboCopy the report tree to a deployment url or find a way to include it into CC.NET’s document list.  Wrap all that up in a bow, and we’ve got the uber-web validator CI engine.  Now where’d I put that other 257 hours in the day?  And will condensing a few weeks worth of random thoughts and links this tightly get me into the Google Dungeon?