Andre's Blog

Personal blog of Andre Perusse

Bitten By Windows Genuine Advantage

Last weekend I was home doing some paperwork and I had to send a fax to my insurance agent. I hadn't sent a fax in several months and not since I had reimaged my computer with Vista and Office 2007. So I opened Word and looked in the templates for a fax cover page. Word ships with several templates as standard, but it also includes a link to an Internet-based gallery of templates. I selected one of the Internet-based templates and was informed that this feature required that Microsoft verify that my Office installation was "genuine." Sure, whatever, I thought to myself and I hit the "Continue" button. To my shock, Office reported that the validation had failed! My Office installation wasn't genuine!

That's odd, I mumbled. There's a link that takes you to a more detailed explanation of what is wrong with the validation. I clicked on it and the resulting web page told me my Office installation hadn't been activated yet. It said all I had to do to fix it was run any Office app and it would automatically start the activation process. I shook my head and cursed at Microsoft for inflicting this garbage on me. I double-checked Word and sure enough, it claimed that it was already activated. So I concluded that Microsoft's "genuine" validation routine was on crack. I get all my Microsoft software directly from MSDN, so there's no way my Office installation was phony.

I futzed around with the web site a bit more trying to figure out how I could convince the Great Genuine Validation Gods that I didn't steal this copy of Office. No go. I called the tech support number and was quickly transferred to another call center because I got my copy from MSDN. This new call center told me they couldn't help me on the weekend unless my "business" was experiencing a Severity One emergency. Call back Monday, he said. I was irked as hell about this but there was little I could do. So I hauled out my laptop which had Office installed from the exact same CD. It worked fine, I got my stupid template, and I sent my fax.

Due to my employer's current MSDN configuration, it was difficult for me to log a support incident with Microsoft about this so I didn't bother looking at it again until this evening. I suspected that something went awry with my Office install on Vista so I set about trying to "repair" it from the CD. Nope, no good. I was getting ready to uninstall and reinstall as a last resort when I remembered that I had installed Microsoft Project 2007 at the same time I had installed Office Professional, but I had never run Project at all. Hmmm, I thought to myself, I'll bet the Genuine Gods are pissed off because I've never run Project, so they're not going to let me into their special Internet club! So I ran Project (which installed using a different product key than Office and thus needs its own activation), it immediately wanted to be activated, I activated it and promptly closed it. I went back into Word, tried to open an Internet-based template, went through the "Genuine" validation voodoo, and it worked!

So, my question to Microsoft is: Why the hell did I have to active Microsoft Project so that I could download a Word template? In what twisted demon dimension does this bent logic make any sense? Good grief. (But I still think the Ribbon interface is way cool.)

Vista Revisited

As you may recall, several months ago I installed the so-called "Release Candidate 1" version of Microsoft's latest consumer operating system, Vista. You may also recall that it was an absolutely horrid experience and I re-install Windows XP in a matter of days. Well, Vista was officially released recently so I decided to give it another try.

During my RC1 trial, there were a few items that were deal-breakers that made me install Win XP again. One of these was the inability for me to control the jet engine fan speed on my ATI X800XL video card. With XP, I was using Ray Adams' ATI Tray Tools program which allows for the control of fan speed (and thus the resulting noise) in software. Ray has pretty much declared that he won't be porting ATI Tray Tools to Vista anytime soon, so I had to find another solution. There are no other software solutions, so I tried to go the hardware route which consists of replacing the stock cooling fan with a thrid-party assembly. Much recommended on the Internet was Artic Cooling's ATI Silencer Rev 5, however, it was discontinued and is no longer available. My local computer store had a Zalman VF-700 in stock, which reportedly also worked well with my card. After about 40 minutes of work, I had the replacement cooling fan installed and in the 5-volt mode it is virtually silent. That's one down.

Next, I was distressed that RC1 didn't support my Creative SoundBlaster Audigy 2 sound card, which is connected via coax digital cable to my Logitech Z-5500 speaker system. Creative now has revised "beta" Vista drivers so I decided to give it go. As it turns out, the beta drivers work mostly fine and the coax digital out on the sound card works great. That's two down.

Last on my list of deal-breakers with Vista RC1 was the fact that my brand-new (then) Logitech QuickCam Fusion wouldn't work. Logitech released new drivers a day before Vista's official release and the QuickCam now works fine, too. That's three down and none left. So I now have a clean-install of Vista on my home PC, completely replacing XP.

The upgrade wasn't completely without aggravation, however. Vista still refused to recognize my motherboard's integrated Intel RAID array out-of-the-box. I was lucky in that the USB thumb drive I had used to install the driver during my RC1 fiasco had remained untouched since then, so the RAID driver was still on it. The driver loaded without any fuss and Vista installed fine afterward. Next was getting my usual set of software installed, which consists of such things as Microsoft Office and Visual Studio 2005. These are on my hard drive as ISO images downloaded from MSDN. My usual ISO mounting software (Nero ImageDrive) doesn't yet work on Vista, so I had to try something else. MagicISO worked once, but gave me error messages on every restart after that. Virtual CloneDrive seems to work at first, but throws read errors during an install. For Visual Studio 2005, I had to burn the image to a physical DVD in order to get it installed.

I have a couple of last complaints with my Vista experience so far. As I mentioned above, I have an ATI X800XL video card and it has the ability to capture video from a composite or s-video source. Well, it HAD that ability before I upgraded to Vista. ATI's official Vista drivers have dropped support for the video-in feature of this card, and there doesn't appear to be any committment to restore them in the near future. And, while I was going through the Windows Media Center set up procedure, my computer blue-screened during my second 5.1 audio sound test.

Despite these issues, Vista remains on my machine and I have no intention of putting XP back on. Most things work fine, and the supplied Microsoft printer drivers were even able to see and print to my networked Brother MFC-8840DN all-in-one unit (though I can't scan or fax from it yet). I'm confident that driver updates from vendors will fix most of my problems over the next couple of months. In the meantime, Vista is a pleasant enough upgrade from XP, if only in the paint job and visual appearance department. As a collegue of mine (who also owns a Mac) said, "It's like using a Mac, except it has applications."

Plextor DVD Writer - R.I.P.

A couple of years ago I picked up my first DVD writer - a Plextor PX-712A. I was kinda proud of it because Plextor is a premium brand and most do-it-yourselfers in the build-your-own-PC world tend to use more mainstream and budget "OEM"-type brands such as LG, LiteOn, or Pioneer. Plextor markets themselves as "The Leader in Reliable CD, DVD, and Digial Video Solutions" so I felt pretty smug that I had paid a little extra and bought quality gear. It wasn't dual-layer, but back then dual-layer writers were just hitting the market anyway and even today dual-layer media is still prohibitively expensive.

I really enjoyed using the Plextor. The tray mechanism has a very nice glide motion where the tray slows down just before it reaches the full open or close state and gives the drive a sort-of "luxury" feel. The unit is heavy and full-length, and the status light has different modes for read vs. write vs. power on. Plextor's web site had several firmware updates, all of which were easily applied. Perhaps the feature I liked most was that it was extremely quiet. Overall, it was just a nice piece of kit to own - the kind of kit that gives you that nice pride-of-ownership feeling.

Unfortunately, this feeling was not meant to last. Over the last several months, things just haven't been working quite right. I don't use the drive all that much, so I never really connected the dots until now, but the unit is no longer reliable. When reading DVD's it almost always quits about half-way through with various read-errors. Conversely, disks that it writes do so without any errors, but the resulting disks don't play well in other DVD drives. My ancient Pioneer DVD-ROM drive works much better at reading disks, a fact that pisses me off to no small extent. Here I was being Mr. Smarty-Pants by getting a high-end drive that I was sure would last until the end of the earth. Instead, it gets beat by my bought-it-without-a-thought DVD-ROM unit that must be 4 years old by now.

What really irks me is that this is the second time "high-end" gear has failed me. Before this I had a nice Ricoh CD-writer I bought in 1999 that, again, I was sure would last forever. It also refused to read and write data after a couple of years of extremely light use. The difference with the Ricoh was that it came with a good warranty and I was able to get it repaired for free. The Plextor only had a 1-year warranty that is long expired.

Despite this experience, I was tempted to pick up a new Plextor to replace my busted drive. Their gear just feels so nice. However, a couple of factors changed my mind. First, my motherboard is getting a little long in the tooth and only supports two SATA connectors which are already in use. If I'm plunking down more than $100 for a DVD drive, I ain't buying no old-school PATA garbage. Not that PATA drives aren't as good, I just really like the smaller SATA cables and their tidy appearance. Secondly, the local computer store (which is only two minutes from where I work) doesn't stock Plextor drives, but they had plenty of $41 LG drives. And $41 isn't really an investment by any means - it's more of an impulse buy. So, I picked up an 18x LG drive and you know what? It works great, and it's not even that noisy. And I'll bet it'll last forever. Still, I'm going to miss that luxury tray motion.

Weather RADAR for Canada - a Yahoo Widget

I seem to be obsessed with the weather lately. So are a lot of other people judging from the amount of weather-related widgets and "parts" for public portals. One of my favourites was a Yahoo widget that showed the precipitation RADAR imagery from Environment Canada's web site. Unfortunately, the widget stopped working last year when Environment Canada changed how the web site worked.

So, I took a look at the Yahoo Widget SDK and it seemed like an easy-enough technology to develop with. The main widget file is an XML document that allows you to lay out window areas with text blocks and images. It also allows for the description of timer events and preference settings. The programmatic aspect of the widget is controlled with simple javascript, though as I learned the hard way, it's not really the same as using javascript in a web page. For example, there is no Image() object in a Yahoo widget. The Image() object is not an intrinsic javascript construct - it is actually part of the browser's HTML document object model. So don't try pre-loading any web graphics the good old fashioned way.

Still, once you get the hang of it, developing a widget is pretty straight-forward. I'm not really a fan of the absolute positioning required for text blocks, but the technology is simple to understand. I was able to develop most of the Weather RADAR widget in a weekend. Not that this widget is particularly complex, but it's not totally simple either.

Some of my favourite features of this widget are:
  • It automatically refreshes with the latest RADAR image every 10 minutes. This is how often Environment Canada updates the image on their web site.
  • Similar to the actual web site, the user can select which "overlay" images to display on top of the RADAR image. The overlays consist of markers for cities, roads, rivers and more.
  • Double-clicking on the RADAR image will toggle the animation of the image.
  • The image can be resized from 25% to 175% of the original image size.
If you're interested, you can download this widget from my Duotronic site:

Amiga OS is Still Alive - Who Knew?

I was rummaging through Digg this morning and I nearly fell out of my chair when I saw a headline on the front page that Amiga OS 4 had just been released. Ars Technica even posted a 6-page review of the new release. I haven't used an Amiga since 1997, but it appears this release has done a decent job updating the OS. I lost touch with the Amiga world after '97, but this review points out that a company released new Amiga hardware a few years ago based on the PowerPC platform. Unfortunately, that didn't last and you can no longer buy new hardware to run this OS on. This kinda makes me chuckle because it is SOOOO reminicent of how the Amiga world worked.

So now I'm taking a few moments to stroll down memory lane. Ahh, the good ol' Amiga - I tell ya, that was some machine in its day. I owned an A500, A3000 (my personal favourite computer of all time), A3000T and an A4000 that got "frankensteined" into this ludriocrously huge enclosure to run a Video Toaster and Video Flyer system (now THOSE were really fun days - until the money ran out). Absolutely wicked graphics and sound capabilities for the late '80s and early '90s. I remember all the "demo" disks I used to download off of BBS systems (the Internet was barely getting off the ground back then) that were just so totally amazing with crazy colourful motion graphics and stereo sampled sound. Completely useless, yes, but awsome eye-candy that made the jaw of my PC friends hit the ground on a regular basis. Some of my favourite games still come from that platform: Silk Worm, Speedball, and Araknoid. In fact, my professional career in computers started with the Amiga (and almost died with it, too) before I was forced to switch to PCs.

Computers really haven't been as much for me since those good ol' days. <sigh>  Thanks for all the memories, Amiga.

ASP.NET Sites and SharePoint Servers - A Match Made in Hell

If you've installed SharePoint Services (2003 or 2007) on your web server, you will likely find that many of your ASP.NET sites no longer work properly or at all. Even if the ASP.NET is not in the "managed path" of the SharePoint server, it still causes extreme grief. I had hoped that this would be "fixed" in the 2007 release of SharePoint, but apparently it is not.

To get around the problem, you can start with this Knowledgebase Article:

How to enable an ASP.Net application to run on a SharePoint virtual server

which basically tells you to reset the <trust> level to "Full" and re-add all the standard ASP.NET httpModules and httpHandlers. Nice - this is always what I wanted to do just to get my damn sites working again. Notably missing from this article, however, is the new ASP.NET 2.0 "Profile" feature, which also uses an HttpModule to automatically link a profile with the currently authenticated user. If you're using the Profile feature on your site, be sure to also add this line to your <httpModules> section:

<add name="Profile" type="System.Web.Profile.ProfileModule" />

Hopefully this little tidbit of information will save you the half-a-day of grief I just spent tracking down an "object reference not set to an instance of an object" error when trying to access the Profile object. I knew it was likely a SharePoint issue, but I danced all around the web.config before I was able to figure it out.

Nikon D40 - Photographic Bliss!

Having agonized over the decision of buying a new digital camera for months, I finally picked up a new Nikon D40 DSLR using the plethora of FutureShop gift cards my family got me for Christmas. It's an awesome piece of gear! Let me tell you about my journey to get here.

I bought my first digicam about 4 years ago and it's been a decent camera. It's a 3-megapixel Toshiba PDR-M71 with a 2.8x optical zoom with many manual controls such as shutter speed and aperture. It takes fairly good pictures and was well reviewed in its day. (In fact, I wrote a review on FutureShop's web site and won a FutureShop gift card for my efforts!) However, my biggest complaint with the camera is that it is slow. Slow to power-up, slow to recycle, just plain slow. And of course, being a consumer-level digicam, most of the advanced settings are buried in menus making it difficult to experiment. Another feature that I yearned for was a better optical zoom - I like to take the odd nature picture and it's difficult to get close up to the animals of nature, you know?

So, for the past several months I've been looking at the "super-zoom" class of cameras, namely the Canon S3 IS and the Sony DSC-H5. They are significantly faster that my old Toshiba, take excellent pictures and have 12x optical zoom to boot. They also come with the added bonus of "movie mode" allowing you to take pretty good quality videos (my Toshiba has a movie mode, too, but it's only at 320 x 200). However, I've also secretly lusted after the holy grail of digital cameras - the digital SLR. These are the domain of the true photographer and for good reason. DSLRs come with interchangeable lenses, super-fast speed, and more manual controls than you can shake a stick at. But they are damn expensive, and they can lead you down the path of an extremely expensive hobby - some lenses cost well over $1000!

This week I finally made up my mind. I was on my way to FutureShop to pick up a Sony H5. It's a very nice camera and it was in my price range. So I get to the camera counter and I start fiddling around with the H5 (something I have done frequently over the past little while) and it's just kinda missing something, you know? It's kinda small and the viewfinder is actually a tiny LCD screen where it's almost impossible to tell if the frame is in focus. I had gotten spoiled earlier that day because I spent some time at Carsand Mosher trying out a D40 (and a Canon S3). The viewfinder on an SLR is a totally different experience than the standard point-and-shoot digicams. So, I was less than thrilled with the shooting experience with the H5. As luck would have it, though, FutureShop had one Nikon D40 left - an open-box item with a nice discount to boot! There is no way I can turn down a bargain (must be the Scottish blood in me) so I bought it instead.

I've been taking pictures constantly since I bought it. This thing is so FAST that it truly makes it a joy to use. There is no "boot" time, there is no shutter lag, and the recycle time is so fast you hardly notice it! The camera just feels "right" when I'm holding it and it's so easy to use. I get carried away with it and I just keep clicking the shutter button - it doesn't really matter where the camera is pointed. I can't stop myself - it's addictive.

The camera is not perfect, however. I wish I could have bought a lens with more zoom capability - the standard lens that comes with the kit is roughly equivalent to a 3x optical zoom on a point-and-shoot. Also, this being Nikon's entry-level DSLR, it doesn't have exposure bracketing which I like to use as "insurance" when taking pictures (my Toshiba has this feature, oddly enough). For this feature I would either have to spend an extra $300 on Canon's Digital Rebel, or an extra $700 on Nikon's next model up, the D80. I really like the Nikon cameras and an extra $700 just wasn't in my price range.

Still, the D40 is a wicked piece of kit for the money. If you're looking at upgrading your current camera, you should really take a good look at the D40. I'm sure that it will totally change your photographing experience like it has done for me.


I just finished reading a rather interesting book called Showstopper! written by G. Pascal Zachary in 1994. The book chronicles the development of Windows NT from its inception in the fall of 1988 until its first release in the summer of 1993. It is exceptionally well written, focusing more on the people involved and less on the actual technology being created. In fact, you don't really have to be a technology enthusiast to enjoy the book. Every page is filled with the heartaches and triumphs of the hundreds of programmers, testers, builders, and program managers involved in the creation of this brand new operating system.

People and personalities play a very strong role in this book. The book's primary focus is on the uber-architect and team lead of NT, David Cutler, who on page 2 already has a broken finger and a cracked toe from taking out his frustrations on the walls of Building Two at the Microsoft campus in Redmond. Throughout the book he is portrayed as a dark, brooding drill sergeant who is feared by many and regarded as a hero by others. Cutler came to Microsoft from Digital Equipment Corporation where he and his team developed the VMS operating system that ran on DEC's Vax computers, released in 1977. At Microsoft, Cutler rules his team with an iron fist and he has a unique vision for this new operating system: it will run on multiple hardware platforms. Up until this time, an operating system was strongly tied to the hardware it ran on. Even UNIX had so many flavours that any software written for it had to be built with a specific version in mind. Though NT started out with a promise of being able to run OS/2 programs (OS/2 was a joint operating system venture between Microsoft and IBM), the commercial failure of OS/2 had the team change gears so that NT would run older DOS and Windows programs instead. The OS/2 debacle showed that customers weren't willing to leave their old programs behind (OS/2 couldn't run programs written for DOS or Windows).

The original schedule for NT called for it to be released in March of 1991. However, many things conspired to push this ship date back by more than two years. The initial CPU targeted for the new OS was Intel's doomed i860 RISC chip, which really didn't exist in any commercial form - the NT team had to assemble their own hardware to fashion an i860 computer on which to run the earliest versions of the OS. The team later had to switch gears to Intel's mainstream x86 series and the new MIPS chips. OS/2 compatibility was dropped in favour of DOS/Windows compatibility. The new file system (NTFS) was delayed by the need to retain compatibility with older file systems. The size of the team mushroomed over the life of the project, and there were several turf wars between various teams, divisions, and program managers.

Over the life of the project, many personal relationships were destroyed as team members spent countless hours at the Microsoft campus. Spouses and children were relegated to second place in favour of the new operating system that would define Microsoft's future. Stock options made millionaires out of many on the team, though some left giving up hundreds of thousands in vested stock just to retain their sanity. Throughout it all Cutler is there watching over everything, making his demands and demanding from his team, cursing and swearing, punching walls, sticking to his guns and making few compromises. At the end of the book I found myself asking, "Why is it that so many great leaders have to be such large assholes?" Honestly, does greatness have to exact such a hefty price?

Though you may not be a fan of Microsoft or Windows NT, it is largely irrelevant as the most interesting aspect of the book is the multitude of personalities and their interactions with each other. And though the book is over 10 years old, it still holds some fascinating insights into the development of large software programs.

Dell Latitude D820 Review

Development in today's world of .NET Framework 2.0 (and now 3.0), Visual Studio 2005 Team System, SQL Server 2005 and other vaious high-power applications requires sturdy hardware on which to code. Being on the bleeding edge of any technology usually requires the latest and greatest in hardware resources in order to be even moderately productive. After running all of the aforementioned apps on a 3 GHz Pentium 4 desktop with 512 MB of RAM for several months, my employer blessed me with a brand new Dell Latitude D820. And not a moment too soon, either.

I have been using this new laptop for about 3 months now, and it is one of the best latops I have ever had (previous models I have used include a Compaq M500, two other Dell Latitude models, and an IBM ThinkPad T40). My configuration includes an Intel Core Duo T2600 at 2.16 GHz, 2MB 2GB of RAM (the single most important item, IMHO), a 60 GB 7200 RPM hard drive, a CD-R/DVD drive, and a 1680 x 1050 15.4" widescreen display running on an Intel 945GM adapter. My OS is Windows XP Pro, SP2. Though heavier than a 12" laptop, it's still light enough for everyday travelling while preserving a solid feel and high build quality.

The keyboard is quite nice for a laptop, though I still prefer to use an external keyboard when I can. There are four dedicated feature buttons above the keyboard, two for audio volume, one for mute, and the power button. The keyboard is flanked on either side by speakers. On the left side of the laptop is a slider switch to control the built-in WiFi card. It turns the laptop's WiFi antenna on and off, but it also has a really neat feature where sliding the switch all the way up will illuminate a small green LED if a WiFi signal is detected. This feature works even if the laptop is off, so you can quickly determine if a hotpsot is within range without opening the screen and turning on the entire machine. Also on the left side are 1394 (Firewire), audio line-in, and headphone jacks, in addition to the IR port and the PC and EC card slots. The back of the machine has the following connections: VGA, 9-pin serial, 2 USB ports. The right side has an additional 2 USB ports and the CD/DVD drive.

While performing usual development tasks, the machine is extremely responsive, even with three instances of Visual Studio 2005 open, SQL Server Management Studio, several browser windows and driving an external 20" display at 1680 X 1050 in addition to the laptop screen in an extended desktop configuration. I attribute this to the dual-core processor and ample RAM. It is as responsive or even more so than my personal hyperthreaded, overclocked P-4 running at 3.5 GHz with 2MB of RAM and an ATI X800XL video card driving a single 1920 X 1200 display. Hard drive intensive operations still seem quicker on my desktop, though it is running two 200 GB drives in a RAID 0 configuration, so I would expect it to have the edge speed-wise.

Battery life is also quite impressive. Though I haven't benchmarked it properly, I appear to get about 4 hours on a single charge while performing elementary tasks such as web surfing or typing new blog entries. In fact, the only thing that I don't like about this laptop is the display. Cramming a 1680 x 1050 display into a 15.4" screen results in a pretty high dpi, and text that is very clear, but microscopically small. This, combined with a viewing angle that results in a dimmed display when you're even slightly off-axis results in a display that is not very pleasant to use. The high resolution is nice, but I generally prefer the much brighter and easier-to-read display on my wife's budget-class e-Machine's laptop.

The solution to the display issue for everyday use at the office is, of course, adding a docking station and external display. The docking station I use is Dell's XXX unit which provides all the usual ports in addition to a DVI video connector, an S/PDIF digital audio out, and an S-video out. The company-provided external display is a 17" CRT which is sooo 1990's. Not satisfied with such an antique, I bought a Dell 2007WFP 20" widescreen flat panel to use at work. Compared to the built-in laptop display, this unit is an absolute joy to work with.

So, if you are in the market for a new laptop, I heartily recommend the Dell Latitude D820. I would suggest, however, that you opt for the "real" video card option and maybe a larger hard drive. I am so impressed with this unit that I would seriously considering buying one with my own money if I had to buy my own gear.

Team Foundation Build and Web Deployment Projects

Last week I decided it was time to start generating regular builds of the web application my team is developing. We're using Team Foundation Server as our source control system, so it was a natural choice to use Team Foundation Build (TFB) to generate the compiled application. Unfortunately for me, TFB is a version 1 Microsoft product which means, well, there are a few things that don't work quite right or as well as you might hope.

Installing TFB was easy, though I would have thought it would have been automatically installed with the Team Foundation Server (TFS) install. However, your build server might not be your TFS server, so this does make some kind of sense. Configuring a new Build Type was also easy, though this brings up a whole new issue. A Build Type consists of various parameters on the configuration of the compiled code you want. However, there is very little guidance I could find on the rationale for selecting various configuration options. As it turns out, for building a web application, you must be very precise with this configuration or nothing will work. The "Active solution configuration" isn't so important, but the "Active solution platform" must be "Mixed Platforms" or you won't get any satisfaction.

Before I continue, there are a few things you must know about the solution I was attempting to build. The solution contains about six projects, and the web site itself has several references to other assemblies. Both of these conditions weigh heavily on how successful you'll be using TFB to build the solution. First, the number or projects, or more precisely, the type of projects included in the solution. The current release of TFB can't deal with either Database Projects (regular ones, not the new stuff in Data Dude) or with Reporting Services Projects. With Database Projects, you'll only get a warning during the build, but with Reporting Services Projects you'll get an error and the build will be marked as "Failed." Now, using the Configuration Manager in Visual Studio, you can indicate which projects are supposed to get built when the solution is built (meaning you can exclude certain projects). This works in Visual Studio, but TFB ignores such settings and tries to build everything in the solution anyway. The only way to fix this is to remove the offending projects from your solution. Very irritating.

Now comes the really fun part. We use a Web Deployment Project (WDP) to compile the web application into a folder that can be simply "xcopy"'d to a deployment folder. On our local developer workstations this works great. On TFB, you're in for a world of hurt. First off, TFB has no idea what a WDP is, so you'll get all kinds of build errors when it sees this and tries to build it. You have to either install the WDP add-in on your build server, or at least copy the relevant files to the proper location on the build server. Once that's done, TFB can perform a build on WDPs, but this will generally fail until you take some extra steps.

First, your web application's external file assembly references have to be set up so that TFB can see them. The best way to do this is to add a "Solution Folder" to your existing solution, and place all assemblies that your web application references in here. Then make sure to re-reference these assemblies in your web project so that it knows where to find the new location. If you do this, then TFB will find all the referenced assemblies because they become part of the solution and are referenced by relative path.

Next (and this is the real tricky part), you have to tell all the other projects in your solution upon which the web site depends to copy their compiled assemblies into the web project's bin folder. In Visual Studio, you would do this simply by making a project reference from the web site to another project but the combination of TFB and WDP has no idea how to deal with this. When your WDP is compiled, it will complain that it can't find the appropriate assembly references (errors such as "Type 'MyClass' is not defined") and the build will fail. To get around this, you have to add a post-build event to all projects in the solution upon which the web site is dependent. In my solution, for example, I have two class libraries that the web site depends on - a Business Layer library and a Data Access Library. In both of these projects, I have to specify a post build event command that will copy the compiled assemblies to the bin folder of the web project.

xcopy /Y /S /F "$(TargetPath)" "$(SolutionDir)WebProjectRoot\bin"

(by the way, the quotation marks are very important if your paths have spaces in them, as I learned when xcopy gave me an "exited with code 4" message - very helpful)

Once you have done this and if the planets are aligned just so, your web solution will finally have a successful build in TFB. Of course, the next question is "what do I do once the thing finally builds?" For me, I wanted to be able to navigate to the build server in a web browser so I could see what the latest build looks like (we also use Automaton to perform continuous integration - ironically, Automaton was the easiest component of this entire mess to get set up properly). What I ended up doing was simply creating a virtual directory to the build location for the WDP on the build server. It's always in the same location, so on every build the directory is updated with the latest bits. It works, and now everyone is happy (okay, maybe only I'm happy because it took me four bloody days to get this far and it's nice to finally have a sense of accomplishment).

Here are some references I found helpful in between beating my head on my desk trying to get this all to work:

Web Deployment projects in a Team Build (Martijn Beenes)

Resolving file references in team build (Manish Agarwal)

Team Build, Web Applications, Lemon Juice and Paper Cuts (Rob Conery)

Also, thanks to Luis Fraile and Swaha Miller on MSDN Forums for helping me with this problem.