Andre's Blog

Personal blog of Andre Perusse

Weather RADAR for Canada - a Yahoo Widget

I seem to be obsessed with the weather lately. So are a lot of other people judging from the amount of weather-related widgets and "parts" for public portals. One of my favourites was a Yahoo widget that showed the precipitation RADAR imagery from Environment Canada's web site. Unfortunately, the widget stopped working last year when Environment Canada changed how the web site worked.

So, I took a look at the Yahoo Widget SDK and it seemed like an easy-enough technology to develop with. The main widget file is an XML document that allows you to lay out window areas with text blocks and images. It also allows for the description of timer events and preference settings. The programmatic aspect of the widget is controlled with simple javascript, though as I learned the hard way, it's not really the same as using javascript in a web page. For example, there is no Image() object in a Yahoo widget. The Image() object is not an intrinsic javascript construct - it is actually part of the browser's HTML document object model. So don't try pre-loading any web graphics the good old fashioned way.

Still, once you get the hang of it, developing a widget is pretty straight-forward. I'm not really a fan of the absolute positioning required for text blocks, but the technology is simple to understand. I was able to develop most of the Weather RADAR widget in a weekend. Not that this widget is particularly complex, but it's not totally simple either.

Some of my favourite features of this widget are:
  • It automatically refreshes with the latest RADAR image every 10 minutes. This is how often Environment Canada updates the image on their web site.
  • Similar to the actual web site, the user can select which "overlay" images to display on top of the RADAR image. The overlays consist of markers for cities, roads, rivers and more.
  • Double-clicking on the RADAR image will toggle the animation of the image.
  • The image can be resized from 25% to 175% of the original image size.
If you're interested, you can download this widget from my Duotronic site: http://www.duotronic.com/weatherWidget.

Amiga OS is Still Alive - Who Knew?

I was rummaging through Digg this morning and I nearly fell out of my chair when I saw a headline on the front page that Amiga OS 4 had just been released. Ars Technica even posted a 6-page review of the new release. I haven't used an Amiga since 1997, but it appears this release has done a decent job updating the OS. I lost touch with the Amiga world after '97, but this review points out that a company released new Amiga hardware a few years ago based on the PowerPC platform. Unfortunately, that didn't last and you can no longer buy new hardware to run this OS on. This kinda makes me chuckle because it is SOOOO reminicent of how the Amiga world worked.

So now I'm taking a few moments to stroll down memory lane. Ahh, the good ol' Amiga - I tell ya, that was some machine in its day. I owned an A500, A3000 (my personal favourite computer of all time), A3000T and an A4000 that got "frankensteined" into this ludriocrously huge enclosure to run a Video Toaster and Video Flyer system (now THOSE were really fun days - until the money ran out). Absolutely wicked graphics and sound capabilities for the late '80s and early '90s. I remember all the "demo" disks I used to download off of BBS systems (the Internet was barely getting off the ground back then) that were just so totally amazing with crazy colourful motion graphics and stereo sampled sound. Completely useless, yes, but awsome eye-candy that made the jaw of my PC friends hit the ground on a regular basis. Some of my favourite games still come from that platform: Silk Worm, Speedball, and Araknoid. In fact, my professional career in computers started with the Amiga (and almost died with it, too) before I was forced to switch to PCs.

Computers really haven't been as much for me since those good ol' days. <sigh>  Thanks for all the memories, Amiga.

ASP.NET Sites and SharePoint Servers - A Match Made in Hell

If you've installed SharePoint Services (2003 or 2007) on your web server, you will likely find that many of your ASP.NET sites no longer work properly or at all. Even if the ASP.NET is not in the "managed path" of the SharePoint server, it still causes extreme grief. I had hoped that this would be "fixed" in the 2007 release of SharePoint, but apparently it is not.

To get around the problem, you can start with this Knowledgebase Article:

How to enable an ASP.Net application to run on a SharePoint virtual server

which basically tells you to reset the <trust> level to "Full" and re-add all the standard ASP.NET httpModules and httpHandlers. Nice - this is always what I wanted to do just to get my damn sites working again. Notably missing from this article, however, is the new ASP.NET 2.0 "Profile" feature, which also uses an HttpModule to automatically link a profile with the currently authenticated user. If you're using the Profile feature on your site, be sure to also add this line to your <httpModules> section:

<add name="Profile" type="System.Web.Profile.ProfileModule" />

Hopefully this little tidbit of information will save you the half-a-day of grief I just spent tracking down an "object reference not set to an instance of an object" error when trying to access the Profile object. I knew it was likely a SharePoint issue, but I danced all around the web.config before I was able to figure it out.

Showstopper!

I just finished reading a rather interesting book called Showstopper! written by G. Pascal Zachary in 1994. The book chronicles the development of Windows NT from its inception in the fall of 1988 until its first release in the summer of 1993. It is exceptionally well written, focusing more on the people involved and less on the actual technology being created. In fact, you don't really have to be a technology enthusiast to enjoy the book. Every page is filled with the heartaches and triumphs of the hundreds of programmers, testers, builders, and program managers involved in the creation of this brand new operating system.


People and personalities play a very strong role in this book. The book's primary focus is on the uber-architect and team lead of NT, David Cutler, who on page 2 already has a broken finger and a cracked toe from taking out his frustrations on the walls of Building Two at the Microsoft campus in Redmond. Throughout the book he is portrayed as a dark, brooding drill sergeant who is feared by many and regarded as a hero by others. Cutler came to Microsoft from Digital Equipment Corporation where he and his team developed the VMS operating system that ran on DEC's Vax computers, released in 1977. At Microsoft, Cutler rules his team with an iron fist and he has a unique vision for this new operating system: it will run on multiple hardware platforms. Up until this time, an operating system was strongly tied to the hardware it ran on. Even UNIX had so many flavours that any software written for it had to be built with a specific version in mind. Though NT started out with a promise of being able to run OS/2 programs (OS/2 was a joint operating system venture between Microsoft and IBM), the commercial failure of OS/2 had the team change gears so that NT would run older DOS and Windows programs instead. The OS/2 debacle showed that customers weren't willing to leave their old programs behind (OS/2 couldn't run programs written for DOS or Windows).


The original schedule for NT called for it to be released in March of 1991. However, many things conspired to push this ship date back by more than two years. The initial CPU targeted for the new OS was Intel's doomed i860 RISC chip, which really didn't exist in any commercial form - the NT team had to assemble their own hardware to fashion an i860 computer on which to run the earliest versions of the OS. The team later had to switch gears to Intel's mainstream x86 series and the new MIPS chips. OS/2 compatibility was dropped in favour of DOS/Windows compatibility. The new file system (NTFS) was delayed by the need to retain compatibility with older file systems. The size of the team mushroomed over the life of the project, and there were several turf wars between various teams, divisions, and program managers.


Over the life of the project, many personal relationships were destroyed as team members spent countless hours at the Microsoft campus. Spouses and children were relegated to second place in favour of the new operating system that would define Microsoft's future. Stock options made millionaires out of many on the team, though some left giving up hundreds of thousands in vested stock just to retain their sanity. Throughout it all Cutler is there watching over everything, making his demands and demanding from his team, cursing and swearing, punching walls, sticking to his guns and making few compromises. At the end of the book I found myself asking, "Why is it that so many great leaders have to be such large assholes?" Honestly, does greatness have to exact such a hefty price?


Though you may not be a fan of Microsoft or Windows NT, it is largely irrelevant as the most interesting aspect of the book is the multitude of personalities and their interactions with each other. And though the book is over 10 years old, it still holds some fascinating insights into the development of large software programs.

Team Foundation Build and Web Deployment Projects

Last week I decided it was time to start generating regular builds of the web application my team is developing. We're using Team Foundation Server as our source control system, so it was a natural choice to use Team Foundation Build (TFB) to generate the compiled application. Unfortunately for me, TFB is a version 1 Microsoft product which means, well, there are a few things that don't work quite right or as well as you might hope.

Installing TFB was easy, though I would have thought it would have been automatically installed with the Team Foundation Server (TFS) install. However, your build server might not be your TFS server, so this does make some kind of sense. Configuring a new Build Type was also easy, though this brings up a whole new issue. A Build Type consists of various parameters on the configuration of the compiled code you want. However, there is very little guidance I could find on the rationale for selecting various configuration options. As it turns out, for building a web application, you must be very precise with this configuration or nothing will work. The "Active solution configuration" isn't so important, but the "Active solution platform" must be "Mixed Platforms" or you won't get any satisfaction.

Before I continue, there are a few things you must know about the solution I was attempting to build. The solution contains about six projects, and the web site itself has several references to other assemblies. Both of these conditions weigh heavily on how successful you'll be using TFB to build the solution. First, the number or projects, or more precisely, the type of projects included in the solution. The current release of TFB can't deal with either Database Projects (regular ones, not the new stuff in Data Dude) or with Reporting Services Projects. With Database Projects, you'll only get a warning during the build, but with Reporting Services Projects you'll get an error and the build will be marked as "Failed." Now, using the Configuration Manager in Visual Studio, you can indicate which projects are supposed to get built when the solution is built (meaning you can exclude certain projects). This works in Visual Studio, but TFB ignores such settings and tries to build everything in the solution anyway. The only way to fix this is to remove the offending projects from your solution. Very irritating.

Now comes the really fun part. We use a Web Deployment Project (WDP) to compile the web application into a folder that can be simply "xcopy"'d to a deployment folder. On our local developer workstations this works great. On TFB, you're in for a world of hurt. First off, TFB has no idea what a WDP is, so you'll get all kinds of build errors when it sees this and tries to build it. You have to either install the WDP add-in on your build server, or at least copy the relevant files to the proper location on the build server. Once that's done, TFB can perform a build on WDPs, but this will generally fail until you take some extra steps.

First, your web application's external file assembly references have to be set up so that TFB can see them. The best way to do this is to add a "Solution Folder" to your existing solution, and place all assemblies that your web application references in here. Then make sure to re-reference these assemblies in your web project so that it knows where to find the new location. If you do this, then TFB will find all the referenced assemblies because they become part of the solution and are referenced by relative path.

Next (and this is the real tricky part), you have to tell all the other projects in your solution upon which the web site depends to copy their compiled assemblies into the web project's bin folder. In Visual Studio, you would do this simply by making a project reference from the web site to another project but the combination of TFB and WDP has no idea how to deal with this. When your WDP is compiled, it will complain that it can't find the appropriate assembly references (errors such as "Type 'MyClass' is not defined") and the build will fail. To get around this, you have to add a post-build event to all projects in the solution upon which the web site is dependent. In my solution, for example, I have two class libraries that the web site depends on - a Business Layer library and a Data Access Library. In both of these projects, I have to specify a post build event command that will copy the compiled assemblies to the bin folder of the web project.

xcopy /Y /S /F "$(TargetPath)" "$(SolutionDir)WebProjectRoot\bin"

(by the way, the quotation marks are very important if your paths have spaces in them, as I learned when xcopy gave me an "exited with code 4" message - very helpful)

Once you have done this and if the planets are aligned just so, your web solution will finally have a successful build in TFB. Of course, the next question is "what do I do once the thing finally builds?" For me, I wanted to be able to navigate to the build server in a web browser so I could see what the latest build looks like (we also use Automaton to perform continuous integration - ironically, Automaton was the easiest component of this entire mess to get set up properly). What I ended up doing was simply creating a virtual directory to the build location for the WDP on the build server. It's always in the same location, so on every build the directory is updated with the latest bits. It works, and now everyone is happy (okay, maybe only I'm happy because it took me four bloody days to get this far and it's nice to finally have a sense of accomplishment).

Here are some references I found helpful in between beating my head on my desk trying to get this all to work:

Web Deployment projects in a Team Build (Martijn Beenes)

Resolving file references in team build (Manish Agarwal)

Team Build, Web Applications, Lemon Juice and Paper Cuts (Rob Conery)

Also, thanks to Luis Fraile and Swaha Miller on MSDN Forums for helping me with this problem.

Google Maps 1 - Local Live 0

I've always used Google Maps when looking for addresses or places and it's been a pretty good experience. The satellite images are really cool, and they even have relatively high resolution imagery for my location on the Canadian east coast. Now, Microsoft absolutely hates it when some other company enjoys some success in the consumer computer market, so they set out to challenge Google Maps with Live Local, neƩ Virtual Earth. I had checked it out a couple of times but never found it very compelling.

At a recent event in Halifax, however, a Microsoft presenter mentioned Live Local again (in passing really, it wasn't relevant to the presentation) so I decided to give it another look-see today. And here we have another fine example of how Microsoft often fails to live up to the competition. In the two pictures below you can see one image from Google Maps and another from Live Local (bonus points for whoever can identify the location - Darth Mac is excluded, of course). You will notice that the Google Maps version comes in a much higher resolution. You will also notice that Microsoft apparently forgot to pay their satellite imaging bill, and half the image is missing. Good job, guys.

To be fair, there are scenarios where Local Live has the edge. Check out this posting for a different perspective. There's more to Local Live than just satellite imagery, but I saw half my region missing and I stopped looking right away. I live in the most populous region east of Montreal in Canada and Microsoft didn't think it was worthwhile to get detailed satellite photos here. In the words of my former drill sergeant, "GET IT TOGETHER, MICROSOFT!"

So, in the meantime, I will continue to ignore Live Local (or Half-Dead Local as I'm now calling it) and stick with good ol' Google Maps.



Team Foundation Server - First Thoughts

The latest project I'm working on is using Visual Studio 2005, and the license we have allows us to use Team Foundation Server (TFS) Workgroup Edition. The Workgroup Edition is limited to 5 users, but is otherwise fully functional. Taking some advice from the Microsoft Regional Director for my area, I decided to move our small team from Visual SourceSafe (VSS) to TFS. And, as you know if you've been reading my blog at all, I'm a sucker for new technology especially as it relates to my day-to-day job.

So, I get right to work installing this puppy. Instructions? Who reads instructions, right? I knew it needed SQL Server and Windows SharePoint Services (WSS) so off I went. Okay, I did read just a little bit of the instructions and knew that TFS wouldn't work with SQL Server 2005 Development Edition, so I dodged that bullet. I wasn't so lucky with the next bullet, however. I installed WSS and when the web-based admin screen came up, I proceeded to complete the WSS install. Oops. When I started to install TFS immediately afterwards, it chastised me for not reading the installation instructions. With TFS, it doesn't want you to complete the WSS web-based admin portion since TFS completes the WSS setup. The only way to fix it? Unistall and reinstall WSS.

Honestly, this wasn't a big deal. Cost me maybe 10 minutes and that's only because the crappy virtual machine we have for a dev environment is slow as a dog. After I corrected that boo-boo, TFS installed fine. Well, mostly. See, TFS needs two separate service accounts to run - one for the TFS system itself, and one for the Reporting Services piece. It says it can use a local or a domain account, but using a local account pops up a warning that "domain users won't be able to use it". That's no good - all our developers are using domain accounts. Luckily, I had our help desk folks set me up a domain service account for another purpose a few weeks ago so I was able to use that for the TFS service account. Upon attempting to use if for the reporting account, though, TFS squawked at me that I couldn't use the same account for both services. Seeing as how getting the first domain account set up had taken about 6 weeks, I decided to use a local account for this one and damn the "domain users can't use it" message. Who needs reports anyway?

Once TFS was completely installed, my next task was to connect to it. I quickly installed Team Explorer on my machine, not really knowing what it was exactly, just knowing that it was the client piece. After it installed, I went looking for it in my Start menu. Nothing there. Hmmm.....  Later, I discovered that if you install Team Explorer on a machine without Visual Studio 2005, it actually installs a small version of VS 2005 because Team Explorer can't work without it! Since I already had VS 2005 installed, I didn't see anything new in my Start menu.

So, I fired up VS 2005 and started scanning menu items. Sure enough, under the Tools menu was "Connect to Team Foundation Server...". So I did. And it worked!! Honestly, I was floored because I fully expected some permissions problems. Of course the only reason it worked was because I installed it on the server under my domain account. So, I create a new Team Project and then I marvel at all the cool stuff TFS has set up. Stuff like requirements logs, default work items, and a bunch of other goodies I can't remember. The SharePoint portal site it sets up for the project is also cool, with a bunch of graphs and reports that project manager types are sure to eat up.

Moving my project's source control from VSS to TFS was also mostly painless, though it cost me about 20 minutes of aggravation. I wasn't interested in migrating the VSS database to TFS (since we just started and there wasn't much of a version history anyway), I just wanted to drop the VSS bindings and then re-bind to TFS. Remember, I don't like reading instructions, mostly because they're not usually in the format of "this is how you do exactly what you want to do right now" and instead you have to read a dozen sections piecing together the bits that are relevant to your immediate problem. After missing one project in my solution when unbinding from VSS (and having VS 2005 whine when I tried to change source-control plug-ins) and fighting with that for a little while, I eventually got VS 2005 changed over to TFS source control mode. I checked in the entire solution without incident.

The only other problem I had was setting up other developers to use the system. Silly me, when the dialog asks you to select users OR groups, I selected my team's group instead of individual users. That got me nowhere. Other developers got the "you are not a licensed user" message when trying to connect. With the Workgroup edition (and maybe the full edition, I'm not sure), you have to specify individual accounts, not entire security groups. Once I figured that out (many thanks to the great oracle known as the Internet and MSDN blogs), we were off to the races!

All in all, it was a helluva lot of work just to get a better source control system set up. I'm not sure if using something like Subversion would have been easier, but TFS brings so much more to the table. At any rate, transactional check-ins (no more corrupted VSS databases!) and the whole "shelving" concept are worth the effort. I'm sure we'll dive into all the other cool features of TFS as the project progresses.


Vista and Administrative Shares

I am used to remotely connecting to the file system on my computers via administrative shares. On previous versions of Windows, the hard drive volumes were automatically exposed as administrative shares - C$, D$ and so forth. The "$" at the end of a share name makes it invisible to network browsing, but otherwise it is a regular share. As the name implies, administrative shares are only available to those in the Local Administrators security group.

As some of you are aware, I am fighting my way valiantly through a newly installed Vista RC1 system. In fact, the only reason I haven't put XP back on is because I'm too lazy. Several of my peripheral devices (sound card, web cam, video capture card, etc.) either don't work at all, or only work with partial functionality. Well, add administrative shares to that list.

Now, browsing around the system using the usual tools, it would appear as though Vista does indeed set up the normal set of administrative shares. Except you can't connect to them remotely. I checked the new Network Center settings. I checked the firewall settings. I enabled the Administrator account (which is disabled by default in Vista). I even stood on my head while balancing a rock on one foot. Nothing worked.

Back to the great oracle of the Internet. You know, the oracle is a font of knowledge but it's not very forthright with its information. You have to poke it and prod it before it will reveal the answer that you seek. After going down several blind alleys, I found the answer: you have to make an addition to the Windows Registry. What an absolute scourage on the face of personal computing the Registry is. It is such a mess of settings, parameters, and configurations that it should just be summarily executed. I'm not sure if it's any better than the multitude of INI files it was designed to replace.

Anyway, the Registry setting you have to add is a DWORD called "LocalAccountTokenFilterPolicy" in the following location:

HKLM\Software\Microsoft\Windows\CurrentVersion\Policies\System

The value of the new DWORD setting has to be 1.

Thanks to Jimmy Brush, Wugnet.com, and Google (for caching the page 'cause the original search result pointed to gobblygook) for supplying this answer: http://forums.wugnet.com/vista/Problem-hard-drive-ftopict11749.html#49308)



Tracking A Shipment To The Dark Side Of The Moon

Recently, I bought a nice shiny piece of home audio gear on eBay. I bought it from a fellow in the United States, and the next day the item was shipped via UPS. Though I knew it would take a week to get here, I still like to track my packages on the Internet. You might say I'm rather obsessed with tracking packages, actually.

So, I get the tracking number and plug it into the UPS site over the next few days and I'm absolutely tickled with the amount of tracking data that UPS supplies. It looks something like this:

HARRISBURG, PA,  US     11/09/2006     4:02     DEPARTURE SCAN
HARRISBURG, PA,  US     09/09/2006     1:11     ARRIVAL SCAN
STAUNTON, VA,  US   08/09/2006      20:35   DRIVER HAS BAD HEARTBURN, BUT THE PACKAGE MUST BE DELIVERED - BACK ON THE ROAD
STAUNTON, VA,  US   08/09/2006      20:55   DRIVER STOPPED FOR LATE SUPPER
ROANOKE, VA,  US     08/09/2006     20:16     DEPARTURE SCAN
ROANOKE, VA,  US    08/09/2006     19:53     ARRIVAL SCAN
CHARLOTTE, NC, US   08/09/2006   17:53   DRIVER FEELING MUCH BETTER - BACK ON THE ROAD
CHARLOTTE, NC, US   08/09/2006   17:47   DRIVER STOPPED FOR BATHROOM BREAK
WEST COLUMBIA, SC,  US     08/09/2006     15:07     DEPARTURE SCAN
WEST COLUMBIA, SC,  US    08/09/2006     12:28     ARRIVAL SCAN
ATLANTA, GA,  US     08/09/2006     8:20     DEPARTURE SCAN
ATLANTA, GA,  US    08/09/2006     3:11     ARRIVAL SCAN

An amazing amount of detail to be sure. But, when the package finally crosses the border into Canada, the flow of information abrubtly halts. On the evening of the12th, the package is listed as IMPORT SCAN in Lachine, Quebec. There is absolutely NO activity on the 13th. Nothing. Not a peep. I get ready for work on the morning of the 14th and check again - still nothing. My package has been MIA for two days now. A delivery attempt is made around 4:00 PM on the 14th actually, though I'm not home. I'm able to pick it up later that evening, however.

So, what exactly is it about Canada that causes so much trouble for shipment tracking systems? It's like the package has gone to the dark side of the moon and all communications are blocked. It's not just UPS. When I order stuff from Vancouver (clear across the continent) using another shipping company, I get exactly TWO tracking notices - one that it was picked up, and one that it was delivered a week later. Some tracking system.

It would seem as though there is still some work to be done on shipment tracking north of the border.


Vista RC1 - Not Quite There Yet

Microsoft recently released the so-called first "Release Candidate" for the next version of Windows, called "Vista." The geek that I am, I had downloaded previous Vista betas with the intention of installing and using them, but some how I never quite got around to it. However, with the RC1 release, I decided to give it a whirl. My home machine's OS was waaaaaay overdue for a reinstall anyway, so I thought "what the hell."

So, the sucker for technology punishment that I am, I spent this past weekend installing and playing with Microsoft's new wonder-child and all I can think is that Microsoft must not understand the term "Release Candidate" the same way I do. To me, this term means that the software is pretty much done and Microsoft just wants to see what other little bugs might be left lying around in the code. Apparently I'm either incorrect in my understanding of the term, or Microsoft truly does wish to inflict massive pain upon the computing population. Let's just say that I got intimately familiar with the install key over the weekend, as I tried maybe 10 times to install the damn thing. Like I said - a sucker for punishment. But dammit, I WAS going to get the fool thing installed.

Now, I've been following some tech news reports that claim RC1 is a GREAT improvement over past betas. All I can say is that past betas must have been REALLY crappy, 'cause RC1 is, in my opinion, a big steaming pile of digital mess.

It all started when I booted off the Vista RC1 DVD that I burned from MSDN downloads. My machine is a fairly vanilla rig with an older but immensely popular ASUS P4P800 SE motherboard, complete with a 3.2GHz P4 and 2GB of RAM. No problem there, except my two 200GB hard drives are configured in a RAID 0 array using Intel's ICH5R matrix raid chipset. And here is where my tale of woe begins.

Intel's matrix raid storage technology is now several years old, fairly mature, and fairly popular. But Vista has no idea what it is. Booting off the Vista DVD revealed no available hard drive partitions available. Grrrr.....  Okay, no big deal, I have the stupid driver disk that WinXP needed when installing, so maybe that will work (I will say that Vista's storage driver installer is MUCH nicer that WinXPs 1985 vintage user interface when installing this kind of stuff). So, in goes the disk, Vista reads it, highlights the driver, I click "Continue," it reads the disk for a few seconds, then.....KA-BOOM!!!! ... a lovely STOP error (or BSOD error if you prefer) greets me. I haven't had a STOP error in I don't know how many years. It's nice to see that Vista has brought them back - I was starting to miss them.

So, I thought my disk was bad. Vista now lets you load storage drivers off USB drives, so I downloaded the latest drivers from Intel and tried again. This time, the lastest Intel drivers don't support ICH5, only ICH6 and higher. Took me a while to figure that one out. Back to Intel's site, download ICH5 compatible drivers and try again. KA-BOOM! STOP error.

Cursing like mad now, I consult the great oracle of wisdom better known as the Internet. Apparently Vista can't boot off the DVD and load these drivers. You have to start the install from an existing WinXP system, in which case the drivers will load fine, provided they are in the ROOT of the drive on which they reside. Okay, this is just absolutely insane. This is 2006 PEOPLE, not 1986!!!

So, cursing even more now, the install starts now that Vista can see my hard drive. Wee!! It's finally installing! So, Vista claims it is copying files. Seems to be taking an awfully long time. Gets up to 82% complete....and stays there for about 45 minutes. At this point, I break some furniture. WTF!!!!! Now I can't blame this all on Vista - I suspect the media was bad, but still, after 45 minutes the damn installer should now that something is not right. Good grief.

Next attempt, I mount the DVD image in XP, bypassing the media problem. I am really starting to hate the 25-character install key at this point. But this is the last time I have to enter it - Vista finally installs! Hellelujah! Now we're off to the races!!

Whoops! Not so fast. After Vista is installed, it immediately downloads an update to Windows Defender, the new anti-spyware tool. Cool, the network is at least working, or so I thought. I try to surf the web - no go. I go to Microsoft's site - it works fine! But it's the only site on the web that I can get to!!! This is insane - I've never seen this kind of behaviour in 15+ years of working with computers. Consulting the great Internet oracle, I discover that Microsoft FUBAR'd the drivers for my network card and I have to install the old XP drivers to get it to work properly. But once I do, it does work fine.

So I use if for a few minutes. Release Candidate, you say, hmmm? Actually, most of it works fairly well, sort of. Aero glass is neat, but not a killer feature by any stretch. But there are some serious problems still lurking. For one, there is no driver support yet for the world's most popular sound card, the SoundBlaster Audigy 2. Apparently, if you pray to half a dozen tech gods and install some Creative drivers in the right way while standing on your head, you can get 2.1 sound working. I just recently bought a nice set of Logitech Z-5500 5.1 surround speakers. I'm sorry, but I'm not settling for 2.1 sound. Then, I try to install the driver software for my new Logitech QuickCam Fusion. Sorry, that won't work either. Again, praying to the tech gods while standing on your head will get you SOME functionality, but not all. Next, I open up a folder containing some video files. Ooops, I guess I should have known better. Windows Explorer crashes. I stare in disbelief. Release Candidate my ass!! Once I reboot, I try to play an AVI file. The new Windows Media Player just gives me sound, but no video. Come on, guys, you have GOT to be kidding me, right?

The last problem I had could be a deal-breaker. You see, I have an ATI Radeon X800XL video card. When ATI was building these cards, apparently the only cooling fans available were those used for the air intake on jet fighter planes. So this card makes a lot of noise. In XP, this is no problem - you can download ATI Tray Tools and step down the RPM speed of the fan to a decent level. On Vista, ATI Tray Tools barfs. I can't control the fan speed so now my machine sounds like it's getting ready for take off. I didn't even try to capture video off the card yet - that should be an entertaining walk through a digital minefield.

So, if you're thinking of trying out Vista now that it's got "Release Candidate" status, think again. Me, I think I'll wait for Vista SP1 before I test these waters again. Sorry, but I don't really need to experience Windows 95 growing pains again.