Andre's Blog

Personal blog of Andre Perusse

On The Edge (and falling off a cliff)

UPDATE (Feb. 19, 2008): If you're too busy to read the book, take a look at this 8 minute video - it gives a pretty decent summary! 

Ars Technica recently began running a series on the history of the Amiga, my favourite computer of all time. From the comments posted to this series I discovered that in 2005 a book was published on the entire history of Commodore. Well, the entire history starting from 1974, anyway. Through hours of interviews with various engineers and executives of Commodore, the book titled On the Edge: The Spectacular Rise and Fall of Commodore by Brian Bagnall tells the story of Commodore's pioneering role in the microcomputer industry, its rise to glory in the early to mid 1980's, and ending with its bankruptcy on April 29, 1994.

Commodore was actually founded in Toronto in 1958 by Jack Tramiel and Manfred Kapp as a typewriter manufacturer. After a stock scandal in the '60s, Commodore was bought by Canadian financier Irving Gould in 1966 who kept Jack Tramiel on as CEO to move Commodore into the then lucrative calculator sector. In the mid 70s, the calculator market was getting overcrowded, so Jack started looking for cheaper calculator parts. This lead to Commodore's purchase of MOS Technology, a semiconductor manufacturer. At the time, MOS had recently hired Chuck Peddle, who soon developed the legendary 6502 microprocessor. (Interesting side node: Chuck Peddle's parents hailed from the Canadian Maritime provinces, though the book does not detail from where specifically. A quick look at Canada411 shows a high concentration of Peddle's in Cape Breton, however.) The 6502 was extremely important to the nascent microcomputer industry because while the comparable 6800 from Motorola cost $300 (which Chuck Peddle was also involved with), the 6502 cost only $25. The 6502 became the processor in Commodore's first microcomputer, the PET 2001. It was also used in the Apple I and Apple II computers, and Atari's home computer models (the famous Atari 2600 game console used a variant of the 6502). The 6510 used in the world's top-selling computer model of all time, the Commodore 64, was a direct descendant of the 6502.

While I found this book to be a long read (it is 557 pages), I was thoroughly enthralled with it. While it did not focus too much on the personal lives of those involved (an aspect I enjoyed in another computer history book I recently read, Showstopper by G. Pascal Zachary), it did cover a lot of detail, including technical detail, which naturally I loved. It also detailed a lot of business transactions and, for me, illuminated much of how the business world works, or at least the dysfunctional business world of Commodore. Did you know that Steve Jobs offered to sell Apple to Commodore in the late 1970s? Jack turned them down because he thought $200,000 was an outlandish price. Oops. Of course, if Commodore had bought Apple, they'd be dead now, too. And did you know that the most successful microcomputer in 1977 was not from either Apple nor Commodore, but was in fact Tandy's TRS-80? Perhaps the most satisfying fact in the book is that Bill Gates learned his lesson on software licensing when Jack Tramiel negotiated a deal to use Microsoft's BASIC in Commodore computers. Jack sold tens of MILLIONS of computers with the same Microsoft BASIC in them and he only paid Microsoft a one-time fee of $10,000. Microsoft received absolutely no royalties on any Commodore computer sold in the late 70s to mid 80s. Ha!

Much of the book pays special attention to Commodore's founder and CEO until 1984, Jack Tramiel. Jack had a couple of battle cries that were interesting: "Business is War", and "Computers for the masses, not the classes." The first quote refers to Jack's belief that you didn't succeed by competing with competitors, you succeeded by destroying them completely. The second was Jack's continual insistence on driving down the cost of computers. There are many stories in the book that illustrate this. For instance, the original case design for the PET was to be made as a futuristic-looking molded plastic design. However, Commodore at the time owned an office supply company in Toronto, so they instead used an angular sheet-metal case because it cost less. The case for the C-64 is the exact same one as the VIC-20, because Jack didn't want to spend money designing a new case, which actually ended up causing the engineers to lose weeks worth of time trying to cram the C-64 guts into the VIC-20 case. Jack had some interesting characteristics, many of which I've noticed in other "successful" businessmen. However, what makes a successful businessman doesn't appear to be totally compatible with what makes a successful human being, in my humble opinion. One case in point is the fact that when Chuck Peddle left Commodore in the 1980s to begin his own start-up, out of spite Jack filed a lawsuit that completely destroyed Chuck needlessly (Jack and Chuck had once been very good friends). A hero of the microcomputer era struck down by a mean-spirited, greedy CEO.

Still, I have to wonder if Commodore would ever have succeeded as much without a guy like Jack at the helm. Another very interesting, but oddly secretive character in the Commodore saga was its owner, Irving Gould. The book paints this man as a mostly absent father-figure, who, when his CEOs would become successful with his company, he would fire them out of fear they were gaining too much power. After Jack took Commodore to a billion-dollar company with the C-64, Irving fired him in 1984. After Commodore lost hundreds of millions over the next several quarters, ex-Pepsi executive Tom Rattigan was put in place as the company's CEO and he turned it around in a matter of months! Once Commodore was successful again, Rattigan was let go by Gould, too. Commodore never recovered after this, and eventually failed at the inept hands of CEO Mehdi Ali (nicknamed the "speed bump" by Commodore engineers, who burned an effigy of him in Dave Haynie's Deathbed Vigil video) in 1994.

Commodore was the first company to put a microcomputer on the market, they were the first to sell over one millions units, and they were the first with a multimedia computer (the Amiga, R.I.P.) before the word "multimedia" existed. Despite all this, they self-destructed and have become pretty much just a footnote in the annals of the birth of the microcomputer industry. If you love computers as much as I do, and want to learn what it was like to be a computer engineer back in the heyday, you HAVE to read this book. If you're tired of the "revisionist history" that paints Apple as the founder of the microcomputer, you HAVE to read this book. If you ever look back nostalgically on that old Commodore computer you owned in the 1980s (and wonder why the 1541 disk drive was so slow), you HAVE to read this book. It will make you laugh, it will make you cry, and it will make you pound your fists in anger. What more can you ask from a book, especially a non-fiction one such as this?

Rules for Checking In Code to Source Control

The golden rule of checking in code is:

Don't check in code that will break the build.

Nobody likes code that won't compile. Also see Scott Hanselman's First Rule of Software Development.

In order to help verify that your pending check-in will continue to result in a healthy build, you may wish to view the following rules and procedures that I find helpful.

  1. Try to work on small chunks of code and features at any given time. As Tim Stall points out, "It's easier to integrate 5 small things than 1 big thing."

  2. Before checking in anything, perform a "get latest" on your entire solution's code-base. Resolve any version conflicts before proceeding.

  3. If you have any web sites in your solution that have references to web services, make sure you update all web references.

  4. Perform a build on your entire solution. Obviously, fix any compilation errors.

  5. Database scripts. Ahhh, these are lovely, aren't they? Unless you're fortunate enough to be using some cool database tools (like Visual Studio for Database Professionals), you don't likely have any compile-time error checking for your database scripts. It is essential that you ensure all your database scripts for changing the schema, updating the programming (e.g., stored procs), and inserting default foundation data work properly. If they don't, you'll soon have a swarm of angry developers beating down your door. The best way to do this is to run the update scripts on your machine and test the software. Update scripts should be written in such a way that they first check to see if a particular update has been applied first before trying to apply it again. This makes it much easier to test, and for other developers to apply to their database copies.

  6. Check in ALL files you have checked out. This is a tricky one, since perhaps you know that a particular file doesn't work properly, even though it will compile properly. Unfortunately, you may not know what dependencies exist on this file from some of the other files that you're checking in. So, while you may think you're doing a favour for the rest of your development team by not checking in a file you know to not work properly, you may actually be more causing grief by checking in code that won't build!

Even if you use a fancy-shmancy tool that performs delayed commit or source integration (such as TeamCity), as a professional developer you should really follow these guidelines to save yourself (and your team) pain and suffering.

XBox 360 Fun

I had been holding off buying an XBox 360 since I'm really not that much of a gamer and when I do play I totally suck anyway. For us non-gamer types, the XBox Live Marketplace sounds interesting since it offers high-definition movies, but alas as a Canadian I am deprived of such a useful feature (though Microsoft has said this will be available in Canada by the end of 2007). However, with last week's release of Halo 3, I really had no choice but to break down and get myself one. Halo is just about the only video game I've played in, oh, ten years so once again I bought an expensive piece of hardware just so I could play this damn intoxicating game.

I decided to get an XBox Elite primary for the larger hard drive, since outside of Halo I expect my only primary use of the machine will be for buying and watching high-def content. I ordered my machine through Dell Canada and got a deal where Halo 3 was included for free (I also got $30 off an extra controller with the play and charge kit). For a company that almost exclusively sells their stuff through an on-line web site, their on-line order tracking is really poor. I placed my order 2 weeks before Halo 3 was scheduled to ship. I got an email stating an "expected ship date" of September 21st (the Friday before Halo's release). However, when I clicked on the order number to go directly to Dell's site for an update, it said the order (or more presicely a single line-item on the order) wouldn't ship until the 27th. The other line item had no details for the ship date, so was the first line item's ship date also for the entire order?

Naturally, being the instant gratification freak that is typical of my generation, I checked the site for updates several times a day. I even called Dell in an attempt to gain clarification. I was told that this deal was extremely popular, but that it probably wouldn't ship until a day or two AFTER Halo's release. Grrrr.....  But then, on the 21st (this was the original "expected ship date") something went haywire on Dell's order tracking site and it now said an "expected DELIVERY date" of the 24th! Woo-hoo! Hours later, however, and it was back to a SHIP date of the 27th. Boo. I checked again on the morning of the 25th to find that it had actually shipped the day before (though no notification email was sent to me). They shipped it by air, and it actually arrived on the 25th! So, good marks for execution but an F for a rather useless order tracking system.

Anyway, I hooked up my new toy that evening and was relieved to find the cooling fan was much more quiet than my original XBox. I used my original XBox with XBox Media Center to stream video from my PC to my television in the living room and the fan was always distracting. The DVD drive is another story - it is quite noisy when it's in use even though it's the vaunted Benq drive which is supposed to be the most quite DVD drive in the 360s. I'd hate to hear what the noisy ones sound like.

I won't bore you with a review of Halo 3 since the entire gaming community has already given it plenty of thumbs-up. I haven't even played it much yet, though it looks like Halo, feels like Halo, and sounds like Halo. That's a good thing. Since I totally suck, I play on the "easy" setting but in Halo 3 it is way TOO easy - I haven't died yet! I think I'll restart on the normal difficulty setting and see how I fare. Oh, and the new hammer weapon totally rocks!

One thing that I wasn't looking forward to about the 360 was the fact that while it can play video from a PC, it only supports WMV and MPEG video formats. My original XBox with XBMC plays just about every video format on the planet and I like it that way. Thankfully, some clever programmers developed an ingenious (and free, let's not forget free) piece of software that allows you to play just about any video format on the 360. It's called Tversity and what it basically does is "transcode" a video file on-the-fly to an XBox 360 supported format. Since I just upgraded my rig to a quad-core system, I can easily transcode high-def material without breaking too much of a sweat. This is great stuff.

So, I'm quite happy with my new Halo 3 Machine (let's call it what it really is). Hell, I might even try to do the XBox Live multi-player thing too if I can think up a decent gamer tag. My usual nicknames are all taken, so this might take a while. (UPDATE: I am now known as UnhingedBeaker.)

SQL Server 2005 Syntax Incompatible with SQL Server 2000

On a recent project we use SQL Server 2005 for development but the product officially supports installation on both SQL Server 2005 and SQL Server 2000. During development we'll create tables in the database (using SQL 2005) using the GUI tools in either Visual Studio or Management Studio. When it comes time to create the installation scripts, we'll "Generate CREATE scripts" from these GUI tools. With SQL 2005 (well, at least the version we're using, which is SP2), the CREATE TABLE script will now use a SQL 2005 specific syntax that will not work on SQL 2000.
For example, here is an auto-generated script that runs on SQL 2005, but not on SQL 2000:
CREATE TABLE [dbo].[Order](
 [orderID] [int] NOT NULL,
 [customerID] [int] NOT NULL,
 [orderDate] [datetime] NOT NULL,
 [shipDate] [datetime] NOT NULL,
 [shipperID] [int] NOT NULL,
 [shipperTrackingNumber] [varchar](50) NULL,
 [orderID] ASC
If you try to run this on SQL 2000, you'll get the following error:
Server: Msg 170, Level 15, State 1, Line 11
Line 11: Incorrect syntax near '('.
It would appear as though SQL 2000 does not like the syntax of the primary key constraint included in the CREATE TABLE statement. Alternatively, the following script works on both SQL 2000 and SQL 2005:
CREATE TABLE [dbo].[Order] (
 [orderID] [int] NOT NULL ,
 [customerID] [int] NOT NULL ,
 [orderDate] [datetime] NOT NULL ,
 [shipDate] [datetime] NOT NULL ,
 [shipperID] [int] NOT NULL ,
 [shipperTrackingNumber] [varchar] (50) COLLATE SQL_Latin1_General_CP1_CI_AS NULL
ALTER TABLE [dbo].[Order] ADD
Now, if that was the only problem I could probably live with that. But wait! There's more! When you create an object in SQL Server, it's generally good practice to first make sure the object doesn't already exist. In my day-to-day use, my scripts will often check for general objects, or foreign-key constraints. SQL 2005 uses the following code for these operations:
IF  EXISTS (SELECT * FROM sys.foreign_keys WHERE object_id = OBJECT_ID(N'[dbo].[FK_Order_Customer]') AND parent_object_id = OBJECT_ID(N'[dbo].[Customer]'))
Run this on SQL 2000 and you'll get:
Server: Msg 208, Level 16, State 1, Line 1
Invalid object name 'sys.foreign_keys'.
Also, the following code is used by SQL 2005:
IF  EXISTS (SELECT * FROM sys.objects WHERE object_id = OBJECT_ID(N'[dbo].[Order]') AND type in (N'U'))
DROP TABLE [dbo].[Order]
which will give you the following on SQL 2000:
Server: Msg 208, Level 16, State 1, Line 1
Invalid object name 'sys.objects'.
Microsoft changed the way that meta-data is stored in SQL 2005 to improve security, amongst other things, but this means that these scripts won't work on SQL 2000. Thankfully, however, they did provide "views" in SQL Server 2005 which mimic the old behavior on SQL 2000. To fix these errors on SQL 2000, you can use the following syntax which will work on both SQL 2005 and SQL 2000:
IF  EXISTS (SELECT * FROM dbo.sysforeignkeys WHERE fkeyid = OBJECT_ID(N'[dbo].[FK_Order_Customer]') AND rkeyid = OBJECT_ID(N'[dbo].[Order]'))
ALTER TABLE [dbo].[Order] DROP CONSTRAINT [FK_Order_Customer]
IF  EXISTS (SELECT * FROM dbo.sysobjects WHERE id = OBJECT_ID(N'[dbo].[Order]') AND type in (N'U'))
DROP TABLE [dbo].[Order]
Thankfully, there is a way around this if you're using SQL Management Studio. Instead of right-clicking on a table to generate a CREATE script, right-click on the Database and select Tasks -> Generate Scripts. This will open the Script Wizard dialog. On the Choose Script Options page, set the "Script for Server Version" property to "SQL Server 2000" and your CREATE scripts will now be fully compatible. A little more clicking is required, but at least your scripts will work on both server versions.

A Cool ASP.NET 2.0 Feature (Well, Almost)

Whenever I start a new web project, one of the first things I do is create a "Base" class for all my ASPX code-beside ("code-behind" is so .NET 1.1) class declarations to inherit from. This makes it easy to have useful properties such as CurrentUser available in all pages automatically. Such a practice is quite common these days, and has been covered in several articles including this one.

Recently, I have found it useful to have some boolean properties on the base class which control the rendering of certain elements. For example, a current project I'm working on has the requirement to restrict the user's ability to navigate backwards by using the browser's "Back" button. This is accomplished via the use of the JavaScript "history.forward()" hack. Now, some pages in the application require this while others do not. So the best way to handle this is to have a base-class property called DisableBackButton and set it to "true" when I want to prevent the user from going back to the previous page. And, in fact, this is what I did and it works well.

However, setting such properties can only be achieved with programmatic code. For example, to turn this property "on" for a certain page, I would have to write "DisableBackButton = true;" in the Page_Load event handler. This is fine, but it feels "dirty" and unsophisticated to me. I would much prefer to set this property in a "declarative" fashion on the actual ASPX file rather than writing code to set it. Say, wouldn't it be cool if I could just add an attribute to the @Page directive that said DisableBackButton="True"? Well, it turns out that in ASP.NET 2.0, you can do exactly this!

Well, almost. If you read all the comments in that link you'll see that it isn't all that simple, unfortunately. First, if you're using a base class like I am, you also have to set the "CodeFileBaseClass" attribute in the @Page directive. Ick. I think this is something that the compiler can determine for itself without me having to tell it explicitly. Requiring me to add it manually just means there's another possible point-of-failure and yet something else that has to be maintained. Still, it's not a too terrible price to pay for the cool declarative property setting feature I'm trying to achieve. But, that isn't the only problem. While the project will now compile, you'll still get ASP.NET validation errors in all your pages saying that "DisableBackButton" is a not a valid attribute for the @Page directive. Grrrr. To stop this message from clogging your Visual Studio errors window, you have to open up Visual Studio's XML-based schema file and add the attribute. Now this is really a problem since this schema file is used for ALL Visual Studio web projects, not just the one you're working on right now. And it also means that if you share the code with someone else, you have to tell them to go modify their schema file, too. This is completely unacceptable.

In the end I decided to go the old-fashioned route and set this property programmatically in each page that needed it. While the declarative route would have been uber-cool, there are just too many road-bumps down that path. Here's hoping Orcas (or Shamu as my wife calls it) does something to address this problem.

Making Reporting Services Work On Vista (64-bit)

A couple of weeks ago I got a new laptop (that went along with my new job) and at the urging of some on my co-workers (and against my better judgement) I installed Vista Enterprise 64-bit on it. And that's another story all together. I also installed SQL Server 2005 (64-bit, naturally) and after fighting with the new IIS 7 on Vista (see this Microsoft Support Article for more info), I managed to get Reporting Services installed too.

However, I had never configured it until today. I brought up the Reporting Services Configuration Manager and focused my attention on the first "red X" in the list, which was the "Report Manager Virtual Directory." I was able to set up the virtual directory, but the damn red X wouldn't turn into a green checkmark no matter what I did. I searched the Internet and found all kind of Reporting Services on Vista horror stories, but no story that fit my exact problem. After looking at the various Reporting Services .config files for clues, I went back to the Configuration Manager window and clicked on the "Report Server Virtual Directory" item, which had a green checkmark. Aha! The damn thing WASN'T set up, even though it had a happy green checkmark. Once I set up both the Report Server and Report Manager virtual directories with the proper application pool settings outlined in the above referenced Microsoft Support article, I was able to complete the Reporting Services configuration and now it works great!

Moral of the story: don't trust green checkmarks - they're not always correct.

Microsoft Adds Powerful Tool For Database Developers

I've been dying to try out Microsoft's latest addition to the Visual Studio Team Suite set of products - Data Dude, or more properly know as Visual Studio 2005 Team Edition for Database Professionals. Recently I finally took some time to watch a series of webcasts (at 1.5x speed - the only way to learn!) on MSDN and figure out how this thing works. After spending a couple of hours watching these webcasts, I am able to present you with the Coles Notes version of what Data Dude does and how it works.

Data Dude essentially represents your entire database as a series of text files. You can think of this as the "source code" for your database. Individual .sql files (which are just text files) containing DDL (data description language - SQL commands that create database objects) are stored within the new "Database" project type in Visual Studio. So, if you want to create a new table called "Customer" for example, Data Dude creates a new .sql file called "Customer.sql" which contains the DDL commands to create that table. In this respect, it's not a whole lot different than the old Visual Studio "Database" projects where you might have manually stored .sql scripts to keep them under version control.

What Data Dude brings to the table, however, is a lot more automation and validation to this entire process as well as the ability to "build and deploy" your database to any SQL Server instance. So, you have this big collection of .sql scripts to create tables, views, indexes, procedures, and functions and Data Dude does this wicked cool thing where it "parses" all the SQL in these files and validates them. So, if you a .sql file for your Customer table and it contains a field called "customerID" and in the .sql file for a stored procedure that SELECTs from that table you have written the field as "custID", you'll get a build error! Nice stuff.

You can either start your database project from scratch, or you can reverse engineer from an existing database. Once you have a your database project in a state where you're ready to deploy it, you can perform a deployment operation against any SQL Server instance that you can connect to. Data Dude will examine the differences between the source files it has and the existing database on the target and only deploy those updates that are required. For a new database this would obviously be everything, but for a database that your reverse engineered and only changed one stored procedure for example, only that stored procedure is changed on the target. Sweet!

Now that your database is represented as simple text files, you can easily place your database definition under source control. One of the problems that I've experienced working with a team of developers is that while the procs and functions might be under source control, the schema (tables, views, etc.) generally isn't. So, one developer might make a schema change on their local SQl Server instance, update a proc too, check-in the proc, but forget to tell other developers about the schema change. Naturally, when another developer tries to apply the latest version of that proc against their development database, it blows up. Data Dude does away with that by making it oh-so-easy to put your entire schema under source control. So now, another developer can get the latest version of the database project from source control, do a deploy against their local development SQL server, and the schema change AND the proc will get updated.

In addition to providing features for comparing schemas, it also allows to perform data comparisons between tables in different databases. So during development when you're adding new rows to lookup tables, for example, you can now easily deploy those updates to another database (such as a testing or production system) by performing a data compare.

But wait! There's more!

Data Dude also adds the ability to run unit tests against your database. Using Visual Studio 2005's existing unit testing framework, you can create tests that run against your tables, procs, or just about anything. To help you with this, you can create Data Generation Plans that automatically fill your tables with random data. You have a lot of control over how the data is generated, including number of rows and minimum and maximum values. So, the ability to run unit tests combined with the automatic population data makes for a very powerful combination to help you write quality database code.

If you can, I'd recommend that you spend a few hours with Data Dude learning how to take advantage of its capabilities. It may be a "release 1" product and while it's not perfect, it's still a great tool for database development in a team environment. And when they add ERD diagramming features in the next release or two, it will be the only way to develop databases.

Bitten By Windows Genuine Advantage

Last weekend I was home doing some paperwork and I had to send a fax to my insurance agent. I hadn't sent a fax in several months and not since I had reimaged my computer with Vista and Office 2007. So I opened Word and looked in the templates for a fax cover page. Word ships with several templates as standard, but it also includes a link to an Internet-based gallery of templates. I selected one of the Internet-based templates and was informed that this feature required that Microsoft verify that my Office installation was "genuine." Sure, whatever, I thought to myself and I hit the "Continue" button. To my shock, Office reported that the validation had failed! My Office installation wasn't genuine!

That's odd, I mumbled. There's a link that takes you to a more detailed explanation of what is wrong with the validation. I clicked on it and the resulting web page told me my Office installation hadn't been activated yet. It said all I had to do to fix it was run any Office app and it would automatically start the activation process. I shook my head and cursed at Microsoft for inflicting this garbage on me. I double-checked Word and sure enough, it claimed that it was already activated. So I concluded that Microsoft's "genuine" validation routine was on crack. I get all my Microsoft software directly from MSDN, so there's no way my Office installation was phony.

I futzed around with the web site a bit more trying to figure out how I could convince the Great Genuine Validation Gods that I didn't steal this copy of Office. No go. I called the tech support number and was quickly transferred to another call center because I got my copy from MSDN. This new call center told me they couldn't help me on the weekend unless my "business" was experiencing a Severity One emergency. Call back Monday, he said. I was irked as hell about this but there was little I could do. So I hauled out my laptop which had Office installed from the exact same CD. It worked fine, I got my stupid template, and I sent my fax.

Due to my employer's current MSDN configuration, it was difficult for me to log a support incident with Microsoft about this so I didn't bother looking at it again until this evening. I suspected that something went awry with my Office install on Vista so I set about trying to "repair" it from the CD. Nope, no good. I was getting ready to uninstall and reinstall as a last resort when I remembered that I had installed Microsoft Project 2007 at the same time I had installed Office Professional, but I had never run Project at all. Hmmm, I thought to myself, I'll bet the Genuine Gods are pissed off because I've never run Project, so they're not going to let me into their special Internet club! So I ran Project (which installed using a different product key than Office and thus needs its own activation), it immediately wanted to be activated, I activated it and promptly closed it. I went back into Word, tried to open an Internet-based template, went through the "Genuine" validation voodoo, and it worked!

So, my question to Microsoft is: Why the hell did I have to active Microsoft Project so that I could download a Word template? In what twisted demon dimension does this bent logic make any sense? Good grief. (But I still think the Ribbon interface is way cool.)

Vista Revisited

As you may recall, several months ago I installed the so-called "Release Candidate 1" version of Microsoft's latest consumer operating system, Vista. You may also recall that it was an absolutely horrid experience and I re-install Windows XP in a matter of days. Well, Vista was officially released recently so I decided to give it another try.

During my RC1 trial, there were a few items that were deal-breakers that made me install Win XP again. One of these was the inability for me to control the jet engine fan speed on my ATI X800XL video card. With XP, I was using Ray Adams' ATI Tray Tools program which allows for the control of fan speed (and thus the resulting noise) in software. Ray has pretty much declared that he won't be porting ATI Tray Tools to Vista anytime soon, so I had to find another solution. There are no other software solutions, so I tried to go the hardware route which consists of replacing the stock cooling fan with a thrid-party assembly. Much recommended on the Internet was Artic Cooling's ATI Silencer Rev 5, however, it was discontinued and is no longer available. My local computer store had a Zalman VF-700 in stock, which reportedly also worked well with my card. After about 40 minutes of work, I had the replacement cooling fan installed and in the 5-volt mode it is virtually silent. That's one down.

Next, I was distressed that RC1 didn't support my Creative SoundBlaster Audigy 2 sound card, which is connected via coax digital cable to my Logitech Z-5500 speaker system. Creative now has revised "beta" Vista drivers so I decided to give it go. As it turns out, the beta drivers work mostly fine and the coax digital out on the sound card works great. That's two down.

Last on my list of deal-breakers with Vista RC1 was the fact that my brand-new (then) Logitech QuickCam Fusion wouldn't work. Logitech released new drivers a day before Vista's official release and the QuickCam now works fine, too. That's three down and none left. So I now have a clean-install of Vista on my home PC, completely replacing XP.

The upgrade wasn't completely without aggravation, however. Vista still refused to recognize my motherboard's integrated Intel RAID array out-of-the-box. I was lucky in that the USB thumb drive I had used to install the driver during my RC1 fiasco had remained untouched since then, so the RAID driver was still on it. The driver loaded without any fuss and Vista installed fine afterward. Next was getting my usual set of software installed, which consists of such things as Microsoft Office and Visual Studio 2005. These are on my hard drive as ISO images downloaded from MSDN. My usual ISO mounting software (Nero ImageDrive) doesn't yet work on Vista, so I had to try something else. MagicISO worked once, but gave me error messages on every restart after that. Virtual CloneDrive seems to work at first, but throws read errors during an install. For Visual Studio 2005, I had to burn the image to a physical DVD in order to get it installed.

I have a couple of last complaints with my Vista experience so far. As I mentioned above, I have an ATI X800XL video card and it has the ability to capture video from a composite or s-video source. Well, it HAD that ability before I upgraded to Vista. ATI's official Vista drivers have dropped support for the video-in feature of this card, and there doesn't appear to be any committment to restore them in the near future. And, while I was going through the Windows Media Center set up procedure, my computer blue-screened during my second 5.1 audio sound test.

Despite these issues, Vista remains on my machine and I have no intention of putting XP back on. Most things work fine, and the supplied Microsoft printer drivers were even able to see and print to my networked Brother MFC-8840DN all-in-one unit (though I can't scan or fax from it yet). I'm confident that driver updates from vendors will fix most of my problems over the next couple of months. In the meantime, Vista is a pleasant enough upgrade from XP, if only in the paint job and visual appearance department. As a collegue of mine (who also owns a Mac) said, "It's like using a Mac, except it has applications."

Plextor DVD Writer - R.I.P.

A couple of years ago I picked up my first DVD writer - a Plextor PX-712A. I was kinda proud of it because Plextor is a premium brand and most do-it-yourselfers in the build-your-own-PC world tend to use more mainstream and budget "OEM"-type brands such as LG, LiteOn, or Pioneer. Plextor markets themselves as "The Leader in Reliable CD, DVD, and Digial Video Solutions" so I felt pretty smug that I had paid a little extra and bought quality gear. It wasn't dual-layer, but back then dual-layer writers were just hitting the market anyway and even today dual-layer media is still prohibitively expensive.

I really enjoyed using the Plextor. The tray mechanism has a very nice glide motion where the tray slows down just before it reaches the full open or close state and gives the drive a sort-of "luxury" feel. The unit is heavy and full-length, and the status light has different modes for read vs. write vs. power on. Plextor's web site had several firmware updates, all of which were easily applied. Perhaps the feature I liked most was that it was extremely quiet. Overall, it was just a nice piece of kit to own - the kind of kit that gives you that nice pride-of-ownership feeling.

Unfortunately, this feeling was not meant to last. Over the last several months, things just haven't been working quite right. I don't use the drive all that much, so I never really connected the dots until now, but the unit is no longer reliable. When reading DVD's it almost always quits about half-way through with various read-errors. Conversely, disks that it writes do so without any errors, but the resulting disks don't play well in other DVD drives. My ancient Pioneer DVD-ROM drive works much better at reading disks, a fact that pisses me off to no small extent. Here I was being Mr. Smarty-Pants by getting a high-end drive that I was sure would last until the end of the earth. Instead, it gets beat by my bought-it-without-a-thought DVD-ROM unit that must be 4 years old by now.

What really irks me is that this is the second time "high-end" gear has failed me. Before this I had a nice Ricoh CD-writer I bought in 1999 that, again, I was sure would last forever. It also refused to read and write data after a couple of years of extremely light use. The difference with the Ricoh was that it came with a good warranty and I was able to get it repaired for free. The Plextor only had a 1-year warranty that is long expired.

Despite this experience, I was tempted to pick up a new Plextor to replace my busted drive. Their gear just feels so nice. However, a couple of factors changed my mind. First, my motherboard is getting a little long in the tooth and only supports two SATA connectors which are already in use. If I'm plunking down more than $100 for a DVD drive, I ain't buying no old-school PATA garbage. Not that PATA drives aren't as good, I just really like the smaller SATA cables and their tidy appearance. Secondly, the local computer store (which is only two minutes from where I work) doesn't stock Plextor drives, but they had plenty of $41 LG drives. And $41 isn't really an investment by any means - it's more of an impulse buy. So, I picked up an 18x LG drive and you know what? It works great, and it's not even that noisy. And I'll bet it'll last forever. Still, I'm going to miss that luxury tray motion.