A Look at Windows XP Service Pack 3 Part 1: Good Enough?

Rather than write a traditional review of Windows XP Service Pack 3 (SP3), I thought this might be an opportune time to reevaluate XP’s standing in the Windows world. After all, virtually every technology pundit on earth has described Windows Vista as operating system non grata, an upgrade to be avoided at all costs. Over at the tabloid-o-rific InfoWorld, a “Save XP” petition has garnered 100,000 signatures: Sure, that pales next to the 120+ million people that are using Windows Vista at this time, but what the heck. There must be something to this. Is XP really good enough to warrant saving?



Best Microsoft MCTS Training – Microsoft MCITP Training at Certkingdom.com

To find out, I did a clean install of Windows XP with a near-final version of Service Pack 3, the final XP service pack. I then installed a number of applications and technologies from Microsoft that bring XP roughly up to speed with Windows Vista, including Internet Explorer 7, the various Windows Live suite tools, Windows Defender, Windows Media Player 11, and Office 2007. And then I installed my stock group of preferred applications on the system, including AVG Free Antivirus, Apple iTunes, Firefox 3 Beta, Adobe PhotoShop Elements, WinRAR, and several others.

And you know what? I get it. I get why all those people are freaking out over the impending end of the mainstream availability of Windows XP, why so many are clamoring for Microsoft to give its previous generation OS another chance. And it has nothing to do with any faults in Vista, real or imagined. (Mostly imagined, actually.) No, it has everything to do with human nature. We’re creatures of habit, you and I. And even the most forward-looking of us, those who enjoy living on the edge, technology-wise, have a very natural need to be in the comfort zone sometimes. And XP is just that, comfortable, like that ratty old sweatshirt that we should have thrown out years ago but just can’t bear to replace.

Don’t get me wrong, though: It’s not like anything’s broken here. Windows XP has plenty of life left in it. This system will run on far less demanding hardware, comfortably, than any version of Windows Vista, and that alone means this system will be around for years to come. (Not surprisingly, XP also provides better performance in virtual machines as well.) After all, PCs last a lot longer than they used to, and while Microsoft and Intel wrestle with the fact that Moore’s Law is succumbing to the current generation of under-utilized multi-processor CPUs, everyday users are noticing that they can get a lot more out of yesterday’s software than was possible a decade ago during the Megahertz wars. This is big news for the industry, and for the billions of people who use PCs every day.

No, clinging to Windows XP is entirely understandable. I typically advise against upgrading to Vista on current, XP-based hardware. After all, not only will XP likely run more quickly on your existing hardware, but you’ll probably also experience better compatibility, both with software and hardware devices. The old maxim is as true as ever: Unless you’re an expert, you should simply adopt the latest Windows version when you purchase a new PC. This discussion begs a new type of question, however: Should you opt for XP over Vista on a new PC? (Though to be fair that question will become academic this summer when XP is no longer available in this fashion.)

My answer to this question is no, you shouldn’t. Instead, you should choose Windows Vista, for the many reasons I’ve outlined in my Windows Vista review (check out the final part of that review for a quick list of reasons why I think Vista is a big deal and a huge improvement over XP.)

But that’s not what this article is about. No, this is about those of you who have elected, bravely, foolishly, or otherwise, to stick with XP. So let’s take a look, a last look at Windows XP, at least on this site, which is, after all, dedicated to the future of Windows, and not the past. But I think it’s OK to take a step back and see whether what’s old can be new again. After all, that’s what Windows XP Service Pack 3 is all about.
Windows XP and the Vista conundrum

In delaying Windows Vista for over two years, as Microsoft did between 2004 and 2006, the software giant exacerbated the problem it always has getting customers to upgrade, and it did so once inadvertently and once on purpose. The inadvertent bit was that the longer Vista was delayed, the more comfortable XP became to users. Though XP suffered from the same performance, stability, and compatibility issues that dogged Vista in its first year (and, let’s not forget, please, that XP also suffered from a range of extremely high profile security issues the kind of which have never plagued Vista, thank you very much), enough time passed that people simply forgot. Anyone buying a new PC during 2005 and 2006 discovered, perhaps to their amazement, that things actually worked pretty well most of the time. This kind of experience may be commonplace in the tightly controlled Mac OS X environment, but in the willy-nilly world of Windows, where any third-rate second-world company can and will ship a painfully bad device driver at the drop of a hat, this level of stability and reliability was a new phenomenon. Windows XP simply got more mature over time, in ways that were never possible with previous versions of Windows. For the first time ever, time effectively slowed in the computer industry. The upgrade cycle pretty much ended for a while there. (Further evidence of this evolutionary mindset can be seen in the number of XP-based OSes Microsoft shipped in this time frame, including various versions of XP Media Center and Tablet PC Editions.)

More purposefully, Microsoft also screwed over Windows Vista. As Vista was delayed again and again, Microsoft realized that it would be a mistake to tie the success of key new technologies that were to have originally been Vista-only. So it back-ported a number of technologies to XP, things that previously were designed to be Vista-specific. These include, among others, Windows Defender, Internet Explorer 7, Windows Presentation Foundation, Windows Communications Foundation, .NET 3.x, the Windows Security Center, Windows Media Player 11, and even Office 2007. (Remember, Office 2007 was originally going to be Vista-only, was then going to offer unique Vista-only functionality, and was finally changed so that it worked identically on Vista and XP.) Microsoft also dramatically detuned some key Vista features, like Instant Search, while cancelling related technologies such as WinFS. In short, Vista became less exciting over this time period whereas Windows XP became more and more capable. Now, I understand why Microsoft made these decisions and I may even agree with most of them. But the net effect should have been predictable: By not drawing a clearer line between XP and Vista for much of its next-generation technologies, Microsoft in effect created a situation where XP didn’t become obsolete as quickly as did previous Windows versions. Now, the goal is admirable and understandable: Those technologies would achieve greater success due to their exposure to a larger audience. But Vista suffered as a result.

Couple this strategy with the Vista delays and Microsoft’s inability to capitalize on multi-core hardware (another way in which Vista could have differentiated itself from XP), and suddenly XP becomes that comfortable old sweatshirt I discussed previously. I have no doubt these events will be closely studied by both Microsoft and various business schools in the future. To say that this was a lost opportunity is an understatement.

So here it is, in 2008, four long years after Microsoft shipped the last major update to Windows XP (Service Pack 2, which can and should have been marketed as a completely new Windows version). Microsoft may have originally wanted to ship Windows XP SP3 long, long ago, but the Windows division got all caught up in this little project called Windows Vista, so XP SP3 was sort of cast to the side and forgotten. Well, forgotten by Microsoft, that is: The company’s biggest and most important customers–big businesses–seemed poised to settle on XP for the next decade, and they were getting a bit prickly about all the post-SP2 hot-fixes that Microsoft has shipped over the past three years. It seems these things are a bit time consuming to install, and they were interested in getting that promised next service pack, which would roll-up all the previous fixes into a single, convenient update.

Microsoft was quiet about SP3 for a long time, but last year the company finally owned up to the fact that it would indeed develop SP3 and ship it sometime in 2008. And sure enough, SP3 nicely rolls up all of the previously released hot fixes, providing a more seamless (i.e. less complicated and time consuming) install experience. There are a few new features, but not really, unless, again, you’re one of those big businesses Microsoft is so concerned about (see my XP SP3 FAQ for details). As with Windows Vista SP1, XP SP3 is a traditional service pack, more about rolling up previous hot-fixes than about new functionality. And in XP’s case, specifically, that’s just fine because XP, as noted previously, has already gotten a new lease on life. Heck, practically anything that’s available in Vista is available on XP now too, right?

Well, not exactly. But this isn’t a matter of whether all of Vista’s useful features and functionality are being made available on XP. It’s a matter of whether enough of Vista’s useful features and functionality are being made available on XP. In other words: Is XP still good enough? No, XP with SP3 isn’t as “good” as Windows Vista, but remember that it doesn’t have to be. It only has to be good enough. And maybe it is. It’s certainly good enough to make people forget all about Linux on the desktop. It’s proven good enough to keep people from switching to the Mac in dangerous numbers. And it appears to be good enough to make customers look at Vista and say, eh, there’s not enough there there.

And that’s a problem, at least for Microsoft and its current and future platforms. Because in this case, I think the company has kowtowed a bit too much to those who would see XP live forever. It cut a bit too deep from Vista and gave a bit too much to XP. Microsoft will tell you that this doesn’t matter. A Windows license sold, after all, is a Windows license sold. But that’s absolute baloney. If customers are standing put on the previous version, that means they’re not sold on the company’s technological vision, and they’re no longer lining up as Microsoft tries to lead them to the future. I mean, imagine a case in which customers were allowed to choose between a previous generation Toyota Camry and the all-new, designed-from-the-ground-up 2008 model, and the customers actually chose the old version by a roughly 2-to-1 margin, despite the fact that the price hadn’t changed at all? This would be devastating to any car maker. I believe it’s devastating to Microsoft for the same basic reasons.

But enough business theory. What I’m really concerned with here is how this affects you, the Windows user. And the question I put before you, again, is … Is Windows XP good enough?

Saying Goodbye To Old Technology

A reader recently made an interesting point: Windows XP, to his mind, was the tech story of the decade. He’s probably right. Microsoft has never made an OS of any kind with this lengthy a life cycle, and XP has lived on in the face of two major upgrades, Vista and 7, both of which were designed to obsolete it. But the success of XP has a dark side as well. And with most businesses still standardized on this Windows version, XP’s problems are starting to outweigh the benefits.

Best Microsoft MCTS Training – Microsoft MCITP Training at Certkingdom.com

Part of the problem is that XP still ships with wildly outdated non-core technologies, many of which are becoming favorite targets of hackers. Key among these are Internet Explorer (IE) 6 and, less obviously, Adobe Flash 6.

I’d be surprised to discover that I needed to defend my contention that IE 6 is arguably the most dangerous software any business could have deployed throughout their environment today. But it bears repeating: The web is the number one vector of electronic attack, and IE 6 was built for a different decade and, more important, before Microsoft’s Trustworthy Computing initiative. Put simply, it’s just not safe to use.

The problem, of course, is that IE 6 is in fact still widely used. And this is despite two major IE upgrades, IE 7 and IE 8, both of which are dramatically more secure and dramatically more functional. (These two newer IE versions aren’t perfect, however. In the recent electronic attack on Google and other companies that emanated out of China, a vulnerability in IE 6, 7, and 8 was allegedly used. This begs a separate question: Does it make sense for any security-conscious business to use IE at all?)

So the possibilities of hacker attacks against IE aren’t all that surprising. But many admins may not even realize that Windows XP ships with a hugely outdated Flash version. In fact, it’s so old, that Adobe has shipped four major updates to the software since XP first arrived. It’s now up to version 10.

Because multiple vulnerabilities in Flash 6 can be targeted by hacker attacks and result in remote code execution exploits, Microsoft recommends that XP users update to the current Flash version. Common sense, right? But in the upgrade adverse corporate world, I have no doubt that millions of machines will continue forward unprotected.

A new level of vigilance is required here because as OS vendors like Microsoft have done increasingly good jobs of protecting their customers, hackers have moved on to other attack vectors, including applications software like IE, Office, and Adobe Reader and Flash. The popularity of such attacks makes sense; each of these solutions is used by hundreds of millions of users every day.

But when businesses are only slowly updating the technologies installed on users’ PCs–or not updating them at all–the situation is exacerbated. And the attack surface of your environment grows ever bigger.

I mentioned earlier that XP’s benefits–compatibility, familiarity, performance, and, let’s face it, the fact that it’s often already paid for–will soon be outweighed by problems inherent to using an OS that’s almost a decade old. These problems become all the more dangerous when combined with hackers’ new emphasis on unpatched applications as well.

The obvious way to mitigate many of the resulting problems is to upgrade. But as you’re all too well aware, upgrading comes with its own problems, not the least of which are the financial, training, and support costs. But as we’ve discussed over the past few weeks, this is a unique moment in time, and the ideal time to not just change for change’s sake, but to upgrade in ways that make sense. And that means reevaluating what’s installed on users’ computers, which cloud computing services you can perhaps take advantage of, which systems can be virtualized and centrally controlled, and so on.

But at the very least–that is, working within the confines of the systems you currently use–please be sure to thoroughly evaluate the software solutions you have running within your environments and ensure that they are all at least updated with the latest security fixes. We can’t all handle electronic attacks as well as Google apparently did in the recent Chinese situation. But we can at least do the minimum.

Hands-On with Windows Small Business Server “Aurora”: Notes from the Field

I’ve written about Microsoft’s new Small Business Server offering, code-named Aurora, a lot over the past few months. (Most recently in Windows Small Business Server “Aurora” Release Candidate.) Since then, I’ve been using Aurora as the basis for a new domain in my test environment, and I’ve spoken with Microsoft further about the product. Here are some notes from my time using Aurora over the past several weeks.

It’s simple. Aurora is simple, maybe too simple. That’s because it’s designed for the smallest of small businesses, where there is not only no IT staff but perhaps not even someone who is particularly knowledgeable about computers. Aurora needs to work in these largely unmanaged environments, and my take on this is that it will do so just fine. But Aurora really comes into its own when it’s run by someone who understands Active Directory and Group Policy: Push a bit beyond the surface UI and the entire Windows Server management infrastructure is there, waiting to be unlocked. A bit more about this in a moment.


Best Microsoft MCTS Training – Microsoft MCITP Training at Certkingdom.com


What’s confusing about Aurora, to me, is that it’s not clear what’s going on under the covers. Setup literally involves just a handful of steps, and the most taxing question you’ll answer is the name of the new domain you wish to set up. Unlike with traditional Windows Server versions, Aurora assumes you’re going to install it behind a home-like broadband router, like you’d get from a cable or DSL Internet provider, and it doesn’t assume that it will be providing DNS or DHCP services.

On the client end, Aurora also simplifies the process of connecting users (and their PCs) to the domain by providing a very basic web-based installer, which can be found at https:://server-name/connect. By default, users will connect via this interface, provisioning a new domain user account and PC in the process. If you’re used to micromanaging this, it can be a bit off-putting.

Launchpad can be too chatty. Users can optionally use the LaunchPad software on their PCs, which provides them with PC- or network-based health alerts. Even in my small environment, these alerts have proven to be way too chatty, in my opinion, and users who are tasked with ensuring that the environment is kept in a steady, healthy state will quickly become overwhelmed by the number of alerts. These alerts change the LaunchPad’s notification icon to yellow when there’s an issue to resolve (a computer has important updates to install) or red for something very serious (a computer has a hard drive that is nearly full, or an out-of-date anti-malware solution).

Aurora is particularly prickly about getting the remote access and server backup capabilities online and working properly. If you don’t do either, you’ll be notified somewhat incessantly. For some reason, it’s easier to disable client backups than it is server backups.

Shares are confusing. Aurora, like the Windows Home Server products on which it is based, comes with a set of default shares that provide access to server-based storage over the network. However, this isn’t a home server, so the shares aren’t designed around media content like music, photos, and videos. Instead, you get shares called Users and Company. Fair enough, but I found the process of creating new shares–and worse still, removing unwanted shares (Company??) too difficult, and I suspect the IT-less Aurora user base will as well.

What’s really different, however, is that Aurora (like “Vail,” the next version of Windows Home Server) also creates a unique drive letter in Explorer for each share. So the Company share can also be accessed, on the server, by the Y: drive, and the Users share is found at W: too. Why is this, I asked? According to Microsoft, these mapped drive letters make it easier for users to find the shares in the file system (they were previously somewhat hidden on in D:\Shares in WHS v1). And they’re easier to back up as well.

OK, fair enough. But there’s no UI for deleting shares and their accompanying drive letters simultaneously, at least not in the standard Aurora management UI, called the Dashboard.

Extensibility is the key. Where Aurora is really going to prove impressive is via its new add-on model. The problem, for now, is that there are no Aurora add-ons to test, though there is a hint in the UI that one is coming for Microsoft’s Business Productivity Online Suite, or BPOS, which provides hosted versions of Exchange, SharePoint, and other Microsoft servers. I’m eager to try such an add-on, but I’ll need to wait. So I asked about other add-ons that may be coming.

From the look of things, it’s going to be pretty exhaustive. Microsoft plans add-ons for on-premise servers, like Exchange 2010, as well as for hosted services, and there is a very interesting set of management add-ons coming that will simplify Group Policy and other management features. Also coming are add-ons that will negate the very real need, currently, to logon to the server via Remote Desktop Connection to perform many management tasks. Those will all be exposed through the Dashboard in the future, which will be a nice change.

Microsoft is also going to offer something called a Premium Add-on for Aurora that will provide a fully licensed copy of Windows Server 2008 R2 and SQL Server 2008 R2. This provides features not found in Aurora–like Hyper-V–and lets you extend your new domain with a second domain controller. And it is a second machine: Contrary to the upgrade possibility suggested by its name, the Premium Add-on is installed on a second server, not over the existing Aurora box.
Summing up…

Put simply, Aurora lives up to the needs of its stated mission and will be an excellent solution for very small businesses. But where this thing will really come into its own is in the hands of someone who really knows their way around AD and GP. Just implementing something like folder redirection, for example, could make a huge difference for users. These kinds of capabilities, I think, will be out of reach for typical Aurora users, however, unless they sign up with a forward-looking Microsoft partner that understands the product and the ways in which they can provide value on top of the base package. Yes, Aurora is in many ways a starter server. But lurking underneath the hood is (almost) the full power of Windows Server, just waiting to be unleashed.

Revolution, Not Evolution

That we’re in a time of great transition is, of course, obvious. That the future of computing is both mobile and connected, also obvious. Not so obvious is how painful this transition is proving to be. And that pain points to the fact that we’re undergoing a technology revolution, and not an evolution.


Best Microsoft MCTS Training – Microsoft MCITP Training at Certkingdom.com


The two most obvious examples of this technology revolution are cloud computing, an ill-defined and rarely understood technology if there ever was one, and the rise of smart phones, devices that fit in our pockets yet are more powerful than the average business PC desktop from just a few years back.

IT has feared cloud computing from its inception. The reason is simple: Those in IT see the trend towards off-premise computing as a threat to their job security. This is completely understandable and, unfortunately, probably at least partially correct. But IT has always been about efficiency, and one might make the argument, as I have, that by definition anyone in this business should be constantly expecting to evolve their skills as their jobs, and the technology they use, evolves as well.

You’re not alone in this fear. Years ago, a former editor questioned me about cloud computing, which was at the time an emerging trend at best. “What happens to the magazine’s audience?” she asked. “Will their jobs simply disappear?” I told her that the jobs would basically shift from the many on-premise data centers that we traditionally associate with the enterprise to a smaller number of much larger data centers, maintained by giant corporations like Microsoft and Google. The need to maintain these servers would still be there, however, I said.

Turns out, this supposition wasn’t far-reaching enough. Actually, there is still much need for IT inside of the enterprise, and that will continue to be true even if cloud computing grips the world in a frenzy of cost-cutting and off-premise, services-based infrastructure. That’s because businesses, even those that fully embrace cloud computing, will need to maintain some managed resources onsite and connect them, or federate them, with the hosted services. Long term, that scenario will evolve as well. But the need for traditional IT skills isn’t going anywhere. It’s just changing, evolving as always.

I have bigger concerns around mobile computing, specifically the proliferation of unmanaged smart phones that are now sweeping the world. As consumers move to smart phones like the iPhone and Google’s Android in ever-faster numbers, they’re expecting to be able to mix and match their work and home needs, all on one device. And many enterprises–too many, in my opinion–are simply giving in and allowing these users to utilize their own phones, accessing crucial corporate data via Exchange Server and other means.

The rationale behind this change is simple: Businesses believe it’s cheaper to do this. After all, it’s expensive to purchase, maintain, provision, and deploy smart phones. And if employees are just going to buy these devices themselves and shoulder the roughly $100 a month costs associated with their calling, messaging, and data plans, why bother offering to do this for them? That’s money in the bank, right?

So all around the world, corporate blocks to internal data have come down to accommodate these devices. It’s a glaring hole in the defenses against data loss, and one that I think will come back to haunt many companies in the future as users lose their unprotected smart phones or find them to be the subject of outright theft. It’s a lot easier to leave a phone behind than a laptop, and laptops get lost on an alarming scale. I think we’re only at the very tip of this smart phone era and only just starting to understand the importance of managing devices properly.

The silly thing here, of course, is that the technology to properly manage smart phones has been around for a long time. But in some ways, these tools were a bit too forward leaning, as they were tied to devices, like those based on Windows Mobile, that were archaic and not user friendly. I expect these tools to be more broadly deployed in the future as businesses understand they’ve unwittingly given away the keys to the kingdom, and as the technologies themselves evolve to manage more desirable devices.

I spent some time huddling with my compatriots at Windows IT Pro last week, debating these and other sea changes that will forever affect our businesses and the ways in which we manage technology. There’s a lot to think about here, but I was left with one overarching thought, and hopefully this will be at least slightly comforting to anyone that supports technology for a living. Yes, everything is changing, and yes, change can be frightening. But the need for IT–good IT–is stronger now than ever before. And as for this revolution into an era connected and mobile computing, it won’t happen successfully without you.