Introduction
Introduction
Chances are you might have heard of some software package that was released today; a new version of Windows or something like that. It's only the biggest software release in the last five or so years, and it has all the hardware and software vendors on their toes trying to prep their drivers and hardware to run on it.
Windows Vista is the biggest shift in the consumer software world since Windows 95. I won't bore you with all the details of what makes the new OS changes so dramatic, but some things like indexed searching, security out the wazoo, a new user interface dubbed Aero Glass, integrated Media Center software and DirectX 10 might start the feature list.
For our look at Vista and gaming performance, the new DirectX 10 with a completely new graphics driver subsystem is the most noteable change. DirectX 10 adds support for geometry shading (or rather working on geometry in the same way pixels have been for years) and requires hardware developers to move to a unified shader architecture. You can get a LOT more detail on DX10 and the changes that NVIDIA has made to their GPU architecture because of it with the GeForce 8800 series of cards by taking a look over my GeForce 8800 GPU review.
What We are Watching For
Because of these dramatic changes to the graphics system, both NVIDIA and AMD have had to spend significant time redeveloping their graphics drivers to work with the new Windows Vista operating system. Both NVIDIA and AMD (and ATI) have been working on Vista development for YEARS and we have been hearing claims of having the best "Vista Support" from both camps nearly as long. But now that the day of redemption is actually here, who will come out on top?
I decided to take the retail version of Windows Vista Ultimate, got the latest drivers from both AMD and NVIDIA as of Friday and began to spend my weekend testing. What is important to note here is that my intention is NOT to compare the ATI Radeon cards against NVIDIA GeForce cards -- rather it was to compare the gaming experiences provided by ATI and NVIDIA on both Windows XP and Windows Vista.
Should gamers worry about upgrading to Vista right away or should they wait for drivier stability and performance to catch up with the Microsoft vision? Read on to find out.
AMD's ATI Catalyst Vista Driver
Of AMD and NVIDIA, I was surprised to find that the ATI Catalyst driver team right off the bat seemed to be more organized and up beat about their Vista introduction. Having prepared a presentation for media to discuss the latest control center changes, features, short comings as well as future plans gave me the impression that AMD/ATI has been working on preparing for this day for a LOOONG time.
Catalyst 7.1 Vista Driver
While Catalyst 7.1 is already available for Windows XP, the 7.1 release for Vista was officially announced just yesterday. It is no longer a beta driver and has full WHQL certification.
The Catalyst 7.1 Vista drivers introduce quite a few new features in addition to Vista support starting with a new installer program that is easier to use and will eventually allow enthusiasts to downoad driver updates directly through it. It also brings Blu-ray and HD-DVD support to Vista and AMD GPUs for the first time, a topic of heated debate in many home theater discussion groups. ATI's CrossFire technology is added in as well (it wasn't available before hand) as are some performance and stability improvements over previous Vista revisions.
AMD was forth coming with information on the current CrossFire implementation including the lack of OpenGL CrossFire support for now. That means games like Prey and Quake 4 won't be able to use the mutli-GPU configurations though all DX9 games should be nearly on par with Windows XP performance levels.
Another big improvmenet in the Catalyst driver is the big speed up in control panel load times; one of my biggest pet peeves about ATI's most recent drivers. While I didn't bother to time the loading times I can tell with 100% assurance that the CCC is a much faster application under Windows Vista.
In addition to being faster, it is also improved upon functionally. A new 3D preview scenario actually gives the user a scene that somewhat resembles a game as opposed to a company logo or a box car on a flat polygon road.
As it mentioned in the first slide, ATI is also bringing some new features to the Vista Catalyst drivers very soon including support for MSAA adaptive anti aliasing. Using multi-sampled AA instead of super-sampled AA should allow for a nice enhancement in image quality on those alpha blended textures.
Those of you that might dual-boot with Linux when not in gaming mode will appreciate the first Catalyst Control Center for the Linux OS.
Finally, as I also mentioned before, the new download manager in the works from AMD will allow for gamers to only update the components of the software that have changed from the release they have installed. This should dramatically lower the required download sizes and annoying wait times. With just a single installer application, users will no longer need to hunt through the ATI website to find what they need.
In Practice
In real world use, by me gaming on it for a couple of days, the Catalyst 7.1 Vista driver was very stable and reliable nearly 100% of the time. I was unable to get Prey to run at all, though ATI said they could not reproduce the error. Some other driver issues are still holding this back from a prime-time preparedness though:
First, this is really aimed at the 32-bit version of Vista only for right now, and features like HD-DVD and HDMI output support aren't scheduled to hit the 64-bit version until Catalyst 7.3 sometime in March. As I mentioned, OpenGL support is a bit lacking both without CrossFire support and in single card performance. ATI admits that the "focus of this first release is to deliver a stable OpenGL driver" and that performance enhancements will be coming in later releases.
Also, as we expected with the Vista release today, even the 32-bit gaming performance is a bit behind where we have it Windows XP. As driver developers spend more time with the OS in the wild (now that everyone else can get it and test it) I expect that we should get closer and closer to a 1:1 perforance ratio.
Of course, for more details on that, check out our benchmarks in the coming pages.
NVIDIA ForceWare Vista Driver and System Setup
Just like ATI has been doing for the past three or so years, every time we sat down in front of some NVIDIA PR people at a meeting it was reiterated to us how important the transition to Vista was going to be for the industry and how far ahead of the competition NVIDIA was. The time is now, so how do they stack up?
NVIDIA ForceWare 100.54 Vista Driver
Jumping right to the thick of things, NVIDIA is well behind where AMD is at this point in driver development for Windows Vista. AMD came to me with a Vista driver a couple of weeks ago and wanted to know what our Vista tesing plans were. I had to pry a driver from NVIDIA to get testing done in time (and even then I wanted to address more titles, but couldn't for this deadline). Eventually I was given a driver in the 100 series, 100.54 that finally met the promise of Vista gaming with NVIDIA's GPUs.
First the good: the driver was very stable though not WHQL certified (just a simple prompt in Vista still to by pass) and performance in most of the games we played was playable. NVIDIA had both 32-bit and 64-bit versions available for testing though I only had the 32-bit OS as of the time of this writing.
There are more than a few issues that I felt were glaring ommissions from the driver though, the most notable of which was SLI support. As of 100.54, support for running NVIDIA's most prominently marketed technology was not available with the GeForce 8 series, 7 series or anything else. NVIDIA promised me that when the OS went live on Tuesday, January 30th, there would be a publicly available driver that WOULD enable SLI in the "top titles" -- meaning only the top tier of games. I haven't seen it yet, so I'll be just as eager to try it out as you.
Some significant faults can be found in the TV output capabilities of the driver as well; something that is actually pretty stunning considering that Media Center is going to be included with Vista Home Premium and Ultimate edition by default. HDMI support is basically a wash with some noteable image quality issues, black and white output only with the GeForce 8-series and only stereo audio output supported. Those of you that read my review of the NVIDIA PureVideo HD technology will remember how pleased I was with the over/underscan correction the driver provided -- but that isn't here in the 100 series of ForceWare drivers for Vista.
The NVIDIA control panel itself remains unchanged from the newest version seen on Windows XP; you can decide for yourself if that's good or bad though most reader's opinions tend to find it bland and difficult to use.
There is quite a bit missing from the control panel as well including some wizards for setting up TV output (makes sense, huh?) and multi-display configurations which are somewhat limted in the 100.54 as well.
As for gaming, the driver actually does pretty well! There are some stand out "issues" that were raised in the release notes that made my brow perk up a bit. Call of Duty 2 only shows solid colors when running at 4xAA in-game AA; 4xAA setting in the control panel does work though. For the 8800 cards, Apple 30" Cinema display is a bust without support for high resolutions and HDTV output is black and white. Say what?
After going through these lists I was struck with the notion that this driver seemed incredibly hurried. While the Vista OS is new, and even the 8800 cards are new, there is really no excuse for this state of driver development. The OS has been in development for 5 or more years and NVIDIA told us they had been working on G80 for over 4 years. That seems like plenty of time to get the driver in order, doesn't it?
Regardless, we still played the games we were going to (just without SLI testing) to see how WinXP performance compared to the Vista performance on Forceware 100.54.
Testing System and Setup Configuration
The test setup was pretty straight forward -- run Windows XP and Windows Vista on the same system and see how performance varied in some popular titles. Our NVIDIA rig consisted of the EVGA 680i motherboard and an Intel Core 2 Extreme X6800 CPU to test both the GeForce 8800 GTX and the 7900 GTX cards. The AMD X1950 XTX and X1950 CrossFire tests were run on an Intel 975XBX2 motherboard with the same X6800 processor. Both boards sported 2GB of DDR2-800 memory and a PC Power & Cooling 1 kilowatt power supply.
Testing Methodology
Graphics card testing has become the most hotly debated issue in the hardware enthusiast community recently. Because of that, testing graphics cards has become a much more complicated process than it once was. Where before you might have been able to rely on the output of a few synthetic, automatic benchmarks to make your video card purchase, that is just no longer the case. Video cards now cost up to $500 and we want to make sure that we are giving the reader as much information as we can to aid you in your purchasing decision. We know we can't run every game or find every bug and error, but we try to do what we can to aid you, our reader, and the community as a whole.
With that in mind, all the benchmarks that you will see in this review are from games that we bought off the shelves just like you. Of these games, there are two different styles of benchmarks that need to be described.
The first is the "timedemo-style" of benchmark. Many of you may be familiar with this style from games like Quake III; a "demo" is recorded in the game and a set number of frames are saved in a file for playback. When playing back the demo, the game engine then renders the frames as quickly as possible, which is why you will often see the "timedemo-style" of benchmarks playing back the game much more quickly than you would ever play the game. In our benchmarks, the FarCry tests were done in this matter: we recorded four custom demos and then played them back on each card at each different resolution and quality setting. Why does this matter? Because in these tests where timedemos are used, the line graphs that show the frame rate at each second, each card may not end at the same time precisely because one card is able to play it back faster than the other -- less time passes and thus the FRAPs application gets slightly fewer frame rates to plot. However, the peaks and valleys and overall performance of each card is still maintained and we can make a judged comparison of the frame rates and performance.
The second type of benchmark you'll see in this article are manual run throughs of a portion of a game. This is where we sit at the game with a mouse in one hand, a keyboard under the other, and play the game to get a benchmark score. This benchmark method makes the graphs and data easy to read, but adds another level of difficulty to the reviewer -- making the manual run throughs repeatable and accurate. I think we've accomplished this by choosing a section of each game that provides us with a clear cut path. We take three readings of each card and setting, average the scores, and present those to you. While this means the benchmarks are not exact to the most minute detail, they are damn close and practicing with this method for many days has made it clear to me that while this method is time consuming, it is definitely a viable option for games without timedemo support.
The second graph is a bar graph that tells you the average framerate, the maximum framerate, and the minimum framerate. The minimum and average are important numbers here as we want the minimum to be high enough to not affect our gaming experience. While it will be the decision of each individual gamer what is the lowest they will allow, comparing the Min FPS to the line graph and seeing how often this minimum occurs, should give you a good idea of what your gaming experience will be like with this game, and that video card on that resolution.
Our tests are completely based around the second type of benchmark method mentioned above -- the manual run through.
System Setup and Benchmarks
AMD Vista Test System Setup | |
CPU | Intel Core 2 Extreme X6800 - Review |
Motherboards | |
Memory | |
Hard Drive | Western Digital Raptor 150 GB - Review |
Sound Card | |
Video Card | |
Video Drivers | Catalyst 7.1 (8.333) - Vista |
Power Supply | PC Power & Cooling 1000w |
DirectX Version | DX 10 - Vista |
Operating System |
NVIDIA Vista Test System Setup | |
CPU | Intel Core 2 Extreme X6800 - Review |
Motherboards | EVGA nForce 680i Motherboard - Review |
Memory | |
Hard Drive | Western Digital Raptor 150 GB - Review |
Sound Card | |
Video Card | NVIDIA 8800 GTX Reference - Review |
Video Drivers | 100.54 - Vista |
Power Supply | PC Power & Cooling 1000w |
DirectX Version | DX 10 - Vista |
Operating System |
Benchmarks
3DMark06
Battlefield 2
Call of Duty 2
FEAR
HL2: Lost Coast
Prey
I tested these games at 1600x1200, 2048x1536 and 2560x1600, all running at 4xAA and 8xAF in-game settings. High-end systems like the one we have here demand minimum resolutions of 1600x1200 to be worth your money!
Battlefield 2
Battlefield 2 (Direct X)
Battlefield 2 is one of the first games to come along in quite a while that turned out to really push the current and even following generation gaming hardware. Having the privilege of being the first game that might need 2 GB of memory is either a positive or negative, depending on your viewpoint. Here are our IQ settings used:
Our map was the Strike At Karkand that turns out to be one of the most demanding in the retail package in terms of land layout, smoke and other shader effects.
Our first series of tests looks at the NVIDIA 8800 GTX and 7900 GTX experienecs under Windows Vista with Battlefield 2. With the 7900 GTX, the gaming experience remains mostly unchanged between Windows XP and Windows Vista as average and minimum frame rates stay within a close margin of each other. The 8800 GTX isn't so lucky, as we see a big drop in performance under Vista at the 2560x1600 resolution where the 100 FPS frame cap can't help as much. Here we see a 28% drop in average frame rate; and even at 1600x1200 the min FPS drops by 23% though the average remains very close to the Windows XP scores.
The AMD X1950 XTX scores from WinXP to Vista remain nearly the same with a modest 2% average frame rate change. Surprisingly we sw a 10% INCREASE in the minimum FPS using dual cards in CrossFire mode at 2560x1600! Clearly though the AMD Vista driver performance is MUCH closer to the Windows XP experience than NVIDIA's.
Call of Duty 2
Call of Duty 2 (Direct X)
Call of Duty was the best selling FPS of 2003 and Call of Duty 2 looks to continue that tradition with a great game style and superb graphics. Below are the graphical settings we used in our testing.
**Optimize for SLI is enabled in all of our testing!
The first mission was used in its entirety as the benchmark.
As mentioned earlier, running 4xAA in CoD2 through the in-game settings on the NVIDIA Vista driver presented us with all the sounds we expected but with a big solid grey colored screen. Fantastic. Switching to 4xAA set in the control panel allowed us to run it, so bear in mind we are comparing control panel set 4xAA on Vista to the in-game set 4xAA on Windows XP.
Looking at the 8800 GTX, the performance of the Windows Vista driver is very uplifting! Scores are actually modestly higher under the new OS compared to our testing under Windows XP. The 7900 GTX isn't as lucky, seeing a 12% drop in average frame rate at 1600x1200.
Again AMD's performance under Windows Vista is a very close match to that of Windows XP both in single card and CrossFire modes.
Half-Life 2: Lost Coast
Half-Life 2: Lost Coast (Direct X)
Lost Coast is the free technology demo that Valve created to show off their HDR rendering technique. Because this method of doing HDR (high dynamic range) lighting uses integer math instead of floating point, both ATI and NVIDIA cards can do both HDR and AA at the same time.
The NVIDIA Vista driver fails to impress under Half-Life 2: Lost Coast; with the 7900 GTX performance drops by 22% at 1600x1200 while the flagship 8800 GTX performance drops by 52% mainly due to a much lower maximum FPS. At the 2560x1600 resolution that we had enjoyed so much under Windows XP and the 8800 cards NVIDIA's Vista performance is 43% slower than in Windows XP and the minimum FPS is 25% lower and because of that was noticeably choppier during game play.
AMD's X1950 XTX performance in Lost Coast under Windows Vista is nearly a mirror image of that under Windows XP.
Power Consumption and Conclusions
Testing power consumption was done by testing idle power at the desktop and testing load while running 3DMark06 at 2560x1600. For testing the power consumption, I placed all the cards into the 975XBX2 motherboard to offer a common platform amongst them all.
My power consumption testing proves what I had been theorizing for some time about the introduction of the new Aero Glass user interface; idle power consumption has gone up slightly across the board. Since the new Aero Glass UI uses the DX9 technology to draw everything you see on the screen, the GPU is a lot less "idle" than it was before. Still, a modest 5-7 watts isn't much to worry about.
The load power consumption hasn't changed from XP to the Vista, and that also makes sense as the Aero interface isn't running during full screen gaming.
Conclusions
Judging frrom what we have seen thus far, the gaming performance available to enthusiasts is somewhat mixed on Windows Vista. Both NVIDIA and AMD have some areas where their cards perform well and others where they do not and features from XP to Vista haven't translated over quickly either.
Performance
First looking at the performance of the Forceware 100.54 driver and the NVIDIA GeForce 8800 GTX and 7900 GTX, I very obviously was let down by the Windows Vista gaming experience. I didnt' get to test as many games as I would have liked for this review due to the tight time restraints put on me by the driver availability, but in the games that we did go through only one of them was what I would call a "win" for NVIDIA: FEAR. In FEAR the performance under Windows Vista was nearly at the same levels we had seen in Windows XP and didn't present any issues in compatibility or stability. I guess I'd have to say that Call of Duty 2 performance was also acceptable even though we had to use control panel AA settings instead of in-game AA settings.
The experience I had with AMD's first Vista driver was much more positive. Most of the games I tested showed to be on very close performance levels to those we expected from Windows XP; the exceptions here were FEAR and Prey. Obviously with Prey not loading at all on our Vista test system, that presents a BIG problem but AMD is confident they'll find the problem and fix it fast. I'll let you know when that happens. FEAR performance was let down as well, though with the maximum FPS at 46 across the board in our tests, chances are this can be fixed pretty easily as well.
Overall, in terms of performance, NVIDIA is lucky that they have the GeForce 8800 series of cards available and AMD's R600 is still behind closed doors. The raw power of the G80 core is able to keep the 8800 GTX as our performance leader in Windows Vista even with the mentioned performance problems when compared to the AMD's flagship ATI X1950 XTX card.
Driver Features
As mentioned on the first pages of this article, the driver features comparison between ATI and NVIDIA is pretty dramatic. AMD was able to get the ATI Catalyst 7.1 driver to not only be fully functional, but added some new features like Blu-ray and HD-DVD support and a new control center that loads faster and has better previewing capability. They full admitted that OpenGL performance was going to be lower than we expected in Windows XP (though not working wasn't in the books!) but with CrossFire working in the exact same way as it did under Windows XP the AMD ATI Catalyst driver seemed pretty refined and ready for the spotlight.
NVIDIA's Forceware 100.54 driver on the other hand was more or less a mess. SLI support was not enabled and as of 10am on the 30th, the day Vista was released, it still wasn't ready. That is a very big let down for any enthusiast gamers who put their stock in NVIDIA technology with their hard earned dough. TV output and HDMI support is pretty much a wash here and several bugs stand out as making this driver revision seem rushed and hacked together. As I complained about in the earlier segments, how can a driver for a product in development for 4 years (G80) for an OS in development for what seems like forever, NOT be ready on launch day?
Final Thoughts
It may seem like my testing with gaming performance in Vista all resulted in a feeling of doom (nope, no OpenGL support!) and gloom, but don't let that scare you off just yet. I think we all expected there to be some initial growing pains with the Vista operating system and PC gaming simply because of the dramatic shift in driver technology that had to take place; I just don't think we expected it to be this bad. AMD's driver development team is definitely a leap ahead of NVIDIA's as the initial release ATI Catalyst driver offered a gaming experience much closer to that of Windows XP in the new Vista OS than NVIDIA's initial Forceware release. This may change as driver revisions are updated through the comings months, so we'll definitely be keeping an eye on both companies progress.
For now, gamers that were interested in running off to get a copy of Windows Vista, I'd caution you to take a minute and contemplate. Gaming under Vista is definitely possible and if you're comfortable with some slight performance drops for now while taking advantage of Vista's other new features, then a move to Vista sooner rather than later should be considered. If gaming and gaming performance is your only metric for your PC, then I'd definitely hold out on upgrading until AMD and NVIDIA have their software perfected.
No comments:
Post a Comment