Get the latest in PC and console gaming hardware news and hands-on testing reports from GameSpot's Hardware Insider.
Other Site Blogs
Microsoft has confirmed to GameSpot that the Asia launch window for the Xbox One will still remain at "late 2014", despite the...
GameSpot Versus Beat the Pros Edition is a way for GameSpot members to test their skills against super star pro gamers. From FPS to...
Apple's iPhone 4S brings a number of new features to the table, but only a few are relevant to gaming. Chiefly, the A5 processor and its GPU increase the capabilities of the phone considerably. The dual-core A5, the same one found in the iPad 2, offers twice the computing power of the iPhone 4. But the new GPU, a PowerVR SGX543MP2, is where it's really at, and it will offer a seven-fold increase in its capabilities when compared to the iPhone 4. But it's all about the developers who take advantage of more-powerful hardware. The A5 processor has only been used on the iPad 2 until now, and with the processor being deployed to the iPhone 4S, it gives developers a greater impetus to create graphically complex games for a much broader audience. The upcoming Infinity Blade II makes use of the A5's processing abilities.
iOS 5, the mobile operating system that powers Apple's mobile devices, will also bring Game Center updates with it. Game Center was initially released on iOS 4, and it acts as a social network for games within the Apple world. It records high scores, doles out achievements, and basically lets you compare and share your accomplishments with your friends. But like Ping--Apple's social networking service for music--there was no way to import friends lists from third-party services, which turns out to be a big stumbling block for any new social network. The iOS 5 update for Game Center solves that problem. Game Center on iOS 5 will also let you add a profile picture, browse games without leaving the app, and give you an overall achievement score that is compiled from all the games you play.
iCloud Game Saves
iCloud, which is also bundled within iOS 5, brings some interesting changes for gaming to Apple's devices. Using Apple's iCloud service, developers will let gamers store gameplay data profiles in the cloud, along with saved games. Cloud-saved games should be a boon to gamers with multiple iOS devices (iPhone, iPad, iPod Touch). With any luck, it will solve the issue of having to start games anew on each device. Developers will have to individually build cloud support into their apps, so it won't be an instantaneous upgrade. Apple will give 5GB of free data to all users, and additional storage can be purchased as needed. Apple will offer an extra 10GB for $20 a year, 20GB for $40 per year, and 50GB for $100 per year. The service will also sync contacts, calendars, documents, photos, and data from various applications.
Additional iPhone 4S and iOS 5 Updates
The iPhone 4S comes with an 8MP camera that has quick shutter speeds and can record 1080p video. Two antennas improve the phone's reception and bandwidth. Apple also demonstrated Siri, a virtual assistant that you can talk to get basic answers, like stock quotes, weather updates, contact information, and more. iOS 5 also brings with it upgrades to the notification system, a new messaging service, reminders, Twitter integration, and an easier to use camera.
The iPhone 4S 16GB will cost $199 with two-year contracts. 32GB and 64GB variants will cost $299 and $399, respectively.
The iPhone 4S will be available on October 14, and iOS 5 will make its way to the public on October 12. It will be available on Sprint, AT&T, and Verizon. For more iPhone 4S coverage check out CNET. (Images courtesy of CNET.)
The process of capturing video game footage has come a long way. Before there were reasonably priced options to play with, the solution was simply to mount a camcorder in front of the TV. It got the job done, but the grainy recordings and quiet tapping of buttons in the background weren't exactly desirable. The initiated could cobble together a good capture station, but the costs would still be quite high. More recently, quite a few products have popped onto the market at rather low prices. We've zoomed in on two products that get the job done for around $100.
Roxio's $100 GameCap is simplicity in a box the size of a deck of cards. The device plugs into your USB port and lets you connect a single console via component cables. The GameCap doesn't record at HD resolutions, but the easy-to-use controls and interface speak for themselves.
Getting a little fancier, the $130 Hauppauge Colossus gives you quite a bit more flexibility, but with an added dash of complexity. It can capture footage up to 1080i over HDMI and component video. As an add-in card for a desktop computer, it's also not very portable.
The Roxio GameCap is about as difficult to use as a bag of microwave popcorn. You simply plug the USB end into a computer and drop in the component video cables from your console. Installing the software is as trouble-free. Our only gripe is that you have to save the CD installation sleeve. If you lose that, you can't install future downloadable updates or the files off the original driver disk. This seems rather excessive considering that you need the hardware in the first place.
Getting the whole kit working didn't take us more than a few minutes. After flicking a few power switches and firing up the Roxio GameCap software, we had stutter-free video and audio on our computer monitor. No fiddling, no settings, nothing. It's pretty hard to get lost using the included software. The software doesn't let you go full screen if you want to use your monitor to play on, but you can maximize the window to get a larger viewable area. The GameCap box also has component video-out for use on an external monitor or HDTV.
1x component video/analog audio
1x component video/analog audio
There's a big green button in the Roxio GameCap program that says "Start Capture." You won't find any resolution settings or encoder settings to fiddle with outside of choosing whether you want a WMV, DIVX, or AVI file as the output. This makes the GameCap very easy to use but also limits you to the quality settings built into the program. Videos come out at 848x480 regardless of what resolution you set on the console, and screenshots are pulled at 720x480. Audio is captured in stereo. The GameCap software uses a high level of compression that results in small files that add a considerable amount of blocking and artifacting to the visuals. In our test video, the sky gets completely blown out using appropriate brightness settings. The screenshots are also taken with an incorrect aspect ratio, which results in a slightly stretched image.
Roxio's GameCap video editing software lets you trim videos, combine them, replace music tracks, add narration, and quite a bit more. The interface is clearly laid out, and the drag-and-drop functionality makes it intuitive. Once you're done editing, the software will render the file; how long that takes is a function of how powerful your CPU is and how complex the file is. Built-in export functions let you make stand-alone videos and post them to YouTube, Facebook, and WeGame with a few clicks. Overall video quality is nothing to write home about, and black levels are pretty bad due to compression.
The GameCap will come in handy for those folks limited to laptops. It's cheap, easy to use, highly portable, and it provides passable video quality. For better video quality, there are products like the Hauppauge HD PVR, but they also cost twice as much.
Priced at around $130, the Hauppauge Colossus costs a bit more than the Roxio GameCap, but it also does a heck of a lot more if you have a desktop that you can plug it into.
The Hauppauge Colossus requires an open 1x PCI Express slot and fits only in full-height desktop cases; slimline computers need not apply. If you have any experience at installing video cards, the Colossus won't prove any more difficult.
1x component video/analog audio
1x optical audio
Optional daughter card - 1x S-Video and 1x composite video
1x RF receiver
1x component video/analog audio
1x optical audio
Hauppauge includes connecting dongles for component video and everything to get you going using them. You're going to have to supply your own HDMI cables if you want to go down that route. Some models of the Colossus include an HDMI splitter. The Colossus supports S-Video and composite inputs, but the package does not include the functionality out of the box. You'll have to purchase a daughter card separately for $15, which isn't a terrible loss since most of us won't be capturing video from devices limited to those connections. Another benefit of the Colossus is its ability to record multichannel audio from sources over optical and HDMI inputs. The Colossus will not record over HDMI on devices that have HDCP enabled (copy protection). In practical terms, this means you won't be able to record PlayStation 3 footage over HDMI, but component video will work fine.
The Colossus comes with two software titles that you can use to record video with. WinTV v7 can be used to record games, but its primary use is to record TV shows. The ArcSoft ShowBiz software is better suited to gameplay recording because it also has a built-in video editor and automatic upload to YouTube. Both programs let you take screenshots at whatever resolution you've set the console to (1280x720, 1920x1080).
The Colossus has a slightly longer learning curve, but it also delivers high-quality recordings and screenshots, and it offers numerous input options for the effort. In addition to being able to capture over various inputs, the card works well as a personal video recording device like a Tivo, provided you have a cable box or a PC tuner card.
Video quality from the Colossus is excellent, especially considering its price. Our only gripe with the Colossus is that it's impossible to play a lag-free game using only the computer. You will need to use the video outputs to properly play and record. Component video users are covered, but certain models of the Colossus do not include an HDMI splitter, so you'll have to fork out an extra $10 depending on the model.
Even though the two products have similar prices, deciding between them isn't terribly difficult. Gamers with laptops are clearly limited to USB-style options like the Roxio GameCap. As long as you're not looking for HD capture or particularly discerning about video quality, the GameCap should serve just fine. Get the Hauppauge Colossus if you have a desktop. The capture quality is excellent, it has a low price point, and it offers great connectivity options.
Mentioning the term "onboard graphics" to a gamer is like yelling four-letter words in holy places. You're free to do both, but the results are less than optimal. Until recently, onboard graphics were useful for games like Scrabble or perhaps Solitaire, if you really wanted to push the envelope. Intel's Sandy Bridge processor changed that a bit by pairing the beefy Core i3/i5/i7 processors with Intel HD 3000 and HD 2000 graphics. The result allowed you to turn on quite a few newer games and actually play older titles. Crysis was still a stretch, but Counter-Strike is rather doable. AMD is also hopping onto the let's-make-our-onboard-graphics-worth-looking-at bandwagon, albeit half a year later. The company recently released Brazos, meant for laptops, and is now following up with Lynx, AMD's newest desktop offering.
Lynx-based chips are essentially quad-core Phenom II X4 processors that have had a few changes made to them and have been paired with more than a few Radeon cores. The changes are enough to warrant an entirely new socket. Lynx chips will use motherboards with FM1 sockets and A55/A75 chipsets. The company will release four Lynx-based desktop parts: A8-3850, A8-3800, A6-3650, A6-3600. The A8-3850 will cost $135, while the A6-3650 will be $115. AMD doesn't have pricing for the other two, but we're going to guess it's up and down by a handful of bucks. The two with pricing will be available on July 3.
Radeon HD 6550D GPU
4MB L2 Cache
2.4GHz Quad-Core (2.7GHz Turbo)
Radeon HD 6550D GPU
4MB L2 Cache
Radeon HD 6530D GPU
4MB L2 Cache
2.1GHz Quad-Core (2.4GHz Turbo)
Radeon HD 6530D GPU
4MB L2 Cache
With such a low price point, the fastest Lynx quad-core A8-3850 lines up against Intel's dual-core Core i3-based processors. It might seem like an unfair fight, but AMD hasn't exactly been getting hearts racing with its processors during the past few years. As gamers, our primary preoccupation is with the GPU, though, and AMD's rather modestly priced CPUs get the job done in that respect.
Radeon HD 6550D
600MHz GPU Clock
480GFLOPS Peak Compute
Radeon HD 6530D
284GLOPS Peak Compute
Lynx's DirectX11 built-in GPUs aren't anything that's normally worth gushing over, but they are impressive for onboard replacements. The Radeon HD 6550D features 400 Radeon cores running at 600MHz, and the Radeon HD 6530D has 320 of them at 443MHz. The A6's Radeon HD 6530D strikes us as rather neutered, especially considering the minuscule price differences.
Lynx features hybrid CrossFire, but the catch is that you have to purchase another Radeon GPU within a similar performance bracket. Hybrid CrossFire will let you run both the onboard and outboard Radeon GPUs together to get a bit of a boost. The big problem is that at the bottom, you have to spend $50 for a Radeon HD 6450 or about $100 for a Radeon HD 6670 at the top. Additionally, the gains will only be evident in DirectX10 and DirectX11 titles. We really don't recommend spending $50 on a GPU, the results simply aren't worth the effort. If you opt to spend $100, you might as well drop an extra $20 to get a Radeon HD 6770 to get better performance and none of the issues that generally come with dual-GPU setups.
We had to pit the A8-3850 against a substantially more expensive and powerful Core i5 2500K because we didn't have a Core i3 with HD 3000 graphics on hand for testing. The A8-3850's onboard GPU does a phenomenal job of trouncing the Core i5's HD 3000, to the tune of fifty percent on some games. If we upgrade to a Radeon HD 6850, the A8-3850 still does well compared to a vastly more expensive Core i5 2500K, which also has a 600MHz Turbo Boost advantage over the Core i3-2105.
The A8-3850 measures up well if all you're looking for is a simple box to serve as a home theater PC that will perform light gaming duties. However, if you're prepping the system for upgrades and heavy gaming from the start, you're better off taking a different path. We'd opt for the cheaper Phenom II X4 840 (or a Core i3) and pour the savings into a faster GPU for better performance out the door.
Microsoft released a video giving a glimpse of what's to come in Windows 8. The presentation shows off an extraordinary amount of touch screen integration and a user interface that's a blend of Windows Phone 7 and Xbox Live.
It seems like only yesterday that Microsoft was sending out betas for Windows 7, but here we are with a sneak peek at Windows 8, which, according to Microsoft's first "Building Windows 8" video, is going to go tile-based rather than icon-based, at least for tablets and mobile devices. This will, of course, go hand in hand with touch-screen functionality that mirrors what we've come to expect from modern smartphones like the iPhone 4, the Samsung Galaxy, and others. According to the video, the proposed start screen won't be the icon-based desktop we all know and use, but rather, a big clump of tiles in the middle of the screen, which are presumably intended to work best with touch-screen setups. I understand that "mobile computing" (if you guys will let me use that term) is something that has increased exponentially in recent years and that Apple's iPhone hardware has basically been a bottomless pot of gold for that company. What I'm less excited about is the seeming possibility that Microsoft is more interested in touch-screen interfaces and less so in keyboard-and-mouse interfaces.
For those who missed it, I recently traveled to id Software to get GameSpot's E3 2011 exclusive preview of Rage, and for the record, I embarrassed myself pretty badly in front of the company's creative director and its president on a big-screen theatrical setup. Why? My official answer is jet lag (and fleeing from Oklahoma tornadoes), but the real reason is that I was playing the Xbox 360 version with a standard 360 controller, and when it comes down to it, I'm a dyed-in-the-wool keyboard-and-mouse man and still believe, like most right-thinking human beings on the planet, that keyboard-and-mouse is a flat-out superior control setup for first-person shooters (not to mention strategy games, role-playing games, and pretty much any game with nested menus).
I realize it's way too early to start freaking out here, but as someone who has preferred to play with a keyboard and mouse for many, many years, I have to say I'm a bit concerned if this early demonstration means that Microsoft will try to distance traditional mice and keyboards from future iterations of Windows. I literally cringed while watching the video when the touch-screen keyboard was used--I play my games on the PC to avoid having to hunt and peck like that. Aside from the obvious implications for hardware manufacturers that make gaming mice and keyboards like Razer and Logitech, I'm just not liking the (admittedly implausible) possibility that mice and keyboards might become second-class citizens for Windows PCs. Of course, mice and keyboards aren't going anywhere in the near future--everyone on the planet uses them for important work-related applications--but longer-term, I have to say, I don't like the idea of giving up on my mice and keyboards for PC games just yet…at least, not until a much better alternative (that ideally isn't a touch screen) arises. That's my two cents, anyway.
Andrew's fear of change in this case might be misplaced. It's not so much that Microsoft is giving the proverbial finger to the keyboard and mouse as it is enhancing it. Those two devices are incredibly powerful in terms of accuracy and speed, but they're inappropriate for tablets and smartphones.
If we look just a little bit into the future, the Windows 8 demo signals a step in the right direction for Microsoft. The world is marching toward operating systems that need to be accessible through a variety of input devices, be they fingers, mice, or keyboards. At the moment, all of these methods reside in discrete areas. Fingers get used on smartphones, and mice and keyboards stick to laptops and desktops. This works fine if all of these devices never need to interact and stick to their specialties. But we're getting close to an inflection point, one at which these disparate devices might just merge completely.
If you haven't seen the Motorola Atrix 4G, do yourself a favor and check it out. It's a smartphone and portable laptop slapped into one. The phone itself is powerful, but when plugged into the dock (keyboard/touchpad and monitor, a faux-top if you will), it converts into a rather robust platform. The execution and price points are questionable, but the direction is undeniably sound.
Smartphone processors are set to increase in speed tremendously over the next five years, think orders of magnitudes (100x) more powerful. Their persistent data connections also remove the limit of paltry onboard storage when connected to the cloud. There's also nothing stopping manufacturers from popping a 500GB drive into a dock. The seamless merger of desktop and smartphone operating systems becomes a powerful selling point. Why buy two computing devices when one does the job? External monitors, keyboards, and mice fill in the gaps when you need to get accuracy and brute typing speed. On the go, the smartphone lets you carry everything you need without adding an extra pound (or five) of weight.
All of this convergence gets pretty cool when it comes to gaming as well. A single computing device allows for desktop and on-the-go gaming at the same time, all without losing where you are in the game. This all depends on how developers take advantage of the platform, and some genres (real-time strategy, first-person shooter) definitely won't transfer well. But imagine starting your gaming session on a game like Osmos or Castle Crashers at home, only to continue on a bus a moment later. Interesting times, they are a coming.
Calibur 11 has partnered with Major League Gaming to create a unique case for the Xbox 360, using the league's typically eccentric tastes as a guide. The result is the MLG Vault, a case that features a custom camouflage chassis with its very own light-up apocalypse head.
We attached the case to our unsuspecting Xbox 360 to see how it looks. Check out our image gallery for the result.
The word "enough" doesn't exist in a GPU manufacturer's vocabulary. They will always find a way to do more. And now, after advancing from the GeForce GTX 480 to the GeForce GTX 580, Nvidia finds itself releasing the GeForce GTX 590. It's a behemoth of a card, weighing in with not one, but two GeForce GTX 580 GPUs and a mind-boggling $700 price tag.
Nvidia has jammed two powerful GeForce GTX 580-class processors onto a single board. The company couldn't pull off the same feat last year with the hotter-running GeForce GTX 480. Like in dual-GPU cards before it, the chips aren't clocked as high as their single GPU brethren. Squeezing a pair of power-hungry GPUs onto the same board brings up issues of heat and power--not to mention the fact that you have to double almost everything else and still keep it within nearly the same physical space.
In terms of internals, the GeForce GTX 590 has a pair of fully functional 512 core GeForce GTX 580s and 3072MB of GDDR5 memory (1536MB per GPU). Nvidia clocked the chips at 607MHz, and the RAM speed is set at 3414MHz. By comparison, a single GPU GeForce GTX 580 has a core that runs at 772MHz and GDDR5 memory clocked at 4000MHz.
A pair of gigantic vapor chamber coolers keep the GeForce GTX 590 cool. A single fan pushes air across both chips. On the back side of the card you'll find a set of aluminum plates that help to cool off the RAM. Nvidia also made it easy to remove the fan shroud to clean out dusty internals. In use, the GeForce GTX 590 doesn't emit an excessive amount of noise. The card is substantially quieter than the raucous GeForce GTX 480.
On the connector side of things, the GeForce GTX 590 breaks from Nvidia's dual-monitor limitation. The GeForce GTX 590 has three dual-link DVI connectors and is the first Nvidia card to also feature a Mini DisplayPort output. You can connect up to three monitors using a single GeForce GTX 590, making it possible to have 3D Vision surround from a single card.
The GeForce GTX 590 uses two eight-pin power connectors to get the juice it needs. And it needs a lot. Nvidia recommends at least a 700W power supply. For those feeling a bit frisky and contemplating a quad-SLI setup, Nvidia recommends a power supply in the 1200W range.
We don't have the recently released Radeon HD 6990, the GeForce GTX 590's direct competitor, on hand for testing. We've produced some performance metrics of the GeForce GTX 590 here; for a full rundown, we're going to defer to Anandtech.
No doubt about it, the GeForce GTX 590 is a powerful card. And it's definitely not for the cost conscious among us. $700 buys you a lot of performance rolled into a single card. But if all you're looking for is brute speed, two $350 GPUs like the GeForce GTX 570 or the Radeon HD 6970 running in SLI or CrossFire mode will get you better frame rates. If you factor in rebates, they'll get you there for even less than the price of a single GeForce GTX 590. However, if noise and space constraints are paramount issues, the GeForce GTX 590 will get you those frames with a bit more style. Outside of that, dual GPU cards like the GeForce GTX 590 and the Radeon HD 6990 really serve one purpose: bragging rights for the companies involved.
Nintendo released the 3DS in Japan on February 26, and we promptly flew one back to the States for some tinkering. Nintendo traditionally makes its handheld consoles backward compatible with the previous generation. The original Nintendo DS was backward compatible with Game Boy Advance games, and the pattern continues with the Nintendo 3DS's ability to play Nintendo DS/DSi games.
GameSpot tested load times on the Nintendo 3DS and the Nintendo DS Lite with five DS titles. Time measurements began from the moment we launched the game in the console's menu until we could interact with the game menu. All tests were repeated three times and then averaged to the nearest whole number.
Surprisingly, the Nintendo 3DS loads games considerably more slowly than the Nintendo DS Lite. On average, the Nintendo 3DS took 35 percent longer than the Nintendo DS Lite to get to the in-game menu. Performance didn't seem to suffer once we started to play the games, though. Frame rates seemed normal across all the games we played.
The Nintendo 3DS has internals that outclass those of the original dual-screen handheld. However, outside of stating that the console contains a 200MHz PICA200 GPU, Nintendo has been silent on the details surrounding the CPU, but it's clear from the resolution jump on both screens that the 3DS needs to have more processing power. By comparison, the original Nintendo DS is powered by a 133MHz ARM9 and a 33MHz ARM7. We're guessing that the software emulation hoops the Nintendo 3DS needs to jump through to run DS games bog it down a bit. Nintendo hasn't yet responded to inquiries regarding the results.
Sony's recently announced NGP contains a quad-core ARM Cortex A9 CPU and a PowerVR SGX543MP4+ GPU. Both chips are fairly new on the block, and Sony didn't elaborate too much on the exact configuration or clock speed of the chips involved. We dug up the base specs on both chips, but we don't know exactly how they're going to be configured inside of the NGP.
Quad-Core Cortex A9
Based on ARM's publicly provided Cortex A9 CPU specs, we know the chip can be configured in single, dual, and quad-core varieties. We've reached out to ARM for specifics, but it has yet to respond. Based on its spec sheets, the dual-core variant comes in two flavors, optimized for either power or performance. The power-optimized chip lowers the clock speed to 800MHz, which results in 4,000 DMIPS of computational power and .5W of power consumption. The performance-optimized version bumps the speed up to 2GHz, and the computational power jumps to 10,000 DMIPS with 1.9W of power consumption. Assuming a linear scaling of performance, the ARM Cortex A9 with four cores could perform anywhere between 8,000 and 20,000 DMIPS. By comparison, the iPad, which is based off of the 1GHz Cortex A8, has roughly 2,000 DMIPS of computational power.
Imagination Technologies makes the PowerVR SGX543MP4+. However, we can find information only on the PowerVR SGX543MP at the moment. We've reached out to Imagination Technologies for comment, but it has yet to respond. Apple is also rumored to be using a variant of the PowerVR SGX543 chip for the upcoming revision of the iPad.
The normal nomenclature model for the PowerVR SGX543MP contains additional numbers following the MP that indicate the number of cores onboard. Imagine Technologies can configure the GPU to have 1, 2, 4, 8, and 16 cores. Sony opted to use the SGX543MP4+, which indicates that the chip has four cores. We have no details on what the "+" indicates.
Imagine Technologies states: "At 200MHz core frequency an SGX543MP4 (four cores) will deliver 133 million polygons per second and fill rates in excess of 4Gpixels/sec. Higher frequencies or a larger number of cores each deliver more performance. At 400MHz core frequency an SGX543MP8 (eight cores) will deliver 532 million polygons per second and fill rates in excess of 16Gpixels/sec."
Assuming the chip hasn't been highly modified from the PowerVR SGX543MP4, a 200MHz PowerVR SGX543MP4+ could live up to the PlayStation 3's 4.4Gpixel/s fill rate spec. We also have to add that raw fill rate numbers rarely translate into real-world performance.
Wireless routers are something of a commodity nowadays, given away freely with broadband packages or available at knockdown prices. This hasn't stopped manufacturers from making high-end models though, with many promising higher speeds, enhanced security, and better media streaming to entice a purchase from users. Some even claim to offer better speeds when gaming by prioritising traffic on your network, resulting in lower pings and less lag.
We've rounded up three top-of-the-range routers, each of which promises to improve gaming performance on your network: the Linksys E3000, Belkin Play Max, and D-Link DIR-815. Each was tested on a 50-megabit Internet connection, and for comparison's sake, we've included the results from the free low-end D-Link DIR-615 wireless N router that Virgin Media provides with its UK home broadband packages.
We ran three tests on the routers looking at the ping times in World of Warcraft, the results from pingtest.net, and file transfers between two computers on the network. WOW's built-in tools make it easy to monitor latency, and it should be noted that we opened up all the relevant ports before noting down the results. We also ran the test while downloading a large file and streaming a movie to see how well the routers prioritised network traffic.
Pingtest.net looked at two factors: ping and jitter. Ping measures how long it takes a packet of data to travel from your computer to a server on the Internet and back. The higher the number, the longer it takes information to get from your computer to the Internet. Jitter measures the variance in successive ping tests--that is, if there are any corruptions in the data sent back and forth to a server. The lower the number, the more stable your connection.
Our final test looked at file transfers across the network. We ran two tests: one for a large 2GB file and one for a set of small files totaling 200MB. The quicker the time taken, the faster the transfer speed over wireless. As well as wireless, we ran tests over Ethernet, but the variance in results proved to be so negligible that their inclusion was not worthwhile. Before we delve into the results, let's take a look at what the routers have to offer.
Each router has a number of features not available in lesser models. They all feature dual-band support for wireless N across the 5GHz and 2.5GHz bands, which is useful if you've got other devices in your home, such as wireless telephones, that can interfere with the 2.5GHz spectrum. Plus, you get four gigabit Ethernet ports for wired devices. The Play Max and E3000 can also create a guest network that can be limited to just Internet access, so you can let visitors use your wireless connection without a password and not worry that they will access files on your network. All support the usual security standards, including WEP, WPA, and WPA2 at up to 128-bit encryption.
The Play Max and E3000's special feature is a USB port for hooking up a hard drive to use as a network attached storage for SMB or DLNA file sharing. Sadly, NAS performance isn't great on either. While photos and MP3 files steam fine, videos tend to stutter, and file transfers take an age, making them no substitute for a dedicated NAS unit. This is made worse by the confusing configuration options on both. On the E3000, the NAS functionality can be configured only via the Web interface, while the guest network can be configured only by the Cisco Connect software. On the Play Max, NAS features can be configured and accessed only via the desktop software, meaning that you have to install it on each computer.
Fortunately, most other features can be accessed via the Web interface, including the all-important port forwarding for games. The Web interface is speedy on all the routers, with the DIR-815 being the easiest to use, partly due to its smaller feature set. Despite the gaming credentials they all boast, none come with port forwarding presets for common titles such as World of Warcraft or Counter-Strike, which is disappointing. However, there is universal plug-and-play support, so if your game supports it, configuration can be done over the air.
Performance (lower numbers are better)
All routers performed admirably, with no signal dropouts or stability issues. Compared to the stock router, though, performance gains in gaming are negligible. The biggest gain was eight milliseconds with the DIR-815, a figure so low as to make no difference.
Interestingly, the E3000 had the worst performance in WOW and in the pingtest.net tests, but again, only by a few milliseconds. Those figures actually went down when using the more crowded 2.5GHz frequency, suggesting problems with 5GHz optimisation. That said, the E3000 had the best file transfer performance on test, being a over a minute quicker than the DIR-615 when copying large files.
What's most telling about the results is how little is actually gained from switching to a higher-priced router. While additional features such as dual-band support, guest networks, and media streaming are useful, those looking only for a boost in gaming performance will have little to gain aside from a few milliseconds of less ping and an empty wallet.
Have you been to a party where you met that guy? You could be talking about anything, driving on the freeway, or eating a hot dog, and he'd one-up you. It's guaranteed that guy drove 100mph into oncoming traffic and threw hot dogs (with mustard on them) at the wildly swerving cars. AMD and Nvidia are both that guy. And in this case, I love him. This time around, Nvidia was set to rain all over AMD with the newly released the GeForce GTX 560 Ti. Priced at $249, it would replace the immensely popular GeForce GTX 460 and provide considerably more performance. Then, AMD came along and added a lightning storm to the mix with the release of the Radeon HD 6950 1GB priced at $259.
The GeForce 560 Ti features the same benefits that the GeForce 500 and 400 series offer, but the chipset has been considerably refined to provide greater performance with reduced power consumption. To sum up recent Nvidia GPU history, the GF100 chipset made up Nvidia's Fermi chip, which consisted of the GeForce GTX 480 and 470. Fermi was quick, but it gulped down power and boiled everything within inches of it. Given some time, Nvidia revamped the GF100 into the GF104 (aka GeForce GTX 460) and GF110 (aka GeForce GTX 580). Now, we have the GF114, which gives us the GeForce GTX 560 Ti. Basically, we get all of the goodies (DirectX11, PhysX, 3D support) but none of the messy heat concerns.
GeForce GTX 560 Ti
822MHz Core Clock
384 CUDA Cores
1GHz Memory Clock
By comparison, the GeForce GTX 460 came with 336 CUDA cores clocked at 675MHz and 1GB of GDDR5 memory at 900MHz.
The GeForce GTX 560 Ti uses a modest 170W and requires the use of two 6-pin PCI Express power connectors. It comes in at 9 inches in length, which is slightly longer than the GeForce GTX 460. Nvidia representatives mentioned that the GeForce GTX 560 Ti is an exceptional overclocker and many board partners would also roll out video cards with significant clock bumps. We've already seen overclocked cards (900MHz core) at retailers with no price premium at all.
Radeon HD 6950 1GB
AMD's Radeon HD 6950 1GB is almost identical to the original Radeon HD 6950 2GB part with one exception: the amount of RAM. Reducing the available amount of RAM allows the company to drop the price of the card without affecting performance in unconstrained scenarios. Unless you're running exceedingly high resolutions (like those found in Eyefinity setups) and massive amounts of antialiasing, there's little chance you'd notice the difference.
The Competitive Landscape
$289 - Radeon HD 6950 2GB
$269 - Radeon HD 6950 1GB
$259 - GeForce GTX 470
$249 - GeForce GTX 560 Ti
$239 - Radeon HD 6870
We tested the GeForce GTX 560 Ti and pit it against the Radeon HD 6950 2GB, as we didn't have a 1GB card on hand. We also tested it against the Radeon HD 6850 and GeForce GTX 470, and then we tossed in the GeForce GTX 460 to see how far Nvidia has come.
When compared to the GeForce GTX 460, the GeForce GTX 560 Ti kicks it to the curb in every test. In Lost Planet 2 the GeForce GTX 560 Ti pulled away with a large 25% performance increase. It comes as no surprise that the GeForce GTX 560 Ti wipes the floor with the Radeon HD 6870. The GeForce GTX 470 provides stiff competition, but we'd side with the GeForce GTX 560 Ti because it has none of the noise or heat concerns and performs better in most of the tests. The only real competition comes from the Radeon HD 6950 2GB and, by relation, the Radeon HD 6950 1GB. They trade blows and perform similarly, which makes it a draw in our books. AMD countered Nvidia quite well with the release of the Radeon HD 6950 1GB card, essentially nullifying outright dominance of this highly contested price point for either company.Test System: Core i7 980x, Asus Rampage III Extreme, 6GB OCZ DDR3, Seagate 750GB 7200.11, Windows 7 64-bit. Video card drivers - Catalyst 11.1a beta, Forceware 266.58.
High-end PC component manufacturer Corsair has been tentatively dipping its toes into consumer audio products of late. Its first product, the Dolby-powered HS1 headset, was very impressive. The design, build, and sound quality were head and shoulders above many of its competitors, despite a few bass response niggles.
Spurred on by the success of the HS1, Corsair is branching out further, this time into desktop speaker systems--a market heavily dominated by companies such as Creative and Logitech. It's kicking things off with the top-of-the-range SP2500, a gaming-focused 232-watt 2.1 system complete with bi-amplified satellites, a 4th order band-pass subwoofer, and a nifty LCD-equipped desktop controller, all costing a considerable $249.
The satellites and subwoofer have an understated look about them. You won't find any glossy plastic, shiny chrome, or oddly shaped housings to speak of. Instead, the look is functional, with the small satellites sporting mesh-covered 3-inch drivers and a silk domed tweeter in a plain black enclosure. They're reassuringly sturdy and weighty, though, and look and feel like they'll stand up to some abuse. The 120-watt subwoofer is similarly weighty and clad in black, with its size giving it a monolithic stature.
It's significantly larger than subwoofers from similar systems, almost reaching the size found in most home theatre setups, so you'll need plenty of room under your desk to house it. This is because it uses 4th order band-pass technology, which Corsair claims provides deeper and more accurate bass compared to the ported bass-reflex designs commonly offered by rivals. It also houses the power supply, a 3.5mm jack input, a standard phono line-in, a two-stage amplifier for the sub itself, and four Class D digital amplification circuits for the bi-amplified satellites,
Bi-amplification isn't something normally seen in desktop 2.1 systems and is instead featured in much more expensive hi-fi, home theatre, and studio monitor setups. What bi-amplification does is provide separate amplification to the mid-range driver and tweeter in each satellite through individual audio cables, rather than drive the entire satellite through one amplifier via a single cable. In theory, this should result in much better sound quality because each driver receives only the frequencies it's supposed to be reproducing, so there's much less chance of distortion. This does of course depend on how well the amplifier separates the low, mid, and high frequencies--commonly known as crossovers.
It's fortunate then that the SP2500 has some top-notch digital crossovers at its disposal. They're implemented using a digital signal processor (DSP), which means the crossover can be dynamically adjusted, so higher or lower frequencies can be sent to the sub and satellites on the fly. This can dramatically change the sound of the system and in conjunction with the DSP allows for special modes and environmental effects.
They're controlled using a supplied desktop controller, which features a colour LCD, a control dial, another 3.5mm jack input, a headphone output, and buttons for power, menu, volume, and subwoofer volume. The controller is very easy to use, with the control dial acting much like the iPod's clock wheel. You can scroll through lists of different EQ and environmental presets, adjust the volume, and change inputs with ease, and the bright LCD makes things easy to see late at night.
Gaming Headset Group Test - Plantronics, Audio-Technica, Corsair, Sennheiser, Astro, Creative, SteelSeries
While graphics usually hog the spotlight when it comes to gaming, hearing great audio is just as important. Being able to accurately hear where your enemies are shooting from can mean the difference between owning a match and becoming cannon fodder.
We've rounded up seven top PC gaming headsets, which you can check out in our photo gallery.
AMD released the mid-range Radeon HD 6800 GPUs in October. The high-end Radeon HD 6900 series was supposed to follow shortly after, but plans sometimes get derailed. The company barely made it for the holiday season, but it's finally ready to unveil the Radeon HD 6970 and Radeon HD 6950. The new GPUs offer a few new features and a handy dose of nomenclature ambiguity to boot.
Radeon HD 6970
880MHz Core Clock
1536 Stream Processors
2GB GDDR5 RAM
1375 MHz Memory Clock
Radeon HD 6950
800MHz Core Clock
1408 Stream Processors
1250MHz Memory Clock
AMD's Radeon HD 6970 will go toe-to-toe with the GeForce GTX 570, and the Radeon HD 6950 spars with the GeForce GTX 470, as well as replaces the Radeon HD 5870. The mid-range Radeon HD 6870 replaces the Radeon HD 5850, and the HD 6850 boots out the Radeon HD 5830. As far as AMD is concerned, the Radeon HD 5800 series has now been discontinued. The HD 5700-based parts will continue to be manufactured, as will the dual-GPU Radeon HD 5970. The Radeon HD 6970 doesn't come close to displacing the Radeon HD 5970, so for that, we'll have to wait a little bit longer for the Radeon HD 6990.
The recently released 6800 GPUs were basically architecturally refined 5000 series GPUs. The 6900 series takes all those refinements and adds onto them. They feature dual graphics engines that provide triple the tessellation performance of the Radeon HD 5870. The new GPUs also contain second-generation DirectX 11 optimizations, accelerate Blu-ray 3D, and have new antialiasing and ansiotropic filtering modes to improve image quality.
Morphological antialiasing provides full-scene antialiasing via a post-processing technique. Traditional AA modes are applied when the scene is being rendered; morphological AA is done after the frame has been generated. The new AA mode is compatible with any DirectX 9/10/11 game.
EQAA, or enhanced quality antialiasing, uses up to 16 subsamples to improve image quality. It can be enabled on top of existing antialiasing modes, although it does take a little bite out of performance.
On the back end of the Radeon HD 6970 and HD 6950, you can find two dual-link DVI ports, an HDMI 1.4a port, and two mini-DisplayPort 1.2 connectors. HDMI 1.4a allows the video card to output 3D Blu-ray. DisplayPort 1.2 makes a whole new set of features available. The connector now allows you to daisy chain displays together, and with the use of a splitter/hub, you can also support up to six DisplayPort 1.2 displays through two connectors. We don't have pricing information on these hubs, but AMD indicates that they should be fairly inexpensive. Active DisplayPort 1.1 adapters used to cost $100, but now they're about $20, so we're inclined to think that the hubs won't be all that pricey.
The Radeon HD 6970 consumes 250W at peak power, and the Radeon HD 6950 consumes 200W. Both idle at 20W. On the power connector front, the Radeon HD 6970 requires 1x8-pin and 1x6-pin plug, and the Radeon HD 6950 uses 2x6-pin plugs.
The Competitive Landscape
$260 - GeForce GTX 470
$290 - Radeon HD 5870
$300 - Radeon HD 6950
$350 - GeForce GTX 570
$370 - Radeon HD 6970
$450 - GeForce GTX 480
$530 - GeForce GTX 580
The Radeon HD 6950 proves to be a solid value at its $300 spot. It bests the slightly cheaper GeForce GTX 470 handily in everything but Lost Planet 2. The Radeon HD 6950 also outguns the Radeon HD 5870 quite easily and gets very close to catching up with the more expensive GeForce GTX 570.
AMD's new single GPU flagship Radeon HD 6970 also performs well. It takes out the GeForce GTX 570 in everything but Lost Planet 2, but it likely won't catch up to the GeForce GTX 580 (We don't it have for testing, and it's hardly in stock anywhere, which probably explains why we don't have one.)
AMD's new GPUs didn't win the single-GPU crown, but the combination of performance and pricing make them more than competitive. The 6900 feature set is also very future proof. DisplayPort 1.2 isn't all too common yet, but with the low price of active DisplayPort adapters, rolling with three monitors (and even six) has never been easier. Also, be on the lookout for cheaper 1GB variants of the Radeon HD 6970 and Radeon HD 6950.
Test System: Core i7 980x, Asus Rampage III Extreme, 6GB OCZ DDR3, Seagate 750GB 7200.11, Windows 7 64-bit. Video card drivers - Catalyst 10.12beta, Forceware 263.09.
Nvidia has officially cranked it up to 11. After botching the launch of the GeForce 400 series, the company had to demonstrate that it was still capable of delivering. And it has. A scant eight months after the launch of the GeForce 400 series, Nvidia popped out the GeForce GTX 580 in late November, and now it's ready to release the GeForce GTX 570.
Rolling in with an MSRP of $350, the GeForce GTX 570 is eminently more affordable than the $500 GeForce GTX 580 and comes packed with 480 Cuda Cores running at 1464MHz, and 1280MB of GDDR5 RAM running at 3800MHz. The core clock is pegged at 732MHz. We're essentially talking about a GeForce GTX 580 with a slight haircut. Even more stunning, the GeForce GTX 570 slaps the GeForce GTX 480 in just about every single stat and costs significantly less!
The GeForce 500 series is based on Nvidia's GF110 architecture, which is itself an extension of the GF100 chipset that the 400 series used. One would expect that the new 500-series parts would be substantially changed from the 400 series, but then one would just be wrong. All the bullet points from the 400 series still stand: DX11 support, 3D Vision, Nvidia Surround, and PhysX. The key benefits of the 500 series are an improved cooler, quieter running speeds, and architectural improvements that basically yield a faster-running 400-series part sans the mind-blowing temperatures.
The dual-slot, 10.5-inch GeForce GTX 570 is just as long as a GeForce GTX 480. Its new copper cooler gives it a little more heft, and a rejiggered fan design lowers operating noise. You'll also need only two six-pin PCI Express power connectors to power it--no fancy eight-pin connectors required. When running at full tilt, the GeForce GTX 570 is nowhere near as noisy as the GeForce GTX 480.
Much like Rowdy Roddy Piper in They Live, Nvidia is out of bubble gum and clearly ready to kick something. At $350, the GeForce GTX 570 is a phenomenally strong GPU. ATI has no single GPU solution that can even hope to compete with it at the moment, either on price or performance (although that may change very soon). The $450 GeForce GTX 480 gets either beat or matched by the GeForce GTX 570 in our tests. Let's just go over that again: overall better performance than the GeForce GTX 480, lower power consumption, lower heat output, reduced sound levels, and all at a substantially lower price point. There's really nothing to dislike about this turn of events.
Test System: Core i7 980x, Asus Rampage III Extreme, 6GB OCZ DDR3, Seagate 750GB 7200.11, Windows 7 64-bit. Video card drivers: Catalyst 10.11 and Forceware 263.09.
Released back in 2008, PlayTV allowed PlayStation owners to turn their consoles into a free-to-air PVR. Sony's latest update for the software, available for download through the PlayStation Network, promises to make watching television a social activity.
Building on PlayTV’s current features--which allow owners of the little black box to watch, pause, rewind and record live television--the new update will add a live chat, community recommendations, a premium programme guide, and series link recording.
Updating the software is simple; once PlayTV is on your PlayStation 3, you need to go to the PlayStation Network Store and download the key that unlocks the update. The key costs €7.99 (£6.29) for regular PlayStation users and €5.57 (£4.72) for PlayStation Plus subscribers. Then, to activate it, you go back into the PlayTV interface and perform a software update. Our download and installation took about five minutes. The update worked pretty much immediately; only the recommendation feature required further setup.
Recommending shows to friends on PSN was easy. All we had to do was go into the channel listings, go to the show we wanted to recommend, and select "recommend" from the options menu. In order to post recommendations onto Facebook walls, you first have to link your Facebook account to your PSN account. Once done, recommending shows on Facebook was just as simple as for the PSN.
The new live chat feature let us create or join a chat room on each channel. Each chat room can reportedly host up to 64 users--though we rarely saw more than 40 at once, even in the public rooms. Outside of the fact that writing with the DualShock 3 is laborious (remedied by plugging in a USB keyboard), the only problem we encountered was when we used the chat feature in its full-screen setting; the text and usernames could occasionally get muddled into what was playing onscreen behind it. With a quick press of the select button, though, the view can be changed into the chat window mode, placing the text on a uniform, flat background; thus fixing the problem.
The Community Favourites feature allowed us to see what our PSN friends and the general public had been watching and choose from a list of recommendations. Those nervous about other people seeing what they’ve been watching will be pleased to hear it is possible to block the Community Favourites section from keeping tabs on you. All you have to do to ensure your privacy is turn off the tracking mode.
As promised, the new programme guide allowed us to search programmes and plan our viewing seven days in advance, and the new series record option allowed us to set PlayTV to record every episode within a series.
PlayTV’s latest update retains the intuitive interface of its original release, adding a whole host of options to what was originally a digital tuner and recorder designed for the PS3. Though you have to pay a premium for these new features, they just about warrant the price of upgrade.
To celebrate the launch of World of Warcraft: Cataclysm, SteelSeries has created a brand new mouse, specifically designed for the game. Featuring 14 programmable buttons, a 150 fps sensor, and a size that puts other mice to shame, it certainly creates an impression. That's not to mention its steampunk-like aesthetics and orange LEDs that glow through the top of the case. We've got our hands on one ahead of its launch later this year, so check out our photo gallery for more pictures and our initial impressions.
So far on Greatest Gaming Rig we've been running our machine naked on a test bench, and while that's all well and good for some benchmarking, it's not practical for use as an everyday PC. This week, we set ourselves the task of sticking the entire rig into a case, which, as we discovered, was not an easy task. Not only did we have to try to track down a case big enough to house the gargantuan SR-2 motherboard, but we also needed something that could easily house two power supplies, along with four graphics cards.
Thankfully, we managed to find such a case with the DangerDen Tower 29. Standing at a massive 29 inches tall, 20 inches deep, and eight inches wide, the case is large enough to house all our hardware, as well as a load of additional cooling fans. However, in its stock state, it has space for only one PSU. Fortunately, most cases at DangerDen can be custom ordered, and once we'd sent our requirements across, they customised the tower for us.
The case arrived at GameSpot UK HQ flat-packed requiring assembly. All those years of assembling IKEA furniture were about to pay off--or so we thought. Opening the box revealed a large pile of acrylic pieces and a bag containing an astonishing 275 screws. On top of that, the instructions were very long and differed slightly from our case because of its customisation. Thankfully though, all the screws were in individually numbered bags, corresponding to codes in the instructions. It took us the better part of a day to build the case, including having to rebuild some sections due to misinterpreting the instructions.
While the assembly wasn't easy, we were extremely impressed with the finished product. Each of the acrylic pieces was covered in brown paper to protect them during transit, which meant that when assembled, they were incredibly glossy and shiny. Putting in the components was also a tricky process, as the instructions detail only case assembly. If you're planning to use one of these cases, you'll need some prior experience with PC building. We also found some of the screws were very difficult to get into place, as none of the screw holes were threaded, so it took quite an amount of force to get them in. Others were at weird angles, which made the process even trickier.
Once we'd installed our core components, we set about putting in cooling, where we encountered a number of issues. On the front of our case there are mounts for four 120mm fans. We planned to use one of these to install our Corsair H70 cooler, along with three fans. However, because of the acrylic construction, we found the walls of the case were much thicker than standard metal ones, so the stock screws weren't long enough. To counter this we bought an additional set of longer screws with bolts, which allowed us to mount the fans and cooler securely.
Unlike the stock Tower 29 case where the PSU mounts at the top, our case had the PSU at the bottom, mounted sideways. This allows it to hold two PSUs adjacent to each other. While this is a good utilisation of space, it did present us with a few cable management issues. The SATA power cables wouldn't reach to the top of the case, at least not without running the cables over the top of the motherboard, so we had to buy extenders. The same applied to the SATA data cables, and we had to buy some extra one-meter-long leads. One nice feature of the case is that it gives you room to route cables behind the motherboard tray, which we found extremely useful for getting rid of cable clutter. However, it's not quite enough space to route power cables, unless your PSU uses particularly thin ones.
With everything assembled, we were extremely pleased with the look of the case. Thanks to the space inside, we were able to keep the build relatively clean, though we will certainly do a little bit more cable management to get things looking even cleaner. The service from DangerDen was second to none, and they happily customised a case to our exact specifications. Building it isn't for the fainthearted, but for the enthusiasts who are likely to buy this case, it won't be a problem. We were also impressed with Akasa's white LED fans, which were noticeably quieter than the stock Corsair fans, and a tad prettier too. The rig is actually quiet in operation, at least until the graphics card fans kick in. We'll be doing some accurate noise measurements in the next part of the feature, as well as overclocking and messing around with 3D vision surround. Don't forget that we're giving you the chance to win the entire rig*, complete with accessories! Just head over here to find out more.
* To enter you have to be over 13 and a UK resident; the competition is subject to our normal terms and conditions.
Installing applications sucks. Loading up the download page, waiting for the download, clicking through pointless "next" buttons. Now repeat the process 20 times if you’re starting from a clean install. Enter Ninite--a website that turns a tedious chore into a pleasant walk in the park. We've been using Ninite for almost a year (it just turned one) and absolutely love it.
Using Ninite is like going shopping. Grab a video player, a music player, a compression program, and maybe even an IM client. Heck, select everything. Click on Get Installer, and you're 99 percent done. Sit back and watch the magic happen, or don't. Go do whatever you want to.
Instead of making you go through the tedium of babysitting every single download, next button, and blah, blah, blah, Ninite takes care of it all. The tiny file you get after you press Get Installer is even reusable. Click it a month from now, and it will grab all the updated versions of those programs. E-mail it to a friend or relative who has no clue what they’re doing on a computer. As long as they can double-click and they have a working Internet connection, they're in business. Ninite even automatically says no to all those pesky toolbar add-ons.
Give it a whirl. We're pretty sure you'll be happy.
This week, it's the moment we've all been waiting for on Greatest Gaming Rig as we benchmark our system for the first time. On the motherboard, we've got all our amazing components plugged in and wired up to our two Corsair power supplies, and we've borrowed a watt-measuring multimeter from our chums at ZDNet UK to measure power consumption. For testing, we used a number of synthetic tests, including 3D Mark Vantage, Cinebench, and Unigine Heaven, as well as more real-word results in games like Crysis, Batman: Arkham Asylum, and Modern Warfare 2. All tests are run at stock speeds and at a 1920x1080 resolution.
The rig was set up as follows:
2 x Intel Xeon X5680 @ 3.33Ghz
3 x GTX 480 1.5GB 3-Way Sli
1 x GTX 460 for PhysX
24GB of DD3 RAM
Corsair Force 120GB SSD
1 x Corsair AX1200 PSU
1 x Corsair AX850 PSU
First up are 3DMark Vantage's CPU tests. The first runs an AI test, which features a number of high-intensity cooperative manoeuvering and pathfinding artificial intelligence calculations. The second runs a physics test, which features smoke collision and various cloth and soft-body obstacles. The Xeon X5680 is the fastest CPU that Intel currently produces, and with its six cores and workstation credentials, we expected some fantastic results. As the graph shows, our system easily came out on top, almost doubling the score of the Core i7-980x.
Next we fired up Cinebench. It's based on Axon's Cinema 4D, which is a piece of software used by production houses to create 3D content for movies. The benchmark renders a 3D scene with approximately 2,000 objects, more than 300,000 polygons, reflections, area lights, shadows, procedural shaders, and antialiasing. Cinebench is a great test for the X5680 because it's fully multi-threaded. This means it uses all 24 cores, taxing the CPU to the extreme. With all those cores, our CPU easily bested the i7s and made an absolute mockery of the aging Core 2 Duo.
Kicking off the graphics benchmarks is Unigine's Heaven. It uses a number of graphical techniques designed to tax the most hardened of GPUs, including dynamic lighting, physics calculations, and hardware tessellation, a feature unique to DirectX 11. Scores were taken with all settings maxed out and hardware tessellation enabled. Interestingly, though the three GTX 480s easily beat out the single cards, it's not by as large a margin as we predicted. With some tweaking and overclocking we should be able to eke a much better performance out of them.
Synthetic benchmarks are one thing, but what's really important is how the system handles actual games. We tested Crysis, Batman: Arkham Asylum, and Modern Warfare 2 all on maximum settings, getting great results. Of note was Batman: Arkham Asylum, which didn't run all that well in the video. After we restarted the machine, though, the game ran flawlessly, and we haven't been able to replicate that slowdown since.
During our tests we monitored the power consumption of the rig; the results were not exactly eco-friendly. Power usage peaked at a massive 950 watts during the 3D Mark Vantage tests, and averaged around the 650-850 mark when playing games. This is likely to go much higher once we start overclocking the system, meaning a higher electricity bill for us, and sadness for baby seals the world over.
Check back next week when we'll be building the rig into a custom-made acrylic DangerDen case. Don't forget that we're giving you the chance to win the entire rig*, complete with accessories! Just head over here to find out more.
* To enter you have to be over 13 and a UK resident; the competition is subject to our normal terms and conditions.
Flagship products are fun to talk about, but they're not particularly practical for more than a handful of us. The vast majority of PC gamers gravitate toward the $200 sweet spot for GPUs. So instead of trotting out a $600 part to premier its new lineup of GPUs, AMD is rolling out a pair of more accessible GPUs--the Radeon HD 6870 and HD 6850.
Radeon HD 6870
1120 stream processors
1GB GDDR5 RAM
Radeon HD 6850
960 stream processors
The new 6000 series parts are basically architecturally refined 5000 series parts with a few new tricks. These new GPUs are more power efficient, have second-generation DirectX 11 optimizations, accelerate Blu-ray 3D, and feature new antialiasing and ansiotropic filtering techniques to improve image quality.
Morphological antialiasing, new to the Radeon HD 6000 series, provides full-scene antialiasing via a post-processing technique. Traditional AA modes are applied when the scene is being rendered, morphological AA is done after the frame has been generated. The new AA mode is compatible with any DirectX 9/10/11 game. Drivers for the new modes were delivered too late for us to perform a thorough testing of this feature.
Architecture improvements allowed AMD to reduce silicon size while keeping performance high and power consumption low. The Radeon HD 6870 consumes 150W at peak power, and the Radeon HD 6850 consumes 127W. Both GPUs idle at 19W. Compared to the parts they are replacing, both GPUs have fewer stream processors but feature much higher clock rates. By comparison, the Radeon HD 5850 has 1440 stream processors, while the HD 6870 has 1120.
The pricing and nomenclature of these new parts cause a bit of confusion when they are compared to AMD's current-generation GPUs. The Radeon HD 6870 actually replaces the Radeon HD 5850, and the HD 6850 boots out the Radeon HD 5830. The $300-plus Radeon HD 5870 will continue to rule the AMD roost for a bit longer. AMD will discontinue manufacturing Radeon HD 5800 and 5900 series parts going forward from the third quarter of 2010, but it will keep HD 5700-based parts in the product mix.
When lining up the potential competitors, nothing fits perfectly, but a few bucks here and there gets us close enough.
$170 - GeForce GTX 460 768MB
$180 - Radeon HD 6850
$190 - GeForce GTX 460
$200 - GeForce GTX 460 OC
$240 - Radeon HD 6870
We leaned on the GeForce GTX 460 1GB at two speeds. We didn't have the slightly cheaper 768MB version of the GeForce GTX 460 on hand for testing.
It's clear that the new Radeon HD 6000 series completely replaces the older 5000 series parts. Its performance is superior in every test across the board. When compared to the competition, the Radeon HD 6870 trades blow with the overclocked GTX 460. The Radeon HD 6850 trails the slightly more expensive stock GeForce GTX 460 1GB.
With the future of such lucrative GPU spots in flux, both AMD and Nvidia sent out flurries of e-mail adjusting prices, firing accusations, and in general muddying the waters. Nvidia's reduced pricing on the GeForce GTX 460 1GB makes it a difficult part to beat. For $220, you can find 800MHz GeForce GTX 460s, which makes for quite a compelling argument if a $200 725MHz version gives the Radeon HD 6870 such a good run. Nvidia also dropped the price of the GeForce GTX 470 down to the $260 range, which should offer the allure of slightly better performance for slightly more money. The Radeon HD 6850 competes well with the more expensive stock GeForce GTX 460 1GB, which means it's definitely competitive with the 768MB version of the GTX 460.
The battle for these important GPU price ranges isn't over yet. Both HD 6000 series parts have had to deal with heavily discounted and heavily overclocked parts on launch day. AMD has already mentioned that its partners are in the process of unveiling overclocked 6000 series GPUs as well. Pricing, as we know, is never final.
Determining a victor here is moot because even minor price movements and overclocked speeds would toss them under the bus just as fast as we can proclaim them. One thing is for certain, though: The Radeon HD 6870 and Radeon HD 6850 provide enough competition to force Nvidia to slash prices overnight, which honestly makes the consumer the real winner here.
Test System: Core i7 980x @4.2GHz, Asus Rampage III Extreme, 6GB OCZ DDR3, Seagate 750GB 7200.11, Windows 7 64-bit. Video card drivers - Catalyst beta, Forceware 260.89.