ASRock FM2A88X Extreme6+ Review
by Ian Cutress on March 19, 2014 11:59 AM ESTF1 2013
First up is F1 2013 by Codemasters. I am a big Formula 1 fan in my spare time, and nothing makes me happier than carving up the field in a Caterham, waving to the Red Bulls as I drive by (because I play on easy and take shortcuts). F1 2013 uses the EGO Engine, and like other Codemasters games ends up being very playable on old hardware quite easily. In order to beef up the benchmark a bit, we devised the following scenario for the benchmark mode: one lap of Spa-Francorchamps in the heavy wet, the benchmark follows Jenson Button in the McLaren who starts on the grid in 22nd place, with the field made up of 11 Williams cars, 5 Marussia and 5 Caterham in that order. This puts emphasis on the CPU to handle the AI in the wet, and allows for a good amount of overtaking during the automated benchmark. We test at 1920x1080 on Ultra graphical settings for a single GPU, as using multiple GPUs seems to have no scaling effect.
F1 2013, 1080p Max | ||
NVIDIA | AMD | |
Average Frame Rates |
|
|
Minimum Frame Rates |
|
Compared to the Intel platforms we have so far put through our 2014 gaming tests, the A10-7850K gives reasonable 60+ FPS numbers for single GPU F1 2013, but the high end Intel parts can offer almost a +50% gain. Adding more GPUs just compounds the issue. We are testing other FM2+ motherboards to see if this range of results is consistent.
Bioshock Infinite
Bioshock Infinite was Zero Punctuation’s Game of the Year for 2013, uses the Unreal Engine 3, and is designed to scale with both cores and graphical prowess. We test the benchmark using the Adrenaline benchmark tool and the Xtreme (1920x1080, Maximum) performance setting, noting down the average frame rates and the minimum frame rates.
Bioshock Infinite, 1080p Max | ||
NVIDIA | AMD | |
Average Frame Rates |
|
|
Minimum Frame Rates |
|
With Bioshock Infinite the difference is not as much as it was in F1 2013, however beyond a single GPU there is a deficit.
Tomb Raider
The next benchmark in our test is Tomb Raider. Tomb Raider is an AMD optimized game, lauded for its use of TressFX creating dynamic hair to increase the immersion in game. Tomb Raider uses a modified version of the Crystal Engine, and enjoys raw horsepower. We test the benchmark using the Adrenaline benchmark tool and the Xtreme (1920x1080, Maximum) performance setting, noting down the average frame rates and the minimum frame rates.
Tomb Raider, 1080p Max | ||
NVIDIA | AMD | |
Average Frame Rates |
|
|
Minimum Frame Rates |
|
Tomb Raider does great with AMD, with this game essentially being CPU agnostic.
44 Comments
View All Comments
niva - Wednesday, March 19, 2014 - link
These benchmarks are making me depressed for AMD CPUs. I guess it's time to switch to Intel after not having purchased an Intel chip since 1996.nathanddrews - Wednesday, March 19, 2014 - link
Come on in, the water's fine.Malorcus - Wednesday, March 19, 2014 - link
I hear you man, I did the same with my current Ive Bridge CPU. I am looking to build a media computer using an AMD APU though. They still have their niche, but it is not in high end computing/gaming.ddriver - Wednesday, March 19, 2014 - link
You'd be surprised by the amount of needed and commercially viable tasks for which those poor CPUs are more than fast enough. It is a sad thing to see AMD struggling to compete with Intel's value products.Fallen Kell - Thursday, March 20, 2014 - link
AMD has sadly not had a real competing product in the high end side for 6 or 7 years now. On the low end, AMD was competing, until Intel decided to compete in this market segment. The last two updates that Intel has made were more focused on the lower end than on the high end. This is finally cutting into the one thing keeping AMD alive. I really hope AMD does survive because those of us that are old enough to remember know that Intel hates the consumer and only really pushes technology when it is competing. We would have CPU's that are only soldered directly into motherboards with no ability to upgrade, completely locked down CPUs with no ability to overclock, locked in memory bus speeds that are tiered based on the CPU/motherboard that you purchased with higher performance memory compatibility costing you extra, etc., etc....But I really don't see a way that AMD can compete at this point. They are still hemorrhaging money (not nearly as bad as a year ago when they lost $1.2 BILLION, but even after restructuring to cut 31% of their operating costs, they still lost $162 million last year). While I understood the reason for acquiring ATI, I believe ATI is worse off due to that acquisition. ATI went from being a profitable company competing well in their market, to one that is losing money and is seemingly almost a generation behind Nvidia in their offerings (I say this based on the fact that the brand new released top of the line AMD graphics cards can barely beat the last generation of cards from Nvidia in performance and can not come anywhere near the Nvidia offerings in power/performance or heat/performance, and Nvidia is getting ready to release its true next generation of cards even while they simply released the last generation of cards at their full potential to beat AMD's current cards now that AMD finally had a competing product). The major losses that AMD as a whole has, is taking its toll on the R&D AMD can afford to do in terms of increasing the efficiency of their designed with respect to power and cooling requirements while still being able to push the performance of their cards.
Demiurge - Friday, March 21, 2014 - link
Funny, I thought the same thing about Intel CPU's when GPU's started to encroach on the high performance features such as physics, ray-tracing, signal processing, and other high intensity applications. There's a bigger picture that I think a lot of people miss. The market has already shifted away from CPU's being the centerpiece of high performance applications. AMD has the right strategy with buying ATI and the paradigm of Heterogeneous Computing, but like Intel with the P4: it's too little too late. If they had the software, they might've been able to pull this off, but that is exactly what they are trying to do with Mantle. I think only unanimous adoption would've guaranteed a win. It was a big risk, and it would have been an amazing upstart, but I don't think it will pay off as much as they need it to. Just a modest opinion... and so a long rant ends... ;-)pandemonium - Thursday, March 20, 2014 - link
AMD hasn't been competitive to Intel for the consumer since 1999 or so? They've always been cheaper, and always been far lower in gaming and general desktop usage results as well. You are very, very, very late to the party, lol.mr_tawan - Thursday, March 20, 2014 - link
I believe it was Intel Core series (2006) that started to get ahead of AMD's CPU. Before that, AMD CPUs was both perform better and cheaper. Intel CPUs were power hungry and expensive while did not yield excellent performance.The raw ALU performance on the current line of AMD CPU is quite low because of the design decision to reduce the space occupied by the CPU while add even more GPU on the die, and then make them work together more nicely. It's the direction the AMD heads to.
I believe that one day even the FX line would be APU just like those A-series.
Vinny DePaul - Thursday, March 20, 2014 - link
I feel your pain. I am an AMD fan. I want to keep using AMD but the CPU is running too hot. It is heating up the room! I switch to Intel. It is just easier.... I hope the AMD's involvement in PS4 and Xbox One will shape the future in games and software.Xpl1c1t - Thursday, March 20, 2014 - link
This is exactly what I did going from a Athlon XP Palomino, to Athlon 64 Venice, to a Core i5 Lynnfield.I'd consider purchasing an AMD processor again if the whole APU thing becomes quite competent and powerful at a smaller and more efficient node. I'd promptly and gladly buy an APU with the equivalent of 2 IVB cores and a 7870 onboard if it could me mounted on a Pico-ITX board...