Benchmarked - Metro: Last Light Reduxby Jarred Walton on October 2, 2014 3:20 AM EST
- Posted in
- 4A Games
Last month 4A Games released updated versions of the two earlier games in the Metro series, Metro 2033 Redux and Metro: Last Light Redux. The games have both been remastered using the latest version of 4A Engine, with updates for the latest generation of console hardware among other things. Fundamentally, that means less for Metro: Last Light than it does for Metro 2033, but there are still some visual changes, and that potentially means performance changes as well. We've been using Metro: Last Light as one of our gaming performance benchmarks almost since it first came out in May, 2013, and it's still one of the most demanding games around. Of course part of that stems from the use of super-sampling anti-aliasing at the highest quality settings, but even without SSAA Metro: Last Light can be a beast.
Something we've wanted to do more in the past is to provide smaller updates looking at the performance of recent game releases. Our GPU reviews do a good job of giving a broad overview of the performance from the latest graphics cards on a smaller subset of games, and it's basically impossible to test every new GPU on every game at the time of launch. But if you're in the market for a new GPU, you probably want to use if for playing games, which means seeing how new games perform on a selection of hardware is useful. To be clear, we're not replacing our GPU reviews, but we hope to augment our other coverage with increased coverage of the recent gaming releases.
It's worth noting that testing gaming performance at the time of launch may also tell an interesting story about the state of drivers from the various GPU companies. AMD and NVIDIA are the two obvious participants, but with Intel continuing to increase the performance of their Processor Graphics solutions it's also important to see how they fare with new releases. In some cases we may see serious performance issues or rendering errors early on, and if/when that happens we may elect to revisit the performance of certain games a month or two after launch to see what has changed. We've encountered instances in the past where drivers tended to target and fix issues with the most commonly benchmarked games, and while things are certainly better these days it's always good to look at empirical data showing how the various companies stack up.
With that out of the way, let's see what has changed with Metro: Last Light Redux, both in terms of graphics as well as performance. Starting with the former, in most areas you'll be hard pressed to see substantial differences. The most noteworthy exception is the use of red lights and smoke in place of white lights/smoke in some areas; this is particularly apparent in the built-in benchmark. There also appears to be more tessellation in some areas, and at the end (when the "train" gets blown up), you can see in Redux that there's more deformation/destruction of the concrete barrier. I've created a split-screen video showing the original Metro: Last Light on the left and Metro: Last Light Redux on the right. The games were both run at 1080p maximum quality settings, with Advanced PhysX disabled. (Note that with video recording I limited the frame rate to 30 FPS, so disregard the performance shown in that clip.)
Other than the aforementioned changes in lighting color for the smoke, it's difficult to say how much the graphics have improved versus simply being different from the initial release. I've benchmarked Metro: Last Light hundreds of times over the past year (perhaps even thousands), but I have to admit that I haven't actually taken the time to play the game that much, so many of the more subtle changes might go unnoticed.
The list of updates notes that there are graphical upgrades, lighting enhancements, improvements to the gameplay and gunplay, and Redux also includes all of the DLC released for the original game. There have been some updates to certain maps/areas as well, all the weapons that were added via DLC are integrated into the game, and there are some minor UI tweaks (e.g. you can check your watch and inventory as in the original Metro 2033). Finally, there are new achievements/trophies along with two new modes – Spartan and Survival – in Redux. Spartan is basically the way the original Last Light worked (more run-and-gun gameplay, more ammo, not as "hard") while Survival mode is more like the original Metro 2033 (less ammo and health, more difficult enemies). From what I can tell, though, having more (or less) ammo in either game doesn't really change things too much.
But what about performance – is Metro: Last Light Redux any faster (or slower) at rendering its updated graphics compared to the original? To answer that, I've got a rather different set of hardware than what Ryan uses for our GPU reviews, as all of the hardware has been purchased at retail over the past year or so. For now I'm going to focus on single GPU performance, and while I do have a moderate collection of both AMD and NVIDIA GPUs, for the time being my hardware is slanted towards the high-end offerings than lower tier parts. On the laptop side, we'd also like to thank MSI for letting us use three of their latest notebooks, the GT70 Dominator Pro with GTX 880M, the GS60 Ghost Pro 3K with GTX 870M, and the GE60 Apache Pro with GTX 860M. Here's the short list of hardware that I've used for testing:
|Gaming Benchmarks Test Systems|
Intel Core i7-4770K (4x 3.5-3.9GHz, 8MB L3)
Overclocked to 4.1GHz
|Motherboard||Gigabyte G1.Sniper M5 Z87|
|Memory||2x8GB Corsair Vengeance Pro DDR3-1866 CL9|
Gigabyte Radeon HD 6970
Sapphire Radeon R9 280
Sapphire Radeon R9 280X
Gigabyte Radeon R9 290X
EVGA GeForce GTX 770
EVGA GeForce GTX 780
Zotac GeForce GTX 970
GeForce GTX 880M (MSI GT70 Dominator Pro)
GeForce GTX 870M (MSI GS60 Ghost 3K Pro)
GeForce GTX 860M (MSI GE60 Apache Pro)
|Storage||Corsair Neutron GTX 480GB|
|Power Supply||Rosewill Capstone 1000M|
|Case||Corsair Obsidian 350D|
|Operating System||Windows 7 64-bit|
The obvious omission here is the new GeForce GTX 980, though we're also missing GTX 780 Ti, R9 290, not to mention all of the mainstream GPUs like the GTX 750/750 Ti, the whole AMD R7 series, etc. The good news is that the laptops at least give us some idea of what to expect from such cards – the GTX 860M for instance is clocked very similarly to the GTX 750 Ti, and GTX 870M is similar to the OEM GTX 760 192-bit. Again, we'll work on improving the selection of cards tested and try to cover a broader range in the future, but for now let's see how performance differs between the two releases of Metro: Last Light.
We've tested at 1080p with maximum quality (Very High Quality) and we also ran a second test at 1080p with High Quality and without SSAA. In both cases we're testing without enabling Advanced PhysX. While PhysX can make a noticeable difference at times (the Batman games being a prime example), I can't say I've noticed anything but lower frame rates from the feature in the Last Light benchmark – it basically drops performance about 10-15% on NVIDIA cards, and minimum frame rates in particular can be very poor. Advanced PhysX also seems to cause issues with some NVIDIA cards (see below). Our settings then are essentially "Ultra" quality and "High" quality; here's what performance looks like for the two releases on our selected hardware:
So this is where things get interesting. At our maximum quality settings, performance is lower almost across the gamut of hardware with Metro: Last Light Redux. The R9 280 and MSI GE60 are the two exceptions, where performance basically stays the same; everywhere else we see anywhere from a 2% to an 11% drop. When we drop the quality settings a notch and disable SSAA on the other hand, Redux performance is only slightly lower (essentially the same) in one instance, and that's the HD 6970; all of the newer GPUs are anywhere from 10% to 19% faster. That could mean that optimizations have been made for all the modern GPUs but they just don't translate as well to SSAA performance.
As far as AMD vs. NVIDIA, similar to what we saw in our recent GTX 970 review NVIDIA's new "budget friendly high-end GPU" basically offers performance on par with AMD's top of the line R9 290X at a much lower price. GTX 970 also tends to be roughly the same level of performance as GTX 780, with the 780 now being cleared out at lower prices. The GTX 770 meanwhile offers roughly the same performance as the R9 280X, though in this case the AMD GPU has the lower price, but of course GTX 770 is being phased out in favor of GTX 970 as well.
One other item worth mentioning is that I noticed my Zotac GTX 970 GPU was a bit flaky with Redux, particularly at even higher settings (e.g. 2560x1440 maximum or 1080p high quality, with Advanced PhysX). I was running at the card's stock settings initially (which have a mild 26MHz bump on the base GPU clock), and I thought perhaps temperatures were getting too hot on some components. It turns out the real culprit is Advanced PhysX, which tends to crash Redux every few minutes on the GTX 970.
I haven't tested with PhysX extensively, but some additional testing of the GTX 780 also showed crashes with PhysX enabled (but it takes about twice as long as the GTX 970 to crash to the desktop, so 10 minutes instead of five minutes). Either Metro: Last Light Redux has some poorly implemented PhysX code, and/or NVIDIA may need to tweak their drivers for Redux to achieve stability at certain settings with Advanced PhysX enabled. This is definitely a fringe case, however, so it's not likely to affect a lot of users either way.
Overall, the Redux release of Metro: Last Light won't be any more – or less – playable on most systems than the original game. Of course, Metro 2033 Redux saw a much greater overhaul in terms of graphics and gameplay, but in that case it means the system requirements are higher than the original game, likely at the same level as Last Light Redux. In other words, if you're looking for the poster child of why gamers might want SLI or CrossFire builds, the Metro Redux games are right up there with other punishing titles like the Crysis series, at least if you want to crank up every quality settings. SSAA is as usual a major hit to performance, so turning that off can boost performance by almost 100% at the cost of having jaggies.
And on a final note, there's a huge onslaught of games coming, and we're hoping to test many of them in a format similar to this. Your feedback is welcome as always, and if you have any requests for games that are already available or coming soon that you'd like to see benchmarked, let us know. Also let us know if you'd like to see additional settings tested; I confined the results reported to 1080p at High and Ultra quality, but we could certainly run other settings. Since these are all single GPU configurations, 2560x1440 with Redux proves to be too much in most cases, unless we drop SSAA; the laptops meanwhile might benefit from 1920x1080 and medium quality settings, though that's a bit too light on the faster desktop GPUs. Anyway, let us know what you'd like to see.
Post Your CommentPlease log in or sign up to comment.
View All Comments
JarredWalton - Friday, October 3, 2014 - linkFixed! No double final notes -- that's what I get for editing and adding content after posting. Hahaha.
xTRICKYxx - Thursday, October 2, 2014 - linkCould you do different CPU + GPU combinations for these kind of articles? I know it would be time consuming...
JarredWalton - Friday, October 3, 2014 - linkI thought about that before, but basically it would double the time and that's a lot of work for a small payoff in information. Plus, what CPU should I use as a second option? i3-3225 I have sitting around, or I could get an FX-8320. I think most CPUs used for gaming will be at least at the i5 level, and outside of CrossFire and SLI rigs the performance will generally be GPU limited, regardless of CPU. If there's enough demand for it, I'll reconsider, but for now I'm sticking to one CPU. :)
Impulses - Friday, October 3, 2014 - linkI'd love to see testing at higher res and/or SLI/CF configs, I know that adds to the variables and would require a different test group besides the laptops, and I realize those of us running larger displays and multiple cards are the minority, but still...
A lot of these tests are just gonna boil down to a single fact otherwise: one large percentage of recent cards can run everything just fine at 1080p and a smaller percentage can't. Pushing setups to the limit is usually a more interesting read, and more revealing of the relative performance differences.
JarredWalton - Friday, October 3, 2014 - linkUnfortunately I have no SLI configurations right now, though I can run 280, 290, and 290X CF. I have a lot more AMD GPUs at present than NVIDIA, though I'll see if we can fix that. Just one more 970 and two 980 cards and I'd be set. Hahaha.
tential - Friday, October 3, 2014 - linkI think this is a good idea, but I don't like the day 1 testing or rather, I hope you do follow up testing for the more popular games.
DPOverLord - Friday, October 3, 2014 - linkMetro Last Light was one of those games thats remarkable but unremarkable when it came to SLI.
We did extensive benchmarks with Titans on Surround 1600p & 1440p (portrait and Surround meaning 7680x1600 & 4800x2560), Multiple GPU is similar to Titanfall... Not optimized.
It would be interesting to see if they fixed this issue in the redux version. Are there any plans to check this in resolutions other than 1080p. We of course appreciate your article, but the industry is moving away from 1080p surely but slowly and it would good to see benchmarks from you guys in this regard.
Surround / higher resolutions are more demanding and more accurately portray if the developer is taking advantage of these higher textures. I am not knocking 1080p as a developer, they look for what "70+%" of their clientele (us) use. Hence why SLI / Surround is never optimized proficiently. However, with higher resolutions, SLI usage should be increasing.
Hope that makes sense in a dumb down version, but would be nice to see benchmarks that are not just 1080p, otherwise most of these articles are not as beneficial. Anandtech of all people should be able to switch out a monitor or two.
JarredWalton - Saturday, October 4, 2014 - linkAt this time, I only have CrossFire setups for 290X and 280 (and 6970, though that's sort of not useful now). I only have single NVIDIA GPUs for the time being, and of course more configurations means more time to test. Assuming I can get a second GTX 970 (780 and 770 optional), I could at least run a few comparisons for surround gaming as I do have multiple monitors available.
However, let me just say that I don't think developers are targeting the 70+% when they ignore multiple monitors but more like the 95%. Yes, multiple displays are a good way to bring GPUs to their knees, but so are 2560x1440 and 3840x2160. I'm more inclined to add those than surround gaming.
I actually have 2560x1440 numbers available, but at least for Metro Redux (with SSAA enabled) it's not particularly useful data without SLI/CF results. The GTX 780 hits 26.6 FPS average, the 970 is 25.6, and the R9 290X is also 25.6. A single R9 280 meanwhile is down at 15.6 FPS and the R9 280X is 19.2 FPS. In other words, not one of the single GPU configurations is able to reach 30+ FPS in Metro Redux. (Note: GTX 980 probably gets there, but not with much room to spare.)
Hope that helps; if this becomes a regular section on AnandTech (which is what I'm hoping to do), we'll almost certainly add additional GPUs in the future. Consider this the beta release. Hahaha. :-)
DPOverLord - Sunday, October 5, 2014 - linkThanks for writing back!
I hear what you are saying. I do feel though that in today's day and age multiple GPU configurations will become more and more mainstream. From 4k to multiple monitors it would be great to see benchmarks more reflective of that.
As I said the reason programs/games are not as optimized always is due to the developers budget. Nvidia and AMD claim it's a simple drive update, but it's not the case, AKA titan fall, and a few other games that needed code edited.