Skip to main content

Preparing your PC for Star Wars: Battlefront

Digital Foundry on PC performance and recommended hardware across all budgets.

The Star Wars: Battlefront beta concluded a few days ago and DICE can seemingly rest easy - the nightmares of the Battlefield 4 era look to be over, server stability held up despite the immense load caused by a cumulative 9m users, and what was presented was a remarkably solid piece of code - good news, bearing in mind that the game releases in just a few short weeks. The beta also allowed us to profile the revised Frostbite 3 engine across a range of graphics hardware, allowing us to put together this performance preview with settings and hardware recommendations.

We can assume that DICE will continue to optimise the game up until release, and we strongly suspect that at least one of the major GPU vendors will be looking to tighten up performance via driver updates, but so close to launch, we aren't expecting massive changes between now and the full November 17th release.

In putting this performance preview today, we used the console versions as a base on which to define the core experience, attempting to match PC presets with the PS4 presentation. Why use console-equivalent settings as our foundation? Well, we can safely assume that this offers the best 'bang for the buck' in terms of what can be done with limited graphics resources. On top of that, dropping down too low in terms of PC's global presets kills off key effects present on PS4 and Xbox One - the immense tessellated geometry effect in the environments, for example.

Star Wars: Battlefront is predominantly driven by the GPU in your PC, but before we begin we must sound a note of caution about CPU utilisation. We tested the beta with two processors - the Core i3 4130 and the Core i7 4790K, dual-core and quad-core chips respectively. Even in the 40-player Hoth battle, the 4130 performed admirably when paired with Nvidia cards, to the point where we actually played the game at a locked 1080p60 using Titan X - a hilarious mismatch of budget CPU and $1000 graphics hardware.

However, the same CPU had trouble even powering a low-end Radeon - the R7 360 - with intrusive stutter. We'll look at CPU performance in more depth when Battlefront launches (specifically with a view to seeing how AMD processors cope), but in the here and now, we can't recommend Battlefront for AMD GPUs when paired with less capable processors - check out the first video to see the impact this has.

PC Low
PC Medium
PlayStation 4
PC High
PC Ultra
Star Wars Battlefront as compared between PS4 and PC. Note the use of parallax occlusion mapping across the ground - plus texture filtering that matches PC's high setting.
PC Low
PC Medium
PlayStation 4
PC High
PC Ultra
The depth of field filter here is a match for the high setting, and the lens flares effect is also identical.
PC Low
PC Medium
PlayStation 4
PC High
PC Ultra
Texture quality is a match for the high PC preset as well, with less detail resolved on rocks when compared to ultra.
PC Low
PC Medium
PlayStation 4
PC High
PC Ultra
Ambient occlusion on PS4 sits above the medium quality. Curiously, the PS4's implementation of shadows is unique on console, with slightly blurrier edges than the PC's presets - though without the aliasing seen on the low preset.

The best graphics card for Star Wars: Battlefront under £100/$130: GTX 750 Ti

Recommend Settings: 1080p, 85 per cent resolution scale, high presets with ultra textures, low effects, medium shadows, medium ambient occlusion, TAA.

Let's begin by looking at what we like to term entry-level enthusiast graphics cards - GPUs that with careful tweaking can usually match the quality and performance of the Xbox One and PlayStation 4. In this case, that'll be Nvidia's GTX 750 Ti and AMD's R7 360. Our settings are a very close match to what we've nailed down as console-equivalent presets, though what we gain slightly in resolution we lose by dropping ambient occlusion from high to medium (in practise, it's hard to see too much of a visual downgrade, but the performance increase is palpable).

Now, here's the thing: DICE appears to have strongly optimised Battlefront for console, which appears to translate across to a distinct AMD advantage across a majority of the PC graphics market, starting with the sub-£100 enthusiast category. In like-for-like testing on Battlefront's pre-scripted cut-scenes, that amounts to a five per cent boost in favour of AMD (and the outgoing 260X will be faster still). We also overclocked both cards, with a 1200MHz core/1800MHz RAM set-up on the R7 360 and a +200MHz core/+400MHz boost on the GTX 750 Ti. The R7 360 is still faster here, but only by around two per cent. We've attempted to match console settings with our chosen presets above, but what's clear is that getting the same level of performance on these cards is challenging. You'll need to overclock to get the job done, and even then, the GTX 750 Ti falls a little short. Further resolution scaling may be required to get an absolute lock at 60fps.

However, in actual gameplay with heavy effects work, the R7 360 clearly feels better. But there's just one problem - the advantage only manifests when paired with a quad-core processor, while a dual-core Intel processor stutters badly - something that doesn't happen with Nvidia cards. Clearly, AMD's driver overhead is an issue here. Until we can test with AMD's FX-6300 CPU, this situation makes it a default win for the weaker card, the GTX 750 Ti. To be clear, the CPU issue here isn't anything DICE can fix - it's down to AMD to address long-standing issues over its driver overhead.

We check out the GTX 750 Ti and R7 360 running with Core i3 and i7 processors, and provide performance insights on the GTX 950 and the R7 370.Watch on YouTube

Order our recommended graphics cards from Amazon with free shipping:

The best graphics card for Star Wars: Battlefront under £130/$160: GTX 950

Recommend Settings: 1080p, 90 per cent resolution scale, high presets with ultra textures, medium effects, medium shadows, high ambient occlusion, TAA.

This is one of the few categories where Nvidia posts a notable improvement in results over AMD as we compare the GeForce GTX 950 and AMD's Radeon R7 370 - basically an overclocked version of the classic Radeon HD 7850. It's not exactly a huge boost on paper - around 9.5 per cent, but in our gameplay stress test in an effects-heavy shoot-out on Tatooine, the GTX 950 sticks well north of the target 60fps, presenting us with a decent degree of headroom, while the R7 370 can momentarily dip beneath the target threshold.

Settings chosen here should actually match or exceed the standard set by PlayStation 4, particularly in terms of effects, while the internal rendering resolution on our settings is effectively 1728x972, up from 1600x900 on the console. It should be stated that DICE's resolution scaler is really rather good, and seems to exhibit more refinement than the similar tech implemented in the console versions.

The GTX 950 is easy to recommend here - the base performance is better than its competitor, there is more overclocking headroom should you need it, and of course, there are no issues pairing it with the Core i3. Combining the GTX 950 with a Core i3 won't see you get the best out of the GPU, but it's certainly preferable to doing so with the R7 370, where the stutter is highly intrusive to the experience.

The Tatooine survival mode shoot-out is a pretty good stress test for the GPU - the effects preset in particular causes performance issues on higher settings and needs to be handled carefully on lower level cards.

Order our recommended graphics cards from Amazon with free shipping:

The best graphics card for Star Wars: Battlefront under £200/$250: R9 380

Recommend Settings: 1080p, 100 per cent resolution scale, ultra presets with high effects, high shadows, high ambient occlusion, TAA.

The Radeon R9 380 is an upclocked version of the old R9 285, with some firmware refinements that have turned it from an also-ran into a winner in its market sector, and coming off the back of our initial testing of Battlefront running on a more powerful R9 290X, we were genuinely surprised at how well it runs Battlefront at 1080p.

It can't sustain 1080p60 at ultra settings, but dropping down effects, shadows and ambient occlusion down a notch gives you more than enough headroom to get that lock - and the game looks quite beautiful. Using the pre-scripted cut-scenes for benchmark comparison purposes against the GTX 960, it's 11 per cent faster - and could be quicker still bearing in mind that our card is a highly clocked MSI Gaming 2G model.

In our gameplay test area, the R9 380 also kept us ahead of the 60fps target, while the GTX 960 could slip beneath, requiring further settings compromises (dropping shadows to medium is an easy win and doesn't impact image quality unduly). In choosing the AMD card, we're going to assume that you're running with an Intel quad or better.

Four GPUs examined in-depth here - the GTX 960, GTX 970, R9 380 and the R9 290X. In these market sectors, AMD offers the highest performance.Watch on YouTube

Order our recommended graphics cards from Amazon with free shipping:

The best graphics card for Star Wars: Battlefront under £250/$330: R9 290X/R9 390

Recommend Settings: 1440p, 100 per cent resolution scale, ultra presets with high effects, high shadows, high ambient occlusion, TAA.

Once we cross the £200 threshold, serious options start to open up in terms of visual quality. At 1080p, Nvidia's GTX 970 and AMD's R9 390 produce stunning results, with the ability to run the game at ultra settings with plenty of headroom left over. Owing to lack of GPU availability, we had to test with an older R9 290X, but in prior tests, the R9 390 has proved faster and at the very least, it should produce a ballpark experience.

With the ultra threshold so easily attained at 1080p, the question is how much further you can push the GPU technology available here. The obvious route forward - for Radeon owners at least - is to push resolution by targeting 1440p. It's not possible to maintain ultra settings across the board here but using the same compromises we chose for the R9 380, you can get the locked 60fps you need. For those with 1080p screens, upping the resolution scaler to 133 per cent does the same job, and the super-sampling effect is - to our eyes - preferable to the slightly better shadows, effects and AO of the fully enabled ultra experience.

It's in this sector of the GPU market where AMD's dominance is most remarkable. At 1080p, the 290X weighed in with a 10 per cent advantage, but this increased to 23 per cent at 1440p. We've never been hugely enthusiastic about AMD's Hawaii-based products, but they seem to love Battlefront.

Extended analysis of how Battlefront runs with the Radeon R9 290X - you should see ballpark performance from the R9 390.Watch on YouTube

Order our recommended graphics cards from Amazon with free shipping:

Top-tier graphics: R9 Fury X vs GeForce GTX 980 Ti

Recommend Settings: 3840x2160, 85 per cent resolution scale, high presets with ultra textures, medium effects, medium shadows, TAA.

1440p resolution on ultra settings is no problem at all for the top-tier GPUs on the market - the R9 Fury X and the GTX 980 Ti. Indeed, the Nvidia card - once overclocked - can even sustain 60fps with a resolution scale taking us up to an internal rendering resolution of 2880x1620. AMD and Nvidia tell us that these cards are 4K ready - but what's clear is that we're still some way off being able to run top-tier titles at ultra HD without compromising on settings, which is exactly what we had to do with Star Wars: Battlefront.

Yes, native 4K is possible and perfectly playable, but this game was designed with 60fps in mind and to hit that target, we found ourselves drawing upon the compromises we made in order to get our budget GPUs up to speed. The high settings preset is our template, with effects and shadows pared back to provide the boost to frame-rate we need. However, even then, native 4K still isn't possible - instead, we have the resolution scaler providing us with a very clean image at 85 per cent. This means an internal res of 3264x1836 - still a mammoth pixel count.

Fury X commands a small four per cent lead in like-for-like scripted sequences, but its advantage in gameplay is palpable - it sticks above the 60fps threshold, while GTX 980 Ti requires us to draw upon its overclocking prowess to get the job done. But what we should say is that the experience is quite remarkable. The huge increase to pixel count highlights the stunning attention to detail throughout the game, especially in terms of materials and their interaction with the lighting model. Effectively we are running the PS4 console experience at over 4x the resolution, and the result is beautiful.

In terms of which card is preferable, the Fury X is faster at stock speeds, but the 980 Ti's relatively enormous overclocking headroom can't be ignored. It's also worth noting that we tested a reference Nvidia card, while most of the 980 Ti examples on the market feature a factory overclock built-in, which would nullify most of the performance differential.

R9 Fury X and GTX 980 Ti can both max out Battlefront at 1440p - it's at 4K where things become more interesting.Watch on YouTube

Star Wars: Battlefront - PC performance impressions

In our initial Battlefront piece, we noted that the game was eminently scalable, based on our impressions running it with a Core i7 4790 and R9 290X. Extended testing across a range of graphics hardware bears this out, but what is apparent is that our 'go to' hardware for console-equivalent performance - the GTX 750 Ti - needs to be pushed to its limits to match the PS4 experience, and even then it seems to fall a little short.

The only real unknown right now is how indicative the beta levels we had to test represent the entirety of the game content. Impressions do seem positive though: performance is pretty consistent though across all three levels we tested in the beta, despite the utilisation of very different content, suggesting that DICE has stuck to a particular performance profile that should hopefully be common across the game (we do have concerns based on how detailed the Endor level looks though!).

What is intriguing though is just how dominant AMD hardware performs here. Until Nvidia's driver team dives deeper into what makes Battlefront tick and optimises accordingly, AMD enjoys am advantage here in key sectors of the graphics market. The Nvidia driver we used already had some optimisation work carried out, but we suspect that Nvidia will continue its work to make GeForce more competitive in what is going to be one of the biggest PC titles of the year.

In terms of the beta itself, overall stability was impressive - there's the feeling that lessons have been learned at DICE. Aside from the woeful lack of servers when the beta launched, our experience from an online stability angle was almost flawless. Fingers crossed this will continue when the game launches on November 17th.

Read this next