Similar to the last game we looked at, Lords of the Fallen, Assassin's Creed: Unity has had a bit of a rocky start with bugs and other issues needing to be ironed out. It also happens to be a very demanding game to run – at maximum quality, it will basically chew up any GPU you throw at it and spit out crispy bits of silicon. And it's not just GPUs that get eaten, as CPU power can have a substantial impact as well. Finally, and this is not necessarily correlated with the other items in this list, Assassin's Creed: Unity (ACU) is an NVIDIA "The Way It's Meant To Be Played" title, and it's also one of the notable games for NVIDIA's GameWorks toolset – ACU includes support for HBAO+, TXAA, PCSS, Tessellation (coming in a future patch), and now MFAA (which we looked at yesterday).

There's an interesting corollary to the above items that's worth getting out of the way: reviews of Assassin's Creed: Unity have so far been rather lackluster, with an overall average Metacritic score currently sitting at 70%. That's not particularly good for a series that has otherwise had good reviews – e.g. the last game, Black Flag, has an average score of 84%. Perhaps more telling is that the current average user review at Metacritic is an abysmal 2.1. Looking at the comments and reviews makes it abundantly clear that ACU tends to run like a slug on a lot of systems.

I think part of the problem is the mistaken idea that many gamers have that they should be able to max out most settings on games. Assassin's Creed has never been a particularly light series in terms of requirements, though at lower detail settings it was usually playable on a wide selection of hardware. With ACU, the requirements have basically shot up, especially for higher quality settings; at the same time, the rendering quality even at Low is still quite good, and Medium is enough that most users should be content with the way it looks. But if you want to run at High, Very High, or Ultra quality, you'd better be packing some serious GPU heat. The other part of the problem is that the game was likely pushed out the door for the Christmas shopping season before it was fully baked, but that happens every year it seems.

There's another element to the Assassin's Creed: Unity launch worth pointing out; this is a multi-platform release, coming out simultaneously on PC, PS4, and Xbox One. By dropping support for the PS3 and Xbox 360, Ubisoft has opened the doors to much higher quality settings, but the requirements may also be too high for a lot of PCs. With the new generation of consoles now sporting 8GB RAM, we've seen a large jump in resource requirements for textures in particular. I mentioned in the Lords of the Fallen article that GPUs with less than 4GB VRAM may need to opt for lower quality settings; with ACU (at least in the current state of patch 1.2), you can drop the "may" from that statement and just go in knowing full well that GPUs with 2GB RAM are going to struggle at times.

Test System and Benchmarks


View All Comments

  • silverblue - Thursday, November 20, 2014 - link

    There's been a rather intense furore following AC:U's launch. Essentially, Ubisoft have blamed AMD for poor performance, and then partially retracted said statement. Considering how closely Ubisoft works with NVIDIA, it would sound like they've only developed for GeForce cards... but some NVIDIA users are having issues as well. What's more, Far Cry 4 is also causing issues with AMD hardware. Both were developed by the same software house.

    All in all, it looks more likely that Ubisoft Montreal's internal testing and QA is not up to scratch. You can't simply blame one vendor's CPUs and GPUs for poor performance when you've not bothered to optimise your code for anybody barring NVIDIA. I've even heard that playing offline results in a HUGE performance boost across the board...
  • Friendly0Fire - Thursday, November 20, 2014 - link

    More like a yearly release schedule is untenable for a game of this scale. Corners had to be cut somewhere. Reply
  • silverblue - Thursday, November 20, 2014 - link

    Logical, but even then, it's poor form for UbiSoft to slate AMD for what is most likely their fault as opposed to poor drivers. Reply
  • FlushedBubblyJock - Thursday, November 20, 2014 - link

    That's amazing how it's never AMD's fault no matter what. No matter how poorly they do. No matter how many features they do not have or only have as a ridiculed braggart's vaporware. No matter how long it takes them to continue failing and not delivering, it's not their fault.
    Never AMD's Fault should be their saying since Never Settle is the opposite of the truth with AMD.
  • ppi - Thursday, November 20, 2014 - link

    While I would agree with you AMD has been relagated to ultra-low-budget-inconsequential player on the CPU on front, with respect to GPUs I am not certain where you have been living last couple years, whether on Mars or under some rock.

    Since HD 4000 series, AMD has been running neck-for-neck with nVidia, sometimes kicking it in the rearside soundly, e.g. Radeon 5870 vs. rebadged GeForce 8800, sometimes being a bit behind, until the Maxwell 980 and 970 parts came couple months ago. But even now, the fastest AMD offering is still at least on par with 2nd fastest nVidia offering performance-wise (the issue is rather power consumption). And drivers-wise, there's lot of games coming out with very good graphical fidelity that have no issues on AMD cards.

    Who failed here big time are Ubisoft's managers, who (probably wishing to please the shareholders) wanted to rush the games before the December holiday season to get extra bucks, and allowed proper Q&A to be skipped. There is absolutely no no excuse whatsoever for neglecting GPUs that still make 1/3 of the market (and mind you, nVidia performance is reportedly far from perfect as well). If the AMD cards did not work, they either should not have released the game at all, or release it nVidia only/AMD beta-only.

    I do hope it backfires them at Ubisoft in such a way, that instead of now, these games will be rather bought a year later, in 2015 Steam sale season.
  • D. Lister - Friday, November 21, 2014 - link

    (the issue is rather power consumption)

    Imagine what the nvidia hardware could do with the same power budget. And it isn't just power, but also temps and noise. How come AMD default coolers are the worst in the market yet the nvidia default coolers, esp. for the higher-end models are some of the best? How come it took AMD more than a decade to address the multi-gpu micro-stutter issue in the drivers? And how about the alleged speed boost in CPU performance that AMD promised with Win 8, that never quite took off?

    AMD hires from the same talent pool as their competition, but ultimately, it is their consistent corner-cutting and false promises that hurt their business and relegates them to a lower tier.

    I apologise if I offended any AMD fans, but please understand this, you aren't on AMD's side and I'm not on nvidia/intel's side... it is us, the consumers who are all on the same side, and unless we demand the quality that we are paying for, every now and then someone would try to get away by BSing us out of our hard-earned cash.
  • FragAU - Friday, November 21, 2014 - link

    You are kidding right? I have been fortunate enough to essentially own every top-end GPU since the days of 3DFX Voodoo (and before!). AMD has certainly released some absolute monster cards and has been responsible for keeping Nvidia in check since all other competition ceased to exist. Both companies have had their fair share of the performance crown.

    Currently I own 2x 290X and have since their launch - I run every single game without issue (aside from the topic of this one) at Ultra settings with no issues (Both watercooled so nice and silent too). Ubi soft is just plain rubbish these days, heck look at the status of their cruddy GTA wannabe watch dogs? That game had issues on any PC. Tell me how black flag can run flawless and then this game just run like absolute crud? Sure a 980 should be in front but the 780ti/290x shouldn't be that far behind.

    I will freely admit that Nvidia usually do have more solid drivers in early releases but nothing that has been a deal breaker. Having run SLi and CF since early days I can tell you that both have share of issues .. Anyway all I can say is you better hope that AMD keep on the heels of Nvidia or you will be paying $700 for the same GPU for 3 generations.
  • silverblue - Friday, November 21, 2014 - link

    CrossfireX was only introduced in September 2005. Granted, the time from then to a viable fix was about 8 years (which is still a very long time) but there's two factors to consider here - how long has it been a problem, and how long has it taken AMD to acknowledge it themselves? The discrepancy between the two is what they really need to be judged upon, not how long it took for a solution to reach the user base. Promising fixes is one thing, burying your head in the sand and flat out ignoring your customers is quite another.

    FlushedBubblyJock mentioned it never being AMD's fault for this, that and the other. You'd have to be blinkered to subscribe to that particular theory. AMD's biggest problem is delivery - VCE support was a joke for a long time; some might say their DirectX drivers are in need of serious work; TrueAudio doesn't appear to be having any impact... to name a few. Team Green is in the ascendency right now, and AMD can't release a counter quickly enough, so they look to have no answer for Maxwell. It's almost all down to delivery, and we can only hope they improve in this area. It's not about being a fanboy, but bringing some objectivity to the discussion.
  • ppi - Friday, November 21, 2014 - link

    Yes, right. But my point was mainly that graphical glitches and poor performance in ONE PARTICULAR GAME, sponsored by AMD's competitor, should be blamed on Ubisoft Q&A and them rushing to get the game out for x-mas, rather than on AMD.

    AMD do disapoint me though. Case example: When LCDs came out, I thought - great, now we will be able to get variable refresh rates. But lo and behold, 10 years pass and nothing, until nVidia comes with G-Sync. And then we learn AMD had done it, they had it RIGHT IN FRONT OF THEIR EYES, and they did not see the benefits, but instead tried to sell it as some crappy energy saving thingy. *facepalm* It is clear their development lacks some people who would focus on improving *game experience*.

    (btw, from my last 6 gfx cards, 3 were nVidia, 3 AMD/ATI)
  • D. Lister - Saturday, November 22, 2014 - link

    @ silverblue

    "CrossfireX was only introduced in September 2005..."

    I'm sorry, -almost- a decade then. Because it is really inconsequential how long a particular phase takes in the entire process of solving a problem - what matters is how long it took the end-users to get their money's worth.

    Secondly, the defence that they just didn't know any better, while the competition apparently did, to the point that the competition had to develop a tool (FCAT) for AMD to actually see (or recognise) the problem, merely suggests that if they weren't being deliberately callous, they were just outright incompetent. Either way, the point is that they need to step up their game, because their customers and fans deserve better than what been bringing forth, and because the free market needs good competition.

Log in

Don't have an account? Sign up now