INSOMNIA

Why Visual Downgrades Happen

By Nick Foo / January 26, 2017


One common theory is that the downgrades happen to be "fair" to console gamers, but the reality is that it has to do with keeping development costs down. When you have two versions of a game, even though they differ only in the graphics, the time spent on the assets nearly doubles, not to mention QA costs, since you can have totally different bugs with different assets. The first trailer for Ghost Recon Wildlands [ > ], the latest Ubisoft downgrade offender, that people saw probably only had the assets for the scenes shown on the trailer. So even if they had kept those assets, they would have had to develop the rest and end up with two different pipelines and the costs would have been astronomical by comparison. I used to think just like the typical forumroid on this matter, but then I found out the truth was much more mundane.
   People say that Ubisoft has millions and millions of dollars to polish each game they release, and that costs shouldn't be an issue, but the reason they have millions is precisely because they make sound business decisions like not having a separate asset and graphics pipeline for the PC platform, which accounts only for a tiny fraction of the market. Also the game probably started development before they got their hands on the consoles, which is most likely why they released the original footage. Believe me, this pisses me off as well, but it's much more reasonable than the conspiracy theory going around, which I used to believe in too. I just wished developers at least acknowledged this like CD Projekt Red did.
   They might have made several bad business decisions, and continue to make them, but that doesn't mean ALL their business decisions have been a mistake. Sticking to a single asset pipeline after realising how under-powered the consoles were, was a sound, though unfortunate decision.
   Again, I'm not ignoring their other major screw ups, like rushing buggy games out (AC Unity), but this in particular is a no-brainer for any company.
   Could they have better estimated the power of consoles before developing their prototypes? I don't know. But I wish they had.
   Oh and yeah, it's a really tough question whether a company should try to make a prototype years in advance to build hype. I can't really tell if it would be a successful business decision or not outside of hindsight. Because sometimes you might make a correct prediction, other times not.
   You could say that In big-budget games though, where companies regularly have to push the boundaries of technology as a selling point, it's almost guaranteed that you'll run into problems down the line, so in those cases I would lean towards not trying to make uncertain promises. But in the cases of expansions or sequels that are pretty much more of the same, it's more reasonable to promote your game ahead of time. But then again who wants to buy a game that doesn't push the boundaries outside of the "indie"/text-adventure game crowd?
   So they had one high-detail asset pipeline, and then they switched to another pipeline with lower detail assets. That looks to me lie sticking to a single pipeline.
   If you insist on thinking they already knew how powerful the consoles were and had an original pipeline that stuck to the limitations of the hardware and then decided to have a second one to fool the public, then I think you are dreaming up conspiracy theories.

   People argue that first they had the final pipeline with knowledge of the hardware and at some point they decided to make another one just for show.
   My theory is that they started developing the game before they even knew the specs, probably around 2011, and only had an estimate. (I'm guessing they took this risk to avoid from others beating them to market.) They then got the hardware at around 2012. I don't know if people think one can immediately start development on a new platform as soon as you get the hardware, but it's not that simple. You get some highly technical manuals which sometimes are incomplete, and you get some very rough early compilers. It's not like Unity where you have a "porting wizard". It takes time for developers to get accustomed to the tools, let alone create their own ones for debugging and profiling to make sure they're getting the most out of the hardware. During that time production on the game would still be underway on desktop computers, and even though they know the specs are different, it's not as simple as "our desktop has 2.5 Ghz and the console has 1.7 Ghz". it's not a simple matter of comparing some numbers. These things have custom hardware that differ a lot from the typical desktop, with differences that aren't linear. They have to actually run the thing on the device, and then they have to estimate if it'll be a matter of optimization, and integrating that into a prototype takes time. There might have been some push from the marketing side overestimating how much the game could be optimized even if the engineering team advised against it, but at that point nobody's sure, as even the PC version would not have been completely optimized, in addition to god knows how many other factors. Developing an engine is a complicated mess. I'm guessing that by mid-2015 they still didn't know how much they could push the hardware.
   Again, this is only speculation, and the whole truth is impossible to know for sure, but these are estimates based on what I know happens to other developers.
   I'm angry as well though that they could just take those leaps of faith instead of being more patient, but these people are under constant pressure to deliver the best product as early as possible, which is why they end up doing things like this.
   One thing I'm 100% sure of though: There is no conspiracy. These guys didn't purposefully manipulate gamers, they just made some huge mistakes, the way people always do when they're under pressure.
   Disagree with me if you will, because even I don't have full knowledge of the process. I don't know all the facts. But think about what you are implying when you go down the conspiracy road. That the developers/executives nearly fully knew the outcome. It's almost like assuming they sat around the conference table saying to themselves, "Fuck it, lets whet their appetite with some unrealistic expectations so they get hooked, they won't know the difference this time unlike with Watch Dogs!" To me the conversation would have gone more like, "We acknowledge what happened with Watch Dogs, but the graphics in this game aren't promising as much and we're also confident in Mr lead developer here to squeeze the polygons out, plus the engine is not nearly halfway optimized so we have confidence that in the worst case scenario we'll just strip out a few objects here and there and maybe remove a few effects, but we're confident we can make it pass without many people noticing". Perhaps Mr lead developer was lying to himself due to pressure, who knows, but he was overall hopeful that it would work out. Then in the months that followed they realized their mistake and facepalmed all over again.
   As for how the showcase was developed, it was perfectly possible. You can spend a short time creating very highly detailed assets for a limited area that you're going to show off. There can be many factors in this early showcase. You can either use a very powerful PC to compensate for the lack of optimisation early on, or you can create a "fake" engine that only renders what you're going to show off. It could go as far as presenting scripted AI as though it were representing the final vision of dynamic behaviour, or just blatantly creating it all on 3DS Max or Maya if you're confident enough that you'll be able to pull it off in production with a working engine that's fully optimised for the target hardware (though I guess this would be a rare case and rather foolish).
   Given a very narrow and scripted play showcase, you can produce highly detailed set-pieces in a short amount of time. The brunt of the work lies not so much in the painstaking work of artists to just create something that looks good, but rather making it look good on the target hardware, and with the full promised features, which involves tweaking the engine and the assets to render every frame in under 33 or 16 milliseconds on Xbox One and PS4 as a priority. If during production they realise they overshot their estimates, then they're at least early enough in development that they haven't wasted ages creating insanely detailed assets, and so they decide to fix their pipeline to certain parameters that they know will work and create the rest of the game.
   I know that this might feel like they're doing this maliciously. But remember that every time that they show off a game for the first time, they're always forced to push the envelope a bit further to gather hype, which means they create a lot of unknowns with every new game, unlike with DLC, where the technology pipeline has been finalised and they can thus more confidently estimate the time it will take to ship the product. They might tell themselves, "Oh look, there's this new paper that came out showing off a way to improve rendering at a lower cost to the CPU/GPU, we think we might be able to pull this off on this hardware". They won't then create a fully functional and optimised version for their E3 trailer, because there won't be enough time. So they'll just create the closest proof of concept they can make which matches the look they envision, and then proceed to spend the next two years making it work for real. Again, there are varying degrees of completion for the promised tech once it's shown off for the first time, so some games might be closer to the truth than others.
   Remember though that the technical folk are not usually the ones who want to take these risks; as far as they are concerned they would rather take their sweet time making everything work as promised. The pressure comes from high level executives and marketing who want to get the next best thing done on budget, and on time. It's a bet that they're forced to make every single time to stay competitive. Some companies are better at striking this balance than others, and this depends on a lot of factors, such as labour costs and developer experience.
   So shelve your Illuminati conspiracies, the reality is much more mundane and far less sinister than that. Or believe them if you wish: you're certainly free to make up your mind. I've done my part here.