Resolution Bumps on X/Pro are Disappointing

Resolution Bumps on X/Pro are Disappointing

When the Playstation 4 Pro launched, I was stoked. Finally, console gaming was going to the iterative model of hardware development that had served PCs and phones so well for so long. It was time to let developers breathe a collective sigh of relief, and actually carry forward their experience and their toolsets from console generation to console generation.

The Xbox One X represented an even bigger iterative leap, and I bought one for Dynasty Warriors 9.

(One of you is sitting in the back row screaming at me that Nintendo did this first with the New 3DS, and that one of you is absolutely right. Unfortunately, they only made like 5 exclusive games that actually took advantage of that hardware. The New 3DS is a surprisingly powerful little machine, that almost everyone in the development community ignored).

97cd6e6a-cc84-4ba8-a0ed-c96d24c2b7c4.PNG

A few games have really leaned into the power available in these new consoles. They give the user a bunch of different graphics options, and say go nuts. They offer additional effects, higher resolutions, and the option to just lower everything down to standard settings and get a higher framerate.

I don't have a problem with those games...but I could count them on one hand. The only ones I can think of off the top of my head are Rise of the Tomb Raider, the Middle Earth games, Horizon, and uh...kind of Dynasty Warriors 9? Though that game doesn't tweak the settings so much as the resolution.

And therein lies the problem.

It's become clear to me pretty quickly over months of owning boosted consoles that most game developers are creating their games for the original PS4 spec, and then gently nudging them up or down depending on what hardware you're using. I am not a game developer, but I've been playing games for thirty years, and time and again, when testing out different games...this is what I've noticed.

The PS4 version targets 1080p. They try to get it there on base Xbox One, but if they can't they go down to 900p. The Pro gets a boosted res at or about 1620p, or some kind of fancy scaling. Ditto for the Xbox One X, but with a higher resolution.

That's it. They just change the resolution. No more effects. No cool little details. No completely unnecessary GPU-driven physics particles that I still like. You're getting the exact same core game, just at a higher or lower rendering resolution.

While that's good for people that own the base consoles, because they're not getting left out...it's also a little dull for people that own the beefier ones. And people that own those and a 4K TV often aren't even getting a 4K resolution. But that's fine. Because 4K requires a ludicrous amount of processing power.

And here's a picture I took of The Crew. I like The Crew. I wrote this article without any pictures in mind and now I'm putting some pictures into it.

And here's a picture I took of The Crew. I like The Crew. I wrote this article without any pictures in mind and now I'm putting some pictures into it.

What about PC gamers? Are they getting those extra effects that were once prominent on the platform? Sort...sort of?

Usually though, multi-platform games will have a setting called High or Very High that mimics the console look, and then an "Ultra" quality that's mostly there for people that bought stupidly expensive high-end cards. Sometimes those ultra settings shove in some of Nvidia's proprietary hair or shadow effects...but most of the time they just crank up the internal rendering resolution of shadows or ambient occlusion or some other effect to a point where only the beefiest cards can still run the game.

It's kind of a joke.

You might think, well, I want the highest resolution. But in some games, thanks to new anti-aliasing techniques...it's hard to even tell the benefits you're getting. Assassin's Creed Origins looks almost as good at 1080p as it does at the fake 4K resolutions on the boosted consoles, because its anti-aliasing is ludicrously good. Similarly, I can't tell the difference between its very high and ultra settings on my PC...other than my framerate dying a little on my GTX 1070.

I wish that companies were either chasing framerate or effects.

Far Cry 5 has really detailed cockpits in it that you barely ever see, because you're too busy playing the video game and not looking down and pressing the screenshot button like me.

Far Cry 5 has really detailed cockpits in it that you barely ever see, because you're too busy playing the video game and not looking down and pressing the screenshot button like me.

Far Cry 5 was supposed to have a high framerate mode on the beefy consoles...but it was then cut right before release, with no real explanation why. Instead, they just cranked up the resolution as high as they could while maintaining 30 FPS performance. It's possible to push the game at 60 on PC without too much trouble, and that results in a smoother gameplay experience.

The new God of War is being lauded for its visuals, and to its credit it does offer a higher framerate mode for Pro owners. But the base visual quality is identical outside of resolution. And the base PS4 runs the game totally fine at 1080p.

I can't help but feel that all of this extra performance is being largely wasted. I understand the economic realities of developing games. It wouldn't make a lot of sense to invest tons of dev resources into effects that only boosted console owners and a handful of PC gamers would see.

But it'd be nice to see a few more options for gamers, and it'd be nice to see one game really go for it, to show people why they should at least consider higher-end hardware.

I'm not sure why resolution became such a big deal. Beyond 1080p, it's very difficult to tell a massive difference unless you have a huge screen and you're sitting right up in its business. I'm not saying there's no difference, but it's a difference that's much smaller than new graphics effects or a smoother gameplay experience.

I own a base PS4, an Xbox One X, and a decent PC...and games seem largely the same across all three. That's both a testament to how comfy developers are with the current generation of tech, and a little disappointing. I make decisions on where to buy games based on which controller I want to use, whether I think mouse and keyboard will benefit the game, and whether or not I think my PC can brute force the game to a higher framerate.

I understand that games have to make money above all else. But it's been a while since I felt like a game fell out of the future. A lot of games from 2013/2014 look just as good as today's, on a technical level. Maybe we've hit that magical era where art talent is finally just as important or more important than technical proficiency. Maybe! That would be great!

I can't help but think of the cancelled Terminator Salvation PhysX DLC. I mean, I think about it all the time anyway, but I extra think about it when thinking about technical stuff in video games. I loved that stupid game. Maybe they could test the waters by making cheap DLC that required higher-end hardware. Something. Give me something. Anything that's not just a bump to 4k.

Bioshock: The Collection's Weird Audio

Bioshock: The Collection's Weird Audio

Dead Rising 4 is a really polished Dynasty Warriors

Dead Rising 4 is a really polished Dynasty Warriors