These aren't the PC ports you're looking for - The current state of AAA PC gaming
We recently wrote about how the developer of Star WarsJedi: Survivor admitted that the game sucks for a number of PC players, and also how the PC release is getting hammered in reviews and player feedback.
Now, it's time to open up some discussion about the state of PC gaming in 2023. Many of us thought we would see the end of poor-quality games hitting the shelves. Yet following on from the review of The Last of Us: Part 1, which launched last month with an array of performance-impacting issues, I've come to the conclusion that even developers seem to not actually care about the quality of their PC ports, and only resort to Twitter statements about how they strive for better and will release patches in coming weeks/months to improve the situation.
But why is this? Why are many big releases being delayed from their original launch date (The Last of Us: Part 1) with the promise of a game in its "best possible shape", only to then release with lots of issues anyway?
It also feels like that a certain section of the online community is actively normalizing these technical problem-laden releases, by advocating gamers upgrade to the latest and greatest hardware to take advantage of brute forcing one's way through an optimization-impacted game. It does not help that a fair number of gaming outlets headline articles about memory requirements getting higher and higher, when the reality is that these are steep due to poor optimization at a game engine level.
Take for example the above screenshot of Cyberpunk 2077 where the engine is well optimized for a broad range of PC configurations. On my RTX 4090 with 12700KF CPU and 64GB of DDR4 RAM, I see under 5GB of system RAM usage, with VRAM usage under 12GB. Consider that this is using the 'Ultra Path Tracing' graphics preset with DLSS Quality and Frame Generation and SSR set to the highest Psycho option. The fact that a large open-world game like this with a bustling city with thousands of NPCs is capable of this sort of memory utilization whilst still having a healthy workload balanced across all CPU cores and making use of the GPU effectively proves that it is entirely possible to have a modern game engine run its best across the board.
That's not to say that Cyberpunk 2077 is a perfect benchmark for what modern games should be. It too launched with countless technical issues which took the good part of two years to resolve.
On the one hand, I can see where part of the blame lies with poor PC ports, and that's with us gamers. If we keep buying and pre-ordering new games without sampling a demo or being able to read reviews published from recognized outlets before launch, then many developers will keep launching shoddy ports that require months of patching regardless of what negative publicity says.
On the other hand, if we don't buy new PC titles due to the above, then developers could alienate PC as a platform for launch because the sales aren't there. It ends up being a catch-22 because this would not satisfy shareholders with a vested interest in sales and meeting release schedules.
I was in the lucky position of having had three generations of higher-end GPUs since Cyberpunk 2077 launched in 2020. An RTX 2070 Super, RTX 3080 Ti, and now, the RTX 4090. I have seen how games like Cyberpunk 2077 launch in a bit of a mess. Only those with brute force capability are able to power their way through to passable framerates, that is if the traversal and shader stuttering can be tolerated in most cases.
With Star Wars Jedi: Survivor, it is clear the game, story, and combat mechanics are very good, but the technical issues are simply too jarring and unacceptable. You would think that seeing the negative publicity that these big games get would ensure that games yet to be released undergo strict testing befitting of a polished and optimized title, but here we are.
The major issue surrounding Jedi: Survivor, is that the engine simply is not optimized for acceptable GPU and CPU utilization. On my i7 12700KF and 4090, the first three hours of the game sat between a nominal 60-75 fps at 3440x1440 with or without ray tracing enabled. Enabling FSR does absolutely nothing for framerates, yet in turn degrades the image quality on moving assets as shown by Alex at Digital Foundry.
Then, after I reached Koboh, the frame rate rose to 90-120 fps on average, with GPU utilization sitting comfortably at 90-99% - My first thoughts were that the performance issues may only have been in the first 3 hours of the game, but this was soon squashed. As soon as I reached the town at the bottom of Koboh, things took a downward turn with GPU utilization not exceeding 65%, and frame rate falling to the low 30s, on the world's most powerful gaming graphics card currently... Ouch. That's before I even mention the traversal stuttering during gameplay and frame time spikes that happen simply by moving the mouse around the settings menu.
What's even more puzzling is that in 2019, Jedi: Fallen Order was also released with performance issues. It seems that no lessons have been learned.
EA's statement about going through extensive testing to ensure that there are no adverse issues from patches is also laughable because it contradicts the very situation they are currently in with the game. If extensive testing is important, then why are we having this discussion with a game that is in an unfit state for launch in the first place?
It would be interesting to hear our readers' thoughts on this. Holding developers to account through online outcry for these issues seems to not be working, and time and time again we find ourselves sitting in the same seat talking about it. There have been no less than six big titles that have come out in the last 12 months with major performance issues at launch for the PC version, with developers then promising day 1 patches and beyond to iron things out,
in the case of Cyberpunk 2077, CDPR made things right, but it took a couple of years to get there. Other games like Uncharted still have bugs such as mouse camera panning judder, the same bug which plagued The Last of Us: Part 1 recently, and Naughty Dog managed to fix that within weeks after a heavy public slaughtering.
Uncharted, however, remains forgotten.
What I think would please many of us are game demos or early access through Steam or other storefront. Heck, if EA wants to get more people onto their EA Play/Origin platform, then early access where gamers can submit issues long before launch would be hugely popular, and it would make us feel like we were part of the process by the end. A demo (or early access) gives ample time to gauge what performance is like, and, if released a good amount of time before launch, gives enough time to apply pre-launch patching should bugs be reported.
A demo does mean extra work for developers, but they are doing extra work now anyway weeks/months/years after launching a game, so it's probably wiser to go the demo route surely.
Alternatively, they could also test games on the most popular hardware configurations in circulation. The Steam Hardware Survey is updated regularly and outlines what PCs the vast majority of gamers are playing. It's an invaluable resource for developers, yet it feels like no one other than gaming websites seems to make use of it.
Plus, not all review outlets can be relied upon either, so the demo/early access route does genuinely seem to me as the best step to take.
Thoughts?