If you’re the kind of person who spends any time reading about the gaming industry, you’ve come in contact with one of those gamers. You know the kind – they insist that their opinion is factual evidence that one thing is better than another, and if you disagree, they feel pretty certain you must be some kind of uncultured ape. One such debate – between the importance of better visuals and higher framerates – has warred on for decades, and with next generation consoles launching at the end of the year, the dispute has only become more amplified and divisive.
Sony and Microsoft’s mid-generation console updates – PS4 Pro and Xbox One X – finally gave console gamers a taste of 4K resolutions, but many developers made wise use of the increased power to include an in-game choice between upgraded visuals (Quality Mode) or upgraded framerates (Performance Mode). These options have quickly become an expectation for AAA titles and even show up in some indie games, marking the first time console gamers have ever been able to make meaningful choices about what kind of experience they value most. Despite these new freedoms, the discussion of which is better seems interminable.
The viewpoint that framerate is king tends to be most common, with a vocal majority of players – especially those who game on their PCs – feeling that anything below 60fps is entirely unacceptable. Others aim for framerates exceeding 90-120fps, occasionally even complaining of headaches or sore eyes when playing games where the frames dip below their preferred cap. Some of these players spend hundreds or thousands of dollars on hardware to ensure that these expectations are met or surpassed so that they have the best experience possible with their games, and rightly so.
Even the vast majority of developers and industry leaders have been known to prefer improved framerates over higher resolutions. In an interview with Stevivor, Xbox head Phil Spencer echoed the sentiments of millions of fellow framerate enthusiasts when he said, “As we were looking at the future, the feel of the games was definitely something that we wanted to have more focus on, not just throwing more pixels up on the screen.” Fair enough, Phil.
On the other end of the spectrum are gamers like me who are fairly indifferent on the subject of framerates. I’ve been gaming for over thirty years, and most of the games I grew up loving ran between 20-30fps. However, I only know that because I looked it up. In other words, I just generally don’t even think about framerates because, unless the game’s performance is downright abysmal, I typically don’t really notice frame drops or imperfections that crop up along the way.
I’ll certainly concede that I can perceive that first-person shooters and racing games show noticeable improvements when running at 60fps, but lower framerates suit me just fine for every other genre. If anything, when I do mentally register 60fps gameplay in other titles, it sometimes comes across as too buttery and surreal for my tastes – a personal perception which framerate enthusiasts are likely to scoff at.
Meanwhile, I’m enamored by the increase in visual quality that comes from new hardware and technology. When I heard that HDR was coming to consoles, I immediately rushed out and dropped thousands of dollars on a 4K OLED television with HDR10 compatibility in anticipation. Since I’m always quick to permanently switch my games to Quality Mode and marvel at the sharper visuals, I’m immensely thankful that developers have taken the time to ensure I have this option. I would assume those who prefer higher framerates feel identically each time they visit the settings in a new game and realize they have the freedom of choice.
While I may not be entirely alone in my predilections, I freely admit to being in the minority of gamers who actively value visuals over framerates. But it’s still simply my personal opinion – one which really isn’t up for serious debate because it isn’t a situation with a right or wrong answer to begin with. Like so many other things in life, entertainment is almost entirely subjective. Our preferences for enhanced visuals or higher framerates shouldn’t really matter to anyone but us.
It’s perfectly acceptable to have an opinion or even an allegiance to a concept or brand – like how I’m primarily a PlayStation fan and have no problem politely expressing the reasons why. It’s even perfectly acceptable to discuss why you feel the way that you do. It’s not reasonable, however, to demand that your opinion be accepted as universal fact, nor is it acceptable to discount another person’s positive experience with subjective art. The less time we spend insisting that others agree with our preferences, the more time we have to celebrate the gaming experience together.
Sure, a friendly debate over the importance of these two things might have held some significance to console gamers years ago when every game developer had to sacrifice one or the other. But as Sony and Microsoft’s new consoles launch this holiday season, we’re entering a generation where 4K/60fps looks to be considered a minimum target for the majority of major releases, hopefully meeting the needs of most gamers right out of the box.
For those who still want even more, developers are likely to offer us options for increasing framerates past 60fps at the expense of texture quality and vice versa. Even better, this new hardware is poised to usher in a whole new era of gamer choice and freedom that may extend even beyond these simple toggles. So, whether you prefer to make the most out of every pixel or see how many frames you can push out of a system, PCs and modern consoles let you get what you want out of your experience with a game.
There’s hardly a debate when everybody wins.
Some of the coverage you find on Cultured Vultures contains affiliate links, which provide us with small commissions based on purchases made from visiting our site.