Not like console gamers were ever given a choice, but PC gamers kept wanting PC ports for more frames over the 30 fps standard. Graphics were already good during the PS4 era and PS5 is still crutching so hard on PS4 games during their PS5 pro showcasing. Now console users wanting the same after finally getting the option over a decade later I think shows they aren’t too different from PC gamers in loving frames.
That sends me back to when people in online discussions regularly claimed anything above 60 fps is pointless because the human eye can‘t see more than that anyway
Even of the eye notices it, it’s not really a big deal most of the time unless you play some real time multiplayer game, and going from 60 to 120 literally doubles the amount of frames that the GPU needs to process, thus raising the GPU requirement for no fucking reason 99% of the time.
I think console players are catching up on the massive difference between 30 FPS and 60+ FPS in first person games where the camera can move quickly. As TVs have improved along with the consoles, and some titles are able to be played at 60+ FPS, people are noticing the difference compared to newer titles that aim for 30 FPS as a trade off for detailed graphics and motion blur.
Plus performance mode reduces the number of times a game might stutter or have short periods of time where the frames have a massive drop compared to their normal rate.
Not like console gamers were ever given a choice, but PC gamers kept wanting PC ports for more frames over the 30 fps standard. Graphics were already good during the PS4 era and PS5 is still crutching so hard on PS4 games during their PS5 pro showcasing. Now console users wanting the same after finally getting the option over a decade later I think shows they aren’t too different from PC gamers in loving frames.
That sends me back to when people in online discussions regularly claimed anything above 60 fps is pointless because the human eye can‘t see more than that anyway
That claim is such a pet peeve of mine. That’s not even how our eyes work, and it’s demonstrably untrue.
It can even be proven false by rapidly moving the mouse cursor across the screen very quickly and the lack of motion blur.
Even of the eye notices it, it’s not really a big deal most of the time unless you play some real time multiplayer game, and going from 60 to 120 literally doubles the amount of frames that the GPU needs to process, thus raising the GPU requirement for no fucking reason 99% of the time.
I think console players are catching up on the massive difference between 30 FPS and 60+ FPS in first person games where the camera can move quickly. As TVs have improved along with the consoles, and some titles are able to be played at 60+ FPS, people are noticing the difference compared to newer titles that aim for 30 FPS as a trade off for detailed graphics and motion blur.
Plus performance mode reduces the number of times a game might stutter or have short periods of time where the frames have a massive drop compared to their normal rate.