• misk@sopuli.xyzOP
      link
      fedilink
      English
      arrow-up
      6
      ·
      edit-2
      7 months ago

      Seems so. It’s a shame 40 fps (for VRR/120hz displays) is not more common if 60 is not achievable.

    • fibojoly@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      4
      ·
      edit-2
      7 months ago

      As long as your eyes are only 24fps, it appears.
      Edit : I was making a reference, people! It’s an excuse that has been used before!

      • Ranvier@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        4
        ·
        edit-2
        7 months ago

        That the eye can only perceive 24 fps is a myth. Plus vision perception is very complicated with many different processes, and your eyes and brain don’t strictly perceive things in frames per second. 24 fps is a relatively arbitrary number picked by the early movie industry, to make sure it would stay a good amount above 16 fps (below this you lose the persistence of motion illusion) without wasting too much more film, and is just a nice easily divisible number. The difference between higher frame rates is quite obvious. Just go grab any older pc game to make sure you can get a high frame rate, then cap it to not go higher to 24 after that, and the difference is night and day. Tons of people complaining about how much they hated the look of Hobbit movie with its 48 fps film can attest to this as well. You certainly do start to get some diminishing returns the higher fps you get though. Movies can be shot to deliberately avoid quick camera movements and other things that wouldn’t do well at 24 fps, but video games don’t always have that luxury. For an rpg or something sure 30 fps is probably fine. But fighting, action, racing, anything with a lot of movement or especially quick movements of the camera starts to feel pretty bad at 30 compared to 60.