• RightHandOfIkaros@lemmy.world
    link
    fedilink
    English
    arrow-up
    10
    ·
    edit-2
    8 months ago

    The test was apparently only between 30 and 60 Hz. It doesn’t seem like it was well researched, and there is no clear reason why this would ever even need to be a study other than high refresh rate display manufacturers wanting a new special label they can upcharge for basic features that were previously and are currently included in the display base price.

    • NightAuthor@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      ·
      8 months ago

      Knowing how well eyes can work could be useful for a ton of reasons, including focusing on the right aspects of display tech improvements.

      As for fps, they’ve shown previously that many people can identify a person when flashed on screen for a single frame at over 200fps.

      • RightHandOfIkaros@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        ·
        8 months ago

        That last part is kinda my point. The fact that they only tested 30Hz and 60Hz seems really bad for testing when they could have just tested until people said they couldn’t see the light flashing anymore? Why only test those two numbers?

        • exocrinous@startrek.website
          link
          fedilink
          English
          arrow-up
          1
          ·
          8 months ago

          Because the hypothesis that some gamers’ eyes see at different speeds only takes two datapoints to be proven true.

  • Dave.@aussie.zone
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    2
    ·
    8 months ago

    Is this news? In the early 2000s I couldn’t stand working with 60Hz monitors, there was noticeable flicker. Setting them to 72Hz was a definite improvement.

    About 90 percent of my coworkers were like, “Why are you fiddling with the display settings? Flicker? Wot flicker?”