• RainfallSonata@lemmy.world
    link
    fedilink
    arrow-up
    6
    arrow-down
    1
    ·
    9 months ago

    I never understood how they were useful in the first place. But that’s kind of beside the point. I assume this is referencing AI, but due to the fact that you’ve only posted one photo out of apparently four, I don’t really have any idea what you’re posting about.

    • Hildegarde@lemmy.world
      link
      fedilink
      arrow-up
      5
      ·
      9 months ago

      The point of verification photos is to ensure that nsfw subreddits are only posting with consent. Many posts were just random nudes someone found, in which the subject was not ok with having them posted.

      The verification photos show an intention to upload to the sub. A former partner wanting to upload revenge porn would not have access to a verification photo. They often require the paper be crumpled to make it infeasible to photoshop.

      If an AI can generate a photorealistic verification picture, it cannot be used to verify anything.

      • RainfallSonata@lemmy.world
        link
        fedilink
        arrow-up
        2
        arrow-down
        1
        ·
        edit-2
        9 months ago

        I didn’t realize they originated with verifying nsfw content. I’d only ever seen them in otherwise text-based contexts. It seemed to me the person in the photo didn’t necessarily represent the account owner just because they were holding up a piece of paper showing the username. But if you’re matching the verification against other photos, that makes more sense.

        • RedditWanderer@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          9 months ago

          It’s been used way before the nsfw stuff and the advent of AI.

          Back in the days if you were doing an AMA with a celeb, the picture proof is the celeb telling us this is the account they are using. Doesn’t need to be their account and was only useful for people with an identifiable face. If you were doing an AMA because you were some specialist or professional, giving your face and username doesn’t do anything, you need to provide paperwork to the mods.

          This is a poor way to police fake nudes though, I wouldn’t have trusted it even before AI.

      • oce 🐆@jlai.lu
        link
        fedilink
        arrow-up
        1
        ·
        9 months ago

        Was it really that hard to Photoshop enough to bypass mods that are not experts at photo forensic?

  • 🇰 🌀 🇱 🇦 🇳 🇦 🇰 ℹ️@yiffit.net
    link
    fedilink
    English
    arrow-up
    5
    ·
    edit-2
    9 months ago

    My discord friends had some easy ways to defeat this.

    You could require multiple photos; it’s pretty hard to get AI to consistently generate photos that are 100% perfect. There would bound to be things wrong with trying to get AI to generate multiple photos of the same (non-celeb) person that would make it obvious it’s fake.

    Another idea was to make it a short video instead of a still photo. For now, at least, AI absolutely sucks balls at making video.

  • wick@lemm.ee
    link
    fedilink
    arrow-up
    4
    ·
    9 months ago

    I can finally realise my dream of commenting on r/blackpeopletwitter

  • HiddenLayer5@lemmy.ml
    link
    fedilink
    English
    arrow-up
    3
    ·
    edit-2
    9 months ago

    At some point the only way to verify someone will be to do what the Klingons did to rule out changelings: Cut them and see if they bleed.

  • CheeseNoodle@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    9 months ago

    Thank goodness we can now use AI to do something that could already easily be done by taking a picture off someones social media.

  • Nova Ayashi@reddthat.com
    link
    fedilink
    arrow-up
    2
    ·
    9 months ago

    I don’t know if I just have really good eyes for a 38 year old, but I can tell at first glance, within seconds, that this photo is AI generated. It’s all about the lack of humanity in the subject’s eyes

    • circuscritic@lemmy.ca
      link
      fedilink
      arrow-up
      2
      ·
      9 months ago

      Or, and this is just a long shot, maybe you viewed the photo knowing it was AI generated and then worked backwards to create your own internal justification as to why you’re uniquely gifted as detecting “humanity” in the eyes on webcam selfie photos.

    • MxM111@kbin.social
      link
      fedilink
      arrow-up
      1
      ·
      9 months ago

      If by luck of humanity you mean a single light source reflected in her left eye, and double source in her right, I agree.

    • WHYAREWEALLCAPS@kbin.social
      link
      fedilink
      arrow-up
      0
      arrow-down
      1
      ·
      9 months ago

      It’s far more than her eyes, she is bilaterally asymmetrical. With real people you can generally take a reflection of one side and it will look fairly close to the other. This woman has so much asymmetry it is off-putting. Her eyes are different heights and shapes, her cheek bones are different, the outer part of her nostrils are at different heights, her lip sides are shaped differently, her jawlines are different, her suprasternal notch(the divot at the base of the neck) is WILDLY different. The easiest thing to spot is her different skin tones. At first, you’ll want to chalk it up to shading, but the light source isn’t to her side but in front and to the upper right, that does not allow for such a radical change if you look at her forehead.

      • Rhaedas@kbin.social
        link
        fedilink
        arrow-up
        1
        ·
        9 months ago

        AI notes: make face and body images more symmetrical, but not 100%. Got it.

        The only reason it hasn’t done that yet is because it’s not really AI, but large model probability with training feedback, and so far the feedback has enforced the “close enough” aspect. The next versions will cross the lines that still let us sense something isn’t quite right.

  • yamanii@lemmy.world
    link
    fedilink
    arrow-up
    1
    ·
    9 months ago

    Can confirm, I made some random korean dude on dall-e to send to Instagram after it threatened to close my fake account, and it passed.

  • Margot Robbie@lemmy.world
    link
    fedilink
    arrow-up
    1
    ·
    9 months ago

    Due to having so many people trying to impersonate me on the internet, I’ve become somewhat of a expert on verification pictures.

    You can still easily tell that this is fake because if you look closely, the details, especially the background clutter, is utterly nonsensical.

    1. The object over her right shoulder (your left), for example, looks like if someone blended a webcam with a TV with a nightstand.
    2. Over her left shoulder (your right), her chair is only on that one side and it blends into the counter in the background.
    3. Is it a table lamp or a wall mounted light?
    4. The doorframe in background behind her head is not even aligned.
    5. Her clavicles are asymmetrical, never seen that on a real person.
    6. Her wispy hairstrands. Real hair don’t appear out of thin air in loops.
    • Honytawk@lemmy.zip
      link
      fedilink
      arrow-up
      1
      ·
      9 months ago

      The point isn’t that you can spot it.

      The point is that the automated system can’t spot it.

      Or are you telling me there is a person looking at every verification photo, and if they did they would thoroughly scan the photo for imperfections?

      • Margot Robbie@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        9 months ago

        The idea of using a picture upload for automated verification is completely unviable. A much more commonly used system would be something like telling you to perform a random gesture on camera on the spot, like “turn your head slowly” or “open your mouth slowly” which would be trivial for a human to perform but near impossible for AI generators.

        • curiousPJ@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          9 months ago

          but near impossible for AI generators.

          …I feel like this isn’t the first time I heard that statement before.

    • phoenixz@lemmy.ca
      link
      fedilink
      arrow-up
      1
      ·
      9 months ago

      Due to so many people trying to impersonate me on the Internet

      Yeah see, now I am not really sure if you’re the real Margot Robbie.

      Could you send me a verification picture?

      • Margot Robbie@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        9 months ago

        But then how will I astroturf (I mean, organically market) my current and future movies, like Golden Globe winning summer blockbuster, Barbie, now available on Blu-Ray and select streaming services, here if I get verified?

    • Nora@lemmy.ml
      link
      fedilink
      arrow-up
      1
      ·
      edit-2
      9 months ago

      Micro communities based on pre (post-truth) connections. Only allowing people into the community that can be confirmed be others?

      I’ve been thinking of starting a matrix community to get away from discord and it’s inevitable Botting.