• BrikoX@lemmy.zip
    link
    fedilink
    English
    arrow-up
    34
    ·
    5 months ago

    How about the false positives? You want your name permanently associated with child porn because someone fucked up and ruined your life? https://www.eff.org/deeplinks/2022/08/googles-scans-private-photos-led-false-accusations-child-abuse

    The whole system is so flawed that it has like 20-25% success rate.

    Or how about this system being adopted for anything else? Guns? Abortion? LGBT related issues? Once something gets implemented, it’s there forever and expansion is inevitable. And each subsequent government will use it for their personal agenda.

    • eveninghere@beehaw.org
      link
      fedilink
      arrow-up
      1
      arrow-down
      5
      ·
      5 months ago

      They say they the images are merely matched to pre-determined images found on the web. You’re talking about a different scenario where AI detects inappropriate contents in an image.

      • vrighter@discuss.tchncs.de
        link
        fedilink
        arrow-up
        5
        ·
        5 months ago

        change one pixel and suddenly it doesn’tmatch. Do the comparison based on similarity instead and now you’re back to false positives

        • eveninghere@beehaw.org
          link
          fedilink
          arrow-up
          1
          arrow-down
          3
          ·
          edit-2
          5 months ago

          My guess was that this law was going to permit something as simple as pixel matching. Honestly I don’t imagine they can codify in the law something more sophisticated. Companies don’t want false positives either, at the very least due to profits.

          • Inductor@feddit.de
            link
            fedilink
            arrow-up
            4
            ·
            5 months ago

            Unfourtunately, I couldn’t find a source stating it would be required. AFAIK it’s been assumed that they would use perceptual hashes, since that’s what various companies have been suggesting/presenting. Like Apple’s NeuralHash, which was reverse engineered. It’s also the only somewhat practical solution, since exact matches would be easily be circumvented by changing one pixel or mirroring the image.

            Patrick Breyer’s page on Chat Control has a lot of general information about the EU’s proposal.

            • eveninghere@beehaw.org
              link
              fedilink
              arrow-up
              2
              ·
              edit-2
              5 months ago

              Stupid regulation, honestly. Exact matches are implementable but further than that… Aren’t they basically banning e2ee at this point?

              Now I see why Signal will close in EU.