Boing Lemmy
  • Communities
  • Create Post
  • Create Community
  • heart
    Support Lemmy
  • search
    Search
  • Login
  • Sign Up
themachinestops@lemmy.dbzer0.com to Technology@lemmy.worldEnglish · 1 month ago

A Developer Accidentally Found CSAM in AI Data. Google Banned Him For It

www.404media.co

external-link
message-square
104
link
fedilink
582
external-link

A Developer Accidentally Found CSAM in AI Data. Google Banned Him For It

www.404media.co

themachinestops@lemmy.dbzer0.com to Technology@lemmy.worldEnglish · 1 month ago
message-square
104
link
fedilink
  • abbiistabbii@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    5
    ·
    1 month ago

    This, literally the only reason I could guess is that it is to teach AI to recognise childporn, but if that is the case, why is google going it instead of like, the FBI?

    • gustofwind@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      ·
      1 month ago

      Who do you think the FBI would contract to do the work anyway 😬

      Maybe not Google but it would sure be some private company. Our government doesn’t do stuff itself almost ever. It hires the private sector

      • alias_qr_rainmaker@lemmy.worldBanned
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        20 days ago

        Removed by mod

    • alias_qr_rainmaker@lemmy.worldBanned
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      20 days ago

      Removed by mod

      • vimmiewimmie@slrpnk.net
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 month ago

        What’s the ‘applescript’?

        • alias_qr_rainmaker@lemmy.worldBanned
          link
          fedilink
          English
          arrow-up
          2
          ·
          edit-2
          20 days ago

          Removed by mod

          • alias_qr_rainmaker@lemmy.worldBanned
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            20 days ago

            Removed by mod

    • forkDestroyer@infosec.pub
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 month ago

      Google isn’t the only service checking for csam. Microsoft (and other file hosting services, likely) also have methods to do this. This doesn’t mean they also host csam to detect it. I believe their checks use hash values to determine if a picture is already clocked as being in that category.

      This has existed since 2009 and provides good insight on the topic, used for detecting all sorts of bad category images:

      https://technologycoalition.org/news/the-tech-coalition-empowers-industry-to-combat-online-child-sexual-abuse-with-expanded-photodna-licensing/

    • frongt@lemmy.zip
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 month ago

      Google wants to be able to recognize and remove it. They don’t want the FBI all up in their business.

      • Allero@lemmy.today
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 month ago

        So, Google could be allowed to have the tools to collect, store, and process CSAM all over the Web without oversight?

        Pretty much everyone else would get straight to jail for attempting that.

Technology@lemmy.world

technology@lemmy.world

Subscribe from Remote Instance

Create a post
You are not logged in. However you can subscribe from another Fediverse account, for example Lemmy or Mastodon. To do this, paste the following into the search field of your instance: [email protected]

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


  • @[email protected]
  • @[email protected]
  • @[email protected]
  • @[email protected]
Visibility: Public
globe

This community can be federated to other instances and be posted/commented in by their users.

  • 3.61K users / day
  • 9.03K users / week
  • 15K users / month
  • 29.7K users / 6 months
  • 2 local subscribers
  • 78.5K subscribers
  • 4.95K Posts
  • 145K Comments
  • Modlog
  • mods:
  • L3s@lemmy.world
  • enu@lemmy.world
  • Technopagan@lemmy.world
  • L4sBot@lemmy.world
  • L3s@hackingne.ws
  • BE: 0.19.14
  • Modlog
  • Instances
  • Docs
  • Code
  • join-lemmy.org