An in-depth police report obtained by 404 Media shows how a school, and then the police, investigated a wave of AI-powered “nudify” apps in a high school.

  • redcalcium@lemmy.institute
    link
    fedilink
    English
    arrow-up
    9
    arrow-down
    1
    ·
    2 years ago

    I kinda doubt porn would be a problem with Sora, just like it’s not a problem with Dall-e. The model is locked down in openai’s servers and open source model is no where as good yet. Even if there is a comparable downloadable models, it’ll be computationally expensive I doubt teens can freely run it.

    • just another dev@lemmy.my-box.dev
      link
      fedilink
      English
      arrow-up
      18
      arrow-down
      1
      ·
      2 years ago

      open source model is no where as good yet

      By the time a law would be adopted, it probably will be. I wouldn’t want to rely on the “kindness” of commercial entities as the sole protector of consumer welfare. We’ve seen how well that works with Google and Facebook.

    • cm0002@lemmy.world
      link
      fedilink
      English
      arrow-up
      13
      ·
      2 years ago

      Even if there is a comparable downloadable models, it’ll be computationally expensive I doubt teens can freely run it.

      For now, but with every new tech, hardware efficiency optimization is not too far behind. Especially considering the performance required for training != performance required for running/outputting.

      Considering the glacier speeds our government moves, I’d bet on those hardware efficiency optimizations making it out before any significant law gets implemented.

    • abhibeckert@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      2 years ago

      Um… the Taylor Swift porn deepfakes were Dall-e.

      Sure - they try to prevent that stuff, but it’s hardly perfect. And not all bullying is easily spotted. Imagine a deepfake of a kid sending a text message, but the bubbles are green. Or maybe they’re smiling at someone they hate.

      Also, stable diffusion is more than good enough for this stuff. It’s free and any decent gaming laptop can run that. Takes mine 20 seconds to produce a decent deepfake… I’ve used it to touch up my own photos.