• Admiral Patrick@dubvee.org
    link
    fedilink
    English
    arrow-up
    8
    ·
    1 year ago

    My takeaway from this is:

    1. Get a bunch of AI-generated slop and put it in a bunch of individual .htm files on my webserver.
    2. When my bot user agent filter is invoked in Nginx, instead of returning 444 and closing the connection, return a random .htm of AI-generated slop (instead of serving the real content)
    3. Laugh as the LLMs eat their own shit
    4. ???
    5. Profit
    • mesa@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 year ago

      I might just do this. It would be fun to write a quick python script to automate this so that it keeps going forever. Just have a link that regens junk then have it go to another junk html file forever more.

  • Ekky@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    6
    ·
    1 year ago

    So now LLM makers actually have to sanitize their datasets? The horror

  • gravitas_deficiency@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    2
    ·
    1 year ago

    Imo this is not a bad thing.

    All the big LLM players are staunchly against regulation; this is one of the outcomes of that. So, by all means, please continue building an ouroboros of nonsense. It’ll only make the regulations that eventually get applied to ML stricter and more incisive.

  • paddirn@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    So kinda like the human centipede, but with LLMs? The LLMillipede? The AI Centipede? The Enshittipede?

  • BetaDoggo_@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 year ago

    How many times is this same article going to be written? Model collapse from synthetic data is not a concern at any scale when human data is in the mix. We have entire series of models now trained with mostly synthetic data: https://huggingface.co/docs/transformers/main/model_doc/phi3. When using entirely unassisted outputs error accumulates with each generation but this isn’t a concern in any real scenarios.

  • TriflingToad@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 year ago

    we already have open source AI. This will only effect people trying to make it better than what stable diffusion can do, make a new type of ai entirely (like music, but that’s not a very ai saturated market yet), or update existing ai with new information like skibidi toilet

  • leftzero@lemmynsfw.com
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 year ago

    Anyone old enough to have played with a photocopier as a kid could have told you this was going to happen.