My takeaway from this is:
- Get a bunch of AI-generated slop and put it in a bunch of individual
.htmfiles on my webserver. - When my bot user agent filter is invoked in Nginx, instead of returning
444and closing the connection, return a random.htmof AI-generated slop (instead of serving the real content) - Laugh as the LLMs eat their own shit
- ???
- Profit
I might just do this. It would be fun to write a quick python script to automate this so that it keeps going forever. Just have a link that regens junk then have it go to another junk html file forever more.
- Get a bunch of AI-generated slop and put it in a bunch of individual
So now LLM makers actually have to sanitize their datasets? The horror…
I don’t think that’s tractable.
Inbreeding
What are you doing step-AI?
Imo this is not a bad thing.
All the big LLM players are staunchly against regulation; this is one of the outcomes of that. So, by all means, please continue building an ouroboros of nonsense. It’ll only make the regulations that eventually get applied to ML stricter and more incisive.
This reminds me of the low-background steel problem: https://en.m.wikipedia.org/wiki/Low-background_steel
So kinda like the human centipede, but with LLMs? The LLMillipede? The AI Centipede? The Enshittipede?
Except it just goes in a circle.
))<>((
How many times is this same article going to be written? Model collapse from synthetic data is not a concern at any scale when human data is in the mix. We have entire series of models now trained with mostly synthetic data: https://huggingface.co/docs/transformers/main/model_doc/phi3. When using entirely unassisted outputs error accumulates with each generation but this isn’t a concern in any real scenarios.
Good. Let the monster eat itself.
we already have open source AI. This will only effect people trying to make it better than what stable diffusion can do, make a new type of ai entirely (like music, but that’s not a very ai saturated market yet), or update existing ai with new information like skibidi toilet
Anyone old enough to have played with a photocopier as a kid could have told you this was going to happen.






