It sounds a lot like this quote from Andrej Karpathy :
Turns out that LLMs learn a lot better and faster from educational
content as well. This is partly because the average Common Crawl article
(internet pages) is not of very high value and distracts the training,
packing in too much irrelevant information. The average webpage on the
internet is so random and terrible it’s not even clear how prior LLMs
learn anything at all.
So it will end in a downward spiral because it starts referencing AI articles, from which articles are being written, from which the AI learns, from which articles are being…
As long as there’s supervision during training, which there always will be, this isn’t really a problem. This just shows how bad it can get if you just train on generated stuff.
deleted by creator
It sounds a lot like this quote from Andrej Karpathy :
So it will end in a downward spiral because it starts referencing AI articles, from which articles are being written, from which the AI learns, from which articles are being…
As long as there’s supervision during training, which there always will be, this isn’t really a problem. This just shows how bad it can get if you just train on generated stuff.
How? We just learned that they train on social media.
Short term profit is all they care about until this platform crashes down completely
Well it learned to put glue on pizza, eat rocks, and smoke while pregnant.
You forgot jumping off the Golden Gate bridge