Another AI fail. Letting AI write code and modify your file system without sandboxing and buckups. What could go wrong?
None of this would happen if people recognized that, at best, AI has the intelligence level of a child. It has a lot of knowledge (some of which is hallucinated, but that’s besides the point) but none of the responsibility that you’d hope an adult would have. It’s also not capable of learning from its own mistakes or being careful.
There’s a whole market for child safety stuff: corner foam, child-proof cabinet locks, power plug covers, etc… You want all of that in your system if you let the AI run loose.
A child, on acid and meth. You should never let it run lose, no matter how many safeguards. Not if your code is business critical.
I am never quite sure if the I in AI stands for intelligence or ignorance.
If you need an LLM to move files in a folder, maybe you should not be a PM.
Gemini-cli has a sandbox environment, and a revert method you can enable.
This is more of a FAFO article than anything. The tools are there for you not to fuck up, you choose not to use them.
Play stupid games win stupid prizes etc etc
Indeed, there were a lot of [sic] buck ups here.






