Another AI fail. Letting AI write code and modify your file system without sandboxing and buckups. What could go wrong?

  • NeatNit@discuss.tchncs.de
    link
    fedilink
    arrow-up
    14
    arrow-down
    1
    ·
    5 months ago

    None of this would happen if people recognized that, at best, AI has the intelligence level of a child. It has a lot of knowledge (some of which is hallucinated, but that’s besides the point) but none of the responsibility that you’d hope an adult would have. It’s also not capable of learning from its own mistakes or being careful.

    There’s a whole market for child safety stuff: corner foam, child-proof cabinet locks, power plug covers, etc… You want all of that in your system if you let the AI run loose.

    • Jo Miran@lemmy.ml
      link
      fedilink
      arrow-up
      3
      ·
      5 months ago

      A child, on acid and meth. You should never let it run lose, no matter how many safeguards. Not if your code is business critical.

  • aramova@infosec.pub
    link
    fedilink
    arrow-up
    3
    ·
    5 months ago

    Gemini-cli has a sandbox environment, and a revert method you can enable.

    This is more of a FAFO article than anything. The tools are there for you not to fuck up, you choose not to use them.

    Play stupid games win stupid prizes etc etc