cross-posted from: https://quokk.au/c/programmer_humor/p/475451/don-t-do-ai-and-code-kids
Original Reddit thread: https://reddit.com/r/google_antigravity/comments/1p82or6/google_antigravity_just_deleted_the_contents_of/
No sane IDE implementation lets LLMs run commands without a sandbox.
WTF is this?
An insane IDE implementation that lets LLMs run commands without a sandbox.
Unreal.
It’s like they’re trying to get the public to despise LLMs. It’s certainly working.
my thoughts. the heck does a brain dead LLM have access to your file system or to do anything except interact with you lol
I mean, it makes sense inside like a docker container or VM.
…Not like this.
Which is also why I don’t want tight AI integration on an OS. It’s fine as a chatbot, not an administrator that I suspect MS will one day hand more control over their PC than they allow users.
It’s funny until you realize Google dumped $93M into convincing the general public that AI is the future before thrusting a half baked technology into our daily lives.
Well, there’s the problem. In AI development $93m is nothing. Its like they threw a pocket change at a child and demanded industry leading AI.
If they threw $93m at me, I’d start saying their large lying machine was the solution to all my problems.
I think this is just the cost of it’s adverts
The copium in the reddit thread is hilarious.
“The issue is that you had a space in your path name”
No, the issue is that the AI wiped an entire drive! 🤣
I can’t stand those types of people.
“I’m blaming you because my 401k can’t accept this information or it will collapse.”
I mean, a lot of the people pointing that out are actually doing so to indicate the dangers of relying on AI in the first place.
If you read some OPs replies it becomes clear that what happened here is they asked the bot how to fix something, didn’t understand the instructions it replied with, and then just went and said “Hey, I don’t get it, so you do it for me.”
Anyone who knew what they were doing would have noticed the bad delete command the bot presented (improperly formatted, and with no safety checks), but because OP figured “Hey, knowing stuff is for suckers”, they ended up losing all their stuff.
How would you feel if you bought pharmaceuticals that promised to heal you and they made you sick? What about a ride at a theme park with no warning signs that failed and hurt you?
The whole marketing thing of these garbage devices is based around abusing trust. There are no warnings that you need to be an expert, in fact they claim the opposite.
The person is a rube, but only evil people abuse the trust of others and only evil people blame people for having their trust abused. Being able to trust people is good actually, and we should viciously beat to death everyone that violates social trust.
How would you feel if you bought a hammer and then it broke your hand?
You wouldn’t feel anything at all, because it’s an inane scenario that can’t actually happen without you misusing the tool.
If someone told me the hammer was safe and I hit something with it, the temper was bad, it shattered and cut me, and it was established to be deliberate deception beyond even negligence I’d want my pound of flesh yeah
Hell yeah
IF (and this is a big if) you are going to allow an AI direct access to your files and your command line, for the love of Gabe sandbox that shit and run a backup for the folders you give it access to. We know AI makes mistakes like this. Just act as if you were giving your little brother access to your drives and your command line and it’s his first day. I get we’re all still learning about this stuff, but allowing an AI agent command-line access and full drive access, to a local drive you have no backup of, is just leaving Little Timmy at home alone with a loaded shotgun and a open bottle of pills level of irresponsibility.
D:
:ᗡ
Perfection
Live by the slop, die by the slop. Also: no backup, no pity
Watching vibe coders get blown up by their own ignorance and stupidity is such a great past time. Fuck AI.
Hilarious! I knew this was going to happen! Just not that fast!
The irony of having run out of tokens on that last message… “Your problem now, peace out. Or pony up”.
Fuck Reddit and Fuck Spez.
So brave.
STFU
i mean does anyone disagree?
I still announce it anytime I see the other site mentioned or in the meme/screenshot.
LLMs are like a power saw whose blade is attached by a chain. You don’t know what it’s going to cut.
Why would you give AI access to the whole drive? Why would you allow AI run destructive commands on its own without reviewing them?
The guy was asking for it. I really enjoy seeing these vibe coders imagine they are software engineers and fail miserably with their drives and databases wiped.
Why would you ask AI to do any operation on even a single file, let alone an entire a local drive, that wasn’t backed up? I’ve been using and misusing computers for long enough that I have blown up my own shit many times in many stupid ways though, so I can’t honestly say that 20 years ago this wouldn’t have been me lol.
If he knew what he was doing, would he need to be vibe coding? The target audience are exactly the people most susceptible to collateral damage.
I’ll probably get eaten here but here goes: I do use LLMs when coding. But those should NEVER be used when on unknown waters. To quickly get the 50 lines boilerplate and fill out the important 12 - sure. See how a nested something can be written in a syntax I’ve forgotten - yes. Get some example to know where to start searching the documentation from - ok. But “I asked it to do X, don’t understand what it spewed out, let’s roll”? Hell no, it’s a ticking bomb with a very short fuse. Unfortunately the marketing has pushed LLMs as things one can trust. I feel I’m already being treated like a zealot dev, afraid for his job, when I’m warning people around me to not trust the LLMs’ output below search engine query
I don’t know how this one works but many of them can get access through the IDE because the IDE has full disk access, due to being an IDE.
LLMs sometimes use a MCP server to access tools which are usually coded to require consent before each step, but that should probably be an always type of thing.
I hate these stupid things, but I am forced to use them. I think there should be a suggested patch type workflow instead of just allowing them to run roughshod all over your computer, but Google and Microsoft are pursuing “YOLO mode” for everything anyway even if it’s alarmingly obvious how terrible an idea that is.
We have containers and VMs, these fucking things should be isolated and it should be impossible for them to alter files without consent.
I hate to say it but this is the future. As OS devs try to cram AI into everything so that users don’t have to understand technology (because they’d rather take selfies and scroll tiktok) surrendering your control over your own devices and the information contained therein will become more complete.
files are overrated anyway when you got antigravity
Just ask the AI to generate you new, better files
2017: we lock your data behind paywall without your authorisation, pay us 2 bitcoin to unlock it.
2025: whoops i deleted your D: drive, do check it for the extend of the damage.
Would people learn
Edit: omfg quota limit hit right after the drive is empty. Seriously why would people even allow AI to hold their egg basket. This is all on OOP.












