• Gaja0@lemmy.zip
    link
    fedilink
    arrow-up
    13
    ·
    3 days ago

    It’s funny until you realize Google dumped $93M into convincing the general public that AI is the future before thrusting a half baked technology into our daily lives.

    • Voroxpete@sh.itjust.works
      link
      fedilink
      arrow-up
      4
      ·
      4 days ago

      I mean, a lot of the people pointing that out are actually doing so to indicate the dangers of relying on AI in the first place.

      If you read some OPs replies it becomes clear that what happened here is they asked the bot how to fix something, didn’t understand the instructions it replied with, and then just went and said “Hey, I don’t get it, so you do it for me.”

      Anyone who knew what they were doing would have noticed the bad delete command the bot presented (improperly formatted, and with no safety checks), but because OP figured “Hey, knowing stuff is for suckers”, they ended up losing all their stuff.

      • naevaTheRat@lemmy.dbzer0.com
        link
        fedilink
        arrow-up
        9
        arrow-down
        1
        ·
        3 days ago

        How would you feel if you bought pharmaceuticals that promised to heal you and they made you sick? What about a ride at a theme park with no warning signs that failed and hurt you?

        The whole marketing thing of these garbage devices is based around abusing trust. There are no warnings that you need to be an expert, in fact they claim the opposite.

        The person is a rube, but only evil people abuse the trust of others and only evil people blame people for having their trust abused. Being able to trust people is good actually, and we should viciously beat to death everyone that violates social trust.

        • Ledivin@lemmy.world
          link
          fedilink
          arrow-up
          3
          arrow-down
          2
          ·
          3 days ago

          How would you feel if you bought a hammer and then it broke your hand?

          You wouldn’t feel anything at all, because it’s an inane scenario that can’t actually happen without you misusing the tool.

          • naevaTheRat@lemmy.dbzer0.com
            link
            fedilink
            arrow-up
            3
            arrow-down
            2
            ·
            3 days ago

            If someone told me the hammer was safe and I hit something with it, the temper was bad, it shattered and cut me, and it was established to be deliberate deception beyond even negligence I’d want my pound of flesh yeah

  • khepri@lemmy.world
    link
    fedilink
    arrow-up
    6
    ·
    edit-2
    3 days ago

    IF (and this is a big if) you are going to allow an AI direct access to your files and your command line, for the love of Gabe sandbox that shit and run a backup for the folders you give it access to. We know AI makes mistakes like this. Just act as if you were giving your little brother access to your drives and your command line and it’s his first day. I get we’re all still learning about this stuff, but allowing an AI agent command-line access and full drive access, to a local drive you have no backup of, is just leaving Little Timmy at home alone with a loaded shotgun and a open bottle of pills level of irresponsibility.

  • BassTurd@lemmy.world
    link
    fedilink
    arrow-up
    4
    ·
    4 days ago

    Watching vibe coders get blown up by their own ignorance and stupidity is such a great past time. Fuck AI.

  • Ex Nummis@lemmy.world
    link
    fedilink
    arrow-up
    4
    ·
    4 days ago

    The irony of having run out of tokens on that last message… “Your problem now, peace out. Or pony up”.

  • falseWhite@lemmy.world
    link
    fedilink
    arrow-up
    2
    ·
    4 days ago

    Why would you give AI access to the whole drive? Why would you allow AI run destructive commands on its own without reviewing them?

    The guy was asking for it. I really enjoy seeing these vibe coders imagine they are software engineers and fail miserably with their drives and databases wiped.

    • khepri@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      3 days ago

      Why would you ask AI to do any operation on even a single file, let alone an entire a local drive, that wasn’t backed up? I’ve been using and misusing computers for long enough that I have blown up my own shit many times in many stupid ways though, so I can’t honestly say that 20 years ago this wouldn’t have been me lol.

    • tburkhol@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      4 days ago

      If he knew what he was doing, would he need to be vibe coding? The target audience are exactly the people most susceptible to collateral damage.

      • INeedMana@piefed.zip
        link
        fedilink
        English
        arrow-up
        2
        ·
        3 days ago

        I’ll probably get eaten here but here goes: I do use LLMs when coding. But those should NEVER be used when on unknown waters. To quickly get the 50 lines boilerplate and fill out the important 12 - sure. See how a nested something can be written in a syntax I’ve forgotten - yes. Get some example to know where to start searching the documentation from - ok. But “I asked it to do X, don’t understand what it spewed out, let’s roll”? Hell no, it’s a ticking bomb with a very short fuse. Unfortunately the marketing has pushed LLMs as things one can trust. I feel I’m already being treated like a zealot dev, afraid for his job, when I’m warning people around me to not trust the LLMs’ output below search engine query

    • aesthelete@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      edit-2
      4 days ago

      I don’t know how this one works but many of them can get access through the IDE because the IDE has full disk access, due to being an IDE.

      LLMs sometimes use a MCP server to access tools which are usually coded to require consent before each step, but that should probably be an always type of thing.

      I hate these stupid things, but I am forced to use them. I think there should be a suggested patch type workflow instead of just allowing them to run roughshod all over your computer, but Google and Microsoft are pursuing “YOLO mode” for everything anyway even if it’s alarmingly obvious how terrible an idea that is.

      We have containers and VMs, these fucking things should be isolated and it should be impossible for them to alter files without consent.

  • RememberTheApollo_@lemmy.world
    link
    fedilink
    arrow-up
    1
    ·
    4 days ago

    I hate to say it but this is the future. As OS devs try to cram AI into everything so that users don’t have to understand technology (because they’d rather take selfies and scroll tiktok) surrendering your control over your own devices and the information contained therein will become more complete.

  • psx_crab@lemmy.zip
    link
    fedilink
    English
    arrow-up
    1
    ·
    4 days ago

    2017: we lock your data behind paywall without your authorisation, pay us 2 bitcoin to unlock it.

    2025: whoops i deleted your D: drive, do check it for the extend of the damage.

    Would people learn

    Edit: omfg quota limit hit right after the drive is empty. Seriously why would people even allow AI to hold their egg basket. This is all on OOP.