• snaggen@programming.dev
      link
      fedilink
      arrow-up
      0
      ·
      edit-2
      9 months ago

      I actually asked chatGPT about a specific issue I had and solved a while back. It was one of these issues where it looked like a simple naive solution would be sufficient, but due to different conditions that fails, you have to go with a more complex solution. So, I asked about this to see what it would answer. And it went with the simpler solution, but with some adjustments. The code also didn’t compile. But it looked interesting enough, for me to question my self. Maybe it was just me that failed the simpler solution, so I actually tried to fix the compile errors to see if I could get it working. But the more I tried to fix its code the more obvious it got that it didn’t have a clue about what it was doing. However, due to the confidence and ability to make things look plausible, it sent me on a wild goose chase. And this is why I am not using LLM for programming. They are basically overconfident junior devs, that likes mansplaining.

      • marx2k@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        9 months ago

        I don’t do it enough, but I do enjoy using it (it being perplexity.ai) for getting code examples of stuff I’m always looking up over and over. A YAML sample of a Cloudformation or CDL snippet for a very specifically configured resource. A YAML sample of an Ansible module that does a thing. A Python sample of a specific lambda method. A regex for email addresses.

      • Aurenkin@sh.itjust.works
        link
        fedilink
        arrow-up
        0
        ·
        9 months ago

        It’s not always right but it saves me tonnes of time at work, usually when I want to do something simple in a language or environment I’m not totally familiar with.

        • ulterno@lemmy.kde.social
          link
          fedilink
          English
          arrow-up
          0
          ·
          9 months ago

          It has encouraged my colleagues to get answers from it that would be easily available with a google search (and by asking me - my fault for acting like they are a pita for not extrapolating from previous explanations). Resulting in:

          1. Trying to sudo apt install on a RHEL system
            • Looking for an apt RPM package on the internet
          2. Looking for RPM packages of almost every unavailable thingy on the internet.
          3. In general, succeeding 90% at a task (not 90% of the times, but 90% part of it) and going with it. Only to later realise the remaining 10% invalidated all their effort.
        • Barbarian@sh.itjust.works
          link
          fedilink
          arrow-up
          0
          ·
          9 months ago

          It can reliably copy the simple things in it’s training data from stackoverflow.

          But at that point, why not just go to stackoverflow instead?

          • Aurenkin@sh.itjust.works
            link
            fedilink
            arrow-up
            0
            ·
            9 months ago

            I’m not saying it’s going to take anyone’s job anytime soon but it’s a lot quicker to get something tailor made for your problem than going to stack overflow. Everyone should use the tools that work for them but don’t sleep on this stuff, like any tool it’s really helpful once you know how to use it.

            • Holzkohlen@feddit.de
              link
              fedilink
              arrow-up
              0
              ·
              9 months ago

              Agreed. But I think it is utterly useless if you aren’t experienced enough to tell if it is bullshitting. Almost every time I have asked for a little adjustment, it just makes something up that looks good on first glance. My favorite is when it invents python libraries that magically handle all the difficult stuff. But man is it useful for my crappy little bash scripts or regex.

          • DrCake@lemmy.world
            link
            fedilink
            arrow-up
            0
            ·
            9 months ago

            I find that sometimes I can’t quite describe the problem well enough for google to find results. The conversational nature of ChatGPT means I usually can get a good enough answer from it

          • TheUnamusedFox@lemmy.dbzer0.com
            link
            fedilink
            English
            arrow-up
            0
            ·
            9 months ago

            Gpt4 is pretty awesome for simple stuff. I’ve just started learning python (Knowing no other language) and made my first project a pyqt GUI for editing the config of a FOSS project. It’s reasoning ability is not great, but when you clearly lay out what you want to do, how you want to do it, it because a fantastic natural language to code interpreter. All the fiddly bits I dread typing out I just pop into gpt 3.5, and more complicated stuff gpt4.

            I have learned a lot from debugging whenever it gets stuck, and being able to create an actual usable program right from the start is awesome.

            Even better is slowly realizing you are understanding what’s going on, and the dread of actually studying to learn the language becomes a genuine desire to learn more.