• reddig33@lemmy.world
    link
    fedilink
    arrow-up
    20
    arrow-down
    3
    ·
    2 months ago

    “Do you want me to use ChatGPT to do that?”

    No. I don’t. I really really don’t.

    • Draupnir@lemmy.world
      link
      fedilink
      English
      arrow-up
      16
      ·
      2 months ago

      You can turn off the ability for it to request chatGPT if it can’t resolve the request on its own

    • Ghostalmedia@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      13
      ·
      2 months ago

      IMHO, this is how integrations with chat GPT should work. Default to a local model or a private cloud model, and if that doesn’t work, ask the user if they want the query to go to another LLM. And let people turn off the external LLM prompts entirely.

      Let make it opt it. And make opting out very prominent.