• Hnery@feddit.org
    link
    fedilink
    arrow-up
    2
    ·
    2 months ago

    llama3 is not bad and you can easily run the smaller ones on an average desktop cornfuser

    • PolarisFx@lemmy.dbzer0.com
      link
      fedilink
      arrow-up
      1
      ·
      2 months ago

      But slowly, I filled my home server with whatever CUDA capable cards I have and it’s fine for SD, but I found llama way too slow. I rented a dual A2000 instance for a couple weeks and it was bearable, but still not great.