You misspelled Austrians.
You misspelled Austrians.
The main limitation is the VRAM, but I doubt any model is going to be particularly fast.
I think phi3:mini
on ollama might be an okish fit for python, since it’s a small model, but was trained on python codebases.
The bonus panel explains why it doesn’t.
If you want to know more about how unhinged the dude is, I recommend reading this email thread in the DNG mailing list.
I’ve been working for longer than I was in school, and I don’t have regular nightmares about being stuck in an office building. Just saying.