

you’d probably be better off setting up your own domain server and trying to get that working
you’d probably be better off setting up your own domain server and trying to get that working
That’s the point though - it wasn’t a good thing
That’s technically a subdomain and the same reason email went with @
Donated with renewal! Thank you all for the hard work 😁
That’s TrueNas. It can run docker compose files so I’m abusing the crap out of what it’s supposed to do haha.
Yeah, just me and my family for now. I have gotten a lot of knowledge setting stuff up and hope to eventually get some VPSs set up for some public Lemmy, pixelfed, and maybe mastodon instances for digital nomads and expats.
Eh, it’s a document viewer. I figured they’re referring to Plex and jellyfin when they say media.
All these. I just added calibre web and may phase out Kavita.
Elena Rossini (@_elena@mastodon.social) is a journalist who’s gotten into the fediverse and self hosting with Yuno Host. She’s documented it on her blog. It’s worked out really well for her.
This is super cool! You really get a lot just by making a FE for the lemmy API. Good work! 👍
Mine aren’t totally great as they’re both kinda negative. My top comment is https://lemm.ee/comment/16106836, which is pointing out that new jokes are old.
My top post is https://lemm.ee/post/5345101. The image associated with it is gone, but it was a screenshot of an old friend reaching out and acting friendly in a text and trying to sell me on a crypto scheme. That actually really bummed me out when he did that.
I don’t even know what voice to image in my head for this. A raspy pikachu worn out by life? What does that sound like?
Fucking the working class
I haven’t tried those, so not really, but with open web UI, you can download and run anything, just make sure it fits in your vram so it doesn’t run on the CPU. The deep seek one is decent. I find that i like chatgpt 4-o better, but it’s still good.
The coder model has only that one. The ones bigger than that are like 20GB+, and my GPU has 16GB. I’ve only tried two models, but it looked like the size balloons after that, so that may be the biggest models that I can run.
I got it working with my 6800XT. I’m running deep seek r1 14b (somewhere around there) and the deep seek coder V2. I have a link to a blog with those instructions
https://gotosocial.michaeldileo.org/@mdileo/statuses/01JQA4M4Q33PMCADH9M2AWQSS8
For me federation is working to mastodon, but I don’t think comments are really out yet, but I use cactus comments for it, but that doesn’t federate to mastodon, it federates to matrix. It also requires a matrix server, which was a total pita to set up.
oh, I thought that was a mastodon thing or something. Thank you for the clarification :)
How does your skin do in the sun?