• 0 Posts
  • 24 Comments
Joined 2 years ago
cake
Cake day: June 10th, 2023

help-circle
  • If you want to start the most effective, upgrade your router or primary switch to 2.5G or 10G. Then at least there is a low likelihood of a bottleneck when your devices are communicating internally with each other and youll have overhead downstream. Then, if you have multiple switches, prioritize the highest bandwitch between them over upgrading your devices beyond 1gb nic’s.

    I use an opnsense router with 2.5g nic’s, and then I have a 2.5g switch and a 1gb switch than are connected via a 10gb fiber link. (This is all enterprise ubiquity level stuff). But all my downstream devices and switches are 1gb snd I have no plans to upgrade intentionally. Internally, I won’t see bottlenecks often since communication between the switches and modems is enough to support multiple devices spamming 1gb/s file transfers simultaneously (not that itll happen often lol)

    So my WiFi access points, primary NAS, and my most used PC are all on 2.5gb connections since they could benefit. But everything else is on 1gb since the switch has way more ports and was way cheaper.

    I’m not against buying 10g switches for future proofing, but they’re still too costly for my needs, and its unlikely I’ll wish I had 10g any time soon esp when it comes to internet. Even if I upgrade beyond 1gb fiber service, it’d be so thay multiple devices can fully saturate a 1gb NIC at the same time, not so one computer can speed test 3gb+.

    Thay said, what I have is overkill, but i enjoy some homelab tinkering.


  • Most likely fiber. Around here the ADSL provider (CenturyLink) was the first to start deploying fiber to compete with cable able to do 1gb (which is, of course, highly variable and full of asterisks because coax, quality to neighbors modems to support a stronger mesh, possible MoCA interference, etc.)

    More recently they rebranded fiber as a different company… Probably to get rid of the DSL name stigma.






  • I mean, the issues were present and widely reported for several months before Intel even acknowledged the problems. And it wasn’t just media reporting this, it was also game server hosts who were seeing massive deployments failing at unprecedented rates. Even those customers, who get way better support than the average home user, were largely dismissed by intel for a long time. It then took several more months to ship a fix. The widespread nature of the issues points to a major failure on the companies part to properly QA and ensure their partners were given accurate guidance for motherboard specs. Even so, the patches only prevent further harm to the processor, it doesnt fix any damage that has already been incurred that could amount to years off of its lifespan. Sure they are doing an extended warranty, but thats still a band-aid.

    I agree it doesnt mean one should completely dismiss the possibility of buying an Intel chip, but it certinally doesn’t inspire confidence.

    Even if this was all an oversight or process failure, it still looks a lot like Intel as a whole deciding to ship chips that had a nice looking set of numbers despite those numbers being achieved through a degraded lifespan.


  • I’m in a swing state with an abortion measure on the ballot, and while all the polls claim it’s close, I’m not really sure they are properly accounting for the number of voters that have been activated by the possibility of enshrining pro-choice into the state constitution.

    These polling strategies are complex and a lot of thought goes into them, but they rarely can account for uncommon circumstances that increase voter turnout in local or state elections and how that will effect the national election.

    While this is entirely personal reexperience bias, I also wonder how effective these polls are at reaching a representative survey group. I know at least on my phone basically all survey calls and texts go to spam and I wonder if older, more conservative voters are getting overrepresented due to their likelihood of not having those kinds of spam filters in place.



  • As a side note, if you work somewhere that uses 1password, you can usually get your personal subscription comped as an individual. Only need to pay for it if you leave your company or they drop 1password.

    I dont know that I’ll stay on 1password forever, but on the scale of things I’m most concerned about self-hosting vs using a reasonably private SaaS, 1password is nowhere near the top of my list to ditch. Otherwise, its a solid recommendation for non-self hosters who want to make some progress.




  • I’ve also had struggles with arch with printing, more so than debian-based distros. EndeavourOS is where i did the most troubleshooting, but its also a problem on my manjaro install (whicj ill move to endeavour… Someday) But learning how to use cups directly was worth it.

    Currently, printing via GUI is like 5ppm and very low dpi so… Not great. But at least I can print for the casual use cases out of the box and could work out a terminal solution if I needed to in the meantime.

    I don’t print much so haven’t put time into getting things working better for bigger jobs, but printing is definitely going to be a more hit/miss experience with arch. Its looking like better GUI experience for my specific model will require a driver from the AUR or scripting the Debian install from brothers drivers site. But my model is apparently not as widely used and just hasn’t gotten as much community support I guess




  • Until recently, Wayland development was rather slow, especially in the areas where more specialized software run into issues that force them to stick with X11. Since Wayland does a lot less than X11 and is more componetized across multiple libraries designed to be swappable, some of these areas simply do not have solutions. Yet.

    And, as always with FOSS, funding is a big part of the problem. The recent funding boosts the GNOME foundation received have also led to some increased funding for work on Wayland and friends. In particular, accessibility has been almost nonexistent on Wayland, so that also means that if an app wants to ensure certain levels of accessibility, they can’t switch to Wayland. GNOME’s Newton effort is still very alpha, but promising.

    While big apps like blender and krita get good funding, they can’t necessarily solve the problem themselves by throwing money at it, either. But the more funding Wayland gets to fill in the feature-gaps and ease adoption, the sooner we’ll be able to move away from xwayland as a fallback.

    Wayland and its whole implementation process certinally aren’t without fault. There’s a lot of really justified anger and frustration all around. Even so, staying on X11 isnt a solution.


  • While I found ubuntu’s business practices (all the upsells, mostly) the most grating, really the thing that pushed me off of Ubuntu was packages being behind inexplicably and all the forking/modifying they did to gnome and just always being like 1-2 major versions behind, especially since gnomes been shipping tons of features the last few years and Ubuntu wouldn’t get them for ages.

    Outside of the snaps that Ubuntu seems to force you back into if you purposely try to turn it off, its not the worst to avoid otherwise. Or just deal with for a few apps.

    If they want the ubuntu stack of tooling, suggest debian. If they feel intimidated by Debian, Ubuntu is fine. Debian is really solid out of the box for a primary devices nowadays. no need to wait for Ubuntu to bless packages since the Debian ppa’s are usually much faster to update. But as long as they aren’t doing really weird stuff, they can always move off of Ubuntu to Debian or any other debian descendant easily if they want a smooth transition since its the same package manager.

    As long as the immutable distro paradigm isnt a turn off for them, Vanilla OS is also really neat, including cross-package manager installs. V1 is Ubuntu based, v2 will be Debian based (if it isnt already GA’d… I know thats soonish)

    I’ve mostly switched to using Debian for dev containers and servers, and 99% of the time any ubuntu-specific guides are still perfectlh helpful. I moved to Arch for main devices.

    (Side note: I abandoned manjaro for similar reasons as I abandoned Ubuntu: too much customization forced upon me, manjaro’s package repo was always behind or even had some broken packages vs the arch repos, and some odd decisions by the maintainers about all sorts of things. EndeavourOS has been just way better as someone who likes to have a less-dictated setup that is closer to the distro base and faster to get package updates)

    Edit: I guess my tl:dr is… If one thinks “Ubuntu”, first ask “why not debian?”, and then proceed to Ubuntu if there are some solid reasons to do so for the situation.



  • Exactly. Not a huge fan of notes apps storing the data in a db.otherwise there is a lot to like about joplin. With obsidian i open my notes in codium all the time to make mass edits or fill gaps that obsidians UI cant meet, which is not possible with joplin.

    Fortunately with obsidian as long as you keep the plugins on the lighter side and keep any non-markdown content in seperate files via linking, im not too worried about having to jump ship if it ever goes bad. Worst case if a plugin dies or i have to migrate, the actual loss of data is that some plugin used json or whatever and it’d have to be converted or replaced.

    I do have hope at least that if the company folds they’ll open source it, or turn a blind eye to a community reengineering effort. And what is unique about obsidian markdown and metadata will probably get community-built migration tools quickly if enough people jump ship en masse.

    But for the time being Obsidian is the best option for me and i dont feel that bad about it.