• bitwaba@lemmy.world
    link
    fedilink
    arrow-up
    9
    ·
    4 months ago

    The sweet spot is the 40-60% load.

    But it doesn’t make that much of a difference. The efficiency swing is maybe 10%. Like an bronze 80 rated PSU will have a minimum efficiency of 80%, but even if you’re at the 50% load mark it won’t be over 90% efficient.

    The main point (to me anyways) is that its dumb to pay more for a power supply just so you can pay "more* on your power bill. If your idle load is 100W and your gaming load is 300W, you’ve got no reason running more than a 600W PSU

    • Naz@sh.itjust.works
      link
      fedilink
      arrow-up
      1
      ·
      4 months ago

      I’ve got a 850W power supply, which I bought 2-3 years ago in anticipation of the RTX 4000 series. My usual load with a GTX 1080 was 150W and now my entire system uses 520W completely loaded. Do I count? :)

      • Psythik@lemmy.world
        link
        fedilink
        arrow-up
        2
        ·
        4 months ago

        I have a 4090 in my Ryzen 7700X system and a power meter; 850W is overkill for a 4090. My system never uses more than 650w. What’s more important than the power rating is buying a high-tier PSU with good overcurrent protection, cause the 4090 tends to have power spikes even a good 750w PSU should be able to handle.

        If you bought a PSU certified for PCIe 5, then you’re most likely fine. If you didn’t have to use a squid adapter to plug in your GPU, then you’re more than likely good to go so long as you didn’t buy a shit tier PSU.