“Jensen sir, 50 series is too hot”

“Easy fix with my massive unparalleled intellect. Just turn off the sensor”

If you needed any more proof that Nvidia is continuing to enshittify their monopoly and milk consumers. Hey lets remove one of the critical things that lets you diagnose a bad card and catch bad situations that might result in gpu deathdoors! Dont need that shit, Just buy new ones every 2 years you poors!

If you buy a Nvidia GPU, you are part of the problem here.

  • Vinstaal0@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    2 months ago

    Yeah NVIDIA is a bullshit company and has been for a while. AMD and Intel need to get their raytracing game up so they become a real competitor for NVIDIA especially now when there are more games that require raytracing.

    • potustheplant@feddit.nl
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 months ago

      No games “require” raytracing and the ones that support it may look better when you turn it on but it’s not worth the performance cost.

      • MrPoopbutt@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 months ago

        Not true anymore.

        The Indiana Jones game that just came out does require ray tracing. There are a few others coming out that do as well.

        • TheObviousSolution@kbin.melroy.org
          link
          fedilink
          arrow-up
          1
          ·
          2 months ago

          Fortunately, they will be crippling themselves in the console market doing shit like this, so there should still be plenty of choice left. I wouldn’t be surprised if Bethesda eventually released a non-ray tracing version of the game after whatever deal they signed with NVIDIA expires and they consider porting the game to platforms without RT support.

      • MeaanBeaan@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        arrow-down
        1
        ·
        edit-2
        2 months ago

        This is incorrect. The new indiana Jones game requires raytracing as does the upcoming doom game. As much as you may or may not like it traditional rasterized graphics are (starting) to be phased out. At least across the AAA gaming space. The theoretical benefits to workload for developers make it pretty much an inevitability at this point once workflows and optimizations are figured out. Though I doubt rasterized graphics will completely go away. Much like how pixel art games are very much still a thing decades after becoming obsolete.

  • A_Random_Idiot@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    2 months ago

    The only good thing this generation about nvidia, is that their prices were lower than expected on the low end cards, Forcing AMD to cut the lunacy of their prices of their 9070/others in half.

  • Rav Sha'ul@discuss.tchncs.de
    link
    fedilink
    English
    arrow-up
    0
    arrow-down
    1
    ·
    2 months ago

    Can AIB’s add extra sensors for the OS to read, or will the nVidia driver not provide that level of information?

    • empireOfLove2@lemmy.dbzer0.comOP
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      2 months ago

      Unlikely, as the hotspot sensors/detection logic is baked into the chip silicon and it’s microcode. AIB’s can only change the PCB around the die. I’d almost guarantee the thermal sensors are still present to avoid fires, but if Nvidia has turned off external reporting outside the chip itself (beyond telling the driver that thermal limit has been reached), I doubt AIB’s are going to be able to crack it too.

      Also the way Nvidia operates, if an AIB deviates from Nvidia’s mandatory process, they’ll get black balled and put out of business. So they won’t. Daddy Jensen knows best!

      • Rav Sha'ul@discuss.tchncs.de
        link
        fedilink
        English
        arrow-up
        0
        arrow-down
        1
        ·
        2 months ago

        We’ll find out how the 5080 is on Thursday, but I expect that the 5070 Ti should have cool temperatures.

        • empireOfLove2@lemmy.dbzer0.comOP
          link
          fedilink
          English
          arrow-up
          0
          ·
          2 months ago

          Oh I’m sure the lower cards will run cool and fine for average die temps. The 5090 is very much a halo product with that ridiculous 600w TBP. But as with any physical product, things do decay over time, or are assembled incorrectly, and that’s what hotspot temp reporting helps with diagnosing.

          • Rav Sha'ul@discuss.tchncs.de
            link
            fedilink
            English
            arrow-up
            0
            arrow-down
            3
            ·
            2 months ago

            Isn’t a GPU that pulls 600 watts in the whackjob territory?

            The engineers need to get the 6090 to use 400 watts. That would be a very big PR win that does not need any marketing spin to sell.