• Joker@piefed.social
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 day ago

    FYI this is the nouveau driver no one uses. There is absolutely no reason to use an nvidia card and this driver.

    • thingsiplay@beehaw.org
      link
      fedilink
      arrow-up
      0
      ·
      21 hours ago

      My older previous computer has a GTX 1070 and I don’t use it because I avoid the proprietary driver. If the nouveau driver becomes good, I can use my older secondary computer for something else. Hell, it can even game, but that wouldn’t be my main usage anyway.

      So yes, there are people caring about nouveau driver.

      • entwine@programming.dev
        link
        fedilink
        arrow-up
        0
        ·
        6 hours ago

        With old hardware, beggars can’t be choosers. I get the appeal of the nouveau driver, but if your goal is to save a machine from the landfill, it’s probably the better compromise to use the proprietary driver and keep it actually competitive for as long as possible. Those 900/1000 series cards are still plenty powerful today, even if they can’t quite do AAA gaming anymore.

    • Brickfrog@lemmy.dbzer0.com
      link
      fedilink
      arrow-up
      0
      ·
      23 hours ago

      Nouveau seems to work pretty well on Debian 13 for me, at least for standard web browser / streaming / video playback with 2160p HDR tonemapping. Back when I was using Debian 12 Nouveau would lag badly during 2160p playback so I was force to use the Nvidia driver binary at the time. But so far it’s been alright, granted I’ve not tested any gaming and perhaps that’s where Nouveau won’t do as well.

      • thingsiplay@beehaw.org
        link
        fedilink
        arrow-up
        0
        ·
        21 hours ago

        Not exactly. There are two alternatives, depending on which card you have: a) the proprietary driver, b) the new Open Source driver that supports RTX 20xx series and upwards.

    • malloc@programming.dev
      link
      fedilink
      arrow-up
      0
      ·
      1 day ago

      Maybe best to avoid NVDA if using Linux, entirely.

      My next build is going to be AMD GPU and CPU with nixOS. I heard GPU support for Linux is better with AMD cards, but honestly haven’t delved into it whether it holds any truth or not.

      • thingsiplay@beehaw.org
        link
        fedilink
        arrow-up
        0
        ·
        21 hours ago

        I also switched to AMD+AMD. The GPU support being better on AMD comes from the Open Source driver that is integrated into Linux. But there are caveats. In example if you need OpenCL or other features, it can be problematic with AMD. Plus, if you have a Nvidia card 20xx series or newer, then you can use the new Open Source driver too. And Nvidia support for Wayland and other stuff got better nowadays (just reading about it, no personal experience with current, my last Nvidia card is 1070).

        While I prefer AMD gpu now, the “better support” is not really black and white.

      • Joker@piefed.social
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 day ago

        It’s generally easier because the drivers are built in. Nvidia is perfectly usable, but it’s more susceptible to breaking during kernel updates. It’s not as bad as everyone makes it sound though. That said, AMD is usually the way to go on Linux unless your use case requires Nvidia.

        • Jumuta@sh.itjust.works
          link
          fedilink
          arrow-up
          0
          ·
          23 hours ago

          “perfectly usable” as in you have to install a third party translation layer to make hardware video decoding work on firefox

        • NeilBrü@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          1 day ago

          The use case is precision CAD and DNN development.

          cuDNN+CUDA+TensorCores have the best TOPS/$/kWh performance (for now). Plus, I need ECC VRAM for professional CAD calculations.

          There’s plenty of reasons to use an NVIDIA stack.

          It’s just weird when people say there’s no reason to use their products.

              • thingsiplay@beehaw.org
                link
                fedilink
                arrow-up
                0
                ·
                7 hours ago

                Ah right, I have read about that, just forgot. Man HDMI is such a mess. Use Display Port whenever you can and don’t buy a monitor without one ever again.

                • frozen@lemmy.frozeninferno.xyz
                  link
                  fedilink
                  arrow-up
                  0
                  ·
                  6 hours ago

                  The 9070 XT supports HDMI 2.1b, and unfortunately my Sapphire NITRO+ has two of them and two DisplayPorts. None of my three monitors support HDMI 2.0 or 2.1, so one of them is stuck at 60 Hz right now, and I’m pretty annoyed about it.

                  • thingsiplay@beehaw.org
                    link
                    fedilink
                    arrow-up
                    0
                    ·
                    6 hours ago

                    Did you make sure its not an issue with the cable? Because the cables need to support the “correct” version and features of HDMI too, not just the GPU and monitor connections and the driver. Man typing that out makes me dizzy.

            • davidgro@lemmy.world
              link
              fedilink
              arrow-up
              0
              ·
              23 hours ago

              Something about AMD not being able to license the HDMI protocol in a way that allows open source code.

              The main Nvidia driver that people use is proprietary, so it doesn’t have that problem