• malloc@programming.dev
    link
    fedilink
    arrow-up
    0
    ·
    1 day ago

    Maybe best to avoid NVDA if using Linux, entirely.

    My next build is going to be AMD GPU and CPU with nixOS. I heard GPU support for Linux is better with AMD cards, but honestly haven’t delved into it whether it holds any truth or not.

    • thingsiplay@beehaw.org
      link
      fedilink
      arrow-up
      0
      ·
      21 hours ago

      I also switched to AMD+AMD. The GPU support being better on AMD comes from the Open Source driver that is integrated into Linux. But there are caveats. In example if you need OpenCL or other features, it can be problematic with AMD. Plus, if you have a Nvidia card 20xx series or newer, then you can use the new Open Source driver too. And Nvidia support for Wayland and other stuff got better nowadays (just reading about it, no personal experience with current, my last Nvidia card is 1070).

      While I prefer AMD gpu now, the “better support” is not really black and white.

    • Joker@piefed.social
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 day ago

      It’s generally easier because the drivers are built in. Nvidia is perfectly usable, but it’s more susceptible to breaking during kernel updates. It’s not as bad as everyone makes it sound though. That said, AMD is usually the way to go on Linux unless your use case requires Nvidia.

      • Jumuta@sh.itjust.works
        link
        fedilink
        arrow-up
        0
        ·
        23 hours ago

        “perfectly usable” as in you have to install a third party translation layer to make hardware video decoding work on firefox

      • NeilBrü@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 day ago

        The use case is precision CAD and DNN development.

        cuDNN+CUDA+TensorCores have the best TOPS/$/kWh performance (for now). Plus, I need ECC VRAM for professional CAD calculations.

        There’s plenty of reasons to use an NVIDIA stack.

        It’s just weird when people say there’s no reason to use their products.

            • thingsiplay@beehaw.org
              link
              fedilink
              arrow-up
              0
              ·
              7 hours ago

              Ah right, I have read about that, just forgot. Man HDMI is such a mess. Use Display Port whenever you can and don’t buy a monitor without one ever again.

              • frozen@lemmy.frozeninferno.xyz
                link
                fedilink
                arrow-up
                0
                ·
                6 hours ago

                The 9070 XT supports HDMI 2.1b, and unfortunately my Sapphire NITRO+ has two of them and two DisplayPorts. None of my three monitors support HDMI 2.0 or 2.1, so one of them is stuck at 60 Hz right now, and I’m pretty annoyed about it.

                • thingsiplay@beehaw.org
                  link
                  fedilink
                  arrow-up
                  0
                  ·
                  6 hours ago

                  Did you make sure its not an issue with the cable? Because the cables need to support the “correct” version and features of HDMI too, not just the GPU and monitor connections and the driver. Man typing that out makes me dizzy.

                  • frozen@lemmy.frozeninferno.xyz
                    link
                    fedilink
                    arrow-up
                    0
                    ·
                    5 hours ago

                    I’ve checked it, just to be sure, it’s definitely a 2.1 cable, but unfortunately the cable doesn’t matter in this case. My monitors are good, but they’re older. HDMI 2.0/2.1 wasn’t around back then. I get good refresh rates over DisplayPort (I believe they have DP 1.4), and my RX 6800 XT had three of those, so I just naively assumed a 9070 XT would as well.

          • davidgro@lemmy.world
            link
            fedilink
            arrow-up
            0
            ·
            23 hours ago

            Something about AMD not being able to license the HDMI protocol in a way that allows open source code.

            The main Nvidia driver that people use is proprietary, so it doesn’t have that problem