Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Wayland didn't save the Linux Desktop for me, but I'd say that dropping nvidia + Wayland in favour of amdgpu + Wayland did.

Wayland is simply unusable with Nvidia's junky blobs. It frustrated me to the point of selling my founders ed 3090 (that I got from a friend at Nvidia at launch) and replacing it with 7900xt. It was like night and day. Not a single dropped frame (at 120hz), no tearing, not a single crash in 12 months. Just some minor issues with Electron to do with variations in fractional scaling across monitors. My 7840u 90hz oled Thinkpad is the same - not a single crash, not a single dropped frame. An absolute joy to use.

Wayland isn't perfect, but it certainly isn't the problem. At least it wasn't for me.



To my eye, Nvidia not playing ball seems to be at the root of some of the most significant issues with day to day Wayland usage. Unfortunately, with the degree of lockin they’ve achieved with CUDA they don’t have to care a whole lot.


At this point nVidia is an AI and HPC accelerator company that happens to attach video outputs to the accelerator cards for that niche use case.

They are still good video cards but that’s a side effect of their main focus.


That's true today but it's a really recent development that doesn't justify the current situation with their proprietary blobs.

I vaguely remember they made some announcement about planning to push things to the mainline kernel but I've stopped following most of the drama.


CUDA is their main moat, so presumably they don't want to risk having open source drivers provide outsiders an insight to how their hardware works...


Before that was the case, you could just say "gaming" instead. There is nothing new here from the perspective of Linux users.


Yet said company has a 90%+ market share in its 'side hustle' of gaming GPUs, and makes the SoC for the world's best selling console (the Switch).


They sell their non consumer gaming hardware for multiple times the profit. As llm usage keeps increasing I would not be shocked if Nvidia exited the consumer gpu market as it becomes smaller and smaller part of its business.


I feel like this is a perilous outlook - LLMs and other AI might not turn out to be the success stories we want them to be - we are still banking on their future potential instead of their present capabilities.

We are with AGI at a similar point we were with self-driving cars circa 2016(?). Back then it was predicted that each car will need a super powerful AI brain to be able to drive itself. We didn't get self driving and I'd bet these fancy driving assistants that we have today in cars don't justify the technological expense that went into them.

Gaming is a solid, established business and has been the backbone of NVIDIA's profits. The AI hype is very high now, and it's not clear where its going to go in the future, but it's not going to be stable.


Same arguments apply to crypto it seems, but that didn’t stop nvidia basically leaving the gaming market high and dry while the crypto market vacuumed up all of their cards.

I agree that’s a perilous approach for nvidia… but based on past behaviour that doesn’t seem to bother them.


to such a point that people will buy an Nvidia GPU for compute and an AMD CPU for the integrated graphics


This criticism of Wayland with NVIDIA drivers surprises me every time I see it, because I've been on Wayland a few years with NVIDIA hardware (both very old and very new) and it's all cool.

But I'm on Fedora. Are you on a different distribution? Ubuntu?


Last time I tried the 535 driver, GPU copies from XWayland and Wayland were broken, as well as Vulkan in Wayland. Pretty much the only thing which works are GL applications in Wayland. I found that both of these had been previously reported by other issues: https://gitlab.freedesktop.org/xorg/xserver/-/issues/1444 https://github.com/NVIDIA/egl-wayland/issues/72

I'm curious what kind of workloads you were running that you have had no issues. I wouldn't expect XWayland to have worked at all before Nvidia added GBM support.


> Wayland is simply unusable with Nvidia's junky blobs.

That's a lot less true as of their 545 branch. It still has a rather serious VRR sync issue but otherwise they're almost there. And aside that sync issue, it feels better than X does. It's close.


545 made my Steam games unplayable. Had to roll back to 535


> dropping nvidia + Wayland in favour of amdgpu + Wayland did.

This also dropped my laptops power usage by 2.5W.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: