> Regressions are introduced all the time because Linux developers spend very little to no time checking that their code changes don't cause regressions or breakages outside of the problems they're trying to fix or features they're implementing.
Wow. Such sweeping generalizations make me not want to pay attention to anything this article has to say.
But let's entertain the possibility that parts of that statement are true. It's practically impossible for "Linux developers"—whatever that means—to ensure that a change they make doesn't break something downstream. Unlike proprietary operating systems, the Linux ecosystem is not maintained by a single company, so it's impossible to know and account for all the environments a piece of software will run in. Distro maintainers work tirelessly to ensure bugs don't make it to users, but it's simply impractical to test every single combination of hardware and software that exists. This is extremely difficult even for proprietary operating systems where development is centralized, let alone for a distributed environment like most software for Linux, and Linux itself, is developed in.
When you think about it, it's a miracle of modern software engineering and a testament of the power of collaboration and the free software movement that the Linux ecosystem works as well as it does, and in many areas matches or exceeds the functionality of other operating systems, all the while preserving important user freedoms. GNU/Linux is a stronghold for user-friendly computing among alternatives from companies that want to control how you use your computers, or profit from your data or attention. It's not perfect, but it's one of the few options we have (besides the BSDs, but I wouldn't consider them to be as user-friendly).
> It's practically impossible for "Linux developers"—whatever that means—to ensure that a change they make doesn't break something downstream. Unlike proprietary operating systems...
That's exactly the point, though. Linux will "never be ready for the desktop" because of the problems that this causes. Developers of software for Linux, not to mention kernel developers themselves, do not have the kind of funding that Microsoft has to test on a truly staggering number of hardware configurations. And on the opposite side of the spectrum, they don't have the luxury of supporting a comparatively small set of hardware combinations like Apple has.
It's far easier for Microsoft to test on different devices than it's for Linux volunteers. I have seen countless times where KDE and Gnome developers couldn't reproduce the issues mentioned because they didn't have to the hardware on which the problem was reported.
This doesn't mean Windows doesn't have hardware dependent bugs of course.
While it is not out of the question, Microsoft’s employees are disincentivized from testing on various hardware, as spending time to do testing does not contribute to their performance evaluations.
The most one can say is all versions of Linux cannot ever be a successful desktop OS.
However, it’s very possible for a company to build an Apple like company by creating their own Linux distro that is well tested and guaranteed to run at least as well on their hardware as Apple’s software does on Apple’s hardware.
This is the System76 approach.
Alternatively, you could create a distro that is restricted and sell it to hardware vendors who then have the responsibility to test your distro against their hardware. This is the MS approach. There’s no reason such an approach would necessarily have any more issues than MS does.
In fact, because of the open source nature of Linux development, it’s likely that the Linux versions should do better than their proprietary counterparts, because downstream companies, teams and users can test before the product is released and point out problems before the product has even hit alpha. Heck, they can even share patches.
The real problem with Linux on the desktop is that it doesn’t have enough market power to make the kind of testing MS and Apple and their downstream clients do financially worth it.
There is no reason why the server space shouldn’t have the same issues. But the Linux server space has enough money that downstream vendors clients and users are able to invest money in testing, which has meant the advantages of open source have pretty much wiped out all the competitors in the space.
> Linux will "never be ready for the desktop" because of the problems that this causes.
These problems can be mitigated, though. All it takes is a company willing to lead the effort and maintain a stable, limited and tested ecosystem of software built on top of a stable, limited and tested version of Linux. And/or maintain a secure runtime environment like Google has done with Android.
The problem with this is that it invariably takes away some of the freedoms Linux users expect. Personally, if I'm not allowed to run any software I want, how I want to, and terms like "sideloading" and "jailbreaking" are thrown around, it's not Linux and I don't want it. At that point you're no longer using a general purpose machine, but a machine that someone else decided how it must be used.
Besides, like a sibling comment mentioned, bugs do happen even on Windows and macOS. This is an inescapable fact of software development. I was simply pointing out how harder it is to manage this on Linux, and yet it remains usable for many people despite of it.
> they don't have the luxury of supporting a comparatively small set of hardware combinations like Apple has.
And even then, Apple has never been able to promise that you can keep the software features your computer shipped with across system updates. If a feature is removed, your only path of recourse is to treat your current patch like an LTS release.
I don't see a major problem here. Hardware manufacturers can take up the burden of testing their products on Linux, and some already do, like Lenovo certification for Ubuntu (in practice their laptops work perfectly well on other Linux distributions).
And if the assertion was true, I'd be feeling those regressions at least monthly on my leading edge, unstable (~amd64), Gentoo box, that is my daily driver desktop I use at $DayJob.
This box has been stable enough to make money. But not as stable as my backup (also Gentoo) which runs stable and updates weekly.
Edit: my biggest source of stability issues are Nvidia drivers mis-matched from kernel. Takes a whole reboot to fix.
No coder should ever, ever be shocked that broken code mostly kind of seems to work, especially if you don't notice things like file corruption because you're not looking, or memory corruption that doesn't happen to matter at the moment. A lot of the failures and bad code are in the unexpected path.
When this box does have issue (rare), it's right after I've done some upgrades to packages that the distribution has told me aren't ready. I know I'm standing on the edge (of glory!).
Also, I use ext4 since like forever, ext3 before that and have never had issues.
To be clear, my point is that these open source devs (I'm one too) are very careful to not regress. I think the software quality is very high.
When I used ext4, I had unexplainable ccache corruption about once per year when compiling updates for my Gentoo Linux machines. Since switching to ZFS about 12 years ago, ccache has not become corrupted once.
You've lost context and appear to be going after a weak interpretation with the implication I don't know how to sysadmin.
The context was the sweeping generalisation that Linux devs and that they don't care about regressions, see up thread by @imiric. In that context I've not had regressions. More explicitly, for this narrower context, I've not had regression bugs in ext3/ext4.
I'm not making a claim that I've not had never had ANY issues or never have to fsck my drives; nor am I making claims that nothing has ever appeared in 'lost+found', nor am I claiming that ext3/ext4 are perfect. Nor that Linux is perfect.
Just claiming that the software quality in the Linux ecosystem is very high. I'd say better than many closed-source solutions -- which couldn't be proved anyway.
I've only been at this since 1994 and still have a lot to learn. However, I don't learn much from the "you're wrong and not good" type of conversation.
I think you’re missing the point. Unless you were actually measuring the content of your file system with a reasonable quality hash function, you likely had actual data corruption that Fsck will never see. Fsck can’t check data on ext*.
Unless you were routinely running mtree or something similar, you’d honestly have no idea.
My point here is that software can appear to be fine when it’s actually shit because most people won’t notice any kind of corner case. All you’re saying is that you’ve very likely missed the corner cases.
I use ZFS to protect against file corruption and ECC RAM to protect against memory corruption. The combination works well.
The combination is not perfect unfortunately. The other day, I had a CPU cache parity check failure hang vim when I was trying to close it. I had to reboot to get rid of the hung kernel thread. It would be really nice if all CPU cache levels had full ECC protection, but only some of them do.
Just check this bug tracker. Just this one. Bug report on top of bug report caused by a fix which was caused by another fix.
Sadly it's not just one bug tracker, I keep an eye on many of them.
Of course in the magical Linux land if "It works for me", then the author is lying.
Never mind that the author of this "click-bait" article is a ... Linux kernel bugzilla maintainer. Yeah. I can vouch for every letter in the article. It's written with my blood.
Does that guy seriously maintain bugzilla for the Linux kernel and yet cannot decide whether Linux is an operating system or not:
> Linux is not an operating system
> While Linux is unrivaled on servers and has been the world's most popular operating system for over two decades, the situation on the desktop is quite bleak.
It either is or is not. It is illogical to claim both true and false to be true at the same time, yet that is what he did.
I suspect that the unspoken bit here is that hardware qualification is extremely important.
On the desktop, Linux is an afterthought for the vast majority of consumer hardware vendors. Many Linux desktop users run configurations which could generously be called “alpha”.
With MS windows, this burden is shouldered by every hardware manufacturer. It’s nonsensical to release a peripheral which doesn’t work on windows. Laptop/desktop makers then qualify common applications to make sure everything works as expected.
If any of the above doesn’t work out, then MS will get a call, or the hardware vendor.
>Sadly it's not just one bug tracker, I keep an eye on many of them.
How many Microsoft or Apple bug trackers do you have access to? The fact is that we have to consider that open-source work is largely done by volunteers and commercial developers with their own use cases in mind, and we only know about the drama because it's out in public. The drama inside big tech firms could be worse and on a larger scale and we would likely not know about it. Even if an insider told you something, it would be only a small part of a bigger picture you can't see.
Working for a vendor that ships Linux code, we find regressions all the time. Because we actually test things. We do try to be good citizens, and Linux is the right answer, but it isn't a false assertion on the part of the author at all.
In a previous job, 20 years ago, we found that the Linux USB stack was so bad that it was full of deadlocks and unprotected shared memory on 2.6. Just atrocious. And subsequent releases broke it again and again.
About 5 years after that, working for a different company, I had a well-known kernel developer tell me point blank that as long as we release our code to the kernel, they will "keep it working." When I probed what that meant, he genuinely saw no difference between "compiling" and "working."
There are various efforts to add testing/tests. Those are good. But we lost a lot when Solaris, Irix, etc. all died, and Linux has never quite gotten there, and I doubt it ever will.
> [...] it isn't a false assertion on the part of the author at all.
I wasn't arguing that it was entirely false, but that the generalizations of "regressions are introduced all the time" and "Linux developers spend little to no time" can't be applied to the entire Linux ecosystem.
I wouldn't discredit your experience, but speaking of Linux kernel development specifically, my understanding as a user is that it's held to a very high standard of scrutiny. Rants from Linus are famous for drilling into people that "WE DON'T BREAK USERSPACE". If you consider how many thousands of lines of code are changed on a regular basis, it's an engineering wonder that it works as well as it does. In many ways Linux is a victim of its success.
Yes, they are very careful about not breaking userspace. That has nothing to do with holding the _user_ in high regard, though.
The sweeping changes in Linux are amazing, but they also break a lot, frequently, and the testing approach leaves a lot to be desired. I am old and a former embedded coder, I understand how the 90s developers feel about unit testing, and I am sympathetic to the idea that test-driven-development and heavy UT in modern teams is often completely out of control, but Linux is distinctly lacking in real testing of independent units in isolation. I am aware of kunit, kselftest, etc.
I am very familiar with reporting bugs on the Linux kernel, as you may imagine from the previous comments.
But you’re just reiterating that the kernel development itself lacks sufficient testing which necessitates this. Everyone is a volunteer; the result is as you’d expect.
Look, there’s nothing wrong with this. In reality all the people who are electing to run Linux without it actually having sufficient quality for their purposes are the actual problem, who’d which there are hundreds of companies that just don’t care. We created this situation by undervaluing quality vs cost.
But it’s also the reason it won’t be a success on the desktop. The only way that happens is if windows takes an even more substantial nosedive. Things are going to have to get pretty bad for that to happen.
I am going to assume that Lenovo, Dell, nVidia, AMD, Intel etc actually test Linux (or farm that out to RedHat etc) actually test Linux on the hardware that they support. But I could be wrong of course.
I am sure what he means that is we lost a lot when it become unprofitable for companies to offer solid UNIX offerings that were supported in the way he would like. Other than the fact that it evolves glacially, Illumos is in an even worse position than Linux when it comes to offering support.
If we are being honest though, in the terms that this article is arguing, "Linux" as an ecosystem is the wrong label. When it comes to stability and support, we need to think of each Linux distro as an Operating System. In that sense, things like RHEL are certainly well supported--including on the desktop. If your app targets RHEL, you pretty much have none of the problems discussed in this article. You still have a smaller universe of applications than Windows of course which is to be expected.
So, the problem boils down to "nobody is making sufficient money off desktop Linux to steward a desktop Linux distro". That is not careless devs or bad design decisions though--no matter how chicken-and-egg you try to paint it. It is simply the network effect.
Decent desktop operating systems exist and they already have large application universes and large, monetizable user bases. They appeared into a vacuum that lacked such operating systems but that vacuum no longer exists. That makes it almost impossible to economically introduce a new desktop platform. Certainly none of the commercial attempts have succeeded in the last few decades. In fact, the system that has been the MOST successful is Linux. By that metric, saying that Linux is doing it all wrong does not ring very true.
Valve is doing a lot to make Linux gaming a reality because, when it comes to the network effect, it is largely true that Steam ( not just Windows ) is the true platform. So, if you can bring Steam to Linux, you can move the users there too. It is still early days for that but it appears to be working.
On the mobile side, the network effect is very much alive as well. This is why Windows, using of course all of Microsoft's tricks, strengths, and knowledge, completely failed while the Linux kernel ( in the form of Android ) has totally dominated. And that is not for any of the reasons cited in the article. Do you think an Android app from 2008 will run on Android today unmodified? An iOS app from 2008 on iOS today? I don't think so.
illumos is the living and thriving fork of OpenSolaris. I run some customers' private clouds on illumos. There are some contributions we may be able to upstream before 2025Q3.
Looking back on how things panned out, going all in on illumos a bit over ten years ago is something I wish I had done.
addendum: by "all in" I mean as an operating systems developer.
“it's a miracle of modern software engineering and a testament of the power of collaboration and the free software movement that the Linux ecosystem works as well as it does”
It can be a miracle, and yet still not be good enough. Even as a business user, knowing that if I build an app and it probably won’t work in 5 years, is all I need to know about this hobbyist toy. Linux is fundamentally not ready for anyone who has needs more than a Chromebook (which also runs Steam, by the way).
It’s like pointing to a skyscraper made out of sand. Sure, it collapses every time the tide comes in, but your robots can rebuild it in 5 minutes flat, so it’s clearly ready for tenants.
> It can be a miracle, and yet still not be good enough.
"Good enough" is highly subjective. You're entitled to your opinion, of course, but millions of people also think otherwise.
> Even as a business user, knowing that if I build an app and it probably won’t work in 5 years, is all I need to know about this hobbyist toy.
That ultimately depends on your tech stack and app features, much more than your OS. If you build a statically linked ELF binary that runs on the command line, chances are it will continue to work decades from now. If, OTOH, your app depends on a specific version of a GUI toolkit or some esoteric libraries, you might run into issues in the future. This is the same on any other OS. Microsoft does a commendable job at maintaining backwards compatibility for apps written in their supported stacks, but step outside of this and you'll experience problems. Same for macOS to a lesser extent (Carbon -> Cocoa, etc.).
> “Good enough" is highly subjective. You're entitled to your opinion, of course, but millions of people also think otherwise.
Ironically, I might as well point out that billions of people think otherwise of Linux. In some countries, a few trees is “good enough” as a bridge. The IRS, of course, says COBOL is “good enough” to use as a web backend for the moment.
The weirdest thing is how people complain that engineering is a race of the bottom. Scarcely a thread goes by here without a tragedy of the commons reference. Yet inexplicably, business executives are terrified cowards when it involves free software. This of course makes no sense; you can’t have both. Either the complaints are wrong, or Linux is in fact unsuitable for purpose.
> Microsoft does a commendable job at maintaining backwards compatibility for apps written in their supported stacks, but step outside of this and you'll experience problems
Oregon Trail, a game made for Windows 95, still runs on Windows 11. (Edit for below complaint, as I’m posting too fast: You need all the DLLs that came with it on the original CD in the same folder as the EXE.)
Try installing and running any Linux app, be it even GNOME Chess, from four years ago. See how it goes. Consider my allegory of the sand skyscraper.
On that note, the Carbon to Cocoa transition mostly wrapped up a decade ago, and was completed by the majority of apps two decades ago. Try installing a Linux program from 2005 with a UI made in any toolkit of your choice.
The Linux community chases trends and kills APIs more often than Google.
> Oregon Trail, a game made for Windows 95, still runs on Windows 11
Fatal error: MSVCRT70.DLL could not be found.
But that's easy enough to fix, right? Just dig through Windows's package manager (sorry, I meant MSDN) trying to guess the right vcredist, and try a few and hope for the best.
The Visual C++ .NET 2002 runtime shipped with Windows 95?
To be fair, I've seen plenty of commercial Windows software ship over the years without runtime components required on the original supported OS (or with required redistributable components that had to be separately installed from the distribution CD-ROM for no well-explained reason).
Alternatively, look it up by reading winetricks’ source code. Wait, would that be cheating considering that winetricks is a UNIX tool? Linux and UNIX systems in general are supposed to be harder, not easier, in comparison to Windows, right?
For those unable to tell, I am using sarcasm in that last part.
> I might as well point out that billions of people think otherwise of Linux.
These same billions, sans a very small fraction, don't even have "operating system" in their vocabulary. And as far as they are concerned, Chrome browser on one operating system is just as good as Chrome/Chromium on another operating system.
> Try installing a Linux program from 2005 with a UI made in any toolkit of your choice.
Personally I would not have difficulty doing this with containers. But there is no reason to do this at all. No sane and sober person wants this.
> knowing that if I build an app and it probably won’t work in 5 years
Would you say MacOS, Android, and iPhone are also hobbyist toys then?
And by the way, if you statically compile your app for Linux it will basically work forever. Linus's mantra: We Do Not Break Userspace. These are userland problems and can all be avoided with a linker flag if it's a priority for you.
You cannot statically compile a lot of stuff. Too many Linux libraries are GPLed, and that only works for open source applications.
And then one day the underlying libraries will have security problems. Your statically linked application has just become a gaping hole.
And finally, you don't want your statically compiled binary to be hundreds of megabytes and take forever to launch, do you? But seeing how many people here are advocating Snap/Flat/AppImage, I guess it's not a problem? But why would you need a Linux userspace or OS in the first place when all you are doing is running virtual containers? You can run them on Windows or MacOS as well.
Not many libraries use the GPL because “that only works for open source applications”.
That said, you can just copy the .so files if you want to run the old binary.
Furthermore, binaries are mmapped, so even if it is very large, it will still launch relatively quickly since only the parts needed to launch are read into memory.
Can you stop for a second and realize your "solutions" are absolute unmitigated crap and 99.999% of people of the world wouldn't bother even if they could?
In Windows you don't have to do that. You can compile an application in Windows 11 and run it in Windows XP for Christ's sake (if you don't use new Windows APIs).
Show me a way to compile anything in Ubuntu 24 using only its own libraries and have it run in Ubuntu 6, huh? Without virtualizating/chrooting/crap like that. No, you can't? Fine.
> Not many libraries use the GPL
Let me start: Qt. Before your spew "GTK", who is going to pay for porting the code to GTK/GTKMM (since GTK is C only)?
End of your wonderful story. You really want to bend reality to make Linux look "OK". It's not. It's bad.
First, LD_LIBRARY_PATH works well (or has for me). If for some reason the elf interpreter is unable to use the older libc, just use the old elf interpreter. Second, do static linking to achieve what you want. Then it will run on the older Ubuntu trivially.
That of course requires updating the application when a vulnerability is found in a library, but this is no different from Windows where you have to update many applications to fix vulnerabilities in third party libraries. The number of vendor updates to fix the same libjpeg of libpng issues on Windows is truly staggering.
The Qt libraries do not use the GPL. They use the same LGPL that GTK does. You are making stuff up to bash open source UNIX machines (which is what Linux machines are).
For 99.99999% average people out there none of this utter cryptic crap works even remotely which basically means it doesn't exist.
Can you stop with anecdotal evidence? I don't care that you can make old software works. I can do as well, you know. Been there long before you even knew Linux existed. Had a nice little trip with GCC 2.96 by RedHat if you even remember that fiasco of epic proportions.
Now you say it is possible, but are fabricating figures for how many people can do it. Thus, you acknowledge what you said was untrue. Both things cannot be true simultaneously.
This is a feature of Linux. Linus Torvalds is very adamant about supporting old binaries with new kernels, even if it is not commonly done.
This is a broken mindset, that you should be able to take an app and use it for 5+ years. When computers were in their infancy, and systems were only updated every 4-6 years, it might have made sense, but this doesn't in our modern software environment.
Nowadays all software is built on libraries and frameworks, and they have security issues and even just bugs, and you want to get those fixes.
If you want to run 5+ year old software, you can now do it natively in a VM in almost any computer; so why does my shiny new OS have to run ancient binaries again?
Without intending to be inflammatory, I think that the mindset that you are espousing is broken.
Security is is a real issue for a subset of computing tasks. To further your point, for those tasks you can argue that constant vigilance and patching are a necessity of the modern world (an alternative and arguably better approach would be that formal verification and not updating, as often applied in safety critical control systems.) However, security is often used as a pernicious ruse for forcing obsolesence: want the latest security patches? update to the latest OS version. Oh look, the latest OS version no longer runs on your perfectly good hardware. Or similarly, oh look, your perfectly good software no longer runs on the latest OS version.
But now consider the subset of computing that does not need to involve security either because it has literally no security implications, or because it can be sandboxed by the OS (e.g. games, music and video production, architectural design, scientific simulation, mathematical research, ...) There is a large body of this kind of software that works perfectly well for any number of years (modulo forced obsolesce initiatives like "modernising" the UI or moving to the cloud). I would argue that the primary function of the OS should be to provide a stable platform for running such software securely. Yes, the user could learn how run it under emulation, in a container or VM, but then what is the purpose of the OS?
The alternative is a high software maintenance burden/cost to everyone (for applications to just keep the lights on, or users to stay current in a churning software landscape) and/or the destruction of a massive amount value in developed but no longer easily able to be run software: this value destruction here is twofold: (1) the licensee can no longer run the software that they pay for, and (2) the effort expended to develop said software is discarded.
Okay, I understand what you are trying to say, but some of what you are using as examples doesn't support your argument. Games, unless they are simple single player games, need the network. Scientific sims and math research are best served on the latest hardware, which often needs the latest OS/software to get the most out of it.
I think your arguing that MOST software should run without needing to be updated, and when we lived in a dial-up world and before, that was a very viable position. But with all of our machines on an always on network, the OS has to be kept up to date.
Most businesses just want things to work; their software/hardware costs are often rounding errors when amortized over their lifetime. I'm sympathetic to users who have paid for programs wanting to run them forever, but software businesses have to make money and sell new versions, so they add new features and follow the OS upgrades.
It's hard for businesses to support older software or a big diversity of versions; it's why companies mandate a standard and try to enforce it.
With Microsoft, they are making millions on OS and related basic programs, and so can afford to support things for a long time. With OSS like Linux, there is less funding and less people who are interested in running old versions.
As someone who has had to keep some software up to date on Linux, it can sometimes be more of a pain to update a package (because of dependencies) them to just recompile the thing from source.
The 5 years that LTS releases get are good for 2 average commercial update cycles, which I think is reasonable. Beyond that, people's skills are going to be out of date.
A statically compiled app that used Xorg won’t run on many Linux distributions at the moment.
A statically compiled app that used Jack for audio might have problems running on PipeWire.
A statically compiled app from 2012 that used PolicyKit for authorization might not work today when asking for sudo. Everyone’s moved on to polkit.
The meme is wrong, don’t regurgitate nonsense. I can guarantee you my Windows app compiled in 2007 still has the UAC prompt and the audio system working correctly.
> A statically compiled app that used Xorg won’t run on many Linux distributions at the moment.
Not a single major distribution doesn't ship Xwayland by default. The entire gaming community relies on X11 continuing to work, just like they rely on x86 libraries to stick around.
> A statically compiled app that used Jack for audio might have problems running on PipeWire.
A statically compiled app that uses Jack for audio might have problems running on Pulseaudio/ALSA/OSS. Jack's never a thing that's generally available, so claiming it's a backwards compatibility issue is nonsense. If your application requires Jack, then you need to ship it.
Note that you'll also have issues on Windows and Mac because Jack - unsurprisingly - is not installed by default on those platforms either.
> A statically compiled app from 2012 that used PolicyKit for authorization might not work today when asking for sudo. Everyone’s moved on to polkit.
If you used policykit correctly you should have either used the dbus protocol, or the command line utilities. These are the APIs for userspace and they haven't changed since 2012.
What distro are you using that doesn't package Xwayland or pipewire-jack? Sure those are compatibility layers, but so is WoW64. The Linux and Windows backcompat strategies aren't all that different here.
> It can be a miracle, and yet still not be good enough. Even as a business user, knowing that if I build an app and it probably won’t work in 5 years, is all I need to know about this hobbyist toy
Build a Flatpak, you're done. Your application can depend on a stable runtime and work on any Linux distribution.
> Linux is fundamentally not ready for anyone who has needs more than a Chromebook (which also runs Steam, by the way).
Chromebooks now cover 85% of the U.S. education market, that's most workloads. The Steam Deck is also Linux (SteamOS).
What's left in the world of Win32 apps? Specialty commercial applications like Photoshop, first person shooter games that require an anticheat kernel driver. If you don't need those, a system like Bazzite is much less user hostile than what Windows has become in the era of Windows 11.
>Build a Flatpak, you're done. Your application can depend on a stable runtime and work on any Linux distribution.
Flatpak weakens but does not entirely remove the dependence on the underlying Linux distributon. Flatpak relies on a system service called xdg-desktop-portal to bridge apps inside the flatpak sandbox with the host OS. Applications distributed using flatpak may not work as expected if the distribution ships too old a version of xdg-desktop-portal.
At one time I was a big advocate for Unix/Linux on the desktop, but I think that ship has mostly sailed.
But, it seems that FUD about what it is capable of is still actively here, which is good to know. With frameworks and libraries, there is a 0% chance you don't have SOME update for your Windows app in a couple years, forget 5.
Now, there are lots of headwinds against Linux -- Windows is a known quantity, everyone knows the MS Office suite, people hate change and don't want to learn new stuff.
But to pretend that Linux is a house of cards because there are sometimes issues that cause troubles is not being honest. Even Windows can have big issues, or have we forgotten the CrowdStrike outage earlier this year?
All OSes have always had issues; that's why all of them have patches and updates.
I would blame Microsoft for that one, since they had the ability to ban crowdstrike and others who have no business being in the NT kernel from the NT kernel, but instead gave them access by giving them code signing certificates.
Apple on the other hand got them to switch to userspace:
How many people are running a locally-installed office suite at this point? Mailing files around seems archaic for most purposes. At my prior (open source) employer, even we had mostly given up on running local office productivity apps.
I’ve used Linux since 2011; I know all about it; and I would recommend it to nobody because the community can hear no criticism, bear no criticism, and self-rationalizes everything. Sometimes it takes a bold stance to make a point against one of the most toxic, delusional, and self-righteous communities online.
For my part, I’m back on Windows and Mac. Screw the community’s arrogance more than the project. The community, as even the comments here show, does not deserve a victory.
Which community can you walk into telling everyone that everything they care about is shit and they are shit and will never make it, and they will take it productively?
I Certainly admit that some Linux users are heavily in denial about linux's flaws, and don't take criticisms healthily, but this is a trait in humanity, not just in Linux users.
If your version of diplomacy is starting off a meeting with urinating on their flag, don't say diplomacy never works because no other countries can take criticism.
I think it’s clear that today’s “desktop Linux” distributions are not what you want. That’s ok. Your criticisms come from the point of view of a usage that is poorly supported.
Android is perhaps the closest to what you seem to want, although even it is much more aggressive about deprecating APIs, ABIs, and features than Windows.
The is no Linux distribution today that even attempts to deliver the interface stability you appear to want. And given the development model and business structures involved, I think it’s unlikely there ever will be.
You, and the many computer users like you, will probably never have a suitable Linux distribution. That’s an unfortunate reality. It’s not your fault: what you want is reasonable, but Linux isn’t set up to deliver it.
> …the community can hear no criticism, bear no criticism, and self-rationalizes everything. Sometimes it takes a bold stance to make a point against one of the most toxic, delusional, and self-righteous communities online.
if one’s “criticism” is snarky or just plain needlessly rude, one shouldn’t be surprised when they receive the same in return, this is extremely basic human behavior. very very basic.
we all learn this in like elementary school, if you’re rude, people will treat you like a rude person. it’s not shocking and it’s not unexpected. its extremely basic social shit learned by even small children.
- "The vast majority of applications exist only for Windows, and speaking of games, Linux hasn't seen any AAA titles for many years now,"
The vast majority of AAA titles run seamlessly on Linux. Per protondb[0], ≥80% of the top 10, top 100, and top 1000 are rated "platinum" or "gold" for Linux functioning (by users).
> Also, despite millions of players, it's hard to call CS2 an AAA title because it's based on Direct3D 11 (which is now more than a decade old) and lacks modern lighting techniques like ray tracing.
Some of the quotes from the OP's page border on parody. So, CS2 doesn't count as a AAA title because it's based on DX11 and doesn't have realtime ray tracing. But apparently this same person is oblivious to the fact that both real-time RT and DX12 translation have been completely usable on Linux for years now. Nvidia helped ship native RT to Linux clients in Wolfenstein: Youngblood, and Proton has supported the majority of DX12 games with RT since before the Steam Deck existed.
Kinda makes you glad that this is the final edition, if that's the sort of due diligence being exercised.
I don't think that will convince many people. Appealing to authority is one third of rhetoric, Aristotle would say that you need emotion or logic to convince me completely.
Speaking for myself, I'm mostly surprised that someone with such an intimate knowledge of Linux has ignored it's developments to this extent. This entire thread is mostly at the consensus that you have tortured many of the talking points in this article beyond taking seriously.
Many in this thread have also identified that MacOS and Windows are deficient in several of the criteria you listed. So what even is a desktop, really? An ideal, or a thing people use?
Not a single Windows MacOS iOS Android user knows what their display server is and how it works.
Xorg is being deprecated and replaced with semi-broken something where tons of functions only work in KDE and Gnome. That's all you need to know about Linux.
You are calling the Xorg X11 server Xorg, which is wrong. As someone who has “contributed” to so many projects, you should know that. In any case, it is odd you are whining about this considering that the article’s author is a well known X11 hater and you seem to agree with everything he says.
By the way, “Not a single Windows MacOS iOS Android user knows what their display server is and how it works” is both a non-sequitur and untrue. The guys who wrote the display servers used on those are users of those systems.
> You are calling the Xorg X11 Xorg, which is wrong
Xorg has been the de-facto X11 implementation on Linux for the past 25 years.
First XFree86, then it was superseded by Xorg.
You're being nit picky only that's not a counter argument. That's ad hominem.
> the article’s author is a well known X11 hater and you seem to agree with everything he says.
I'm the author of the article and I'm not a hater. Understanding the limitations of something is not "hatred". I guess the entire world is black or white to you, either I have to be deeply in love with Linux or "You're a hater" which is another ad hominem.
See you've stopped arguing and now switched to discussing me personally. I knew it would happen given how you never had any counter arguments in the first place, the problem is I've been here for over 25 years and know Linux inside out and EVERYTHING that is "claimed" in the article is the common truth.
> By the way, “Not a single Windows MacOS iOS Android user knows what their display server is and how it works” is both a non-sequitur and untrue.
Yet another "counter argument" that lacks any validity because it's empty.
> The guys who wrote the display servers used on those are users of those systems.
This is an article about Linux _USERS_.
You still don't bloody understand that. I don't care about software engineers that can make anything work given enough time. I don't care who wrote Quartz or Surface Flinger. They are not end users.
End users use Operating Systems, buy them and buy software for them. Their developers are absolutely IRRELEVANT. There have been dozens of forgotten OSes that no one uses.
There is nothing “ad hominem” in what I said. I was discussing your remarks, rather than you. However, I will now discuss you. You exhibit severe deficits in logical reasoning skills. Some of your comments exhibit signs of disorganized thoughts, which are linked to schizophrenia.
To give an example, the sudden interjection of “Not a single Windows MacOS iOS Android user knows what their display server is and how it works.” appears to be an example of disorganized thinking, since its connection to what was being discussed is superficial to an extreme. There are other comments by you that exhibit signs of disorganized thinking even more clearly.
You likely could benefit from seeing a psychiatrist.
I have commits in half the projects you mentioned and I think the article is FUD. There are technically wrong points in it, such as the claim that you cannot run old Linux binaries on modern Linux. People are able to run old binaries on Linux that are 30 years old, although it requires copies of the old libraries those binaries need. If the guy actually is involved with so many projects, he would know this.
The guy even contradicted himself:
> Linux is not an operating system
> While Linux is unrivaled on servers and has been the world's most popular operating system for over two decades, the situation on the desktop is quite bleak.
You've not read the article at all and it explains full well what constitutes an OS.
The Linux kernel is NOT an OS.
Any given Linux distro which is not compatible with any other Linux distro or even its own earlier or later versions couldn't be called an OS. It's a software compilation for a certain time period.
An OS implies everyone runs it, anyone can compile software for it and have it run on all devices with this OS. This is impossible with Linux outside of Snap/FlatPak/AppImage which themselves are lightweight virtualization solutions.
This is what Windows, MacOS, iOS and Android are. Linux is not.
Also, please don't throw FUD around without explaining yourself. You probably don't even know/remember what it means any more.
I read the article. It did correctly say that Linux is not an OS and that the distribution is the OS. Then it said Linux is an OS. The article author and you are the ones spreading FUD here. Much of what the two of you say is factually incorrect.
None of that fits the definition of an OS. As per Wikipedia, “An operating system (OS) is system software that manages computer hardware and software resources, and provides common services for computer programs.”:
2. Does Wikipedia page buy operating systems and software for them? No? That's what I thought. Where's native commercial applications for Linux aside from very few specialized ones? Where's native Linux for Linux? Oh, wait, even Indie developers have abandoned Linux because why if there's Wine + DXVK.
3. Who cares about obscure something that pretends to provide APIs? How is it relevant for adoption and usage? Oh, wait, it's not. We've had dozens of dead wanna-be OSes that provide APIs. All dead. Linux on desktop is one of them as well.
Yeah, "I'm wrong".
Linux desktop market share proves it beyond reasonable doubt. LMAO.
The fact that absolute most people shy away from Linux is another proof that it's a "great OS" and "better than Windows".
Keep on living in your fantasy world. Sadly people around you don't share your ideals. They want shit to be done. They install Windows, use it for a decade without any issues, buy a new device. All good.
You response has almost nothing to do with the comment to which you replied. There is no sense in trying to address the moving target that are your replies since you are unable to stay on topic. I will however address the one on topic thing that you said. Wikipedia’s definition of an operating system is roughly what was taught in my university when I got my CS degree and I am in full agreement with it, so regardless of whether Wikipedia is an authority, the definition Wikipedia provides is correct.
CS2 is at most AA and mostly just facilitates skin gambling. The way that CS2 was kinda clumsily dropped, replacing a game with another game in kind of an unpolished state, also doesn't suggest that it could even be "AAA". It's low key a public playtest still. Early access. But that's besides that point.
Can't argue with that. I'm still not super worried for Valve though; between Alyx, Deadlock, the Deck, and Proton, it's pretty clear that CS2 is less of a new norm and more of an outlier too popular to be announced dead and buried yet. Counter Strike has always felt like the drunk uncle of Valve's IP, it would be nice to see him clean up for once but I'm also not upset having him at a holiday party or out at the bar with friends.
I would be worried for Valve if even with their infinite resources they cannot manage to do right by their most popular game on Steam. There's enough multiplayer shooter games from Valve that are also neglected (tf2, l4d, both have huge issues with bots and hackers) to suggest that it is the norm for them, be it due do disinterest, incompetence, or both.
I'm seeing a lot of endorsement of Wine ITT, which is strange to me because the Linux Mint forums constantly seem to trash it whenever it comes up. (I'm not particularly interested in any programs that would need it.)
You are probably seeing endorsements of proton rather than vanilla wine. Proton is more than just wine (although wine is the most importantly part) and it works much better than vanilla wine.
Because those are ratings are based on consolidated user ratings, some of them are outdated and still marked Gold. Grand Theft Auto V is Gold, but online doesn't work anymore. Yet users are still giving it a thumbs up because singleplayer works. Apex Legends is also still Gold, and completely no longer works since there's no singleplayer
I've only accidentally (via a GeForce Now trial before I gave Steam Remote Play another try) played a video game on a Windows computer in the past two years.
The only games that routinely cause problems are those with ridiculous kernel-level anticheat, and I am not interested in those games.
Anecdotal, but I've had maybe a 60% success rate running platinum-rated games under Proton, with an unremarkable Ryzen 3900/RTX 2070 desktop. I suspect the further you are from Steam Deck hardware the more trouble you're likely to have.
The M4 Mac mini finally convinced me to give up on a Linux desktop for now; the combination of native Mac games, Whisky, and Windows via VMWare works at least as well as Proton for me, and everything else including development is much more convenient, including running Linux VMs when useful.
Steam also runs on Chromebook; which is easily showing that this is a very shallow victory for Linux on the desktop as a platform.
If you can survive with Proton, LibreOffice, and GIMP; you can much more comfortably survive with Proton on ChromeOS, Google Docs, and Photopea. Just hating Windows is not enough of a reason to be a Linux user, and as long as that is the case, Linux will remain niche.
Are you saying Linux has "games" because it excels at ... emulating them?
What's the point of using/running Linux then when you have a much higher chance of running games successfully natively? What about dozens if not hundreds of games that use a kernel level anticheat? Pretty much none of them work in Linux.
What's the point of native Linux APIs then?
That's weird.
Wine + DXVK are so good, Linux is GREAT.
No, it shows how horrible Linux is when not a single vendor outside of Valve wants to touch it. And everyone who tried to port games to Linux went bankrupt.
Proton does such a good job that it is difficult to justify the cost of porting games when they already run on Linux in Proton. That said, Feral Interactive is still porting games to Linux and they have not gone bankrupt:
Proton often runs games better than the native ports, so there is not much of a loss if there are fewer native ports. Furthermore, in a number of cases, Windows games actually run better on Linux in Proton than on Windows.
Elden Ring is a fantastic example, as on day 1, it performed about 15% better on Linux in average FPS, and had substantially less stutter if I recall correctly.
World of Warcraft is another great example. It is well known to get about 20% better performance on Linux than on Windows:
If you use AMD GPUs, it is more often the case that Linux will outperform Windows in games, since Valve wrote the shader compiler used for them on Linux and it is fairly amazing:
I ran Quake II RTX on Linux on a GTX 1080 Ti with RTX on back when I still had one. Of course, the performance was bad, but it ran. Some sites have benchmarks:
I also no longer have one to try. It's pretty easy to find a video on YouTube of someone trying to start the game and failing with a GTX GPU. The specific Vulkan extension it's asking for is VK_KHR_ray_query.
You can't claim that Linux games running on Windows counts against Linux, but ignore that Windows games also run on Linux. You can't have it both ways.
I mean it's maybe "seamless" if you use a system provided by the top DRM platform. Using wine directly it's unfortunately not very seamless. I don't really play games much but my recent experience hasn't been great.
Last week I tried to run Witcher 3 using wine and it took a while to debug. The installer runs immediately and seemlessly - but then it got messy
It involved installing `winetricks` which seems to be some service that installs a multitude of driver, libs and software through a clunky unresponsive UI. (not clear where it's being installed from, but #yolo) The necessary libraries I had to deduce from reading forum posts and the errors I got on launch. You then had to go in and make sure to launch the directx 11 version of the game (ie the .exe) buried in a hidden directory (~/.wine/)
Is this was "Platinum" means.. i don't want to image what Gold entails
You seem to be kinda confused. ProtonDB does not reflect how well Wine runs a program, it reflects how well Proton runs it. If you try to create a Wine environment with the same featureset and patches that Proton has by default, you will waste a lot of time. A lot of things (eg. Powershell, d3d/DXVK, dotnet, C++ redistributables, etc.) come pre-packaged with Proton and therefore require no extra configuration to run a compatible game.
What you want is to use a program like Lutris or Bottles to manage your runtime for you, if you're not installing it through Steam. I don't know who put you up to installing Wild Hunt on a vanilla Wine prefix but they sound like someone with a good sense of humor.
Okay, that still leaves an insane amount of applications. Games aren't the only thing that people run on their PC. Plus, I think they meant that no native port of AAA games have been published on Linux recently, not that windows games can't run on Linux with additional tooling. though I'm not sure how true that is.
Outside of some games and relatively high-end multimedia apps and maybe some dev tools, basically no one cares about the OS. I probably spend 95% in a MacBook using it as a browser.
I'm a big linux fan. Use it every day. I even use linux on the desktop to develop Windows software that isn't even distributed on linux.
But am I really that alone in recognizing a lot of truth in this article? It's sad to see so much hostility within some of the other comments, even here on HN, quickly coming to linux's defence rather than having at lest some degree of reflection and acceptance towards what are almost undeniably the very common experiences of many linux users.
It somewhat proves point 7 that while not exactly to the letter of the examples given, support the point that until there is wider acceptance of the problem we are far far from a solution.
It's possible to think very positively of linux yet still recognize so many of the challenges that make linux hard to use on the desktop. This very day I can't suspend/resume my desktop because the kernel crashes on resume, a pretty basic feature of a computer in 2024. I put up with it, because Linux is still a great choice for a technically-minded user. But there are other software orgs out there that simply wouldn't tolerate this level of lived experience of any end user. But linux does, and we are where we are.
There’s truth in broad strokes (e.g., the lack of a stable ABI is a genuine problem) that I wish people would acknowledge. I think your attitude is a healthy one! However, I don’t think this article is written very charitably. It seems unfair to Linux developers to suggest that they spend “very little to no time” testing their changes for regressions, for example. And while AppImage, snaps, and flatpacks are imperfect, I’m not sure they deserve to be dismissed as just “virtual lightweight machines.” (I’m not sure how accurate that even is for AppImage.)
Linux users can be defensive or even toxic about their operating system so I hardly blame the author for being impatient or frustrated. Still, I don’t blame people for being frustrated in turn at that tone.
Saying that Flatpak is "virtual lightweight machines" is just plain wrong.
There is no hardware emulation in a Flatpak. There are no "machines" virtual or otherwise other than the computer and host kernel that you are running as an operating system on it. What Flatpaks provide are the support libraries ( the specific versions of those libraries ) that an application requires to run. That is exactly the strategy that Windows uses as well. Windows applications, usually distributed as binaries from very early in the history of the OS, have learned to bundle most of those libraries themselves. For the low level stuff like C and C++ standard libraries, Windows bundles multiple generations of them that are taking up space ( and providing attack surface ) whether you are using them or not. The biggest difference with Flatpaks is that these libraries are not installed until you install and app that needs them. This is why the author complains that Flatpaks "waste space". Windows takes up tens of gigabytes of storage for a basic install but then does not require as much to be installed later to support legacy apps. Somehow the author thinks this is better.
> I don’t think this article is written very charitably
It definitely isn't, but honestly Linux could use more of a drill-sergeant treatment. It's either that or we keep doting on our little pet OS and give up on the idea of desktop Linux being at all a viable competitor to the goliaths of Microsoft and Apple. Maybe we never stood a chance?
Keep in mind, I'm pretty much committed to Linux myself, as I'm not willing to compromise with Microsoft and Apple's user-disrespecting, locked-down, enshittified software. But I have no illusions about the inconveniences and sacrifices of that decision.
I tend to think it’s not really possible for volunteer open source to directly compete with tech corporations when it comes to polish and cohesion of a large end-user ecosystem—it’s just too much cat herding and uphill battle even to agree on a vision.
That said, Linux has other advantages that corporations will never touch because it doesn’t make business sense. I’m really glad it’s still there as a bulwark against enshittification and all computing freedom being taken away from individuals.
The vast majority of Linux is not volunteer open source. It's corporate developers. It's also focused on the server. Tech companies seem to use a lot of Macs these days although I assume Windows predominates overall.
Yep. A good chunk of desktop Linux is thanks to paid Red Hat developers contributing to projects like GNOME. Valve pays for a good deal of Linux's graphics-related userspace code. And another chunk of paid work is handled by non-profits like the KDE organization. There aren't as many weekend hackers keeping your Debian workstation and Pop!_OS laptop running as you might expect.
Red Hat almost certainly does less on-the-clock desktop-specific work than they used to though I expect there are still a lot of GNOME contributions from employed developers. I'm not sure there is a good paid contributors accounting survey at this point, but it's still mostly paid especially outside of the desktop world.
> a bulwark against enshittification and all computing freedom being taken away from individuals
I'd like to believe that, but I can't help but think that if Microsoft really wanted to lock all PC hardware so that it could only run Windows, or if Google really wanted to add Attestation features to Chrome so major websites only worked on proprietary OSes, "the Linux guys would complain" wouldn't make their executives hesitate even for a moment. (In some cases EU legislators might give them pause, however.)
They managed to add DRM to HTML without too much trouble, after all. And now that Mozilla is an advertising company, that kind of thing will only get easier. And even with the Linux-based Steam Deck selling a million units or so, game companies are still fine locking Linux out if they feel like it.
Desktop Linux is pretty much totally ignored by the money people, the big decision-makers. Linux users are a very tiny minority. Your average person on the street barely knows what it is, if they've heard of it at all. So I'm not sure it qualifies as a "bulwark", I'm sorry to say.
> I don’t think this article is written very charitably
Why would it be? It's about Linux on the _desktop_ where it sucks a$$.
Open Reddit, open any distro related forum, open unix stackexchange - hundreds of support messages daily from a tiny Linux community. That should speak volumes about how ready it is and I didn't even touch on individual issues. The article is called "Final", I will not be actively updating it. The previous one existed for over a decade and I'm done with it.
Well, yeah, and open answers.microsoft.com and it's hundreds of support messages daily from the Windows community. Where else are people going to for Linux problems other than the Linux problem support sites?
This is a good example of a positive, necessary conversation that was almost thwarted by an engagement-optimized article. It only happened (three quarters of the way down the HN comment section) because a few users were mature enough to not take the bait and actually discuss the issues at hand. Good stuff from both you and the parent poster.
Thanks for your kind words! I do think there are valid criticisms in the article, even if the overall framing isn’t productive.
While the tone and framing may make it especially good at attracting responses, I do want to clarify that I don’t think it was deliberately optimized for the engagement.
> towards what are almost undeniably the very common experiences of many linux users.
While I empathize with these novice Linux users (who may not be self-aware they are as novice as they are), I do so to consider what a distro that avoids all of their problems would look like.
And, a big part of that is a huge colorful chart about what hardware is supportable and is not supportable during the install, along with the user typing out something like:
"I understand this hardware is not supportable and this is the fault of the vendor of the hardware. I understand that if I want a supported system I can buy one at https://ProductionLinuxInc ; furthermore, I understand and agree that by typing this acknowledgement and clicking "Accept" that I am legally bound to this agreement and legal penalties apply if I bitch about this operating system and my bitching either by its contents or what my bitching alludes to are factually wrong".
The main criticism I see as valid here is first point: software distribution is tricky. For macOS and Windows you basically have a model where you can go to a website, download a binary, install it, and it runs.
Most Linux distros won’t jive with that very well. And because every Linux distro uses a different theme and one of a dozen UI libraries/DEs/etc. the downloaded GUI application will look bad because it won’t fit in. Flatpacks and snaps really are a solution that just allows you to not pollute your OS file system (which was never designed with the concept of apps: there are no reasonable namespaces, no easy ways to uninstall, etc.) and to statically compile the app so the next OS update doesn’t break it.
If the distro contains everything you want in its repositories you can use an “App Store” front end to get what you need and the OS will helpfully keep it up to date which is more than you can say for Windows and almost what you can say for macOS. But if you need a third party app you are out of luck: including it in the distro is a process most people won’t want to undertake.
The solution is likely not going to make the nerds happy: something like snaps are the right abstraction for Applications (I am using the capitalized version here to distinguish it from system programs). What will also happen is that some Linux distros will support this kind of solution out of the box and will be the desktop distros. Not every Linux distro will be blessed with being able to run popular apps. Ubuntu is likely on the right track for this and the Unix purists will hate it more every year as it loses the original shared system philosophy and inches closer to what macOS provides.
So the real question is: do you use your Linux distro desktop for the kernel, the package system, the package repositories, or the 1337 cred?
> And because every Linux distro uses a different theme and one of a dozen UI libraries/DEs/etc. the downloaded GUI application will look bad because it won’t fit in.
As opposed to, say, Windows, where there is only one GUI and one theme by one company so everything is perfectly match- wait, no, they have bits and pieces running every GUI they've every invented all the way back to the 90s ( https://www.dvhardware.net/article75347.html , or https://ntdotdev.wordpress.com/2021/02/06/state-of-the-windo... though that's old now ) and also lots of applications using QT or Electron or whatever just like on Linux. So clearly that's not the problem; if it's good enough for the most popular desktop OS, it can't be that bad.
The reality is I can just target winforms on windows, or GDI if I need that.
To open a window there is exactly one way if I want it.
To open a window on Linux do I call into X11 or wayland? Do I need to also provide a DRM layer for incase it needs to be run directly ontop of DRM because BIG_USER_WITH_MONEY wants that.
Should I use QT to get a decent UI library, or GTK, or one of the many other less popular ones.
As a developer that would want to release a GUI app it's not extremely easy to pick for linux. There's no clear guidelines stating X has these features but Y doesn't (winforms vs whatever Microsoft's latest attempt at UI is).
Let's not even get into actually needing to ship a fully cross platform app. C# is blessed on windows, swift on MacOS and C on linux... Which means all the business logic will end up at the very least exporting a C API if I'm not deciding to just bite the bullet and pay QT.
All this ia why you will likely find that someone will develop a Windows release and maybe MacOS if they have a userbase that can justify the development cost. Maybe they are nice and run a test on wine and treat that as a supported platform but I only know of exactly one company that does that outside of maybe some game devs.
And that doesn't even touch anything in the article on why Linux on the desktop won't happen because the 1% of devs that went down this rocky road then also finds ABI breaks.
Not as much anymore. Part of it due to the mess that is the first-party Microsoft's GUI frameworks situation, part of it due to .NET's own APIs and semantics not evolving together with Windows since long time ago.
Funny anecdata: someone* recently made a small app to indicate when (now broken) CTRL+C is actually triggered in Windows. It was initially written in WPF and was exhibiting a weird unmanaged memory leak. Only after porting it to Avalonia (which works everywhere) the issue went away: https://x.com/KooKiz/status/1873726346897330243
So your experience might actually be better with a cross-platform GUI framework!
Mind you, it's still in its "homeland" on Windows - things like access to Registry are part of the standard library, but you may find it performing better on Unix systems otherwise. The IO abstractions certainly do!
* well, not someone but one of the co-authors of Pro .NET Memory Management.
His journey tracks exactly what I've heard from windows development. If you need more than winforms use avalonia because Microsoft cannot create a viable UI toolkit... Which is extremely funny concidering the article we are having a discussion under...
My point however was that compared to C# you are fighting an uphill battle using any other language to do native windows UI development...
Also why I stopped looking pretty early into native on windows and built a webapp...
Well, Uno (AvaloniaUI's competitor) uses WinUI 3 as a back-end on Windows. And NAOT-compiled Avalonia on Windows is practically native anyway. You can use both to build against macOS too if you want to target both major OSes. The offerings have improved significantly over the years.
But yes, I get your point, if you're not familiar with C# (or Dart or Kotlin for that matter), there is motly a one way out in the form of a web app.
> The reality is I can just target winforms on windows, or GDI if I need that.
And the reality is that you can just target GTK on Linux. That you happen to have picked a single option on NT doesn't magically make it different, it just means you already made the decision.
Which version of GTK should I pick since you seem so confident?
And should I target X or Wayland? Which Wayland extensions are ubiquitous enough so that I don't need to manually reimplement things in my app? Should I target pipewire for audio? Pulse?
If I pick a modern stack like Wayland + GTK 4 + pipewire it might not work on older linux distros. Maybe that's fine for my needs, maybe it's not. If I pick X + GTK 3 + pulse I might end up reimplementing half of the desktop manager in my app and need to constantly rewrite my app.
Do I distribute deb/rpm or do I distribute a flatpack/snap... Actually which one.
And don't bring up that GTK abstracts Wayland/X because if I target Wayland I have access to a ton of Wayland extensions which aren't there in X so I do need to make that decision the moment I need to get access to that stuff.
So even just picking GTK I now have a matrix of 9 different configurations. GTK is not the defacto framework on linux. We now need to add QT to the mix and the whole collection of things there.
It seems like you've picked 1 option for Windows and decided that it's perfect, and then looked at every single remotely possible option on Linux, and then pretended that that was somehow a fair comparison. If we use every GUI toolkit that Microsoft has ever offered, and also every sound API they've ever offered, and also every distribution format (do you use the store? .msi? bundle up a Portable App so you don't need an installer?), your matrix will also blow up.
Feel free to look at the other comment thread here for my opinion on windows toolkits.
But to answer ypur question, if I pick winforms + an exe + wasapi for audio it will work on all supported versions of windows and I don't need to compromise on features. It will very likely also work under wine on linux and macos for that matter...
I use KDE Neon because Windows seems almost childish in comparison.
I use KDE Neon because I haven't bought a Mac yet but I don't know if that would be an upgrade. Neon is just superb.
I just can't imagine how anyone could get use to Neon and then use Windows without thinking it is a joke. Especially if you turn off the bells and whistles, everything is instant and instantly responsive.
I barely know any linux commands but installing software has been trivial in various software managers and apt get.
Beyond that everything just works. I think this experience is predicated on not playing games though. I have no idea what the state of games are for Linux, Windows or Mac.
As long as Neon has a critical mass to exist I could absolutely care less if more people use it or not.
The fact that there’s flatpak AND snap (and the distro’s packages) just points to another nail in the coffin — there are just too many standards. If people could at least rally around snap OR flatpak, then we’d have more of a fighting chance.
Windows has been isolating dependencies for a while now too, but for security, since their entire security model was pretty awful before, largely because of the need to support decades-old applications that business customers / governments still rely on. https://support.microsoft.com/en-us/windows/device-protectio...
I have really come around to this way of thinking. Either you use the walled garden that is the distro’s own repos with only the blessed software of specific slightly out of date versions, or you allow arbitrary software but have to isolate it. The former works for some use cases but not for “Linux on the desktop” as most typical users define it.
> And because every Linux distro uses a different theme and one of a dozen UI libraries/DEs/etc. the downloaded GUI application will look bad because it won’t fit in.
Most apps/DEs/etc. use either GTK or Qt. If your application uses GTK or Qt, it will use the user's preferred GTK/Qt theme by default, and fit in with all the other GTK/Qt apps on their system.
Sure, you might get unlucky, and your GTK app won't look great on a KDE desktop where that user doesn't have any other GTK apps and it sticks out. Or vice versa for your Qt app on a GTK-based desktop where the user hasn't picked a matching Qt theme.
But unless, for some reason known only to yourself, you decided to write your app for FLTK, EFL, wx, or some other niche toolkit, your app is going to work like at least one other app on 90% of users systems. And for those users running FLWM, Enlightenment, or some other weird graphical environment, I'm sure they'll cope.
So the world consists of apps like Slack, and Photoshop, and Final Cut Pro and none of them are written in Qt or GTK or any of the other libraries you mention. Any one of those applications has more installs than Debian on a desktop (this is an educated guesstimate).
I have to use Windows to test my apps, and each and every time it just infuriates me with ads and cluster fuck UI, constant notifications of different products, ads in start menu.
It's only consistent on delivering ads, not how the user interface is organized
How you have to tweak registry to remove the ad, but still can't get rid of OneDrives reminder, and then comes the forced updates. Settings are spread in between millions of places, extremely confusing, and twisted.
Search results are random, what the hell?! Can't even run the most used app with confidence, from search. Which is just a absurd.
Managing software, is freaking pain. Everything is everywhere, no proper centralization whatsoever. choco is good, but its cli, with which people have so much issues with.
And I have seen people suffering with bugs that was never solved, I had to setup another file browser just so the client can use his computer. Otherwise, it would take hours to load a simple folder.
The default Editor have no syntax hi lighting whatsoever, so streamlined developers, can't even provide a good editor, that could be compared with the most basic editor coming with gnome.
Windows is just not usable, and in my way, it's only feeding on the legacy, the apps. But Linux becoming much better now
Yeah, with hundreds of contributions to multiple Linux projects including the Linux kernel, GCC, KDE, Wine, etc.
A total hater.
> How you have to tweak registry to remove the ad, but still can't get rid of OneDrives reminder, and then comes the forced updates.
Takes roughly 2 minutes, ok.
> Settings are spread in between millions of places, extremely confusing, and twisted.
Not sure what you mean. In Linux settings are spread all over UI, /etc /home/.config /home/.app etc. etc. etc. And you need to learn shell and console editors to configure stuff.
> Search results are random, what the hell?! Can't even run the most used app with confidence, from search. Which is just a absurd.
Search is not even available for multiple Linux DEs.
> Managing software, is freaking pain. Everything is everywhere, no proper centralization whatsoever. choco is good, but its cli, with which people have so much issues with.
MSI and GPO exist. Most serious applications nowadays have auto-updaters. Others often need no updates.
> Windows is just not usable, and in my way, it's only feeding on the legacy, the apps.
Works near perfectly for 2 billion users.
> But Linux becoming much better now
Barely works for its 40 million users, riddled with bugs/regressions and missing features.
How does one make hundreds of contributions to a number of important projects yet be so woefully ignorant about things as you are? In this discussion, you have claimed that Qt is under the GPL, said that running old software on Linux systems is impossible, claimed Windows runs near perfectly and said that Linux barely works. All of that is provably false. Here is one of many examples of Windows having profound issues where Linux systems do the right thing:
That is far from near perfectly and there are many severe issues that have plagued Windows systems over the years if you take the time to look.
By the way, those auto-updaters you mention waste system resources. Linux systems use package managers that prevent you from having O(n) auto-updaters consuming resources all the time.
> For the purpose of complying with the LGPL (any extant version: v2, v2.1 or v3):
> If you statically link against an LGPLed library, you _must_ also provide your application in _an object_ (not necessarily source) format, so that a user has the opportunity to modify the library and relink the application.
Basically not a single software developer is going to go through these hoops to link with LGPL statically. I don't care about theory crafting. Linux has been there for over 30 years now. Show me the applications that comply with that. I dare you.
Actually I know a single example (!) and that's proprietary NVIDIA drivers. Not an application BTW. You cannot imagine how difficult Linux kernel developers have made life for NVIDIA. Yeah, there's a huge object file to be linked with the kernel. Other examples? None. In Windows you compile a driver and have it work flawlessly for at least a decade.
Sorry, I'm done with you. You're now baselessly and shamelessly accusing me of lying where a simple Google search reveals everything I say is true.
Windows requires none of this crap. What's funny is that Win32 is the only stable Linux API. Not even glibc provides it. What's even more funnier is that DirectX is still a far more richer set of APIs than what Linux provides despite Open Source making it as hard as possible to link to it. And basically no one does.
With most laptops and pre-assembled PCs Windows comes for "free" (included in the cost). If you're a builder, you can get a Windows license for a low as $10 and live happily ever after and use your old applications including I don't know Adobe Photoshop CS2 released ... almost 20 years ago. Still works fine in Windows 11.
You just provided evidence that your remarks were wrong and then claimed it validated your remarks as being right. Rather than accuse you of lying, I would accuse you of having a severe issue in your basic reasoning skills.
Furthermore, your rant is consistent with disorganized thinking, which is often a symptom of schizophrenia. I suggest seeing a psychiatrist.
That is not what I claimed and Nvidia does not statically link to LGPL code as far as I know. If you refer to the Nvidia kernel modules, there is no static linking being done with LGPL code.
I studied introductory psychology in college. I am a decent judge of whether people could benefit from psychiatric help and the more you write, the more convinced I become that you could benefit. Please go for an evaluation. Ask your regular physician for a referral. A psychiatrist likely could help you.
Oh, too bad, You should turn it on, I will remind you in a few days. > 3 days, 5 days
Oh, Sorry, never is NOT an option.
How can that be acceptable!
At least you know where your configuration might be! On Windows, You don't even know where your apps configurations are. Is it in some deep level underground of registry? Or in App Data, local or roaming, or somewhere else?
On Linux, I know user based configs are in `.config`, and system-wide configs are in `/etc`. Some older app may store their config in home, but that's all. How is that difficult?! And everything is just text files and most of the time very nicely commented.
And I probably never needed to touch anything in `/etc` for almost over a year, unless I am messing with something.
> Search is not even available for multiple Linux DEs.
What?! Gnome has excellent application search, Win + one or two key presses I can launch my app, and it's actually consistent. No ads inserted randomly to interrupt my muscle memory.
KDE also have it. In i3 I can use `rofi` to do the same.
> MSI and GPO exist. Most serious applications nowadays have auto-updaters. Others often need no updates.
Oh, lol. How can one excuse the mess of going to each application's website, download binaries and install, and then let each and every app phone home to check for update, and then go through the upgrade dialogs, keeping watch if the app has inserted any search box or toolbar opt-outs? That is just absurd! I can just run `yay -S --noconfirm` and put password, and my update kicks in. Most of the apps are not bothered if I am running their latest version or not.
> Works near perfectly for 2 billion users.
Because, again, it's only feeding on the legacy, and because people pay the Windows Tax, without knowing, or in my country, it was easily available with new purchases for free (Piracy).
IF, people had to install Windows by themselves, I can confidently say, most of the people would stay far, far away from it.
> Barely works for its 40 million users, riddled with bugs/regressions and missing features.
I have been using Linux for 20+ years. I am not riddled by any of those 'bugs' and 'regressions', funny thing is, sometimes there are minor hitches, but again, they are minor.
And if I face a problem, I can communicate with the respected community, and get a solution. For Windows, which is a black box, you never know if you'll get any solution for your inconveniences. Like, getting rid of OneDrive, permanently. Or File explorer getting stuck indefinitely for hours after hour, and there's no solution to it. The client suffered for almost years, and he had no way to reinstall.
Citations needed. Doesn't happen to anyone else but you.
> Oh, too bad, You should turn it on, I will remind you in a few days. > 3 days, 5 days Oh, Sorry, never is NOT an option.
Again 5 minutes of Googling disables it.
> On Windows, You don't even know where your apps configurations are.
In Linux they can be in a dozen of different locations as well. And Gnome even has (had?) a sort of registry.
> What?! Gnome has excellent application search, Win + one or two key presses I can launch my app, and it's actually consistent. No ads inserted randomly to interrupt my muscle memory.
"I use Gnome/KDE and I don't give a damn about all other DEs."
Got it!
> How can one excuse the mess of going to each application's website, download binaries and install, and then let each and every app phone home to check for update, and then go through the upgrade dialogs, keeping watch if the app has inserted any search box or toolbar opt-outs? That is just absurd! I can just run `yay -S --noconfirm` and put password, and my update kicks in. Most of the apps are not bothered if I am running their latest version or not.
Most serious Windows software doesn't need manual updates. Windows and MS Office update themselves. Adobe, Corel and 3dMax products do so as well. Tiny utilities? Who cares? They just work.
It's Linux where people get obsessed with updates because there's no API/ABI compatibility per se and you have no choice. In Windows? Run something from 20 years ago and don't bother.
> Because, again, it's only feeding on the legacy, and because people pay the Windows Tax,
$10 tax. Yeah. I've spent hundreds of hours fixing Linux. No tax at all.
> I am not riddled by any of those 'bugs' and 'regressions', funny thing is, sometimes there are minor hitches, but again, they are minor.
Anecdotal evidence rears its ugly head AGAIN AGAIN AND AGAIN.
"If it works me for Linux is perfect". Never mind people who actually do the dirty work of fixing hundreds of regressions every Linux kernel release.
Nothing ever changes with the Linux cult. "It works for me, I'm a seasoned developer, I know my way around console, Linux must work for everyone".
Even in this discussion this has been repeated ad nauseam.
If you're approach to fix things in Linux like the way you talk about Linux, no wonder you spent hundreds of hours fixing it. Work smarter not harder.
It's neither about if it works for me, it works for everyone, nor it should be, if it doesn't work for me then it doesn't work for everyone.
You're making some points, that most of us Linux users knows to be wrong, and pointing it out.
You're just a hater, with some outdated information about Linux. Obviously Linux is not for everyone, no one is saying it is. But it provides a open, no bullshit, interruption free, environment, that I own, not some corporate doing business with my data. That is alone enough to me, and many other
wxWidgets isn't niche; it's probably the most commonly used third party toolkit to build commercial Linux desktop applications (yes they exist!). I guess Qt licensing is too scary, and GTK too crap.
The “go to a website, download a binary, install it, and it runs” model has significant security issues.
The security issues are twofold. One is that you could easily be tricked into installing malware. Two is that you will never be able to keep up with security updates, leaving yourself vulnerable to being compromised by flaws in network applications.
Making that model of software distribution difficult is a feature, not a bug.
I fully understand that. And yet until you can convince Debian to include a copy of Adobe Photoshop in their repository you will cut off a significant amount of desktop users. How do you propose to solve that problem?
(Yes of course the solution is to fund Gimp until it’s better than Photoshop by so much that the industry switches. It was the solution 20 years ago too. Adobe has run away with this market. Photoshop is but one of many applications desktop users need.)
Most computer users do not use photoshop. For those that do, there are options. One is to use Krita. Another would be to use Wine to run the Windows version of Photoshop:
So I am friends with a bunch of photographers. Some are proponents of open source. Others don’t care. But universally they would rather have a thing that just works and feels right than mess with Wine. You can do almost anything. When your paycheck depends on your software working your patience is likely to be shorter.
I say this as a person who is a huge fan of Linux/Unix on the desktop. I wish I still was able to run it as my daily workhorse like I did for a decade and a half (and that was before the distros actually got good at this). You can make it work. But an average user isn’t interested because there are only downsides and little benefit.
It sounds like you're violently agreeing with the parent.
I worked for an open source company for over a decade. But, although things improved over time, within about a week of joining I went out and bought a MacBook to use (for the first time) because it was more politically acceptable than Windows but I didn't want to deal with either a locked-down corporate Linux distro or an unsupported free Linux.
My neighbor is a photographer. He uses photoshop on MacOS. It works for him. Until Adobe ports Photoshop to Linux, the “must have Photoshop with no issues whatsoever” demographic is better served by MacOS.
Yeah. I was hired to do a job and didn't want something that would really piss people off (Windows). But Macs were increasingly acceptable and that seemed a comfortable middle ground at the time. (And became increasingly so over time and I really grew to like a lot.)
> And yet until you can convince Debian to include a copy of Adobe Photoshop in their repository you will cut off a significant amount of desktop users.
Shouldn't you try to convice Adobe instead? AFAIK, they're already using their bespoke UI and if Sublime can do it, Adobe can sure deploy Photoshop on Linux even if they only want to bless Ubuntu.
What macOS provides is that each app is bundled with all the libraries it needs. That is basically what AppImage is on Linux and it works just as well. Most people find it too wasteful so Flatpak has become more popular. At least Flatpak allows apps to share many of the libraries that they both rely on.
Although Apple has also made a mistake IMO in not having a proper package manager in spite of hiring the writer of Homebrew at one point. probably less important than it used to be but not all roads lead to Apple apps or the apps of a few other companies.
The fundamental thing people need to realize is that package and app are different abstractions. They are not one to one related and they are not orthogonal. Distros deliver some open source apps as packages. But thats not the same as delivering apps.
You need to dial down the hysterics by about 1000%, or alternatively you might want to head back to the YouTube comment section since your vernacular seems more appropriate for it.
The lack of deduplication, or more like avoiding duplication, of what are basically files in some of these implementations is a problem.
But it's not a problem for me. It's also not a problem for most people. And it's not at all a problem next to the dumpster fire of closed source OS ; so ultimately there is no real problem.
Yet, all of that is moot, because someone already solved the problem. But I'll wait until they publish their work to talk about it.
> If somebody does to desktops like what Google did for smartphones, then Linux will finally have its moment.
That's already happened, and Google did it. They are called Chromebooks. They are Linux underneath, use the GNU userspace stack and Wayland for GUI. They have a button that installs Debian, and GUI packages you "apt install" appear in the Chrome app launch folder.
Apparently ChromeOS is being swallowed by Android, which I suspect isn't a big move. Crosini, the LXC thingy that hosts Debian on ChromeOS, has already been ported. That means Android will be powering laptops. Does that make it a Desktop OS? If so, that does the we can count Android's 45% market share of all personal computing devices towards a Desktop OS? And if we do that, does that mean Linux has finally won the desktop wars? By any definition it's already won the personal computing devices wars.
Fair point, but I think we can measure the success of them in their respective device markets by their market shares. And while ChromeOS was a great attempt, it didn't really disrupt much beyond the US education market.
> That means Android will be powering laptops. Does that make it a Desktop OS? If so, that does the we can count Android's 45% market share of all personal computing devices towards a Desktop OS?
OS market share for a particular device type is typically measured as the respective portion of the OSes that run on that hardware. If you are looking at desktop/laptop market share, you'd only count that :)
> For everyone else, the reason they don't use Linux on their desktop is because it didn't come installed on it.
My experience lines up with this.
Both my elderly parents run Zorin OS on their separate laptops for the past 6 years, they understand as a kind of 'free windows'.
They don't care about the inner workings, they go online, check email, do shopping, make travel arrangements, print a ridiculous amount of paper. And I stopped getting calls because 'the printer is not working'.
Most people only have a few "essentials" applications they want to use (the browser in most cases) and don't care about the OS. Whatever can run these applications reliably will get a free pass. And for a lot of these use cases, if they have someone that can get Linux working well on their hardware, they only notice the speed and the lack of ads.
> Outside of the Linux kernel, there is very little shared technology between Android and the Linux desktops.
lol... so, all of it? Linux is a kernel.
Regardless, my point had absolutely nothing to do with technology. Android's popularity has absolutely nothing to do with Linux and/or what it shipped with or didn't ship with. Its popularity has everything to do with the fact that it shipped preinstalled on devices that people bought.
Linux isn't going to become popular on desktops until it is shipped on desktops.
A lot of netbooks back in the day did ship with a Linux distro pre-installed, and you know what people did?
Replaced them with Windows.
Because the Linux desktop is and continues to be a sup-bar experience for the average joe. GNOME requires a modification to even run Discord properly, but anyone who points this out gets shouted at.
I was those people. I bought the original eee 701 at launch. I ordered it online because local stores didn't stock them. Xandros sucked not because it ran linux, but just because it was some little-tikes toy OS. But Windows didn't even run correctly on the damn thing either, because the 480px vertical height of the screen wasn't tall enough to fit the minimum height of many windows. The device was basically a toy for nerds, no normal person would have picked up one of these devices and thought "this is a computer I can use for real things"
People don't not use Linux because they've considered it and found out that their applications won't run on it. They don't use it because they've not considered it at all, because 99%+ of people do not choose the OS on their computer. They go to the electronics store and buy a Dell, HP, Acer, Lenovo, or Apple. There's no Asus Eee running Xandros at my local Best Buy, and if there ever was, it wasn't on the shelf for very long.
I've been using Linux on my desktop even longer, but I agree with at least the premise of the article (even though the arguments are poorly thought out and expressed most of the time).
No Linux-based OS has gained mainstream acceptance. You can try to blame that on marketing or corporations or whatever, but I don't buy that. I think open source software is the best way to make technically-superior software, but I think often it's a poor way to create the best user experiences, because the kinds of people who create open source software often don't care to focus on UX.
> That’s what consoles are for.
Millions (tens? hundreds? more?) of Windows gamers would disagree with you. You buy a console and do what you want, sure, go for it. But there is a market for Windows gaming, a very big one.
Gnome tries to focus on UX and does a pretty good job at it. But they're yet to make a regular user oriented distro, which I think could be a better bet at realizing that 'mainstream linux' idea, because at least they'd care enough to try and figure out the things that get in the way of users having linux running.
Some standard, good enough, apps would help further this idea. Right now, I can’t find a mail client for Gnome that’s as comfortable to use as Apple’s Mail app. I use Mailspring, and it comes really close, but still could be nicer. It’s also not part of Gnome.
Gnome is a very well managed project, but its resources aren’t infinite.
There are a lot of games that will never make it to consoles because they don't play well (or at all) without a keyboard and mouse. It's also easier (sometimes much easier) to develop games on PC.
That having been said, most games work fine on Linux nowadays anyway.
Same, well, earlier, but Warty was my first time going all in and not dual booting.
I don't game, so that doesn't bother me, though I read it works pretty well these days.
I've used nothing but Linux since, of various flavors. Use it for work, use it for home. Everything just works.
One thing I will note is that despite my general distaste for Electron, it's probably been the biggest recent accelerant for making this all easier. Slack, VSCode, Teams(yuck), MongoDB compass, Postman, etc all work flawlessly. Prior to it existing, we were stuck with Wine, open source alternatives in various states of disrepair, or missing out. I can't even remember the last time I had to 'miss out' because of Linux, I think it was some proprietary VPN blob around 2010 or so.
Games happen to be software, that hundreds of millions use, especially younger tech-oriented people. So yes, it IS essential if you want to make a PC people will use.
Games are particularly interesting though, because they're likely the only legacy software an average person might want to run, while also being the most likely to be closed source.
Anyway, desktop Linux keeps getting relatively better when you compare the increasing hell that a Windows user is subject to.
Besides proton/wine/VMs for gaming which are all very good, console/computer emulation for gaming is solid in Linux, including literal plug and play joystick support.
With Proton, Wine, Lutris etc you certainly can also game on a Linux if you want. I was just playing Path of Exile 2 on the Steam Deck (a Linux desktop)
I've been patiently trying a distro of Linux every other year for the past 20... Someday. The general trend: Things work well from a native install. When you start installing software and drivers, the OS gets totalled, sooner or later.
> What's worse is that software compiled for the current version of Linux X will not necessarily work for the current version of Linux Y. Linux distros insist that all the software must be compiled for their current releases or provided as source code. The problem with source code is that normal users won't bother to compile anything, and secondly, it's not always possible to compile software because it may depend on a specific compiler or dependencies that your distro doesn't provide
This is at the core of it, I think. Forcing a user to compile C code is a bad user experience; it often fails due to linking or other dependency-related errors.
> When you start installing software and drivers, the OS gets totalled
I have been using multiple Linux distros for over 20 years (and supporting other people) and this has simply never happened to me. Not even since I started using a distro that is supposedly less stable (Manjaro).
> Forcing a user to compile C code is a bad user experience
That is why very few distros expect users to do it. Are you thinking of Gentoo or similar? I think its good that there are distros that cater for people who want to compile software, but why choose one of those distros unless you actually want to compile software?
From the point of view of most users, Linux is very similar to iOS or Android - you install software from the "store".
> From the point of view of most users, Linux is very similar to iOS or Android - you install software from the "store".
I believe the comment you're replying to is stating that Linux (and its app ecosystem) is much more fragmented than iOS and Android. Each distro has a "store" of apps that only reliably work on that distro.
It's a "distribution" of software. People taking care to make a number of programs works together well enough on top of the kernel in order for you to use your hardware. But the build scripts for one works well enough if you're trying to use a less common distro (I shamelessly copied from Debian and Arch for a few software on Alpine).
I’ve been using Linux on the same timeline starting with RHEL3, then Gentoo, Ubuntu, and finally ArchLinux and I’ve had to arch-chroot from a live usb several times over the last five years to fix broken upgrades*. It hasn’t been more than an hour detour each time but it’s been far from painless. I just recently had to nuke and pave a Thinkpad X1 extreme to reset KDE/Plasma to a configuration that didn’t crash Plasma randomly or kernel panic when resuming from sleep (power management was especially bad for that old gen1 model until recently and I had to build a custom thermald or something). I’ve still got a problem with a monitor not working after power save - just monitor off, not even computer sleep - until I power cycle it. Some problems are even starting to crop up on my System76 Pangolin.
A significant fraction of the software I use is installed through yay’s extra packages which do frequently require building from source so it’s not just a Gentoo thing. It’s far from as seamless (or as limited) as Android/iOS app stores.
* and not even half of them can be blamed on NVIDIA
> A significant fraction of the software I use is installed through yay’s extra packages which do frequently require building from source so it’s not just a Gentoo thing.
Which is why I said "or similar". I actually had AUR in mind (as I use Manjaro).
It's not supposedly, a "rolling distro" can't be stable because it has no stable ABI (upgrading a few dependencies is enough to change the ABI exposed to applications).
When your compiled binaries work one moment and then don't work after a system update, that's considered unstable and is one of the reasons people pay Red Hat to support old Linux versions for 20 years.
Some people think "stability" means crashes when it comes to distros, it does not.
Could you give examples for what bricked your system when you "start installing software and drivers" or when you were forced to compile C code as a user? Because this all sound so alien to me, I've been using Linux for over 20 years and everything has gotten so much smoother over the years that all this criticism feels completely ignorant to me.
I used to run Ubuntu in a VM years before switching to installing a Linux distribution on my PC and the constant distro upgrades got on my nerves.
The way I started was by installing Arch Linux on a 32GB flash drive. No I did not write the ISO onto the flash drive. I used Virtualbox to load the arch Linux iso and mounted the flash drive into Virtualbox and installed onto the flash drive as if it was a regular HDD/SSD. The only annoyance was figuring out the bootloader for this particularly exotic way of installing Arch Linux.
After that I spent most of my time really just doing everything in a web browser for several months. I did this, because I wanted to have a less distracting computing environment where almost nothing was installed. The fact that games didn't run on Linux was supposed to be a feature, until I realized that they mostly run on Linux anyway. Then I bought a 512GB SSD and installed Arch Linux on it in 2018 and never even thought or cared about what Linux distribution I am using ever since.
But you don't install software and drivers on Linux, you edit your nix flake, switch and it always works. If it doesn't work up to your liking, you boot into the previous generation.
Also for some mysterious reasons, compiling C code under Nix always works and dependencies can't be missing.
In my experience, Nix makes some things very easy and other things very, very difficult.
I hope the situation has improved, but the last time I had to deal with Nvidia (kernel/driver/library dependencies/updates/etc.) under Nix it did not remotely work out of the box and it was a huge pain. (And switching to AMD or intel was not really an option.) Doing anything off the beaten path seems to be challenging even if you have experience with Nix.
The arcane Nix language doesn't help matters.
I think immutable is the future, and I wish I could recommend Nix (and other immutable systems) to normal people, but at this point I think you really need to be a Nix expert/fan to set it up and actually enjoy it.
Sure, if I want to spend weeks and months of time learning how to:
* move my arms again, so I can learn how to
* push myself up again, so I can learn how to
* move my legs again, so I can learn how to
* crawl, so I can learn how to
* walk.
Cost of entry is a very real issue with NixOS and many/most of us are not interested in taking that time, so until it's easy / reasonable to use for most people's lived experiences, it'll sit on the backburner of Linux installations.
In my case I managed to get working desktop in about 15 minutes without any prior Nix knowledge just by following their tutorial, then about 16 hours to read/learn/experiment and 8 more hours to complete my environment and 8 more to move my homelab to Nix. Since then I spend around two hours every six months to update and polish my flakes.
Although Nix knowledge helped me tremendously in managing heterogenous codebase with more than a million of LoC
This subthread started with “When you start installing software and drivers, the OS gets totalled, sooner or later.” Then came the objection that one wouldn’t do that with Nix. Then that yeah, but Nix is too difficult to use. Then that no, it isn’t that difficult after you spend a weekend or two learning it. Which isn’t for the average user. Now you say there are plenty of other options, presumably non-Nix. I feel we are back to where this subthread started.
I have been running Fedora as a daily driver for two years now. It’s even the Sway spin so it’s not the happy path. No issues at all. Been through at least one major version upgrade with zero issues. Maybe even two. It’s a total non-issue.
> This is at the core of it, I think. Forcing a user to compile C code is a bad user experience; it often fails due to linking or other dependency-related errors.
Huh? If you’re compiling your own code in a mainstream Linux distribution then I’m not surprised you have to nuke and pave.
It’s a distribution, meaning they build and distribute the software for you.
Flatpak deals with all the applications that don’t need to be built into the operating system itself.
> Forcing a user to compile C code is a bad user experience; it often fails due to linking or other dependency-related errors.
I've had no failures building major projects like CPython (multiple versions) as long as I RTFM and ensure that clearly identified build dependencies are installed first.
The main reason why Linux on the desktop has never gone mainstream is that it is extremely difficult to find a good laptop that comes with Linux preinstalled, where the OEM is responsible for ensuring a seamless experience, as any bugs in WiFi, audio, drivers, peripherals, etc. is the OEM's responsibility, not the consumer's.
99% of computer users will never consider replacing the operating system that came installed on their system, they don't want to risk bricking it or losing data.
The other question is why should any normal person want to run Linux as their primary OS except for philosophical reasons?
I was forced to use Windows for a year as a developer. I threw WSL2 on it and it was fine. I installed Docker desktop for Windows and it worked fine for Linux.
But especially on the Mac, while I have to download updated commands, it is genuinely certified UNIX.
On both Macs and Windows I also get mainstream software support and on Macs, I get a laptop that can run 14-20 hours on a battery and runs cool and quiet.
The other question is why should any normal person want to run Linux as their primary OS except for philosophical reasons?
it's free.
it's stable.
it's easier to use.
it doesn't have adds.
it uses less resources.
it runs on older hardware.
it doesn't contain spyware.
it doesn't violate my privacy.
it includes more powerful features.
it offers more alternatives for the GUI.
i can trust it because the kernel has been audited.
it is way easier to configure to my personal preferences.
it allows me to replace or remove any application that i don't like.
Mac [...] is genuinely certified UNIX
why should any normal person care about that? UNIX is dead. Linux would be a genuinely certified UNIX if anyone would bother to pay for the certification. in fact, at some point someone did: https://unix.stackexchange.com/questions/293396/is-there-a-l...
You must be joking right? A step in any direction and you must use CLI.
> it doesn't have adds.
Windows neither has "adds".
> it uses less resources.
Proofs required.
> it runs on older hardware.
Good luck using Firefox or Chrome on "older hardware". Good luck watching videos on older hardware that lacks HW H.265/VP9 decoding. Older hardware for what? For NAS? Why would you need desktop Linux on it?
> it doesn't contain spyware.
Prove that Windows spies on you. And also, Windows is used by NSA/CIA/FBI and pretty much all governments of the world some of which are openly hostile to the US. So much for "spyware".
> it doesn't violate my privacy.
See earlier.
> it includes more powerful features.
Examples or bust.
> it offers more alternatives for the GUI.
Windows has had CLI for aeons and now not just one but many including Bash (WSL).
> i can trust it because the kernel has been audited.
That's an abject lie.
> it is way easier to configure to my personal preferences.
And Windows doesn't need configuration as it just works.
> it allows me to replace or remove any application that i don't like.
What would you like to replace in Windows that's not "replaceable"? Examples or bust.
I could continue, but where have you been the past 10 years? Windows has been spyware since Windows 10. There is oodles of telemetry, and you need to have been living under a rock not to know about this. This seems appropriate:
By the way, long before Microsoft started blatently spying on its users, they put a back door into Windows allowing them to push updates to Windows machines, even if you disabled updates. We know because they used it at least once:
When a company can push updates to a machine, it can change whatever it wants. Such changes can include giving themselves full remote control of the machine, if they did not have full control already.
I still have to buy the computer so an OS being “free” is inconsequential and what does “using less resources” get me? Does your computer run for 16 hours+ on one charge? Does it run fast, quiet and cool like my MacBook Air?
By the way you can buy a brand new M1 MacBook Air from Walmart for $699. These are still being manufactured by Apple and distributed via third parties.
As far as “trusting it” because it has been audited, the Heartbleed bug was part of Linux for years before it was discovered.
What “more powerful features” does it have than a typical Mac?
Exactly what are the specs of this $100 laptop? Am I then going to carry around 7 laptops so I can get 14 hours of battery life? Am I going to run a cluster of them when I need to do something worthwhile with them?
What exactly do you do “daily” with a $100 laptop?
From your past comments, I’m assuming you are developer, are you really using a $100 computer?
I'm currently posting from a Dell Inspiron 3505 which has an 8-core AMD CPU and 6gb of ram. I got it new for roughly $300 and from quickly looking on ebay can be had for under $200 used.
I'm waiting on the mail for a $70 Chromebook with 8gb of ram and a 2-core/4-thread CPU to replace this laptop because of a half-damaged charging port. I picked a Chromebook for the 10+ hour battery life, the same reason you chose a Macbook that costs 10x as much.
I use a terminal text editor to write code and I use a web browser for everything else. If it wasn't for browsers being so bloated and desktop applications made with electron, I would be happy to settle with a laptop that had even just 4gb of ram.
I have 16GB of RAM on a 2019 Dell I bought for $140 on ebay and I rarely get past 6GB (mostly browsing and reading pdfs, and the rare WhatsApp tabs). It came with Windows 11 and it was a toaster for the one hour I bear with it. I'm not really keen on privacy, but I draw the line on waste and exploitation.
My MBA is much more ergonomic and user friendly, but it's like being with an annoying roommate that insists on sleeping on the same bed and taking shower with you instead of having an appartment all by yourself
If I just needed to run a suite of microservices in Docker containers on a Chromebook, I would have gotten one for $170 instead of $70.
I'm not the person who posted about privacy - I consider chasing it a dead and fruitless end. Something will always be watching you. It's mere luck that preferring Linux happens to give me an environment that's mostly private by default except for anything involving the web browser.
Recently a friend of mine asked me to help him with his Surface laptop. He was unable to install Windows whatever he tried. I didn't believe him but it was exactly as he said - during the first boot the laptop would turn off and then show a generic error next boot. Ubuntu worked just fine.
What was the issue? Immediately after power management initialization during the first boot the system seen that the battery was in a bad condition and tried to hibernate but for some reason crashed instead. Out of warranty, the battery is extremely hard to replace, so a registry quirk injected through a Linux boot cd immediately after setup helped.
When I see the articles I believe the title should be "Linux is not Windows so I do not like it".
I have used Linux for just shy of 30 years. It is different than Windows and far more powerful. It is just that people do not want to learn different things. Plus Linux has no spyware.
These days, you can do anything on Linux that you do on Windows, except for some 3d Gaming. And that is because hardware vendors refuse to support Linux in the way they support Windows. It is the fault of Video vendors, most notability Nvidia.
So off to my usual rant :) If I was the "ruler" of Linux, I would make sure no Nvidia GPUs would work under Linux until Nvidia opens up their GPU.
>These days, you can do anything on Linux that you do on Windows, except for some 3d Gaming. And that is because hardware vendors refuse to support Linux in the way they support Windows. It is the fault of Video vendors, most notability Nvidia.
Even GPU drivers work fine for 90-99% of games on Linux. The biggest issue keeping certain games from running on Linux is actually anti-cheat support:
> And that is because hardware vendors refuse to support Linux in the way they support Windows
Because Linux does not support hardware (or at least new hardware) in the same Windows does.
AMD makes new graphics card, steps on Windows vs Linux to release an updated driver to support it
Windows - Update driver, release via Windows Update / Website etc
Linux - Send patches months in advance to the LKML using email, make sure any patches needed in Mesa are also sent in with a completely different system. Then, wait for your patches to be peer reviewed and land, then wait for a kernel release, then wait for distros to package said kernel and release it, which if you're on a distro that fixes to a kernel version, could be waiting 6 months.
> It is just that people do not want to learn different things
The differences from Windows versions 7 to 10 to 11 are substantial. To me it seemed like a consistent desktop experience from Linux Mint is much better for those who do not want change at all.
Agreed. Going from Mint 20 to 22, the experience in Cinnamon (/ Muffin / LightDM / Nemo stack) is a ton more stable, and there's a lot more readily available customization that's a lot nicer to do overall (notwithstanding the bits that are hidden in a binary file designed for `dconf`).
> When I see the articles I believe the title should be "Linux is not Windows so I do not like it".
I've used Linux for as much only I'm a tad more humble. Mentioned more than a dozen times in the kernel git log. Not a big achievement but kinda shows I'm not just a bystander.
You've never said how much time you've wasted fixing Linux or making software work. Must be a non-issue I guess.
I've wasted several weeks of my life on broken Windows Updates. Windows Update is a horrible shit show. Meanwhile updates on Linux are so benign I don't have qualms about updating. The worst thing that can happen is that you're thrown into a major version update of whatever software you've gotten used to.
> I'm a tad more humble. Mentioned more than a dozen times in the kernel git log.
Exceptionally humble, indeed.
> You've never said how much time you've wasted fixing Linux or making software work.
This comes up often and I'm always fascinated by the implied assumption that time isn't wasted fixing Windows or MacOS computers and making software work on those systems. The age-old "No, I will not fix your computer!" is not a Linux meme - in 99% of cases it refers to a Windows machine.
We can trade anecdata until the heat death of the universe, and in fact you've nicely listed many of the pro/con points in your articles. However having read both versions (current/technical and final), I must agree with parent. This is just a rant from someone who likes Windows more.
I also want to call out your "solving linux" section (in the technical article). You've listed basically the entire Ubuntu playbook (which is, btw, a UK company, not an African company). Mark "poached" debian developers (DDs), poured millions of his own money into it, spearheaded many of the desktop-specific innovations (now supplanted by other software, but Ubuntu provided the initial push), created an app store, did an enormous amount of work popularizing Linux specifically for the desktop, was the first to start "officially checked and approved" hardware list, etc.
And the results show even if we don't attribute them to Ubuntu today, and even if we're using alternatives: Linux desktop today is a much smoother experience than 20 years ago.
So, while I can sympathize with many of the points in your technical article (https://itvision.altervista.org/why.linux.is.not.ready.for.t...) even if I do not agree with them, the linked one really seems like distillation to a "I like Windows more" rant.
Could you be specific about what constitutes "a rant"?
And no I've spent far less time "fixing" Windows. In fact I don't remember doing that ever. I've had multiple cases when Windows stopped booting and in absolute most of them I just had to reinstall the system from scratch. That's all "I've wasted".
Linux on the other hand?
"It works for you everything is made up in the article".
Never mind how once I had to fix a glaring issue in the Linux kernel which rendered tens of thousands of systems unbootable and which took me over 24 hours of hard work to unravel.
Yeah, indeed I've never had anything like that in Windows in 30+ years that I've been using it. Never once a Windows update rendered the system completely dead. And I'm not talking about my system, I talking about hundreds.
Lastly I'm just a poor Linux kernel bugzilla maintainer who actually overlooks a lot of stuff and sees a staggering number of fixes in every stable Linux release.
I follow way too many bug trackers and LKML as well. It's all good, please disperse.
> And no I've spent far less time "fixing" Windows. [...] I just had to reinstall the system from scratch.
That's one way of dealing with it.
> Never mind how once I had to fix a glaring issue in the Linux kernel which rendered tens of thousands of systems unbootable.
You didn't have to fix it, you could've booted the old kernel. You chose to debug the issue and help. That's commendable, but do you think people (who work on Windows or hardware, or in IT departments in any organization) don't do that for Windows as well?
> And I'm not talking about my system, I talking about hundreds.
Is it tens of thousands, or hundreds?
> Lastly I'm just a poor Linux kernel bugzilla maintainer who actually overlooks a lot of stuff and sees a staggering number of fixes in every stable Linux release. I follow way too many bug trackers and LKML as well.
Well if you're Linux kernel bugzilla maintainer, it's natural that you see many more Linux bugs than Windows or MacOS bugs. Again, commendable that you choose to participate, help, and make it better! You choose to live on the bleeding edge, and so you bleed. But that's far from typical Linux desktop user experience.
> Could you be specific about what constitutes "a rant"?
You're obviously very passionate about, and involved with, Linux. But you also seem to hate it so I'm confused why do it? Buy a Mac, be happy, live and let live, etc.
Windows can be reinstalled, Linux bugs never go away by themselves.
> You chose to debug the issue and help.
No, people don't do that in Windows because Microsoft has much better QA/QC. Not a single Windows update since 1998 has rendered my system unbootable. Microsoft is certainly not perfect and Windows sometimes has regression but those normally affect very few people.
> Is it tens of thousands, or hundreds?
Windows installation user base: 2 billion. Linux: 40 million. Have fun with mental gymnastics.
> You choose to live on the bleeding edge, and so you bleed.
There's a huge number of bugs affecting "stable" Linux distros that use "stable" software including the kernel. Linux bugs are not specific to the bleeding edge but they are more prominent.
This is not a counter argument and I will not waste any more of my time with you. Try using the dictionary argument with your children, "You are a retard, my boy! Why? Look it up in the Merriam Webster dictionary, that's why".
You literally have to reinstall windows because of how broken it got, and that's somehow okay?
The amount of BSOD I've seen and unfixable issues after spending hours googling and browsing windows shitty support sites are just the tip of the iceberg.
Windows is utter trash and the only reason it won is because of the monopoly it established, forcing everyone to use it, hardware manufacturers to support it and once the dominance was established, you can't escape it.
MacOS is literally only used with hardware lock-in. People don't choose it. It's forced onto whoever chooses to overpay for an Apple computer.
The only reason Linux won't win the desktop is because it doesn't have a monopoly to support it.
These discussions that pretend any of this is due to merits of one OS over the other are moot. It doesn't matter at all when it comes to who or how many people use it. These are all determined by external factors.
> You literally have to reinstall windows because of how broken it got, and that's somehow okay?
Last time I did it was in the Windows XP SP0 days.
> The amount of BSOD I've seen and unfixable issues after spending hours googling and browsing windows shitty support sites are just the tip of the iceberg.
When was the last time you got BSOD? I've not seen them after Vista.
> The only reason Linux won't win the desktop is because it doesn't have a monopoly to support it.
"Win"? Start with software compatibility. There's no one Linux, there's a billion of incompatible distros that make it impossible to run old software.
> These discussions that pretend any of this is due to merits of one OS over the other are moot.
API and ABI compatibility are swear words to you, I get it.
I'm sorry if you don't care about those, the world will shrug you off. No other OS has this madness.
> by external factors.
Yeah, for Linux fans it's always someone or something else. The fact that Linux is a bug-ridden mess of software with no compatibility and no stability isn't an issue of course. It's because Microsoft/Oracle/NVIDIA actively interfere with Linux. Or aliens? Must be aliens.
I've been running Linux exclusively since RedHat 5.0 (no, that's not RHEL, that's more than a decade earlier).
This entire article is 1000% true. Every word of it. You can "opine" and "disagree" with it as much as you want. It's become too damn popular in the western world lately to opine about anything.
Feel free to opine about gravity, just don't fall off the top of a 10-storey building.
BTW this is why the entire world is running "ad-ridden"/"privacy-invading" Windows.
People run it because
1. Windows doesn't get in the way and allows them to run their software.
2. Windows is rock stable and doesn't break randomly after every upgrade (sans occasional conflicts with third-party software here and there affecting ~0.05% of people).
3. Windows is simple.
Linux has no implied compatibility, it's full of regressions, it's complicated as hell and it still has poor HW support. Yes.
People run Windows because it comes installed in their machines.
> Windows is rock stable and doesn't break randomly after every upgrade (sans occasional conflicts with third-party software here and there affecting ~0.05% of people).
What? I've lost count of the amount of times that I've had to restore botched Windows updates. It has gotten better lately, but it still happened more times than I've had to deal with update issues on my Linux installation. And I run my Linux near the bleeding edge where, in theory, everything should break all the time, right? Well, it doesn't.
My point stands. People don't choose their OS. It's chosen by the hardware manufacturer, their employers or their requirements.
I'm gonna repeat this because it's extremely important: people don't spend a single second choosing their OS.
If you want to game, you have to use Windows (yes I know about Proton and love it, but mainstream online games require Windows).
If you want to use your computer for work, you need Office. Even with Google Docs, people in my country simply default to the Office suite. Again, not because they choose it, but because it was already chosen for them by the monopoly that is already in place.
Aside from Office, you have all of the other specialized tools like the Adobe suite and many others that force you to use Windows again.
Windows doesn't have anyone competing with them. Their monopoly guarantees they can get away with anything. It's not a competition.
Windows 11 is a great example of this. Nobody asked for it. It's garbage with more ads in it. It wasn't built to improve the system for users, it was built to sell ads and force people to buy new computers. Being the only option allows them to do all of this. Windows simply doesn't need to add, change or remove features because of competitors since there aren't any.
Of course in the unicorn land of Linux cultists it "works" too, but then you open any Linux related forum and multiple threads about something not working anymore are posted daily.
I invite you to r/Fedora for example.
> My point stands. People don't choose their OS.
There's no point.
There's an actual OS, Windows.
And then there's a software compilation X version N that comes with no warranties whatsoever, it's called Linux Distro X version N.
People choose to run an actual OS with the ability to create and run third-party software for at least a decade, in the case of Windows for up to three decades.
Linux offers NONE of that.
No stability, no quality, no QA/AC and no implied or enforced backward or forward compatibility.
Continue to OVERLOOK all of that and believe "But mom, Windows is preinstalled!"
I know a TON of people who have left Linux after a few years of struggling with it at every turn. They are all now running Windows or MacOS.
Reading your answers you seem unhinged. Calm down mate.
> Of course in the unicorn land of Linux cultists it "works" too, but then you open any Linux related forum and multiple threads about something not working anymore are posted daily.
As opposed to Windows where nothing ever breaks? That doesn't make any sense.
It's a known fact that Linux users report way more problems than Windows users because they are simply more technical, more in line with the open source spirit and dig deeper.
When I had issues on my Windows, it was extremely hard to find any solution online. Nobody goes to r/Windows for their troubles. From the few answers I've found, pretty much all of them ended up being "re-install Windows" or "re-install parts of Windows".
> People choose to run an actual OS with the ability to create and run third-party software for at least a decade, in the case of Windows for up to three decades.
Sorry but, that's simply not true for the average person. I'm not talking about the typical HN user here. The absolute vast majority of users *do not choose an OS*. Surely you can't argue with that, right?
> Continue to OVERLOOK all of that
No. I'm well aware that there are many things to improve on Linux (especially desktop linux), but my whole point is that it's utterly irrelevant until people actually get to choose an OS. Linux could have zero issues and work flawlessly everywhere and it would still have almost nobody using it outside of our tech savvy bubble.
If Linux as a whole was as bad as you say, it wouldn't be used in the majority of computers in the world. It's only desktop Linux that suffers from your complaints, and it's simply because it doesn't have companies with infinite money and monopolies behind it making it more seamless for the average user while forcing vendors to support it.
> It is the fault of Video vendors, most notability Nvidia.
And even that's gotten better in recent years, too. Nvidia hasn't fixed everything, but two of the biggest concerns (not supporting Wayland and fighting community drivers) have began to be addressed recently. It's beginning to look like Nvidia is taking desktop Linux seriously, albeit slowly. Most of these changes landed in the 555-series drivers, which are still pretty fresh, maybe even still in beta iirc.
The Windows desktop is unusable shite, not the Linux one. OSX also fine.
My Linux desktop machine does not crash. It runs whatever I ask of it boringly reliably. So does a laptop. That's been my experience for twenty years now.
My shiny new Windows 11 work laptop locks up about twice a day. Reboots if I hold power for long enough to start running again. Gives me no indication what is going wrong. Doesn't even give a blue screen with an error message.
Bluetooth usually connects after twenty minute or so of repeat attempts. Accidentally plugging in a dock overrides mic/speaker settings to ones that don't exist in multiple places. Routinely demands reboots to update itself with no observable change in behaviour.
The Windows system isn't even stable enough to host virtualbox - that upsets the network connection.
Windows definitely has its problems, but hundreds of millions of people use it daily without these kinds of issues. You’re doing something very, very wrong if that’s truthfully your daily experience.
Where I've had crashes have typically been with closed source drivers, usually nvidia. Often the fix was to change out the video card for one that works.
My retired parents would disagree, having happily used Linux Mint for years now. Whenever I need to use Windows while teaching in school, I wonder how people put up with a system that is so user-unfriendly, some would even call it hostile.
The main difference, I feel, is that with Windows you can tell the system is built with a company's interests in mind, while in Linux this feeling is absent, and instead, you can tell the user is at the center of attention. And even if the system may not be perfect, it constantly evolves for the better. Just this difference ensures I will never go back again, nor will I ever recommend anything else.
With Windows, the system is a boss, with linux, you are the boss :)
I have only read section “1” but I already have more than one reply worth of thoughts….
Windows software from yesteryear frequently requires an installer that will not run on modern Windows. 16 bit installers are very common. So, the situation is not quite as rosy as reported.
My main complaint though is calling Flatpaks “light-weigh VMs”. If you are technical, you know what he means. If you are not, this leaves a totally wrong impression. Flatpak emulates no hardware. So there are zero “virtual machines”, light-weight or otherwise. Flatpaks run in a sandbox but mostly that is just a feature of the Linux kernel. Flatpak apps run directly on the host kernel just like any other app. It is totally true that Flatpaks “waste” lots of space because the way they provide portability is to bundle all the libraries that the app may need.
If we are going to complain about Flatpaks though, let’s be honest about how older Windows apps work on modern Windows. They do it by bundling all the libraries that the app will need. If you try to run ReactOS, you will learn that often you need to install old versions of the C and C++ standard libraries ( along with other things ). It is true that Windows includes much of this “out-of-the-box” for compatibility. And guess what, you pay the storage space for that on Windows. There is a reason that Windows is 30 GB freshly installed. Linux does not make you pay that tax. Instead, you have to install that stuff the first time you install a Flatpak. If you install a second Flatpak requiring the same libs, they share the space—just like in Windows.
So let’s not dump on Flatpak for wasting space just because it waits until you need it to install all these support libraries. I do not want my Linux distro to require 30 GB minimum drive space just in case I install Space Cadet Pinball.
Do I like Flatpaks? No actually. I use a distro with a huge repo largely to avoid them. Do I think Linux backwards compatibility could be better? Yes, actually. But please, let’s at least be honest and accurate when we talk about these things.
1: I just don't see who these articles are written for. Linux is on the desktop, it's called Steam Deck, and millions are are using it. So, the author is wrong here in the spirit of the question, and the truth of the question.
2: The statement "Linux developers spend very little to no time checking that their code changes don't cause regressions or breakages" cannot be applied to all or possibly most developers. What packages is TFA talking about here? Maybe TFA means "Denuvo developers spend very little time checking that their code changes don't cause rootkits or kernel exploits - in Windows."
3: "Lack of general software" - TFA seems to be arguing that because software is not specifically compiled for Linux, it is therefore not "available", which is laughably false. My entire Steam library works just fine on the Steam Deck and in Debian. In fact, many of those games run on Steam/Proton when they could not be made to run on modern versions of Windows.
4: "Poor file and folder sharing" - I don't think TFA has ever tried Samba in an all-Linux environment, but I will grant that permissions across machines can be tricky. You know, like sharing folders between Windows XP, 7, 10, and 11.
5: "Lack of funding on the desktop" - Listen, I don't think IBM/RedHat are underfunded. I don't prefer Gnome Desktop, but it's perfectly functional, and funding doesn't seem to be an issue there. Maybe we should ask Canonical if they've ever funded desktop work.
6: "Hardware support and compatibility" - Listen, I'm using an HP 1012 laser printer. Microsoft stopped writing drivers for this perfectly usable printer in Windows 8. I'm using a modern GPU (Radeon 6700XT) that has drivers built right into the kernel. I'm using a scanner that was new when Windows 2000 was new, and I can tell you, it won't work in Windows anymore but it will for sure work in Linux. Once again, TFA does not give examples.
7: Ok, I'll give him this one. You can pay for commercial support for Linux if you want it, though! Canonical and Redhat would love to have your money.
Summing up, it seems like this guy is just complaining that Linux isn't a commercially-supported O/S, except it is, if you pay for it. Maybe he means "Linux isn't Windows." So, his entire premise is wrong, and trolly. And speaking of trolls, you'll see TFA's name in the ars forum if you search, and I'll give you two guesses what his posts there look like.
The Steam Deck is not the desktop and it’s really amazing that this line gets trotted out in defense of desktop Linux every time a criticism gets posted.
It’s like saying that FreeBSD is on the desktop just because Sony used it as the base for the PS5. These are specific scenarios with specific hardware optimized for each use case. Desktop Linux encompasses the sheer amount of available PCs and laptops that one might install it on, and there is a miles long list of “just about everything works except $x thing” articles.
I otherwise generally agree with your points and think the article itself is more troll-ish than it needs to be.
The Steam Deck literally has a desktop mode, and you can connect a keyboard, mouse, and monitor to it, and play games that way. Both my son and I play games using this method. The Steam dock was designed to allow users to take advantage of this mode, among other things.
You can install regular Linux applications in Desktop mode, including Libreoffice and Firefox.
It's probably fair to say the majority of users don't do this, but it's a completely accessible, usable, and supported Linux desktop. So, yes, the Steam deck is a Linux desktop.
Even gaming mode is a usable "Desktop" environment, usable with a virtual or actual keyboard, touchpad or regular mouse.
You can even install SteamOS on a regular PC! You might have to tweak it a bit to get your particular hardware supported, but you'll end up with...a Steam Linux desktop.
Even if we grant that point - which I'm not willing to do, since:
> It's probably fair to say the majority of users don't do this
You still missed the part where I noted:
> These are specific scenarios with specific hardware optimized for each use case. Desktop Linux encompasses the sheer amount of available PCs and laptops that one might install it on, and there is a miles long list of “just about everything works except $x thing” articles.
This is and has been the thorn in Desktop Linux's side for the last ~20 years.
Everything you're describing here is literally true of Windows.
How many desktop PCs aren't certified to run Windows 11? Can you hack it and make it run with esoteric commands and dubious registry edits? Sure, but then you're in the same spot Linux users are.
How much hardware do people throw away because manufacturers don't write updated drivers anymore? Right, you're back in the just about everything works but situation again, except in Windows.
The only real difference between Linux and Windows is that new hardware is better-supported in Windows, and comes with a shelf life, and older hardware is better-supported in Linux with a shelf life that Microsoft can't touch.
When you add in and then Windows includes telemetry and data mining that Microsoft won't allow you to turn off and won't tell you what is included I can't see why anyone would want Windows.
> These are specific scenarios with specific hardware optimized for each use case. Desktop Linux encompasses the sheer amount of available PCs and laptops that one might install it on, and there is a miles long list of “just about everything works except $x thing” articles.
> This is and has been the thorn in Desktop Linux's side for the last ~20 years.
The Steam Deck literally does not have these problems. It's the the Linux Desktop, perfected!
> How many desktop PCs aren't certified to run Windows 11? Can you hack it and make it run with esoteric commands and dubious registry edits? Sure, but then you're in the same spot Linux users are.
> How much hardware do people throw away because manufacturers don't write updated drivers anymore? Right, you're back in the just about everything works but situation again, except in Windows.
A significant amount of goalpost moving is happening here. My original point holds true: general Desktop Linux, installed on $randomly-bought-general-class-desktop-or-laptop-machine, is not the generally solid experience that Windows is.
> When you add in and then Windows includes telemetry and data mining that Microsoft won't allow you to turn off and won't tell you what is included I can't see why anyone would want Windows.
I wasn't even debating these points with you. I don't think anyone should use Windows either, but here we are.
> The Steam Deck literally does not have these problems. It's the the Linux Desktop, perfected!
It is a gaming machine that happens to run Linux, and you have done nothing to convince me otherwise.
Very trollish. The second such article I have seen recently on the HN front page.
What does "not ready for the desktop" even mean? Its been ready for MY desktop for over 20 years. Points 1,2,3 and 4 are pretty much wrong. Point 5 is a problem for other OSes too (there are bugs in applications!), point 6 is only relevant if you have particular hardware you want to work with Linux, and point 7 is not my experience.
Hardware support is so much better in Linux it's not even funny. I mean you can't even install windows server on most PCs because there aren't any drivers ! This trolling is beyond ridiculous.
Seconded.
Back in 2015 I got a Toshiba Satellite Radius convertible laptop.
X11 ran there even with touchscreen support.
Plus, USB3 ran smoothly.
Now I'm using a Dell, and again, full hardware support with zero issues using Wayland and Pipewire.
And I use Slackware, not exactly a sophisticated distro, but one that just offers upstream software with no modification.
And don't forget all of the devices that become obsolete under windows and still "just work" with Linux, like old scanners, printers, cards and adapters of all kinds.
I find it hard to understand why someone would take the time and effort to write something like this.
I can assure you 'linux' works just fine on the desktop (something I have done for a couple decades), and while there is some truth to the security and funding challenges, I find taking the article seriously difficult given the current alternatives.
Both Microsoft and Apple are openly trying to force you into their vision of computing, which is pedicated on their control and profit.
I have machines that run Windows and MacOS, but those both get worse every year, while Debian (and other distros) get slightly better.
I can run many AAA games on Proton, have access to the best development tools available, and am granted choice everywhere.
I can't even log in with a local only account on the latest version of Windows, and MacOS makes it harder and harder to install my own software (unless I pay them a tax, via a yearly dev account payment.
Linux is closer to the way forward, we just aren't there yet.
I've been working on a cross-platform Windows/MacOS/Linux application for the last 2 years. All three of them have very significant pain points. Linux isn't ready for the desktop, but neither are Windows and MacOS in significantly different ways.
You can't, because it's a static executable whose only runtime dependencies are a file system.
And if kernel devs would bless us with a stable graphics ABI, then we could do this for games, too. Unfortunately, the way drivers work currently relies on dynamic linking in a way that drags in a dependency on differences in distributions, which is completely unnecessary and maddening.
All I can say as a person using Linux exclusively for the last 10 years:
- You're using the wrong Linux distribution
- You're using the wrong Linux kernel version
- You have the wrong hardware
- You're using Linux incorrectly
- You're asking for too much
- Go debug or fix it yourself.
Edit: This is sarcasm by the way. These bullets come from the article.
In all seriousness though. I think another poster had the right idea. Linux requires a mindset shift. You must be willing to be a master of your own destiny. I've used Linux largely problem free for 10 years. I took some time to learn how to do things "the Linux way" but once I did it's been smooth sailing.
I recently have had to use Windows for work and it's annoying as hell. Every day after lunch one of my monitors just turns off and rearranges all my windows. On Linux, I could dig into the problem, and worse case just have a xrandr script to hack a solution together. On Windows I'm just SOL looking and dead Windows message board threads.
>and Stack Overflow's "Question Marked as Duplicate"
Based on my experience on Meta, and various random Reddit threads where people complain: in like 90% of cases where someone complains that a Stack Overflow question was unjustly marked as a duplicate, I could go take a look and then write out a clear step by step explanation of not only how it is very obviously a duplicate by Stack Overflow's standards, but exactly how to apply the advice from answers on the target to the original problem.
Even if I didn't previously know anything about the technologies involved.
Please keep in mind that it is not the purpose of Stack Overflow to solve the problem you are currently encountering. The purpose of Stack Overflow is to answer a question that can be useful to others. If your program doesn't work, it is explicitly, by policy, your responsibility to debug first; the point of creating a Minimal Reproducible Example is to demonstrate that you have found the problem and have a legitimate question about why it is a problem.
Someone who closes your question as a duplicate is doing you a kindness. You immediately get answers and don't have to wait for someone else to write one. Closing duplicates is also doing everyone else a kindness, by helping to make sure that, when people use a search engine to find an answer on Stack Overflow, they all end up at the right place with the best-stated version of the question and the best answers for it. In many cases, the person who identifies a duplicate has gone out of the way to figure out the problem (despite an unclear explanation or an attempt to sneak in side questions about unrelated matters).
It's a shame you can't downvote submissions here, and flagging is not appropriate. We've heard these tired arguments a million times, if all of this gets fixed, the author will find another reason why he can't or won't switch.
Anecdotally, over the last 15 or so years I've been a professional programmer the number of fellow devs I've seen using Linux as the primary OS of choice on their dev machines has gone down considerably. Most are on macOS now, and I'd say even Windows has greater usage than Linux in recent years. Docker I think is the primary reason for the switch. You can develop anywhere and be reasonably certain that your application will work the same on a Linux server. I'm not sure if there even is a target market for Desktop linux anymore.
I don't agree with the author, but fair enough, he makes some good points. Where he really loses me though is in section 7. Now maybe I've just been miraculously fortunate, and found the best part of the Linux/Steam Deck community, but I've literally never been told to fix it myself, that I'm expecting too much, etc. What I have found is that people are almost overwhelmingly friendly and eager to help, they see a new person entering their beloved realm of Linux gaming and they want you to fall in love like they did.
I've been running Linux on my desktop since 1994. I've never actually run Windows (or DOS) as my main computer. From the first time I saw commercial Unix in 1991 I knew that was what I wanted. I have about 15 computers running Linux.
The author mentions non technical people and I will happily say that I'm a computer engineer designing semiconductors. Linux may not be the right OS for billions of people out there but it works perfectly fine as a desktop for a few million.
The reality is most people are well served by either Windows and/or macOS.
Both platforms have received billions of dollars in developpment by MS and Apple over the years and both have an extremely rich and mature app ecosystems. For most people, there's no real need or desire to use anything else.
That's it.
It's not an issue of drivers, games or app compatibility. If there were serious demand for Linux on the desktop, there would be good money to be made there and these issues would get fixed pronto.
I've been using various forms of linux desktop since 1998, now I have settled on arch/manjaro/lxde, I can get away with a cheap machine with only 4 gigs of ram and still do everything I want to with it. I use it for pro audio, writing and coding. For employers for a long time they would provide me with windows, and then in the 2010s that switched to Mac but both of those are intentionally bloated today. Linux Desktop is good for the environment.
I'm using Linux as my main desktop for so long now (>15yrs) that I probably wouldn't be able to handle a modern Windows system anymore. I have not touched any Windows installation for years, and everytime I have to help my wife with some issue on her Windows laptop I just find it crazy how people can tolerate that.
It might not be on the same quality level as MacOS but I like the freedom of choice Linux gives me.
1) Lackluster hardware support (not entirely Linux’s fault, but the distros could have a qualified vendor list, for example)
2) Incomplete UI for the OS. Not a single time in the past 20 years did I set up a Linux desktop without the need to spawn a terminal. This is completely unacceptable and user-hostile. None of the other platforms (even Android) require this.
When I think about all of what I do in the CLI and what would be needed to have a GUI, even a basic beginner version, I'm not surprised no one is doing it.
I believe that computers are like cars and should be treated as such. Driving one is easy, but any maintenance requires being knowledgable into how it works. If you don't have an IT dept, go grab a book and learn how to use the computer.
Points 4 (file sharing) and 6 (hw compat), and part of point 1 (package distribution), are valid critiques. The rest, not so much.
The main reason Linux hasn't proliferated on the desktop is that businesses don't use it on the desktop and that's because they prefer to pay for a license so they get support if something goes wrong, and because Windows has always dominated business software (much of which isn't even available on other platforms).
The second reason is because Mac blows most everything else away when it comes to laptops; it's almost rare these days to come across someone that doesn't have a Mac as their laptop (unless it's a work-issued laptop in which case it's a Dell or something like that, or they want a budget laptop).
Linux is not likely to be mainstream for the desktop because there's not really a business model that drives it like there is for Windows and Mac. And that's fine. I use Linux as my daily driver for work (by choice), though for my personal laptop I have a Mac.
I actually just installed Ubuntu desktop on a spare mini PC I had to try out some cpp game development, and it is a surprisingly nice experience! I prefer it over my windows PC which seems plagued with weird occasional glitches. I still need windows for some things like CAD tools, but I think I'll stick with Ubuntu for most things!
This article could be titled what’s wrong with Linux desktop - but current title is plain stupid. Plenty of people are using it as main OS mainly because all other alternatives have either privacy issues or some dumb gold cage limitations and privacy issues. I don’t think author is dumb - it’s catchy title due to hubris issues.
I don't deny having my own biases, but I feel like I could write an equally multi-pointed article entitled "Why Windows is not ready for the desktop." Linux certainly has its compromises but not any less than the more popular options, depending on your perspective and priorities.
The constant struggle is the lagging support for the best hardware. Unless you want to buy a machine from a dedicated Linux shop (framework, System76, tuxedo), you are taking a risk buying anything remotely new without a lot of research into the hardware and hunting for posts about people who have had success. Those dedicated Linux shops are usually at least a generation behind too. It doesn't help that architectures are evolving rapidly in response to Apple's chips (not a bad thing long term).
Just bought an HP omnibook flip with an Intel 258v. From what I've read I expect most things to work, or work soon with kernel updates. All applications I care about work well. Would be nice to see OpenAI make a desktop app for Linux, or some other open source equivalent.
If the vast majority of applications and games are built for Windows, and that means I shouldn't use Linux, I suppose the same is true about this Macbook I'm on right now. You know what, I'm tossing it out the window the second after I post this message.
1. This section is basically pedantry. "you cant use the same software across multiple Linux distros". OK? And PC software doesn't run on a Mac. They are different OS's "Linux" being confused as it's own OS, is just, whatever.
2. Windows and Macs have plenty of bugs too. So much so that we don't do in-place server upgrades, it's a full rebuild each version. I mean look at the Windows 10-11 upgrade situation, it's not great.
3. Lack of general software and games - What? You can play AAA games on Steam on Linux out of the box for the most part. I will concede that industry specific software is likely going to be Windows only, but you can run Windows in a VM. There are options here.
4. Network file sharing isn't any different than on Windows, this is just nonsense.
5. Lack of funding - hand wavy and again, whatever.
6. Hardware support - I haven't really had hardware issues in years. Every Lenovo laptop I've installed Linux on works flawlessly, even Wifi drivers are present in setup.
7. Nonsense again. The Linux community is super helpful.
I built a new PC in the fall of 2022 and set it up to dual boot Windows and Fedora. I haven’t booted the Windows partition in over a year. Fedora does everything I need from a desktop, including Steam.
I'm surprised no one has mentioned atomic Linux distros yet in this thread. The really hard thing here is that people aren't all talking about the same things. My experience on Arch isn't my experience on Pop. Things on Pop are amazing on my MSI rebuilt PC with Nvidia GPU. I don't even know if I really need to upgrade to NixOS except to satiate my curiosity.
> Countless software tiles in Linux have a huge number of bugs and missing features. That's because Linux is severely underfunded on the desktop. While Linux is unrivaled on servers and has been the world's most popular operating system for over two decades, the situation on the desktop is quite bleak.
Applications for Linux are underfunded. Linux itself is one of the few open-source projects that attracts enough money that a significant number of people could in principle actually be employed and properly paid. (Although a huge percentage of Linux Foundation money is going to things that... are not the kernel.) Funding Linux more isn't going to fix those software titles, because they're developed independently. Even desktop managers etc. aren't the Linux Foundation's responsibility. It's not like imagining a world where Edge is developed by a separate company that isn't Microsoft; imagine if Windows Explorer were separate from Windows proper.
Linux won't be popular on the desktop because normal people don't change the operating system on their computer. They either dont know how (because they dont need to) or dont care because everything works fine.
I think Chromebooks will continue to get larger market share, as plenty of people just need a web browser, but that's about it.
What are we comparing this too? With the exception of Windows software able to run on Windows, you could make everyone of those claims in spades about Windows. I would recommend OP go administer an enterprise Windows network and stare into the abyss like I did during my stint in the corporate world. And I won't be doing that again.
I think Linux is not ready for desktop the way Windows is. That's for sure. But I have been daily driving it for 10+ years and recently casual gaming as well. And I can tell you that it can handle a lot of things these days. Especially for general casual users and certain specific niches.
If people are still asking "Which distro should I use?",
I wrote a Linux distro recommendation framework article for beginners based on my experience.
The idea is that if you don't like my recommendations, I hope you can find a distro that meets your needs with the knownledge in the article. Something that I wished I had when I started with Linux.
A relevant discussion would be if an OS built on Linux such as Fedora workstation, Ubuntu desktop, RHEL or even ChromeOS is ready for general desktop use. But that would be more work because you would actually have to test and evaluate different features and eco-system support.
Are Dell’s like this encompassed by that article? https://www.dell.com/en-us/search/linux … What about System76? It seems like the article makes blanket statements that might apply if you’re building your own desktop and expecting tons of software to work … for free.
Would it be right to use the above devices with no additional installs as a benchmark? And then just compare ongoing updates & upgrades to macOS & Windows.
It seems like all modern OSes ship with the core apps you really need. Yes, if they do that you might have some bloat; and need or want to uninstall. But maybe the best OS is as opinionated as the author…
The “desktop” in “desktop Linux” to me refers to the category of software environment running on the device, not the device’s location in the physical world. The categories are “server Linux” (standard Linux kernel, traditional distro user-space, text based user interface via ssh or similar), “desktop Linux” (standard or realtime Linux kernel, traditional distro user-space, GUI login system with windowing), “embedded Linux” (similar to server Linux but almost entirely without a userspace) or “Android Linux” (Android kernel, AOSP Android user space, Android GUI and app system).
It’s running the “KDE Desktop Environment” on top of Arch Linux distro. So, that’s definitely desktop Linux in my view.
1. How many people use Steam Deck or Asus whatever as a desktop OS for running actual productivity applications, doing work, printing and scanning, installing an removing apps, and not just running games curated by Valve?
2. Can you install an average Linux distro on the said two devices and have them work flawlessly? Nope.
All I’m saying is that a position “desktop Linux can’t into games” is very very wrong in 2024. I wouldn’t pick desktop Linux for productivity use because I need a laptop, and Linux doesn’t seem to have figured out power management or reliable sleep-to-suspend yet. Sleep power management is Steam Deck’s biggest weakness - a day asleep it will drain the battery.
> All I’m saying is that a position “desktop Linux can’t into games” is very very wrong in 2024.
Ping me when you find a way to play modern online Windows games with a kernel level anticheat.
Oh, and people don't care that Linux can play games from 2015. People are interested in AAA titles and something that OTHER people play. Leave obscure indies to yourself.
Linux runs (emulates!) Windows games just fine, except when it does so horribly or doesn't do at all.
Yeah, great, kudos to Codeweavers (a commercial company) and Valve (a commercial company). The Linux community has done barely anything to make running Windows applications under Linux a reality.
The one 2024 game release I wanted to play but couldn't at release on my Deck was Star Wars: Survivor (although recent reviews says it runs great now). Every other 2024 game I bought worked flawlessly. Maybe kernel anti-cheat is important to you, but I haven't wanted to play any such titles ¯\_(ツ)_/¯.
You are welcome to "no true scotsman" and "move the goalposts" about this but I still think it's a silly position to hold that "desktop Linux can't into games".
> the core issue is that [Software] is in a constant state of flux. Regressions are introduced all the time because [Software] developers spend very little to no time checking that their code changes don't cause regressions or breakages outside of the problems they're trying to fix or features they're implementing
There, i fixed it for you.
This is not specific to Linux. See Microsoft, Mozilla, Google.
Linux desktop is fine since the first release of XFree86 for it.
As always, it depends, right? I'm perfectly happy with Linux and use it both for personal tasks and work. Office is in the cloud these days so that's covered. Our designers still use Windows or MacOS though.
For me, 2024 was the year I finally got completely rid of Windows and switched fully to Ubuntu on all computers. I had kept Windows for playing Steam games, but I realized Proton will play them just fine on Linux when you enable it in the Steam compatibility settings.
It feels really good to get rid of WSL, Docker in VM and other hacks that kind of make Windows work for development, but always cause various issues. Developing natively on Linux just lets you forget all those hacks.
One thing I also did was switch from NVIDIA RTX to Radeon RX. I haven't had to worry about GPU drivers and settings and heat issues in Linux after that.
The security of desktop Linux is just not there. Flatpak is a thing, but most users won't bother and they run all their desktop apps using the same uid and zero auditing or sandboxing. There is just no easy-to-use solutions, and none of existing mechanisms are default out-of-the-box.
One may argue that running an application means you have to trust the developer/distributor/etc. But no, I don't trust a single binary because I don't and won't have a full knowledge of what it's doing.
Personally I think immutable distros are the future on the desktop for the average user, so yes not for anyone reading and posting here on HN. Fedora Kinoite is excellent for example.
Linux has a stable API for gaming now and there are many AAA games available that target it. It is the Win32 API and DirectX. There is even kernel level support for some of it now.
There's one answer that I would give that's not present on the list - Linux is free, so you don't get to complain if there's something not working properly. You did not pay for it.
With windows, that's a completely different story and yet you can't even complain, my tickets to Microsoft were resolved in ~2% of cases.
"Linux is not ready for the desktop" -- from a usability or technology point of view -- is an idea with zero fundamental merit and it's rather brain-rotty to look at what Linux can do on a desktop through this particular lens.
There are already many Linux "desktops."
A question with merit would be: How can we collectively best bring the advantages of Linux to broader audiences.
Your mileage may vary, but given the choice between other desktop operating systems and their gradual decline from simply unfriendly to practically unusable, Linux is the least frustrating option. Computers with Windows pre-installed have hurdles from step one, where they are pushing you to create online account and even that flow is not straightforward.
I wonder who he keeps running into. Not only do I use Linux on a daily basis, I run about 20 virtual *nixs from that Linux desktop and everything runs really great, but when I run into problems, the Arch community (ie. colleagues, forums, mailing lists) have been nothing but extremely helpful and friendly. Without exception.
Some people really hate when they ask a question that's answered in TFM and are answered by a link to an entire TFM page. There's a huge demand for "please help me understand exactly how this applies in my exact situation", especially for less technically-minded people.
The Mint forums are helpful enough, but they tend to demand tons of ultimately irrelevant system info up front, commonly misdiagnose problems, and give blanket advice that often ignores more detailed concerns. (Also, the site has been very unstable lately.)
Android is a repudiation of traditional “distro” Linux userspace. I think it’s Android approach that has the best chance of reaching mainstream adoption in the laptop form factor with consumers.
I'm not saying "android on a laptop is the way of the future". I'm saying the Android model, to sweep aside the status quo distros and start fresh with new approach to userspace, is the path I see most likely to bring Linux to the masses in laptop/desktop form factor.
If the people you see using Linux on the laptop are developers, then I don't think they count as "consumers" - they're on the production side of things! I don't know any consumers who use Linux on a laptop. I last gave it a shot in 2018 but decided it's not for me.
In my experience, you are still much more likely to get broken hardware support when updating kernels on Linux, though I have no idea why. I almost never see or experienced stuff like my laptop camera not working at all after an update on Windows, but it does happen on Linux. The same goes for GPUs, which can break if you update your kernel often.
It's not necessarily the kernel's fault but it's something that does happen often using distros like say, fedora.
I think the main difference is that Linux supports and runs on almost everything, but in a lot of use cases it will be a specific version of the kernel that will be used for a product's lifetime (for embedded products) or have every update very heavily curated through a third party like Red hat. In those cases, Linux is rock stable, far more than Windows can be.
But for regular, personal usage, I genuinely think that Linux does break more often.
I don't disagree at all. I'm completely allergic to ads at this point, hence why I use Linux a lot more :). But let's be honest, for the average user, the choice is super obvious. Yes, windows shows you ads in the start menu, but it "works" and constantly so. They don't need to necessarily worry about a windows update borking their GPU drivers, or having to use grub to boot into an older kernel because some proprietary driver stopped working after an update.
I personally know enough to fix or even prevent those issues, but for the vast majority of desktop users, even less casual users like gamers, they really don't care about the stuff you listed. It's sad, because it absolutely ruins the experience of what would otherwise be a great OS (the core of windows is great imo, just not everything on top of it).
is somewhat true. State of the video drivers from AMD are not good on Windows and atrociously bad on Linux. NVIDIA are drivers are barely of the acceptable quality on Linux.
And that's just when considering core system components. But non-technical users will expect any odd-ball peripheral they pick up at Office Depot or Best Buy to work as advertised out of the box, and it probably will not.
> Compare this to Windows 11 64, where the vast majority of software titles released in the last 30 years run almost flawlessly.
Modern Linux can still run binaries that were compiled in the 90s. You just need copies of the libraries that they use. If it is that old, you probably want to dump all of the libraries into a directory somewhere and set LD_LIBRARY_PATH to point to it, since you are likely not getting the older versions from your distribution repository and that is okay. Needing to run old binaries is the exception rather than the norm. Keeping legacy libaries around that are not used by modern software just in case someone will try to run a binary from the 90s is a reason why Windows is so bloated.
> QA/QC, bugs and regressions
Here is one of many examples that show Windows is not a bug/regression free experience:
There is a huge amount of software for Linux systems. If you want to talk about games, which is evidently what you mean by software, you can even run most Windows games on Linux:
> Poor file and folder sharing situation on the local network
> Linux doesn't offer a native technology similar to Windows file sharing that is easily configurable, discoverable, encrypted and password protected. What does exist in Linux, Samba, is quite painful to set up shared folders, even more so in distros that use SeLinux (advanced security mechanism), create users, assign and change passwords, and not worry about folder permissions.
There are NFS, SSHFS and others. While I do not use either NFS or Samba, it is fairly simple to setup samba shares or NFS shares with ZFS.
> Lack of funding
> Countless software tiles in Linux have a huge number of bugs and missing features. That's because Linux is severely underfunded on the desktop. While Linux is unrivaled on servers and has been the world's most popular operating system for over two decades, the situation on the desktop is quite bleak.
I thought you said "Linux is not an operating system". In any case, if you want more funding going to desktop use cases, use MacOS.
That said, the Linux desktop has been a fairly good experience for a number of years.
> Especially when it comes to brand new hardware, you may find that your laptop's Wi-Fi network card, webcam, sound, and even keyboard do not work.
I can believe most of those, but not the keyboard outside of truly exceptional situations that are not worth mentioning. As for the rest, issues with hardware compatibility have been rather rare for some time. When I used Windows, you needed to fetch drivers from hardware manufacturers for everything. Off the top of my head:
* I needed a SATA driver since SATA was made after Windows XP.
* I needed a NIC driver since my NIC was made after Windows XP.
* I needed a graphics driver since my graphics card was made after Windows XP.
Things became better with Windows 7, only to gradually become worse overtime as newer things came out that postdated Windows 7. Having nearly all of the drivers in one place is awesome. Even if you need to update the kernel, this is generally fine, since newer kernels are backward compatible.
That said, most hardware is supported by Linux these days. It is a very different situation than it was in the 90s.
> You will have problems with Linux, I can promise you that.
I can promise that you will have problems with any non-trivial OS. You just learn to handle them over time. Then when you try something new, you will complain about every new problem you face as if there were not any problems in your old OS.
How is it that Linux is ready for huge swaths of industrial applications and yet it somehow isn’t ready for the desktop? Personally, I have been enjoying “Linux on the desktop” for years now, and I find the distribution that I use exceedingly user-friendly (in its default configuration anyway); hell, it’s less confusing and annoying than Windows (not sure about how it compares with the recent versions of MacOS, an older one drove me crazy enough I wiped it out and installed Linux in its place); not to say anything about the trustworthiness…
For example, Windows or Android software from ten years ago will still work in Windows 11 or Android 15 or whatever their current versions are. For Windows, software compatibility is actually excellent: a lot of 32-bit Windows 95 software still runs perfectly on Windows 11 64 almost 30 years later. Nothing remotely close exists for Linux.
I completely disagree.
Android applications made for older versions DO NOT work nowadays. Windows applications usually do not install and to run them you have to install them on old version and upgrade. Backward compatibility is no longer the key point for OS creators, and I would say they event want to break it somehow (like Android new set of permissions and removing old apps from the store)
This is untrue. You can run old Linux binaries on modern Linux systems if you have the older libraries. You want to set LD_LIBRARY_PATH to point to where the old libraries are located.
By the way, any Android applications using Google’s old messaging APIs no longer work and need to be rewritten to use fire base. I know people who were bitten by that multiple times since Google had to keep reinventing push notifications.
Use the other system’s elf interpreter (which is conveniently a library). Then it should work. Saying an ability to run older binaries does not exist is just wrong.
By the way, when someone makes an assertion that something is always true, it only takes 1 counter example to disprove it. The assertion that you cannot run old Linux binaries on a newer system has been disproven by a counter example.
> Saying an ability to run older binaries does not exist is just wrong.
For 99.99% of users this "ability" does not exist. End of story. Period.
I don't care about hacks on tops of tricks on top of hacks. People are not interested in this madness.
Every second comment in this discussion comes from a hardcore software developer who desperately tries to vindicate core Linux flaws by providing workarounds which would work only for people like you.
This is NOT how the world works. This is NOT what people care about or interested in. If cars, refrigerators, microwave ovens worked like that no one would buy them.
Then you acknowledge your previous claims about the capability not existing were wrong. You also contradict yourself by saying elsewhere that mostly geeks run Linux, which implies far more than 99.99% of users can do this. It is not a hack either. It is the way you are supposed to run old software by design.
I run Gentoo, not Ubuntu. I have run graphical software from Ubuntu on my machine in a chroot that spoke to my X11 server in the past. I did it to get a Windows game running whose anticheat had wine support that was incompatible with newer versions of glibc than 2.32. A similar process should work with an old Linux distribution binary.
That said, you refuse to recognize that you are wrong no matter how much evidence is presented and even go so far as to present evidence debunking what you say while claiming it supports what you say. If I were to run a graphical application from the 90s on my machine, you would accuse me of making up the result. Since you will say the same thing even if I did this exercise, why should I bother?
Correct, there are some Gnu/Linux-en,
and there is Android/Linux.
Which may not be a classical desktop OS, but some platforms like Samsung phones emulate floating window management and multitasking, making it an okay desktop OS.
And Android has a huge market share, so perhaps "Linux won over Windows" already, just not on the desktop.
Did the author do any fact checking, proton (developed for Steam OS) runs so many games, it is unbelievable.
Absolutely - current windows is truly horrible, ads, ads, ads, crash :( If I were designing something to be deliberately distracting, annoying, and confusing, it would look a lot like windows!
Yeah, I get that this is called out under business / professionals, but I've also used Ubuntu (more recently Pop!_OS, and Fedora before that) as a daily driver for >a decade and given small-form-factor PCs with Ubuntu to relatives who have successfully used them for web and basic LibreOffice work for years without major issues.
When people ask me for recommendations for a family PC these days, I tell them to just get an iPad or Mac Mini though.
Is there even a Linux community that could express what it wanted from desktop linux in unified, actionable message? I rather doubt it. I mean, the fact that GNOME and KDE still exist as separate entities after 20+ years seems evidence enough.
Some people seem to want a deep-pocketed daddy to come and fix it, but even if such an entity existed, I fail to see how they would achieve any return on their investment. And the community is so fractious/holds such high standards (take your pick) that whatever they do is sure to be criticized from the first byte to the last. It's a rare PM who would relish that fight inside the executive suite and out on the mean streets of the mailing lists.
Other entities? Eh, not so confident. Bodies like the Linux Foundation feel like places where projects go to die (corporations pay a small tax to get a Gold badge, dump some deadweight code, take some tax relief, and burnish a resume). Not seeing any mileage here.
Smaller UI shops like Zorin and PopOS? Doing the Lord's work cleaning and polishing, but in the absence of a clean and stable set of interfaces, it's all very fragile.
HW vendors have no incentive to care since desktop linux is a tiny market. (I find the requirement that desktop linux must work on anything mystifying because there are probably only 5 pc makers that really matter in terms of volume and anyone else can wait and see before committing.) If you are dell, all you really care about is linux on poweredge, and if some weird state government insists on linux to you, well, fine, youll freeze the BOM on a few rounds of Dell Latitude, charge extra, and it'll be fine.
It would be a delicious irony if the best Linux desktop environment ends up being WSL2 after all these years.
Using Linux as my main os since 2017, I'm neither a gamer nor a total nerd/linux fanboy. Using my laptop as any casuals (except for dev).
Although I do have issues sometimes with it, it's most of the time because I'm trying to do something super specific for my job, and it requires technical knowledges. Otherwise, the way normal people uses a PC nowadays is 1) start the pc, 2) open the browser .
Next, don't modify anything. Use the DE it comes with, never open the terminal, treat it like it's MacOS or Windows and you're your grandma. Then you'll have a good, solid desktop.
The problem people have is using hobbyist distros like Mint, some flavour of Arch that's just a buggier Arch, some terrible DE that's riddled with bugs, or using distros that I'd only recommend to actual developers, like Debian or (real) Arch.
Or they modify shit because they saw it on some YouTube or blog. Then they break things and don't know how to unbreak them.
Lots of distros are poorly funded and buggy. That's a thing. But the big, corporate money distros are ultra stable, especially in their default configuration.
Also the thing about games, just look at the Steam deck catalogue... That's not an excuse anymore in the current year.
Now - I'm not super deep in Linux. I get to tinkering with it every few years since College.
Early on, you had a lot of weird config stuff and cryptic error messages you had to figure out to get it up and running normally. Then it was OK until something weird happened. The chances of something weird happening go wayyy up if you don't heed your advice:
> Next, don't modify anything. Use the DE it comes with, never open the terminal, treat it like it's MacOS or Windows and you're your grandma. Then you'll have a good, solid desktop.
But even so - occasionally, weird stuff will happen. Just like it happens on any OS over time, the farther you get from a clean install. Software Entropy, if you will. May god have mercy on your layman's soul if that is the case.
> But even so - occasionally, weird stuff will happen. Just like it happens on any OS over time, the farther you get from a clean install.
Immutable distros solve this. Your base OS is basically always in the state the developers intend (as long as you don't mess with it). Applications are all containerized and if you do dev stuff, that's also in containers.
Mint being a "hobbyist" distro that provides a worse experience for regular users than Ubuntu is definitely a bespoke take. Nothing against my upstream as a Mint user, but the entire point of this distro is to get the "life's too short to configure stuff" things out of the way. I do do my fair share of "hobbyist" Linux config stuff, but it's stuff I could do on any distro.
> Or they modify shit because they saw it on some YouTube or blog.
The Youtube content about modifying Linux is overwhelmingly how-tos rather than advocacy. People are watching those videos and following those steps because they already wanted to make the modification.
> Finally, since drivers in Linux are generally part of the kernel (there are a few exceptions, including NVIDIA), you cannot upgrade them to the latest version or downgrade them to the version that worked for you on the fly. You have to boot into a different kernel. This is extremely inconvenient and not always possible
I mean, that's just false. Most drivers are done as kernel modules and you can push/pull them in without missing a beat.
if he's talking about graphics cards or similar subsystems - you might need a reboot, much like Windows does.
IMO, this reads like a gamer reviewing Linux because they thought it would be cool without knowing much about to use it as a workstation to get shit done.
> I mean, that's just false. Most drivers are done as kernel modules and you can push/pull them in without missing a beat.
0.1% of people in the world could that. 0.001% of people in the world would do that.
That's not false, your understanding of a decent OS is just lacking. In Windows you can swap drivers however you want given they support your Windows version.
Windows is installed on 2 billion PCs, give or take. It works great on them, too. Long gone are the days of malware waves or driver failures that break the system.
Linux is on 40 millions.
And the amount of pain I see daily on Linux-related forums trumps Windows' woes by an insane margin.
I would not call limping along from extreme levels of telemetry, AV and other bloat “works great”. Discussing “driver failures that break the system” is a strawman since bad drivers never really broke the system. They just crashed it such that it needed a reboot. That said, Microsoft lets garbage that has no business being in the kernel into the NT kernel, which resulted in this fiasco that actually broke Windows systems:
Windows systems often suffer from cryptolockers, which could be considered malware waves.
You have a stratification bias by looking at forums where people are asking for help. I have installed Linux for a number of non-computer people to solve Windows problems and it has served them well. They all like it better than Windows. The average person really is fine with a Ubuntu install. The only reason Windows has such usage is that it is installed by default due to history. If a Linux distribution were installed by default, there would not be many Windows users.
> I would not call limping along from extreme levels of telemetry, AV and other bloat “works great”
2 billion users don't notice too much "limping" and "telemetry".
1. This entire post is whataboutism, the article is about Linux
2. NSA/CIA/FBI and pretty much all governments of the world use Windows that is ostensibly riddled with spyware and telemetry
3. In the past 20 years not a single person with WireShark and MITM has proven that Windows leaks any personal data (e.g. files). This can be set up in less than 10 minutes. Go produce a single shred of effing evidence I dare you!!
Parroting BS on the Internet doesn't make your post any more valid.
That's disgusting coming from an "IT specialist". You have issues with basic logic.
> Parroting BS on the Internet doesn't make your post any more valid.
> That's disgusting coming from an "IT specialist". You have issues with basic logic.
I assume you are projecting here since I have never once in my life described myself as an “IT specialist”. This fits what you have been doing far better than it fits me.
It leaked a while ago that Microsoft Windows automatically uploads WiFi passwords to their servers, which are sensitive information. This was after the WiFi sense debacle.
As for not noticing limping, of course they would not notice it when they have never seen the machines run faster. Install a Linux distribution on the machine and they will be amazed by the speed. It happens every single time I do it.
I am not sure what you think the your TLA and governments remark achieves. It just seems to be the ravings of someone spreading FUD to me.
Knowing that Microsoft uploads WiFi passwords to its servers does not even need wireshark. You just need to log into the same Microsoft account on a different machine and observe that your WiFi passwords are synced. Given that the upload is likely encrypted, it would be difficult to observe the passwords being uploaded to Microsoft servers using wireshark. While you would see the packets, you would not know what they contain unless you somehow get the encryption keys out of the system encryption library.
Your prior remarks on running old Linux software did not require that things be easy, which is a matter of opinion, or graphical. It is possible with the right libraries as modern Linux systems (minus Android unless you install a X11 server) can display X11 applications.
It could use taking a real good look at the "user journey". How would one even get to having linux on their desktop? It seems like it still overlooks some basic stuff. Like, having a website that's straightforward and doesn't bury how to get a distro. Instructions that'd be written with clarity and wouldn't just drag on for dozens of steps (and even if they do, that's just the stuff that should be written smarter, not in a way that'll just overwhelm someone and just make them check out). Granted, even just putting install media together in the first place might be enough to make some people check out already, but still.
There are things like fedora's media writer, which is good, but even that is kinda needlessly buried under several clicks, because their website just isn't regular user oriented, I guess (which is a real common thread) - there's server and enterprise stuff, which is just a distraction and would be irrelevant to such user, so why is it even there, or why isn't there a "consumer" fedora website...and it's just stuff like this. Take a look at ubuntu's website, and you'll see a similar (but also worse) situation. Just literally, where would a regular person even start there? They'll see all this enterprise stuff and just leave. Debian, there's a huge download button - great! that just downloads an image and uhhh...what one is supposed to do with it? there's bizarrely not even an 'installation instructions' link next to it on that page. there's a "help" link, but it doesn't lead to installation instructions, which are actually under 'other downloads' link, and buried under several overcomplicated pages, with the manual itself being so overly long it'll make anyone give up on using it, simply because it has a poorly usable interface.
It falls apart before someone would even get on, cause there kind of isn't a solid path for them. Like, back to basics. How about making stuff easier to get, really basic stuff like unburying download links and making stuff easier to understand...well, that is, if those distros and companies are even interested in 'consumer linux' anymore, which barely seems that way sometimes. Maybe a user friendly way isn't to just throw an iso file at someone (if they'll even find it) and have them go figure. Maybe that's exactly why linux user base is not growing substantially.
Also, just try to not even tell people to use terminal. Like, don't even start with 'you can apt get anything'. They won't. Just say 'get apps from an app store'. That's actually user friendly, and it already just exists in distros. As a further challenge, one might wanna consider any time a user has to use terminal and/or blindly copy some commands they don't actually understand in order to do or fix something, a usability failure. (cause it is.)
> so I'll just say it bluntly: these are all lightweight virtual machines. It's crazy to think that they solve software incompatibility in Linux, they just work around it by making the user allocate and run gobs of binary code, unnecessarily taxing their storage, CPU and RAM.
And.. what exactly do you think Windows is doing to achieve this same feat? Isn't "Windows on Windows" _precisely_ what you describe?
I was thinking about adding a "Security" section but I wasn't convinced it was necessary. Security in Linux is horrible in general.
* People use `sudo` for anything and everything all the time.
* `git pull whatever_crap_from_the_web` and run it happily.
* Tens of thousands of pieces of malware via NPMs and other frameworks.
* No AV of any kind (the one that actually executes code in a sandbox and verifies its safe).
* Permissions/users/groups/pam.d/logind/selinux/apparmor - it's a huge intricate mess.
Verifying arbitrary code is safe is not possible for AV to do. If you know some way to do what you describe, you can commercialize it and become rich. Many would pay top dollar for such technology.
Windows gives a semblance of trusted code, also Microsoft sells certificates to developers.
Nothing (!) like that exists in the Linux world aside from PGP cross singing which barely any projects use because of its sheer complexity.
Don't lie about Windows and don't lie about Linux.
And Windows 10/11 both mandate that installers are signed before you can even launch them, so there's a very decent first line of defense which is TOTALLY missing in Linux.
Also, "trusted sources"? Who and why would you trust? Seasoned engineers in my company git pull any crap from the web and run it happily without thinking twice because AGILE and things like RoR. Who do you trust among thousands of modules?
God damn it.
Linux fans will make up all sorts of crappy pseudo-arguments to portray it as a decent/secure/stable OS and it's none of that.
On servers where you don't touch anything and never install third-party software? Yeah, surely. No one cares.
Distribution package maintainers generally vet the packages they make avaliable and updates to repositories are typically PGP signed with cryptographically secure hashes for the files. I say typically because I am only familiar with Gentoo and to a lesser extent, the Debian family.
As for Windows, there is no signing requirement according to Microsoft:
> Windows doesn't require software developers to digitally sign their code
Continue to use a crap pseudo-OS with no security and believe everyone around is a geek who is willing to learn CLI, bash, vi, git bisect, reading mans, etc. just to use it.
You have now accused me of saying something I'd never said, you crossed the line and goodbye.
This website is for hackers, not "geek squad". I am very disappointed that something of such low quality could manage to be linked on one of the first few pages of this site.
You are commenting on a site that effectively represents the VC-centric startup world, not the /.-esque hacker world of yesterday. That it uses the term “hacker” in its name doesn’t make it Linux-central.
The garbage is a result of me using Linux for almost 30 years and also being an active Linux maintainer.
I can vouch for every word in the article except maybe the last section but this discussion alone speaks for itself, so even that part is not totally wrong.
Here in this discussion we've got all the falsehoods that I've heard a billion times already:
* It works for me it must work for everyone else (never mind other people have different hardware and use different software)
* I've not had any issues it's all made up (the same)
* I can show this, that and that exception (yeah, exceptions normally prove the rule).
Have fun with a no-os. I mean your specific distro version with a specific set of libraries and applications that you're using today that is not compatible with anything else under the sun.
What do you maintain? If it is on my system, I want to get it off my system as I have zero confidence in your competency after seeing all of the FUD you are spreading.
Everyone I know that uses linux for their PC, has constant, daily issues and complaints, and they all use a different distro that they promise is the best. Without even going into any of the other issues, that's enough to keep me away. Even as a nerdy programmer, I don't have the time, energy or any reason to use something that will make my life harder and impact my ability to do what I want to do. I can't see any conceivable path forward where linux becomes an actual viable and mainstream platform for regular PC users. It seems like people mainly use linux for ideological reasons, not because it's better - actually in spite of it being worse. In my personal experience, the vast 99% majority of professional programmers use windows/mac, and I see more than anything people switching to linux just for the novelty of it. Linux users can also be very dismissive "Photoshop? You don't need that. Use LibreGimp, but use X fork as it has essential bugfixes."
Everyone I know is happy with linux and do not complain at all, instead prefer it over other systems. Me, using linux professionally for the last 15 years I'm just so so happy that system like this exists. Maybe I've been lucky but I've never had permanent problems with it with my last four computers, three of them being notebooks, one desktop. Nothing gets in my way, nothing breaks, I'm more productive than I'd ever be in any other system.
Wow. Such sweeping generalizations make me not want to pay attention to anything this article has to say.
But let's entertain the possibility that parts of that statement are true. It's practically impossible for "Linux developers"—whatever that means—to ensure that a change they make doesn't break something downstream. Unlike proprietary operating systems, the Linux ecosystem is not maintained by a single company, so it's impossible to know and account for all the environments a piece of software will run in. Distro maintainers work tirelessly to ensure bugs don't make it to users, but it's simply impractical to test every single combination of hardware and software that exists. This is extremely difficult even for proprietary operating systems where development is centralized, let alone for a distributed environment like most software for Linux, and Linux itself, is developed in.
When you think about it, it's a miracle of modern software engineering and a testament of the power of collaboration and the free software movement that the Linux ecosystem works as well as it does, and in many areas matches or exceeds the functionality of other operating systems, all the while preserving important user freedoms. GNU/Linux is a stronghold for user-friendly computing among alternatives from companies that want to control how you use your computers, or profit from your data or attention. It's not perfect, but it's one of the few options we have (besides the BSDs, but I wouldn't consider them to be as user-friendly).