The approach to New Year isn't complete without developers across the world asking whether this coming year is going to be the year that Linux conquers the desktop. The year that Windows and macOS will be dethroned, the year that open source will win and distributors of proprietary software will cower in fear. This isn't a new discussion - it's been taking place for years, and is usually measured with the same optimism of success.
But this isn't going to be the year that Linux wins at home, and it isn't going to be the year to throw out the proprietary operating systems of the world. To be frank, next year won't be, either. Or the year after that.
So what's the issue? Linux is already sitting at the core of servers worldwide, Android devices, home wireless routers and a whole range of other commodity items. Linux is, no doubt, wildly successful. Doesn't it stand to reason that it can be just as successful on your standard home or office PC?
In fact there are a number of issues that prevent Linux from enjoying the same success in these places.
The distribution model is user-hostile
Linux distributions are operating systems made up of a Linux kernel, a collection of software utilities and often a package management system. Many of these distributions are free to obtain and advertise themselves for a variety of purposes: some are for embedded systems or for a specific purpose, but there are a whole host of general-purpose distributions, such as Debian, Ubuntu, Mint, Gentoo and CentOS.
For power users and developers, choosing a distribution might be second-nature. It might be that you prefer a source-driven package management system, like emerge on Gentoo, or perhaps you would prefer to avoid systemd like the plague. Perhaps you would like to stick with a distribution that claims to be "pure" and doesn't contain closed-source binary-only drivers. Maybe you are using an obscure computer architecture that is supported by some distributions and not others.
However, for regular users at home, choosing a distribution is a daunting and summarily confusing task. Often it is not clear whether one distribution will provide any real benefits for a given user over another.
The diversity between different distributions can cause headaches not just for inexperienced users, but also for software developers alike. The creators of different distributions often pick different system libraries, or even different versions of the same library, when building their system. This means there is absolutely no guarantee of binary compatibility between Linux distributions. There's no "write once" or "compile once", because the system that you built the application on probably doesn't look anything like the system that your users will run on. You don't even have a guarantee that the correct prerequisites are on your user's system. Which leads us onto a phenomenon known as "dependency hell".
The fires are still burning hot in Dependency Hell
Let's imagine that you have a library on your system that takes an MP3 file and plays it, or a library that takes a JPEG photograph and renders it. You want to write an application that takes advantage of functionality provided by these libraries, so you set off writing your application.
You then take your newly written application to a friend's machine and try to run it. It fails to launch. What went wrong? It turns out that your friend is probably running either a different distribution, or a different release of the same distribution, or maybe they've just not installed any patches in the last six months. In any case, the library you leveraged in your application is a different version on the target machine, and the developers of that library were not careful enough to perfectly preserve API compatibility.
Is it possible to avoid this issue?
You can perhaps build the library into your application directly. This way, you do not have a dependency on the target computer having the correct version of the library that you need. This sounds good in practice, but has some unintended side effects, namely that your application bloats in size, especially if the library in question is large or complex. It also makes the assumption that the library itself has no specific dependencies. Many do, so this falls over quickly.
Alternatively you can package your application such that it will only install through a package management system if specific dependencies are met. This is the more commonly used approach. The package manager has to resolve the dependencies itself when installing your application by making sure that the prerequisites are present. The issue with this is that those specific versions of those specific libraries also need to be made available through the package manager, many of which aren't. It's not common practice for package repositories to maintain an entire back-catalogue of every version of every library or utility. There's no guarantee that those specific versions or libraries will even be available at all from that repository. You may end up with multiple versions of the same library being installed on the same system.
Neither of these approaches are solid. The only reason that this is not so much of a problem with commercially-developed Windows or macOS is because they are usually much more tightly controlled to preserve compatibility between releases, and there are not multiple distributions of these operating systems that need to be considered in the same way that there are in the Linux world.
Can a normal user really be expected to understand what is taking place when installing packages or resolving dependencies?
There's still not much vendor support for hardware
You've just bought a new printer. You bring it home, open the box and plug it in. Nothing happens. Oh, wait. We haven't installed the driver software. Ordinarily you'd get the CD out of the box (or download them from the web), install the drivers and done! The printer now prints exactly as advertised.
The problem is that many hardware manufacturers simply don't produce hardware drivers for Linux. Many drivers that are available in the Linux kernel have been developed by the open-source community to fill a gap, but often these drivers are not perfect either, and either only cover basic functionality or are simply incomplete due to lack of proprietary knowledge of that particular product. How do you know that the peripheral that you've just bought will actually work on your Linux computer at home?
In many cases, the scene is not as dire as it once was. For example, nVIDIA and AMD are now fairly good at providing drivers for their graphics cards and chipsets. On the other hand, try finding drivers for most Intel kit. There's still no Intel graphics drivers for many of their graphics adapters on Linux, nor is there an Intel RAID driver for Rapid Storage Technology. Hell, even whole Intel Atom CPUs are simply not accounted for in the Linux kernel. How does a user at home even know that when they install Linux on their PC at home, that all of their hardware will be fully supported?
Inexperienced computer users simply don't have the knowledge either to recompile the kernel or to load additional kernel modules when drivers are needed. The process of handling and managing drivers in the Linux world has never been streamlined nor simplified.
There still isn't much vendor support for software, either
More often than not, big-name software doesn't appear on Linux desktops either. Perhaps the most famous example is Microsoft Office, which is fairly universally accepted by most. Other common applications like iTunes also have no Linux support. Steam release a small number of games on Linux but are very few in number compared to those available on Windows, or even on Mac.
Many open-source alternatives are available but often they are either lacking in features or in usability. It's not reasonable to suggest that OpenOffice is really a suitable replacement for Microsoft Office, nor that GIMP is really a suitable replacement for the Adobe Creative Suite. This is also not helped by the fact that common day-to-day utilities can change dramatically even just between different desktop environments, of which there are no shortage. Just ask the average crowd of Linux users about their favourite text editor, let alone anything more complicated than that. Can we really expect at this stage that the open-source community is going to be able to produce a whole desktop that works for the majority?
The future of Linux probably isn't on the desktop anyway
If you want to look at some major Linux success stories, look no further than Android, Google's originally-mobile-now-everywhere operating system. It's largely successful because a huge amount of effort was placed into the Android runtime to follow the "write once run everywhere" model. It's also really not very Linux-y at all. Core Android kernel patches have since been upstreamed into the main Linux kernel source tree, but on most Android devices, even user-space utilities beneath the "pretty" user interface vary dramatically.
And you know what? It doesn't matter, because nobody who writes Android applications needs to worry about what user-space Linux utilities will or won't be present on the system, or even to a certain extent which system libraries are present, as their needs will largely be met by the Android runtime. This is very much closer to the kind of model that Microsoft and Apple use, providing a common and unchanging API.
The open-source community just outright lacks the cohesion to maintain the unified vision their product in the same way as the software giants do. This is why there are so many different desktop environments available on Linux-based distributions, and most of them completely unable to agree on even common design or usability principles. Often the technically-brilliant individuals of the open-source community do not understand normal, real-world users and don't have the funds or the time or the capability to correctly research what really works for everyone else out there. (At this time, I feel it's only appropriate to look at Richard Stallman, no doubt a genius, but also has large and frequent completely-not-of-this-earth moments.)
So in the meantime, we'll continue to see the Linux kernel appearing at the heart of other products, like Android. Linux-based desktop distributions won't disappear either, remaining largely reserved for the technically capable or the particularly willing. Manufacturers might even provide Linux as an alternative operating system, like we saw five or six years go with the great Netbook explosion (which, coincidentally, and understandably, failed).
But the Year of the Linux Desktop? The year where you step into John Lewis or Currys and pick from swathes of Linux-powered computers? It's just not going to happen.