This post was inspired by some controversy with Valve and their support for Linux, but the bulk of it comes from long-term observation. One of the biggest impacts with the viability of Linux on the desktop was Valve’s Proton, a Wine fork integrated in Steam allowing almost any Windows game to work out of the box. To Linux users, life was good. However, with the recent announcement of the Steam Deck, a handheld device powered by Linux, Valve’s marketing towards developers explicitly mention no porting required. Valve’s been aggressive with this message enough that they’ve allegedly told developers simply not to bother with Linux ports anymore; enough that it makes commercial porters like Ethan Lee concerned.
However, I suspect this is the long-term result of other factors, and games are only one aspect of it. After all, we all know the Year of the Linux Desktop is around the corner, along with nice applications. Linux won’t rule the world just from games, even if some people really want it to be true. How did it come to this, and why?
Shipping Linux software: the current state, and a tale of two houses
I think it’s interesting to contrast the games space in Linux, which has a de facto champion pushing for specific outcomes, versus the applications and desktop space, which doesn’t have that. As a result, there’s a lot of context you have to understand for one, that doesn’t really apply to the other.
The struggle on Steam. Since Valve started their Linux ambitions almost a decade ago, there has been a some difficulties along the way. Some of that has been Valve’s own missteps and market misreads with Steam Machines1, but a lot of it is herding game developers, a traditionally Windows-focused group.
To try to cut through the morass of distributions, Valve has the “Steam Runtime“, which is based on a specific version of Ubuntu and bundled with Steam. Developers can target this environment as a lowest common denominator, as opposed to figuring it out themselves. Inside of the Steam runtime includes a lot of common libraries (from SDL to curl) that effectively provides a base platform. A base platform matters a lot as a developer, as it provides a foundation you know you can rely on for functionality that you don’t have to vendor. Especially so when the Linux space is less a single target and more multiple operating systems with a very large overlap.
(Of course, there were initial growing pains with the runtime. Backwards compatibility was a challenge with the initial approach, as the system glibc and libGL had to be exposed, and things like curl ran into ABI problems. Valve looked into containerization with pressure-vessel and Flatpak; the latter will come up for non-game contexts.)
The problems occurred once you get past the indie stuff Linux users were mostly getting before. Once the AAA developers came in, there were instances of not keeping ports in parity, ports not being multiplayer compatible (i.e. possible desyncs), or getting worse performance “native” (likely in their own custom wrappers) than in Proton. When you have problems like this, basically recommending and using Proton is the easier and more consistent option – this matters when you’re trying to build a platform!
As an example of some of the difficulties with native games, consider other distributions. Outside of the happy path of expecting your users to use the same distribution (i.e. Ubuntu), you can easily run into trouble with games if the distribution is divergent, as Gentoo and Arch found out. It can be usually worked around, but it can be a lot of grief for what should be a “just works” experience. While many of these issues could be blamed on using a less popular distribution, some of these are documented to happen on said “normal” distribution, just in edge cases.
Valve also has greater ambitions. Their goal is being able to suspend a game on your desktop and play it on your Deck will likely need them to standardize on a single platform. Since most gamers are on Windows, and Valve is pushing Proton, it seems Windows is the obvious candidate. (It’s not like Valve cared about Apple anyways.) There are some additional challenges with this approach (i.e. restoring graphics when the GPU has been swapped under you), but sticking to a single platform (well, one they’re familiar with, and one they control that emulates the semantics of the familar one) removes some concerns with synchronizing lower level state across multiple platforms.
Valve’s team also told me they’re looking into ways to cloud-sync suspended games between desktop and Steam Deck, meaning you could hop between platforms without even needing to save and quit, but that functionality wasn’t in place yet during my time with it.
There’s also how much backwards compatibility comes into play. In the Windows world, backwards compatibility is the default, but in the Linux world, it’s less so. As much as Linus makes noise about never breaking userspace in the kernel, the reality is userspace libraries can and have introduced compatibility breaks at varying levels. Even if it wasn’t, there’s still thousands of Windows-only games that will never get updated. Applications might get evergreen updates (but not always), but games usually are once it’s developed, it’s done outside of some minor post-release support. Proton giving Valve thousands of those games for free, without any or minimal developer intervention is a big win over hoping developers invest in ports for older titles.
Proton also does change the role of many porting studios and consultants, like the previously mentioned Ethan Lee, but also larger companies like Aspyr. While many of these had their own tooling for porting, Proton means developers need to have very minimal changes (if any), reducing the need for their services. However, all’s not lost for these companies; if native ports present some benefit over wrapping, they can offer services there. And of course, services to get the most out of Proton are still on the table.
With all Valve’s efforts spent on Linux, it makes you realize they have a surprising amount of control over gaming on Linux as a whole; they’re the ones pushing a lot of funding and work (even indirect; i.e. working with Collabora on the graphics stack, especially with VR, Valve’s other interest), and other stores put a token effort at best for Linux users. This also leads to a fanatically supportive userbase.
Desktop software. Even without games requiring 3D graphics, there are many concerns you have to deal with as a developer. There’s also nothing like Steam to cut the Gordian knot and short-circuit said concerns. This means a lot of things that got solved for games are still up for debate for normal applications, and there are a lot of things to go over.
As mentioned, there is no obvious champion commercial vendor for applications on the Linux desktop, like there was for Valve and games. This does hurt the dynamics of adoption. Canonical and Red Hat play with this goal, but commitment seems sporadic. Who can blame them when desktop is a fractional part of their userbase that requires a lot of money for little gain? The closest steward there is, is freedesktop.org, being composed of the largest desktop projects like KDE and Gnome. (Arguably, the vendors exert influence here through employing developers of these projects, but to what degree depends on who you ask.)
What is harder is distribution of your application. Most Linux distributions (hey, we’re overloading that term) were focused around shipping applications (built from source, assuming FLOSS) with system libraries. A lot of distro decisions can be explained as having their roots in having the entire world ship on 7 CDs in 1999, and treating it as one continuous OS you’re carving like a ham. However, the trend is shifting back towards how distribution was done on other platforms, where the developer is responsible for it, as well as libraries beyond a base platform. This is something that’s happening regardless if your application is proprietary or libre – you don’t want to have to wait for distros to package it, nor wait for them to package updates at their own release cycles.
On the topic of the base platform, there isn’t much you can count on outside of libc. (And even then, that can be different too – hi musl.) There is no consistent set of base functionality (LSB might have counted years ago, but it’s pretty archaic now) on the level of Valve’s base Steam runtime, let alone on say, Windows. While you can hope things like say, libX11, OpenSSL, or Qt are built the way you expect them across all distros to use the distro copy, it’s safer to vendor them with your application. Even things like the C library can have subtleties, as Gentoo and Fedora found out, and there’s no guarantees of ABI stability.
Not to even mention what you actually develop your app with! While toolkits might have been a problem at one time (Gtk vs. Qt arguments still happen to this day), a lot of applications are already targeting cross-platform toolkits like Qt or Electron. (Arguing about their merits is for another day.) If you want cross-platform toolkits, it’s not hard anymore, though they already come with compromise. Of course, even if there’s still a common substrate, then you have desktop environments providing their own layer of base platform on top – which could be sidestepped as well.
So you can’t just ship a single binary – what options do you have? There’s multiple competing solutions, all vying for your support:
- AppImage, which works by essentially being a self-extracting bundle with the application’s files and some (but not necessarily all) dependencies. It basically gets you back to where things were in the Windows world of downloading
.exefiles from the vendor, (minus installers, mostly) good (i.e. easy and independent) and bad (no infrastructure for updates). This is fairly popular and a conservative choice, though I feel it’s a bit fragile.
- Flatpak, which combines a bunch of different things (containers, platforms as one big package, immutable content-addressable images via OSTree, distribution mechanisms) into one gestalt. This seems to be getting adoption (some pushing by major players in the distro and desktop environment space like Fedora and Gnome, and others like Valve), but it’s got some teething issues with UX, security promises, and integration (nothing that can’t be solved by thinking with portals). While the main flow is oriented around distros or Flathub providing applications, nothing stops developers from providing their own things by providing Flatpak files that can manage a repository directly.
- Snap, which is another containerized application format. Unlike Flatpak though, it’s also encouraged for server and embedded (you know, “IoT”) applications. This is pushed by Canonical somewhat aggressively in Ubuntu (to the point there’s a variant that uses only snaps), and even more controversial in the Linux community; I’m not familiar with it enough to judge. I do know that the Snap store is centralized, which probably doesn’t help matters with perception.
- A tarball; let the user figure out how to sort and deal with it. Very simple, but punts the burden to the user, so it only makes sense if you’re dealing with fairly advanced users. Except usually, you’ll also have to provide an install script (maybe making it a self-extracting package), and usually the user ends up picking up the pieces with the launchers to integrate it with the desktop.
Note that I haven’t mentioned using distribution package management for your application. Indeed, it is a common choice to ship RPMs/Debian packages, but there’s a lot of caveats that make it hard to recommend. You’re going to be targeting a narrow subset of distros, and you have to worry about distro dependencies even more so than normal (As a concrete example, Discord suffered when Debian dropped the
libappindicator1 package.). Supporting multiple versions, let alone upgrades may be harder. It gets even worse if you have a repository; PPAs have a notorious reputation of breaking systems for a reason. Ultimately, it depends to what extent you rely on package management and package distribution – it might end up being sustainable in some scenarios.
I think the paralysis of choice is obvious at this point. Choice is great for flexibility in terms of technical users and entities wanting to define their platform (sides have been drawn on whether to i.e. use AppImage or Flatpak), but as a developer (or user) dealing with what are essentially multiple operating systems with low-level things in common, it’s quite confusing to try to find the common subset. It’s unfortunate that a lot of the messaging around Linux focused on it being one single target; arguably it is in embedded and server, but not so much for desktop.
“What’s in it for me?” – Or, “why play the game?”
Looking at the bewildering options and issues in the Linux ecosystem, it’s no wonder why people might choose not to participate at all. Even Valve, while still aggressively going forward with Linux, pulled back a bit from native applications in favour of investing in Wine. It’s controversial, but I think if I were to develop a desktop application that targets Linux desktops, I would strongly consider shipping the Windows version in Wine. For games, it’s even more obvious that it seems OK unless significant optimizations are on the table, but Valve doesn’t have much messaging about when native is preferable over Proton.
After all, if Wine is nowadays well funded, well developed, and supporting your application, why go native at all? For games, it’s likely you’ll never notice otherwise, and for desktop applications, Linux users tend to be less picky about any possible UX issues induced by Wine. They’ll tolerate a lot of flaws if it means they can have your application at all. Not to mention the deep library of existing applications they can run as well – there’s something to be said for bringing 25 years of backwards compatibility to another OS.
1: I wrote about this sad story before if you’re interested in juvenilia. The writing is sophomoric, but I stand by my opinions then. Of course, the game changed with Proton later.