Win32 is the stable Linux userland ABI (and the consequences)

This post was inspired by some controversy with Valve and their support for Linux, but the bulk of it comes from long-term observation. One of the biggest impacts with the viability of Linux on the desktop was Valve’s Proton, a Wine fork integrated in Steam allowing almost any Windows game to work out of the box. To Linux users, life was good. However, with the recent announcement of the Steam Deck, a handheld device powered by Linux, Valve’s marketing towards developers explicitly mention no porting required. Valve’s been aggressive with this message enough that they’ve allegedly told developers simply not to bother with Linux ports anymore; enough that it makes commercial porters like Ethan Lee concerned.

However, I suspect this is the long-term result of other factors, and games are only one aspect of it. After all, we all know the Year of the Linux Desktop is around the corner, along with nice applications. Linux won’t rule the world just from games, even if some people really want it to be true. How did it come to this, and why?

Shipping Linux software: the current state, and a tale of two houses

I think it’s interesting to contrast the games space in Linux, which has a de facto champion pushing for specific outcomes, versus the applications and desktop space, which doesn’t have that. As a result, there’s a lot of context you have to understand for one, that doesn’t really apply to the other.

The struggle on Steam. Since Valve started their Linux ambitions almost a decade ago, there has been a some difficulties along the way. Some of that has been Valve’s own missteps and market misreads with Steam Machines1, but a lot of it is herding game developers, a traditionally Windows-focused group.

To try to cut through the morass of distributions, Valve has the “Steam Runtime“, which is based on a specific version of Ubuntu and bundled with Steam. Developers can target this environment as a lowest common denominator, as opposed to figuring it out themselves. Inside of the Steam runtime includes a lot of common libraries (from SDL to curl) that effectively provides a base platform. A base platform matters a lot as a developer, as it provides a foundation you know you can rely on for functionality that you don’t have to vendor. Especially so when the Linux space is less a single target and more multiple operating systems with a very large overlap.

(Of course, there were initial growing pains with the runtime. Backwards compatibility was a challenge with the initial approach, as the system glibc and libGL had to be exposed, and things like curl ran into ABI problems. Valve looked into containerization with pressure-vessel and Flatpak; the latter will come up for non-game contexts.)

The problems occurred once you get past the indie stuff Linux users were mostly getting before. Once the AAA developers came in, there were instances of not keeping ports in parity, ports not being multiplayer compatible (i.e. possible desyncs), or getting worse performance “native” (likely in their own custom wrappers) than in Proton. When you have problems like this, basically recommending and using Proton is the easier and more consistent option – this matters when you’re trying to build a platform!

As an example of some of the difficulties with native games, consider other distributions. Outside of the happy path of expecting your users to use the same distribution (i.e. Ubuntu), you can easily run into trouble with games if the distribution is divergent, as Gentoo and Arch found out. It can be usually worked around, but it can be a lot of grief for what should be a “just works” experience. While many of these issues could be blamed on using a less popular distribution, some of these are documented to happen on said “normal” distribution, just in edge cases.

Valve also has greater ambitions. Their goal is being able to suspend a game on your desktop and play it on your Deck will likely need them to standardize on a single platform. Since most gamers are on Windows, and Valve is pushing Proton, it seems Windows is the obvious candidate. (It’s not like Valve cared about Apple anyways.) There are some additional challenges with this approach (i.e. restoring graphics when the GPU has been swapped under you), but sticking to a single platform (well, one they’re familiar with, and one they control that emulates the semantics of the familar one) removes some concerns with synchronizing lower level state across multiple platforms.

Valve’s team also told me they’re looking into ways to cloud-sync suspended games between desktop and Steam Deck, meaning you could hop between platforms without even needing to save and quit, but that functionality wasn’t in place yet during my time with it.

There’s also how much backwards compatibility comes into play. In the Windows world, backwards compatibility is the default, but in the Linux world, it’s less so. As much as Linus makes noise about never breaking userspace in the kernel, the reality is userspace libraries can and have introduced compatibility breaks at varying levels. Even if it wasn’t, there’s still thousands of Windows-only games that will never get updated. Applications might get evergreen updates (but not always), but games usually are once it’s developed, it’s done outside of some minor post-release support. Proton giving Valve thousands of those games for free, without any or minimal developer intervention is a big win over hoping developers invest in ports for older titles.

Proton also does change the role of many porting studios and consultants, like the previously mentioned Ethan Lee, but also larger companies like Aspyr. While many of these had their own tooling for porting, Proton means developers need to have very minimal changes (if any), reducing the need for their services. However, all’s not lost for these companies; if native ports present some benefit over wrapping, they can offer services there. And of course, services to get the most out of Proton are still on the table.

With all Valve’s efforts spent on Linux, it makes you realize they have a surprising amount of control over gaming on Linux as a whole; they’re the ones pushing a lot of funding and work (even indirect; i.e. working with Collabora on the graphics stack, especially with VR, Valve’s other interest), and other stores put a token effort at best for Linux users. This also leads to a fanatically supportive userbase.

Desktop software. Even without games requiring 3D graphics, there are many concerns you have to deal with as a developer. There’s also nothing like Steam to cut the Gordian knot and short-circuit said concerns. This means a lot of things that got solved for games are still up for debate for normal applications, and there are a lot of things to go over.

As mentioned, there is no obvious champion commercial vendor for applications on the Linux desktop, like there was for Valve and games. This does hurt the dynamics of adoption. Canonical and Red Hat play with this goal, but commitment seems sporadic. Who can blame them when desktop is a fractional part of their userbase that requires a lot of money for little gain? The closest steward there is, is, being composed of the largest desktop projects like KDE and Gnome. (Arguably, the vendors exert influence here through employing developers of these projects, but to what degree depends on who you ask.)

What is harder is distribution of your application. Most Linux distributions (hey, we’re overloading that term) were focused around shipping applications (built from source, assuming FLOSS) with system libraries. A lot of distro decisions can be explained as having their roots in having the entire world ship on 7 CDs in 1999, and treating it as one continuous OS you’re carving like a ham. However, the trend is shifting back towards how distribution was done on other platforms, where the developer is responsible for it, as well as libraries beyond a base platform. This is something that’s happening regardless if your application is proprietary or libre – you don’t want to have to wait for distros to package it, nor wait for them to package updates at their own release cycles.

On the topic of the base platform, there isn’t much you can count on outside of libc. (And even then, that can be different too – hi musl.) There is no consistent set of base functionality (LSB might have counted years ago, but it’s pretty archaic now) on the level of Valve’s base Steam runtime, let alone on say, Windows. While you can hope things like say, libX11, OpenSSL, or Qt are built the way you expect them across all distros to use the distro copy, it’s safer to vendor them with your application. Even things like the C library can have subtleties, as Gentoo and Fedora found out, and there’s no guarantees of ABI stability.

Not to even mention what you actually develop your app with! While toolkits might have been a problem at one time (Gtk vs. Qt arguments still happen to this day), a lot of applications are already targeting cross-platform toolkits like Qt or Electron. (Arguing about their merits is for another day.) If you want cross-platform toolkits, it’s not hard anymore, though they already come with compromise. Of course, even if there’s still a common substrate, then you have desktop environments providing their own layer of base platform on top – which could be sidestepped as well.

So you can’t just ship a single binary – what options do you have? There’s multiple competing solutions, all vying for your support:

  • AppImage, which works by essentially being a self-extracting bundle with the application’s files and some (but not necessarily all) dependencies. It basically gets you back to where things were in the Windows world of downloading .exe files from the vendor, (minus installers, mostly) good (i.e. easy and independent) and bad (no infrastructure for updates). This is fairly popular and a conservative choice, though I feel it’s a bit fragile.
  • Flatpak, which combines a bunch of different things (containers, platforms as one big package, immutable content-addressable images via OSTree, distribution mechanisms) into one gestalt. This seems to be getting adoption (some pushing by major players in the distro and desktop environment space like Fedora and Gnome, and others like Valve), but it’s got some teething issues with UX, security promises, and integration (nothing that can’t be solved by thinking with portals). While the main flow is oriented around distros or Flathub providing applications, nothing stops developers from providing their own things by providing Flatpak files that can manage a repository directly.
  • Snap, which is another containerized application format. Unlike Flatpak though, it’s also encouraged for server and embedded (you know, “IoT”) applications. This is pushed by Canonical somewhat aggressively in Ubuntu (to the point there’s a variant that uses only snaps), and even more controversial in the Linux community; I’m not familiar with it enough to judge. I do know that the Snap store is centralized, which probably doesn’t help matters with perception.
  • A tarball; let the user figure out how to sort and deal with it. Very simple, but punts the burden to the user, so it only makes sense if you’re dealing with fairly advanced users. Except usually, you’ll also have to provide an install script (maybe making it a self-extracting package), and usually the user ends up picking up the pieces with the launchers to integrate it with the desktop.

Note that I haven’t mentioned using distribution package management for your application. Indeed, it is a common choice to ship RPMs/Debian packages, but there’s a lot of caveats that make it hard to recommend. You’re going to be targeting a narrow subset of distros, and you have to worry about distro dependencies even more so than normal (As a concrete example, Discord suffered when Debian dropped the libappindicator1 package.). Supporting multiple versions, let alone upgrades may be harder. It gets even worse if you have a repository; PPAs have a notorious reputation of breaking systems for a reason. Ultimately, it depends to what extent you rely on package management and package distribution – it might end up being sustainable in some scenarios.

I think the paralysis of choice is obvious at this point. Choice is great for flexibility in terms of technical users and entities wanting to define their platform (sides have been drawn on whether to i.e. use AppImage or Flatpak), but as a developer (or user) dealing with what are essentially multiple operating systems with low-level things in common, it’s quite confusing to try to find the common subset. It’s unfortunate that a lot of the messaging around Linux focused on it being one single target; arguably it is in embedded and server, but not so much for desktop.

“What’s in it for me?” – Or, “why play the game?”

Looking at the bewildering options and issues in the Linux ecosystem, it’s no wonder why people might choose not to participate at all. Even Valve, while still aggressively going forward with Linux, pulled back a bit from native applications in favour of investing in Wine. It’s controversial, but I think if I were to develop a desktop application that targets Linux desktops, I would strongly consider shipping the Windows version in Wine. For games, it’s even more obvious that it seems OK unless significant optimizations are on the table, but Valve doesn’t have much messaging about when native is preferable over Proton.

After all, if Wine is nowadays well funded, well developed, and supporting your application, why go native at all? For games, it’s likely you’ll never notice otherwise, and for desktop applications, Linux users tend to be less picky about any possible UX issues induced by Wine. They’ll tolerate a lot of flaws if it means they can have your application at all. Not to mention the deep library of existing applications they can run as well – there’s something to be said for bringing 25 years of backwards compatibility to another OS.

1: I wrote about this sad story before if you’re interested in juvenilia. The writing is sophomoric, but I stand by my opinions then. Of course, the game changed with Proton later.

22 thoughts on “Win32 is the stable Linux userland ABI (and the consequences)

  1. Sok Puppette February 27, 2022 / 5:59 pm

    Almost without exception, if your software is not in my distro’s package format, or isn’t packaged *properly* in that format, I am not going to install it, period. There is definitely no *game* that could justify dealing with that.

    flatpak is not an option. AppImage is not an option. Containers are almost never an option. Emulating Windows is ALL KINDS OF NOT AN OPTION. Install from source is maybe sort of an option if I’m very desperate.

    Furthermore, if your software isn’t in my distro’s *native repositories*, and there is any remotely usable alternative that is in those repositories, then that alternative is going to get used.

    The biggest reason for this is that I want to know what’s running on my system, and you packaging old, probably insecure versions of libraries is exactly what I do *not* want. Another reason is that I don’t want to install software that I can’t trivially uninstall, or that creates files I can’t trivially identify as belonging to that software.

    • Curious citizen February 27, 2022 / 7:31 pm

      How much would you be ready to pay for games packaged to your requirements?

      • Sok Puppette February 27, 2022 / 8:42 pm

        I don’t really play games much. But if I already wanted to buy a game *or any other software*, and I had an alternative that was equivalent in every way *except* for packaging, I would pay up to maybe 50 bucks or 10 percent of the software price, whichever was less, without even thinking about it. And I would think about paying more.

        If I’m buying $20,000 per seat CAD software or something, then I expect proper packaging for free.

        • Sok Puppette February 27, 2022 / 8:44 pm

          Oh, and by the way, once again, I have frequently done completely without both commercial and open source software that I would otherwise have used (and paid for in the commercial case), because I didn’t want to take on the administrative or cognitive load from crappy packaging. Something has to be really important for me to take that on at all.

    • The Dubster February 27, 2022 / 8:07 pm


    • Every Software Company February 27, 2022 / 8:21 pm

      We are thrilled to *not* have you as a customer.

  2. Daveee February 27, 2022 / 8:33 pm

    Although I agree with the thrust of your argument, have only begun the read and you got a key fact wrong. “SteamOS” v3 is based on Arch not Ubuntu.

    You might want to change that, it hurts credibility of an otherwise good post.

    • cb February 27, 2022 / 8:57 pm

      The Steam runtime as far as I’m aware was still Ubuntu based, even then SteamOS was on Debian (and now Arch).

      • James February 28, 2022 / 6:29 pm

        There are several different versions of Steam Runtime. The first one was based on Ubuntu 12.04 with some dependencies backported. Newer versions are based on Debian 10 and (soon) Debian 11.

  3. sam February 28, 2022 / 1:12 am

    A consequence of the main point of the post is that you can’t actually change the runtime easily without breaking everything. So I think you may have just misunderstood.

    AFAIK the runtime is still Ubuntu 12.04 or w/e based.

  4. fgsfds February 28, 2022 / 7:28 pm

    I think there are two serious problems, and one annoying (but fixable without rewriting the world) problem, with ABI compatibility on “native” Linux:

    1. No two-level namespaces. Windows has had something like this since forever and OS X got support for this in 10.1. Without this I don’t know how you can reliably distribute binary-only plugins (e.g. VSTs or AUs) that themselves have dependencies on other libraries without inventing your own shared library format and somehow getting every developer in your ecosystem to buy in. A Collabora employee made libcapsule as an attempt to work around this using dlmopen–because apparently games can ship their own copy of libstdc++ which causes problems when the user-mode shared object component of an OpenGL driver also depends on a totally different copy of libstdc++–without solving the actual underlying problem with ELF dynamic linking. It looks pretty gross. Classic Windows “DLL hell” has nothing on this.

    2. HTTPS. If you’re careful, you can depend on just a handful of symbols from:
    a. libc (to get access to dlopen and a common ABI for thread control blocks/TLS/etc.),
    b. libxcb (because of Vulkan), and
    c. libX11 (because GLX depends on Xlib datatypes e.g. Display).
    These don’t change very much. You can then bundle your own copy of Qt or GTK or whatever and probe at runtime for libpulse, libjack, and libasound. With enough effort and care you can make a binary that can run on either a glibc or musl system if you just patch the ELF interpreter path in your binary. But if you ever want to use HTTPS you’ll need to deal with OpenSSL in some way and the ABI for that changes all the time. You can bundle your own TLS implementation but you’ll need to access system-wide certificates somehow; I’ve experimented a bit with temporarily loading a “system” copy of OpenSSL to fetch the certificate directory with “X509_get_default_cert_dir”, but it’s feels fragile.

    3. Every C and C++ toolchain for Linux is trying to hurt you. The defaults are insane. GNU LD doesn’t do identical code folding and it’s mega slow. Dead code elimination requires adding yet more flags to your compiler/linker invocation. You’re bludgeoned with symbol interposition, lazy symbol binding, PLTs, versioned symbols, and a bunch of other esoteric and slow junk that approximately nobody has ever needed when shipping real-world client facing software for over 20 years on Windows and OS X. You can opt out of most of this, but there’s no low-effort way to explicitly depend on older symbol version tags. And patching an ELF binary after the fact to remove or rewrite symbol versions is kind of a pain.

    (Tangent on audio ABIs: OSSv4 has a great ABI that doesn’t depend on any of this shared object misery. It’s too bad that CUSE never took off.)

    • Jackson March 1, 2022 / 10:50 pm

      I’m a C++ dev on Linux, and I usually (for my game) just past -Ofast -flto=32 as flags. Doesn’t this pretty much enable all the optimization? I’m adding these flags in CMake. Honestly, though I know what it does, I’ve never touched the linker directly. How are these issues for most devs?

      And isn’t “produce the executable ASAP” what you want during your dev build compilers, aka 99% of your compiles?

Leave a Reply

Your email address will not be published. Required fields are marked *