Software Bloat: The Red Queen’s Race

VP of Operations, Opreto

6 minute read

For decades, computers have been growing in power at a meteoric pace, and the army of programmers writing software for them is now twenty-eight million strong. So why does it feel like the applications we use every day—not even the brand new, bleeding-edge stuff, just basic things we’ve had forever, like word processors and e-mail clients—are slower and clunkier than ever?

“Well, in our country,” said Alice, still panting a little, “you’d generally get to somewhere else—if you run very fast for a long time, as we’ve been doing.”

“A slow sort of country!” said the Queen. “Now, here, you see, it takes all the running you can do, to keep in the same place. If you want to get somewhere else, you must run at least twice as fast as that!”

By 1987, the distinction between microcomputers and their larger cousins was beginning to blur, as the micros became increasingly capable. Geoffrey Welsh opined in the Toronto PET Users Group newsletter that the line could be drawn at the new generation of micros, which were arguably too complex for programmers to learn to code for in assembly language. As had long been the case with minicomputers and mainframes, programmers would instead code in higher-level languages, made possible by the increased computing power.

Fast forward a few years. It is the nineties, and what Andy giveth, Bill taketh away. Intel’s processors increased massively in power throughout the decade, yet its dominant operating system always found a way to take up the headroom. Thinking back, my Pentium II running Windows 98 toward the decade’s end didn’t feel any faster than my i386 running Windows 3.0 at its start (if anything, the opposite).

Today, as predicted by Niklaus Wirth in a famous 1995 article, computers take longer than ever to boot, and applications are at peak sluggishness even on state of the art hardware. I have to buy a new phone every few years just to get back to bearably responsive functional parity with the old one. Despite the continual churn of new and exciting programming technologies, we perpetually live with stage four software rot. Everyone experiences it, nobody denies it, and everyone just sort of accepts it.

The typical laptop or smart phone of 2023 is absurdly powerful in comparison to the typical microcomputer of 1987. There is no perfect apples to apples benchmark, but we are talking on the order of a million times the processing horsepower (without even touching the GPU), memory capacity, bulk storage capacity, and data throughput, roughly in line with Moore’s law.

I will say that again. Remember the best home computer you could buy in 1987? We have computers we can carry around with us now that are a million times more powerful.

How did we manage to squander seven orders of magnitude and actually make things slower? Obviously, we have applications in 2023 that do things that were infeasible or impossible in 1987, and a wealth of new features we could only dream of back then. But with a million times the computing power at our disposal, couldn’t we have spared a bit to make things feel snappy?

I’ve written before about some of the market forces and business practices I think contributed to modern software becoming so grotesquely inefficient and inexcusably unstable. Of course, more functionality requires more software complexity which requires more computing power, but the business priority of features over efficiency and stability leads to avoidably poor results in that regard.

We can reframe this situation more generally in terms of the modern reality that hardware is cheap and programmers are expensive, which is probably the most common answer to the FAQ “why is modern software so bad?” This lets us break the vague notion of software getting “slower” or “worse” over time—that is to say, for the purposes of this article, consuming more computational resources, though we could treat stability similarly—into three distinct categories.

First, we have the minimum amounts of extra CPU cycles, memory use, and so on that are theoretically necessary to add some piece of functionality to software. Though these numbers are academic and practically unknowable, the fact that they exist and are generally not zero pardons some portion of the bloat. However, as the advancing frontier of the demoscene illustrates, this portion is likely very small in most cases.

Second, we have the overhead introduced by abstractions whose primary purpose is to simplify the programmer’s work. This includes pretty much everything that supports the system’s application software and allows it to be written as something other than a bare-metal assembly monolith: operating systems, hardware drivers, network stacks, high-level language compilers and interpreters, libraries, frameworks, middlewares, runtime environments, database servers, virtualization, containers. Other than in the smallest embedded systems, these menageries have become fiendishly complex over the years. Even if each layer is implemented with perfect efficiency, it is a matter for debate how much of the resultant overhead is an acceptable trade for programmer time.

Third, we have all of the additional overhead that comes from software components being just plain bad at what they’re supposed to do. Supposing you insist on one particular architecture of abstraction layers and interfaces, the implementations of each piece in the hierarchy are unlikely to be anywhere near perfectly efficient. Optimizing code is a time-consuming programming task, and there are things code should be other than optimal (e.g. comprehensible), so again here, we have a matter for debate.

I have no quantitative metrics by which to evaluate any particular piece of software on this basis, nor any specific prescriptions around how and where to prioritize making code more efficient (or stable), but to express my sense of the general situation I’ll just quote Nikita Prokopov: where we are today is bullshit.

The burgeoning retrocomputing hobby is presumably driven by nostalgia, but do we feel nostalgic for the specific systems, or for a generic computing experience that doesn’t feel totally disconnected from the computer by a soaring tower of abstractions? To me, retrocomputing is an escape from the ennui of modern software, in the same way that participating in the Society for Creative Anachronism is an escape from the ennui of modern life. The first computers I recall using were the Commodore 64 and the IBM PC/AT, but I built a Z80-based RC2014 to get a little closer to the “pure” microcomputer experience.

There are still practical business niches for Real Programming, as Welsh styles it. Many embedded applications demand reliable behaviour and predictable timing, which ultimately translates into coding closer to the metal. There are typically some abstraction layers, perhaps a C compiler and FreeRTOS, but requirements (on reliability, safety, etc.) and their verification create a natural bulwark against bloat. The situation on our PCs, tablets, and phones simply isn’t tenable in this world. In effect, whatever is driving these requirements, regulatory or otherwise, forces good engineering in a way that the familiar consumer market does not.

Alice never could quite make out, in thinking it over afterwards, how it was that they began: all she remembers is, that they were running hand in hand, and the Queen went so fast that it was all she could do to keep up with her: and still the Queen kept crying “Faster! Faster!”

Can ordinary, non-embedded software be better? Some people think so. We’re accustomed to the local optimum, and it’s a Herculean lift to break free, but all programmers are individually motivated to step up their game. I want to see the software engineering discipline earn back the engineering part of its name. If we can make understanding the system at all levels part of the culture again, in the same way that we’ve collectively internalized things like design patterns and clean architecture, maybe we can take the shackles off our computers, and quit driving Ferraris at bicycle speeds.

That is, if using ChatGPT to do basic arithmetic doesn’t become the new normal…

Updated:

Comments