James's Blog

Sharing random thoughts, stories and ideas.

Technological Ideologies

Posted: Jan 2, 2021
◷ 8 minute read

Software is going to shit, apparently. From Robert C. Martin’s Clean Code to Casey Muratori’s lecture on The Thirty Million Line Problem, to the even more dramatically titled Preventing the Collapse of Civilization talk by Jonathan Blow, it seems that at least a group of experienced and passionate engineers believes this to be the case. We don’t make software like we used to: our standards are worse than ever - we routinely commit crap code and don’t refactor; our users’ expectations are lower than ever - nobody expects software to “just work” anymore, and everyone simply shrugs off bugs; complexity is more bloated than ever - the “Thirty Million Line Problem” refers to the 30M lines of code in the Linux kernel that one must go through for even things as simple as displaying a line of text on the web. I agree that this is mostly true. They are right about the problems, but I think their diagnosis is off.

What are the most recognized causes of these problems in software technology? Here are some of the key ones that I’ve seen and synthesized:

  • The continued improvements in hardware speed (Moore’s law) has made it less important and necessary to write efficient software. Most progress in software technologies in the past few decades has come from software engineers riding on the progress made by hardware engineers
  • The ever-increasing drive for and reliance on higher levels of abstraction in programming have caused most engineers to lose knowledge of the lower layers (what percent of JS developers today know what cache coherency is?). People no longer understand how things work at the fundamental levels, and so can only resort to pile even more things on top of the already messy heap to move forward, perpetuating the vicious cycle
  • The lack of a formal guild in software, which is fairly universal in all other fields of engineering, means that there is no ubiquitous education on standard, quality, and ethics of software work. There is nothing to stop or slow the inevitable decline in the average level of competence of the field as more people join
  • Most users don’t value quality, speed, efficiency, reliability, and simplicity of the software, at least not as much as the cost to achieve them, and so businesses don’t prioritize them. Instead they focus on time-to-market, marketing, and sales

While I think all these points have some validity, I don’t believe that the decline in software quality over time is mainly from intellectual negligence by software engineers. Of course, many have been asleep at the wheel and definitely contributed to the problem. But there has also been quite a lot of great innovations in software technologies in the last few decades, as many skeptics of the “software is in decline” narrative often correctly bring up. A more careful reflection reveals that perhaps these innovations are not evenly distributed, which causes this dissonance of view points: depending on where you look for progress, things range from limited gains or even regressions, to massive leaps forward.

The last of the four points above hints at what I think is the main issue, though it isn’t exactly directed by what the users value, but rather what the businesses want.


In The Thirty Million Line Problem, Casey notes that the core functional requirements of most softwares (e.g. word processors, operating systems) did not change much from a few decades ago to today, yet the amount of complexity in these programs, including what’s required to run them, have increased beyond control. This is of course somewhat of an exaggeration, which Casey also acknowledges. Most modern programs do more than their older counterparts, the point is that the increase in complexity far outpaced the increase in functionality. But even so, I still think that this is a woefully inaccurate characterization. Yes, the surface level (i.e. user-facing) common use case requirements indeed did not change much for many softwares, but the underlying nature of most commercial softwares is completely different today. They are no longer mere tools as they were years ago, and instead have become complex instruments of continuous user value extraction for the businesses that created them. Regardless of how functionally similar the user-facing parts appear, a word processor or operating system today is nothing like their older ancestors. The piece exposed to the end user, which used to be the main and only component, is now just the tiny tip of a massive iceberg. Most of the complexities now lie in the body of the iceberg, below the surface and hidden from view.

I think that most of the resources, efforts, and attention in the software industry in recent years have been driven to solve a specific narrow class of problems, to the detriment of others. I’m usually not a fan of technology buzzwords, but in this case they do help indicate where recent innovations have been concentrated in. The Cloud, NoSQL, NewSQL, High Availability, Eventual Consistency, Infrastructure as Code, DevOps, Chaos Engineering, Serverless, these are some of the hottest new developments from the past couple of decades. The list goes on, but the theme is clearly unified. They are all solutions to facilitate the instrumentation, collection, and processing of large amounts of data by a centralized entity, for the sole purpose of efficiently extracting value from users.

So when taken as a whole (and not just the parts exposed to the end user) the word processor or operating system of today is way more complex than the ones from the past. Some of that complexity undoubtedly comes from bad engineering, but most of it is because these softwares now need to perform a completely new and different set of tasks. The end user experience didn’t change much, and arguably got worse. However: every action performed by users is tracked and reported to a central location (which means everything must be networked); all the collected data is analyzed to inform how the software will be changed in the future (which requires large processing systems independent of the main software); unintended usages of the software (e.g. ways that lower the potential value extracted from the software) are prevented. And most important of all, all this is done simultaneously for millions or billions of people, while maintaining the illusion that each person is the only user - the illusion of the simple program of yesterday - which requires complex consensus algorithms and fault tolerance across large networks of machines.

If you are looking specifically for progress in the core functionalities of the software, there doesn’t appear to be much, certainly not enough to justify the increase in complexity. But if you look at what has been created for what they truly are, at the innovations and ingenuity that it took to get here, then it does seem like a lot of progress has been made, at least in a narrow range centered around ideas like Scalability and Big Data. This is where the key problems that the largest players in the tech space (the FAANGs) care about reside. And they need to be solved for the financial imperatives of these businesses. This class of problems is what a significant portion of the people in the industry has been working on in recent years, and things improved here much more so than elsewhere.

Over time, I think the prevailing narratives, or ideologies, of the tech space shifted accordingly. In part this helps give the people that work on these problems purpose and meaning, while also serves as a powerful recruitment mechanism. The class of problems concerning centralized scaling became the coolest and the most worthwhile to work on for the intellectual elite, according to the prevalent technological ideology. If you don’t have millions of users, aren’t scaling with thousands of containers across multiple cloud providers, ingesting petabytes of data regularly, then you are an uninteresting nobody.

The side effect is that like all ideologies, once permeated, they steer the minds of the people that live within and blind them to the outside. In this case, the “outside” is everything not related to large scale systems, where things are seen as not interesting and have not been improving as fast. Perhaps it isn’t a coincidence that Casey Muratori and Jonathan Blow are both indie game developers, from a world that is quite distant from the world of the FAANGs and not influenced by these ideologies. Conversely, maybe they are also somewhat blinded of the work that has been done.


The natural tendency of technology is to decline, as Elon Musk pointed out. Things only get better if a large amount of people devote a lot of their time to make them better. So being neglected means to let rot. In recent decades, our abilities to scale systems and extract value out of software have shot through the stratosphere, but the other aspects of technology is being ignored in comparison. Maybe the rest of software, especially for the end user, is going to shit.

Many have come to the realization that a significant portion of our brightest minds today are focused on getting people to click on ads online, and that maybe this isn’t the best use of that cognitive capital. But at least people generally agree that the tech is cool and useful. In a sense, this idea of the broader technological ideologies steering our thoughts in the industry is the result of taking that realization a step further. It’s not just about ad-tech or surveillance capitalism - what the tech is being used for today. Are these innovations themselves, in scaling, really that useful outside of centralized value extraction?

Our problem solving attention is being directed to a fairly narrow set of problems - in which we have made and are making tremendous progress - and it’s causing us to ignore the rest. Other interesting problems, ones that aren’t directly about scaling large systems, but could have transformative value for individuals, are being underserved as a result. To broaden our attention and balance out our efforts, we can start by questioning what we’ve been made to believe.