Discover more from BIG by Matt Stoller
Does Microsoft Have a Boeing 737 Max Style Crash Every Week?
Welcome to BIG, a newsletter about the politics of monopoly. If you’d like to sign up, you can do so here. Or just read on…
One of the best things about writing this newsletter is that it is becoming a bit of a community. The feedback and ideas you send me are essential to keeping this going. Over the past few days, I’ve spoken to doctors, book sellers, aerospace experts, and cheerleading professionals and parents, all of whom are angry about the increasing control across nearly every industry in the economy.
Yesterday I was on the excellent TV show The Rising to discuss JP Morgan CEO Jamie Dimon’s comments at Davos in which Dimon dismissed the anger we all feel. As I was getting mic’d up, one of the producers complained how his daughter was hurt in a cheer competition. Meanwhile Varsity flacks are combing gym message boards for negative comments, and now I’m hearing from officials at both state and Federal level interested in the problem.
Today I have a guest-piece by software professional William Wechtenhiser. Wechtenhiser will describe the perilous state of the consumer software industry, which looks a lot like the pre-2008 banking sector. And he’ll describe the policy changes necessary to get us back to being creative tinkerers and honorable business professionals using the remarkable digital technology with which we are blessed.
So here we go.
Does Microsoft Have a Boeing 737 Max Style Crash Every Week?
My name is William Wechtenhiser. I’m a software engineer, and I want to tell you a story about how monopolies and financialization affect my industry, and through my industry, all of you. First a caveat. I love software, what we can do with code is the closest thing to magic that exists. We each carry a small supercomputer with unimaginably powerful sensors in our pockets. With the right policies, we could have a flourishing paradise of liberty and improve our lives in remarkable ways.
But that isn’t what we have. Instead, the policy choices of the last thirty years seem to have led to Boeing 737 Max-style crises, everywhere, but out of sight. Don’t be bummed out, because as scary as this stuff sounds, it’s all fixable, mostly by doing a lot of the things Matt tends to write about.
Here’s the story.
Last week, Microsoft announced a vulnerability in its Windows software, in an anodyne mild note. Most of us didn’t notice. Why would we? Dozens of updates for Windows packages are released every month, including fixes for serious security vulnerabilities. If we use Windows, we just hit ‘Update’ and keep going. To the extent antitrust enforcers noticed, they would just think, Windows is an old technology, it doesn’t matter, there’s cooler stuff that Google is doing anyway.
In Maryland, the National Security Agency had a very different notion of how serious this breach was and how much Microsoft’s control over our computer systems matters. In fact, it was so serious that it was the NSA who disclosed the existence of the issue to Microsoft. This was the first time that Microsoft has credited the NSA for a reported security vulnerability. In other words, in cyber speak, it’s a five alarm fire.
The potential impact of this issue is virtually unlimited in terms of compromising computers running Windows. And Windows really matters. There are billions of systems running Windows, and Microsoft’s Personal Computing division, which includes Windows, had $45 billion of revenue in 2019. Windows is one of the most important software products in the world.
What were the consequences of this breach? We don’t know yet. Windows is embedded into an enormous number of systems, and now each of them may be vulnerable, but each hack will look different, and seem unrelated to Windows.
Microsoft is not alone among large software vendors in routinely exposing users to severe issues. I mean, Mark Zuckerberg’s slogan is “Move Fast and Break Things,” and his mentor is Bill Gates. Apple, Google, Facebook, Amazon and others have also suffered vulnerabilities, outages and failures as have protocols and implementations that our technology depends on such as SSL and Bluetooth. As a software professional I am routinely amazed at the level of tolerance across society for this level of failure in major systems that we all depend on in our daily lives.
So when I say Boeing plane crash, I’m using a metaphor, but it’s closer to real than you might imagine. The problems I’m going to detail that led to this breach are rife throughout all of software development. I mean, even Boeing’s 737 Max catastrophic failures came from a software patch!
While the Windows breach probably didn’t affect your machine, it could affect your life. Breaches of Windows or underlying operating systems, or just bad software development practices regularly cost time, lives and money. Sometimes it’s inconvenient, like when airline reservation systems go down and cancel or delay thousands of fights. It can be annoying, like in 2016 when AirBnB, Paypal, Twitter and Reddit went down due to an attack by infected ‘Internet of things’ systems, or when hackers took control of tornado sirens near Dallas in 2017 and set them on and off between 2:30 and 4:00 AM one morning.
Sometimes it leads to identity theft after hacks of financial institutions; there were 3,494 successful cyber-attacks on financial institutions in just the first six months of 2019. It can cost lives; in 2017, 16 hospitals in England shut down all non-critical operations because of a ransomware hack, as did utilities in Spain. And it’s a critical national security problem, with Windows XP embedded deeply in, say, nuclear submarine control systems.
The risk is hidden and thus the problem seems out of sight. But that just means we’re in the early part of the theme song from Jaws.
Why did this breach happen?
To regulators, these companies tell us that they take security and quality very seriously but that it is unreasonable to expect them to maintain millions of lines of code without serious issues cropping up from time to time.
To the public, industry leaders have created a narrative about why these software problems exist. And that is, bugs and errors are just a fact of life. Software is complicated and involves math and code and sorcery, and you’re not an engineer and Bill Gates is.
One very popular line is that being more careful with code means that creating software “won’t scale.” Such an argument is also not coincidentally used by Youtube and Facebook to explain why they can't/won't organize content except through automated addiction algorithms. When they say “it won’t scale” what they mean is “it’s not profitable.”
Essentially software companies presuppose it is unreasonable to bear an increase in the costs of providing the service as more people use it, or as they add more features. Of course, most businesses "don't scale" in this sense – if a car companies sell more cars in more countries it will encounter more situations with different weather and traffic conditions and will have to modify its products accordingly. If a car company doesn’t and people die due to product defects, they are sued and have to pay the costs. Internally this creates a culture where engineers have power over safety decisions, even if executives wish they didn’t.
But software executives can boss software engineers around, because we don’t treat these executives like they are building critical systems. Executives are used to public deference based on the technical jargon they use to portray themselves as sorcerers, and they believe their margins should increase as they sell more software, instead of recognizing that costs increase as scale and scope do.
The argument that safety doesn’t scale is also self-evidently absurd. Isn't Big Tech in the business of finding scalable solutions to these kinds of problems? If software quality is a high priority for them, but potentially expensive, we would expect to see massive investments in methods, culture, people and innovation. That's what companies do when they have big, expensive problems, right?
According to CVE Details Microsoft disclosed an average of 225 security vulnerabilities per year between 1999 and 2014. What did Microsoft do to address this? They dismantled their testing processes and then, when this predictably led to a really bad day, they decided to stagger their releases so that their users could do more of the testing they themselves were no longer doing. As a result the average number of vulnerabilities has increased to 627 per year in the 5 years since. Microsoft looks exactly like a company run by financiers focused on short-term gains with no fear of legal consequences and no competition in the market.
In other words, this looks exactly like what Too Big to Fail banks did for years before the financial crisis.
How did we get here? And what can we do about it?
When I was young in the industry, Bill Gates was seen as a God. He was not a God at producing safe and reliable software, but at the new field of ‘business strategy.’ His goal was to capture power in the market. Four policy changes catalyzed his corporate trajectory; the first was the antitrust suit against IBM, which caused the then-dominant computer player to license their operating system from Gates’s small company (it didn’t hurt that Gates’s mother served on a charity board with the head of IBM). The second was the end of antitrust enforcement in the 1980s under Reagan’s Antitrust chief Bill Baxter and FTC Chair James Miller, which ironically was conveyed most clearly with the dropping of the IBM suit. And the third was the ability to copyright software, which Congress enabled in 1976 and 1980.
The fourth was a decision at the National Security Agency to focus not on making computer networks more secure through defensive strategies, but to focus on offensive capabilities. The NSA wanted to be able to hack our enemies when they use our own software, and that meant keeping our software shitty. Essentially the shadow regulator of software security became our spies, and they regulated to ensure there would be more vulnerabilities, not fewer.
But back to the software market itself. Unlike cars or trains, the software industry emerged in a legal environment conducive to monopoly. In the 1980s, the executives ruled, and even though Gates was an engineer, his real job was to be the biggest baddest executive of them all. He did it through mergers, through rebates, and coercive contracts. Matt went into it in his discussion a few months ago, it’s not really much different than Standard Oil. But in terms of security, and this is where I’m going, this concentration and financialization of the industry created an incentive to push risk onto the public. This happened in other industries too. From the 1980s onward, big banks moved risk onto the public through opaque derivatives, until the great crisis of 2008 revealed the depth of the corrupted business models. In software, it’s happening through these security breaches.
Like most American industries, software today is comprised of large corporations focused on financial engineering, mergers and acquisitions and managed revenue growth. Marketing and "strategy" drive product decisions rather than the reverse. Legal and lobbying machinations take priority over technical innovation. In fact, we are so overexposed to this reality that even the jokes about how managers and leaders at tech companies are out of touch sound stale and overused these days - Dilbert and "Office Space" nailed this entire category of humor decades ago ("Office Space" came out in 1999!).
What’s important to understand is that, while there are always going to be some defects, most of these errors and vulnerabilities are not inevitable. They are not a result of technological problems, they are a result of corrupt business models induced by bad public policy around markets.
Software developers have endless stories about how an executive intentionally sent out unsafe products. I once argued with my boss about how we shouldn’t be sending out a widely used products that was crashing browsers and draining cell phone batteries. He told me, “Well it’s not politically feasible to rebuild it.” And that was that. We took the Microsoft path.
Eventually we were barred from the iPhone, not because Steve Jobs was being a good steward but, in all likelihood, because Apple wanted to launch a competitive product. That said, because there was competition, people could turn to a different product. By contrast, Windows has no real competition, so Microsoft can effectively introduce price hikes in the form of security vulnerabilities, without consequence.
Software engineering could have been a practice like engineering, with ethical codes about safety and reliability. Other professions, particularly those that involve real risks to human life and property, have very clear, real licensing guidelines including training, testing, certification and continuing education. Structural engineers, as one example, before being allowed to sign off on the design of an overpass need an Engineering degree from an accredited program as well as, typically, 4 years of supervised work as an Engineer in Training and must pass at least one state-administered exam. Mostly this is because these professions emerged when we had strong public rules ensuring both competitive markets and public interest protections.
By contrast, a software engineer who works on Microsoft Windows or, perhaps, bankofamerica.com is required to ... pass a job interview. And that is assuming that these jobs have not been offshored to a software contractor in India or Vietnam. But in an environment where raw power and only raw power mattered, we moved risk onto the public, instead of reducing Bill Gates’s bank account balance by a zero or two.
While Big Tech is busy cutting costs and denying liability, businesses in other industries are looking to "improve" products or productivity with software "innovations". If it's not fair to say that every company is a software company today, it's close. The managers and software developers working at these companies are generally just less qualified versions of the same people working in Big Tech and are attempting to apply the same methods that are already failing for larger, more focused companies with unlimited resources and more talent.
Making Software Magical Again
Software has become essential to our way of life and has many real consequences associated with its failure, whether due to security flaws or other issues. In 1877, the Supreme Court in Munn vs Illinois offered this definition of such a service, noting that a corporation or person who “devotes his property to a use in which the public has an interest, he, in effect, grants to the public an interest in that use, and must submit to be controlled by the public for the common good.” It’s hard to look at something like Windows and not be compelled by this logic.
There are a number of paths to addressing the crisis of concentration and financialization that has led us to such a vulnerable position. We could start by reversing the policies that caused the crisis in the first place. First, reimpose an assertive anti-merger and antitrust policy, so that as new software platforms emerge there is competition between them for users instead of competition among executives to see who gets to own the whole market. Second, begin real antitrust action, or public utility regulation, of large concentrated platform corporations whose software underpins essential facilities.
Breaking up Big Tech will create companies with more incentive to focus on the quality and reputation of the products they develop. It will also help to prevent (along with enforcement of existing laws) horizontally integrated monopolies from leveraging their positions of dominance in one area to kill threatening startups in many areas. Preventing obviously anti-competitive mergers and acquisitions will force companies to compete on value, whether that be features, quality or price. Sending signals that these and other kinds of anti-competitive behavior will not be tolerated will also open the spigots of investment in all manner of promising small companies. It has become a cliché that startups focus on silly features, but one of the major reasons for that may be that startup companies hoping to tackle ambitious problems (and therefore potentially threatening to Big Tech) simply don't get funded. Investors seem to assume these companies will simply get killed somewhere along the way.
Third, and I’m honestly not sure why this hasn’t happened already, but begin putting together class action lawsuits against corporations like Microsoft so they are liable for bugs in their software. Or change the law so they are directly liable. Nothing will force an executive to listen to an engineer’s concerns like the implicit threat of legal action if they don’t.
Finally, we need a change to our cybersecurity posture. Remember at the beginning of this piece when I mentioned that last week's reported vulnerability was the first security issue in Windows ever disclosed by the NSA? That is because what national infrastructure we have that is focused on software security and vulnerability is tied up in the NSA, an organization that pursues an almost entirely offensive mission, their mention of cybersecurity in their Mission and Values notwithstanding. It is widely assumed in the security community that the NSA hoards knowledge of unpublished vulnerabilities (known in the field as zero-day vulnerabilities) to be exploited for offensive purposes. This approach leaves all American citizens and businesses exposed to additional risk from potential attackers as a matter of policy. We should flip the presumption and refocus the NSA on defensive operations first. The NSA and our cyber capabilities should be oriented around regulating for a high trust web, not a low trust one.
We shouldn’t lose sight of just how awesome computing and software are and can be. Moore's law has led to unprecedented exponential increases in our capabilities for decades. Software running on these computers can solve any problem for which we can conceive a solution. The only limitation is our skill and the only cost is our time. Technology that is effectively free, grows exponentially and is limited only by human creativity sounds too good to be true. This should have led to the greatest flourishing of productivity and creativity in human history. And it still can - if we can create a level playing field and a system that incentivizes the creation of real value instead of rewarding the legal and accounting tricks employed by financialized oligopolies the world over.
Thanks for the essay, William!
And if you know why there aren’t class action suits around security vulnerabilities, let me know.
Thanks for reading. And if you liked this essay, you can sign up here for more issues of BIG, a newsletter on how to restore fair commerce, innovation and democracy. If you want to really understand the secret history of monopoly power, read my book, Goliath: The 100-Year War Between Monopoly Power and Democracy.