Is Software Abstraction Killing Civilization?
Or: Jonathan Blow is wrong but also right and we still
need to prevent the collapse of civilization
Early 2021
I recently stumbled upon
a talk by game
development guru Jonathan Blow, about how software
abstraction will lead to the end of civilization. Quick summary:
- Information passed on between generations is diluted.
Practise is better than theory for keeping skills alive. - Software runs the world.
- Abstraction fosters ignorance about low level programming.
- If we forget low level stuff, civilization will fall apart
since we won’t be able to keep vital software running.
It was one of those
talks that might at a first glance seem perfectly reasonable, not least
because similar ideas are regularly discussed in programmer circles by a
great number of
people. Then you start thinking about what’s said, and all the errors and
misconceptions presented makes you feel troubled, because it’s both
tempting and easy to perpetuate the punchlines without
considering their implications. I agree with Blow that it’s
important to pass knowledge between generations. Therefore, the veracity
of that information is extremely important. I propose that
the information given in the very talk about the importance of
such information, is in fact wrong on many counts.
That’s why I, in this rather long text, have tried to examine Blow’s
claims in detail. (TL;DR: Someone is wrong on the Internet – man with
ample spare time on his hands to the rescue!)
Some examples of collapsed civilizations and artefacts lost in time
I’m not a historian and will not comment on this first part of the talk. It doesn’t matter much, though; my gripe is mainly with the second part.
Five nines
Blow says we used to use the “five nines” (99.999% uptime) metric when selling computer systems. Since his laptop has the habit of rebooting when in sleep mode, it can never be a candidate for 99.999% uptime. This part is true: five nines means a total yearly downtime of just above five minutes. This is presented as proof that we have lost a rhetoric of quality. This part is false.
In reality, five nines usually applies to things like emergency response switchboards (911), hospital systems, financial transaction processing, etc. It’s also a metric usually accompanied by long contracts detailing various downtime scenarios that couldn’t possibly count as downtime. It has never, ever been used to sell consumer laptops or the word processors we run on them.
It’s also incorrect that five nines isn’t used anymore: several companies, such as IBM and Amazon, sell such systems and services. Incidentally, IBM actually goes further and claims their Power9 platform can provide even higher availability. Provided, I’m sure, the network used to reach it is functioning and the client computers are working and, and, and… you get the point.
So, no, we haven’t lost this rhetoric of quality. We’ve simply never used it the way Blow claims.
Oh, and in case anyone’s wondering, my laptops never reboot unless I tell them to and one of them currently has an uptime of 55 days.
“The industry hasn’t produced robust software for decades”
What is robust? Is it my iPhone, going for weeks and months without a
reboot? Is it
the famous uptime of Novell’s file and printer servers,
with documented instances running for 16 years? Is it the multi-year
uptimes of a plethora of Unix, Windows and VMS machines? Is it the 365+
days (and counting) uptime of some random Linux web server I’ve got
access to? Is it turnkey systems like IBM i, offering continuous
availability?
Apparently none of the above, according to Blow.
“Tech companies are no longer about pushing tech forward”
Well, yes and no. I agree that the lifestyle app startups of Silicon Valley (and elsewhere) are hardly about pushing tech forward – but people more interested in making money rather than amazing technology has always been around. Was Infocom’s umpteenth text adventure about pushing tech forward? Was any of the ten bazillion (a rough estimate) jumpy-shooty platform games produced? Was MS-DOS, single-tasking its way through 15 years of building Microsoft’s empire right up until its last breath in version 6.22?
There’s a lot of exciting stuff happening on the “hard tech” front. Blow does mention machine learning earlier in his talk, but “boring” things like file systems, web servers, databases and programming languages are also continually developed and improved upon. A lot of effort is also going into improving the various layers of abstraction, virtualization and containerization Blow is so adverse to – but him disliking them doesn’t mean they’re not about pushing tech forward.
“Abstraction leads to loss of capability.”
Blow lists a number of programming languages, some of which have fallen out of fashion because of abstraction. His take basically boils down to learning assembly coding, memory management and pointers.
I agree that a large number of programmers today (myself included) are very happy with not having to deal with things like memory allocation and pointers. And yes, there are certainly horrifying examples of when abstraction gets the better of us. Plenty of web sites use a lot of completely unnecessary JavaScript framework voodoo to render a simple blog and plenty of supposed “desktop apps” are really running (very slowly) inside a bundled browser.
Despite that, I’m willing to bet a few bucks there are more people
around today (including youngsters) who can program in C than ever
before, and that more C and assembly code is being written than ever
before. Linux and NetBSD, for example, are continuously being ported to
pretty much everything that even vaguely resembles a CPU. Rust, a new
language with a heavy focus on robustness, does feature pointers and
lets the programmer manage memory. Harvard’s introduction to Computer
Science, CS50, is publicly available on Youtube and each year it
features a two hour
lecture on stuff like memory layout, pointers,
malloc() and free().
I agree that the things mentioned here are good to know about and I think anyone seriously into programming should at least try them out some time, if for no other reason than to understand why a lot of us prefer to avoid them if given the option to do so.
Oh, and garbage collection and functional programming aren’t new
abstractions. Lisp did both in the late 1950s and it’s been used in a
lot of “hardcore” settings, including
NASA’s Jet Propulsion Lab.
And, unsurprisingly, a Lisp programmer will of course claim that Lisp
makes you much more productive than something with malloc() ever will.
Another early high level language, COBOL, is completely ignored by
Blow during his whole talk. It’s fairly fundamental for
our current level of civilization, considering it forms the backbone
of our banking and financial infrastructure – but I suppose a high
level language doing really critical work (as opposed to
games written in C) doesn’t fit Blow’s narrative.
“The productivity of Facebook employees is approaching zero”
Here we’re shown some graphs of the rising number of Facebook and Twitter employees and Blow states that since Facebook doesn’t really get that many new features each year, all those programmers mostly sit around and do nothing.
First of all, I’m pretty sure Facebook employs a wide range of staff who are not feature developers: lawyers, accountants, graphical
designers, sysadmins, researchers, HR, middle management, etc. When not working on Facebook the site, employees of Facebook the company
are also working on Instagram, WhatsApp and Oculus VR to name some of their other pursuits. I’d also argue that (measurable) individual
output in general drops as a result of a company growing. You need to reach a certain critical mass to have a whole team working hard on
a feature only to, by some managerial decision, drop it right before it’s finished. Things like that happen all the time in large
software companies and won’t show up in Blow’s rather flimsy metric.
More importantly, Blow constructs this argument around the assumption that Facebook’s product is Facebook, in the sense of the social platform. This is of course wrong. Facebook doesn’t have to add or change very many features on their social platform every year, they just have to keep things smooth enough for those who actually use those functions to stay content.
This is because Facebook’s real product is an ad delivery platform.
As such, it collects massive amounts of personal and private data and churns it into targeted ads, all the while sucking people in with cleverly designed attention-grabbing dark patterns. I’m sure plenty of work is done on this by a whole lot of programmers, but it’ll never appear as “features” to the user, only as corporate revenue. In that respect, Facebook programmers seem to be highly productive.
Ken Thompson’s “Three Week Unix”
First off, I’m not out to belittle Ken Thompson’s efforts here. Writing
an assembler, editor and basic kernel in three weeks is highly
respectable work by any standard. It’s also a great piece of computer
lore and fits Blow’s narrative perfectly – especially with Kernighan’s
little quip about productivity in the end. Of course, we don’t know how
“robust” Thompson’s software was at this stage, or how user friendly, or
what kind of features it had (Note that what’s discussed here isn’t First
Edition Unix or even PDP-7 Unix, for which there’s source code
available: it’s the first version of what was used
18 Comments
dostick
It appears that author is from that newer generation, and completely misses points because he just don’t know. Ironically the article is an example of what Blow was talking about.
similar happens if I talk about how Figma is destroying the design world on unprecedented scale by normalising bad UX, UI and product management found in Figma itself, and get responses from younger designers that everything is great, completely baffled.
You have all that knowledge because you grew up in that environment. They didn’t, and not likely they can learn the equivalent of culture and experience anywhere.
gtsop
It is unfortunate that someone needs to pick apart a flawed thesis in such detail (as the author did with Blow). The pure empiricist is equaly as detached from reality as the pure theorist, and as such Blow is making up arguments just because they fit his experience, cherry picking examples that fit his rants and promoting the exception to be the rule.
rat87
No
https://en.wikipedia.org/wiki/Betteridge%27s_law_of_headline…
Of course not
And if it were it wouldn't be something
c or c++ related but all the banks and unemployment systems still written in COBOL
knodi
Why stop here… not say easy access to clean water is destroying your health. Should walk two miles to the watering hole, carry it back, collect wood for fire and boil the water your self…
s
sd9
Blow often makes fantastic points about development, and often completely misses the mark.
He’s accomplished great things and has ideas worth listening to – but also plenty of nonsense that’s presented indistinguishably.
I felt quite strongly that the collapse of civilisation talk was one of those pieces of nonsense, and I’ve largely ignored it (despite listening to it twice). I’m grateful to OP for providing a more principled rebuttal.
Don’t even get me started on Casey Muratori, who tries to do the Blow thing but doesn’t even get the good parts right.
keninorlando
I miss assembler and maximizing code efficiency to juggle 2k of ram.
recursivedoubts
I teach the systems class at Montana State, where we go from transistors up to a real computing system, and I have students that don't understand what a file system really is when they start my class
I agree that blow is wrong on some details, but I really think we need to be thinking hard around a NAND-to-Tetris style education that starts in high school for our technical students.
I use "outdated" models like Little Man Computer and a simple visual MIPS emulator which, even though they aren't realistic, at least give the students a sense of where we came from, with a level of complexity that a normal human can get their head around. When I look at modern 64 bit architecture books that get suggested for me to teach with I just laugh.
Anyway, not to say I agree 100% with Blow on everything, but connecting technology down to the roots is a hard problem.
lakomen
[dead]
low_tech_punk
I think this post is related: We are destroying software (https://antirez.com/news/145)
I don't disagree that software engineering is not as rigorous as it was. But software has also spread into much wider area, allowing many more people to participate as programmers, all thanks to abstractions.
I'm drawn to the romantic/pure/spartan aspect of low level programming too but I also need to make a living by solving problems in practical and scrappy ways.
ilrwbwrkhv
I think JavaScript on the server and React and these things has really made the web a mess of software development compared to how little stuff it actually does.
I know for a fact a bunch of kids now do not even know that HTML is what gets rendered in the browser. They think that React is itself what browsers render.
Not to mention the absolute idiot of a CEO of Vercel who thinks React is the Linux kernel of development.
ninetyninenine
No it’s not. Software is killing the US and not because it abstracts low level stuff. It’s because it abstracts reality in such a way that the US is becoming a nation of software engineers ignorant of even how to build the stuff it programs. We’ve moved up the stack and moving up the stack requires people knowledgeable about the bottom.
One can’t exist without the other so the world isn’t getting destroyed. China is handling the bottom of the stack. And by understanding the bottom of the stack they also become good with the top.
I used to hear shit like China only knows how to manufacture stuff the US is good at design. And I keep thinking well if you submit an entire design to China to manufacture you don’t think they will figure out how to design? Knowing how to design is easy. Manufacturing or turning that design into reality is hard.
Software isn’t killing civilization. It’s killing the US and that may seem like the world if you lived here all your life.
torlok
As somebody who works in embedded, and does kernel programming and low level networking, I wish any of Blow's fear mongering was true. It would do wonders to my feeling of job security and self importance.
harrall
If an older web developer rants about abstraction, they will target React developers.
If a Python dev rants about abstraction, they will target older web developers.
If a C++ application dev rants about abstractions, they will target Python developers.
If a firmware dev rants about abstractions, they will target application developers.
If an electrical engineer rants about abstractions, they will target firmware developers.
Drawing the line for “excessive abstraction” based on what you personally know and calling everything afterwards as “killing civilization” is quite a take.
tolciho
> I'm not a historian and will not comment on this first part of the talk. It doesn't matter much,
Okay.
> What is robust? … Is it the multi-year uptimes of a plethora of …
Big uptime systems are dubious. Probably a lack of kernel patches, hardware patches, and who know if, on reboot, the system will actually boot and start the relevant services correctly. A bank once had their mainframe fail, and they were flailing around for a good long while, possibly because it had been about a decade since their last failure and everyone forgot what to do, or maybe Bob had retired. Or how about that host that boots four years into the future and now various things are quite broken? There was NTP, but an unrelated change had broken that on some firewall. "Normal Accidents" are somehow a thing in complex systems, as are black swan events. Quite possibly either or both were involved in the late bronze age whateveritwas, but naturally history doesn't matter much.
> Oh, and garbage collection and functional programming aren't new abstractions. Lisp did both in the late 1950s
PAIP (Norvig) recounts that the garbage collection was so bad it was turned off and the LISP machines were let run until they ran out of memory, at which point they were rebooted. I guess this is a point for improved robustness in certain areas, though there are probably still "/5 * * * reboot-something" type cron jobs out there for services that leak too much memory. No, management did not grant time to fix that service last I had to put in the most recent five minute reboot script. Many don't get such a intimate view of the rusty bowels of the internet.
> open up a Unix-type command line in Linux/MacOS/*BSD/WSL, type "ed" at the prompt and see how far you get with your text editing
Some wacky systems do not install ed, or link it to nano or some other nonsense, so you may need to actually install ed to get the standard editor. If you happen to be stuck on such a system, `busybox vi` is tolerable—vim gussied up with color spam is not, especially the unreadable blue on the black console—though learning enough about ed might be good if you, hypothetically, ever have to fix an old system over a serial line at 3AM in the morning. There isn't much to learn for such cases, "a" to append new lines, "." on a line by itself to end that, "/foo" to search, "c" to change that whole line (and then the "." thing) and then "wq" to save. Great for edits were you don't want other folks seeing the contents of, say, /etc/hostname.wg0. Or sometimes a cat of the file should be done to preserve it in the scrollback buffer, which has saved a non-zero number of configuration files across the internet. Ideally this sort of disaster training should be practiced by some now and then, but that does take time away from other things.
Back to the unloved history thing. A collapse can take a few centuries, which may be a problem given the recent emphasis on the current sprint or how the stock will be doing Tuesday (as opposed to the lease for Hong Kong, though a hundred years is a magnificently short period of time). So a few folks crying wolf or playing Cassandra might be a good thing to help point out past issues and maybe from that future shocks can be made less bad.
And of course one should beware the C people.
Sparkyte
In a sense abstraction does. You stop learning the fundamentals of how something works. Long enough as time goes by no one knows how the fundamentals work and a company fails.
foxes
Billionaires are killing civilisation
travisgriggs
Not all simplifications are abstractions. Not all
abstractions are simplifications. But the pursuit of simplification is usually what motivates an abstraction. I don’t think that abstractions kill software, or civilization for that matter, but ill begotten abstractions in the name of short win simplifications, puts a drag on the flexibility and agility and approachability of either.
Take syntactic sugar in just about any language. There’s usually a breaking point, where you can say the localized gain in simplification for this particular nuance is not worth how complex this tool (language) has become. People don’t make more mistakes in a syntax heavy language because of any particular element, but because using the tool well to solve complex problems just gets difficult (regardless of what a compiler might guarantee you).
I look at the complexity that async and coroutines adds to my experience when programming anything “threaded like” in Kotlin compared to how I deal with the same sorts of problems in Elixir/Erlang and it’s just night and day difference. Both have abstractions/simplifications to the age old problem of parallel/async computing, but one (the former) just multiplies simplicities to end up with something complex again and the other has an abstraction/simplification that Just Works(tm) and really is simple.
bionhoward
One reason I like Rust is it lets me replace a lot of high level Python stuff with faster and more correct code while also opening access a new world of low level C stuff I never had an opportunity to learn about, it’s been super fun to tinker with and I look forward to keep working with it!
Definitely good to broaden our horizons but also crucial to maintain focus…there is so much to learn, how can we balance the joy and productivity of software with the strategic imperative to build locally grown hardware?