Software is eating the world. But progress in software technology itself largely stalled around 1996. Here’s what we had then, in chronological order:
Since 1996 we’ve gotten:
IntelliJ, Eclipse, ASP, Spring, Rails, Scala, AWS, Clojure, Heroku, V8, Go, Rust, React, Docker, Kubernetes, Wasm.
Those of us who worked in the 70’s-90’s surfed endless waves of revolutionary changes. It felt like that was the nature of software, to be continually disrupted by new platforms and paradigms. And then it stopped. It’s as if we hit a wall in 1996. What the hell happened in 1996? I think what happened was the internet boom. Suddenly, for the first time ever, programmers could get rich quick. The smart ambitious people flooded into Silicon Valley. But you can’t do research at a startup (I have the scars from trying). New technology takes a long time and is very risky. The sound business plan is to lever up with VC money, throw it at elite programmers who can wrangle the crappy current tech, then cash out. There is no room for technology invention in startups.
Today only megacorps like Google/Facebook/Amazon/Microsoft have the money and time horizons to create new technology. But they only seem to be interested in solving their own problems in the least disruptive way possible.
Don’t look to Computer Science for help. First of all, most of our software technology was built in companies (or corporate labs) outside of academic Computer Science. Secondly, Computer Science strongly disincentivizes risky long-range research. That’s not how you get tenure.
The risk-aversion and hyper-professionalization of Computer Science is part of a larger worrisome trend throughout Science and indeed all of Western Civilization that is the subject of much recent discussion (see The Great Stagnation, Progress Studies, It’s Time to Build). Ironically, a number of highly successful software entrepreneurs are involved in this movement, and are quite proud of the progress wrought from commercialization of the internet, yet seem oblivious to the stagnation and rot within software itself.
But maybe I’m imagining things. Maybe the reason progress stopped in 1996 is that we invented everything. Maybe there are no more radical breakthroughs possible, and all that’s left is to tinker around the edges. This is as good as it gets: a 50 year old OS, 30 year old text editors, and 25 year old languages. Bullshit. No technology has ever been permanent. We’ve just lost the will to improve.
[The discussion continues at Open source is stifling progress.]
[Jan 5: Deleted then reinstated by popular demand. The argument could certainly be improved. I should at least note that there has been substantial technical progress in scaling web performance to meet the needs of the MegaTechCorps and Unicorns. The problem is that much else has languished, or actually worsened. For example small-scale business applications as were built with Visual Basic, and small-scale creative work as with Flash, and small-scale personal applications as with HyperCard. This trend is alarmingly similar to the increasing inequality throughout our society.]