Screwtape at 2017-06-04T05:44:09Z

Yesterday on Hacker News, I read an article that claimed to be an inside perspective on What Really Happened With Windows Vista. It's pretty interesting, but there was one idea that I hadn't previously encountered. A given software implementation balances functionality against the resources (time, storage, etc.) available, but both user expectations and hardware resources change over time. The article mentions two different responses to such changes:

  • Hardware Supports Software: software is always complex and difficult, so if we can add complexity to the platform (OS, compiler, runtime) in a way that makes applications simpler, more reliable, more maintainable with the same level of functionality, we should. If that means spending extra hardware resources on computational overhead like runtime checks and managed code, that's a trade worth making. The article claims this was Microsoft's mindset at the beginning of Vista's development.
  • Software Supports Hardware: people buy hardware to fulfil some particular function, but while some software is necessary to make it work, too much software just slows everything down. Therefore, software should be restricted to what's necessary to provide functionality, and extra hardware resources should mean everything just runs better. The article claims this was the mindset behind Apple's iOS.

Both viewpoints have merit, but they're clearly incompatible. Now that they've been pointed out, I suspect a lot of the technical discussions I've participated in over the years were at least partly based on "Hardware Supports Software" people looking at a project based on "Software Supports Hardware" ideals and boggling, or vice versa. For example, I reckon the systemd and GNOME teams are squarely in the Hardware Supports Software camp, while Software Support Hardware presumably hosts the suckless guys, and anybody who owns an Arduino.

joeyh likes this.