Evolving systems and pushing the reset button
I've been trying to put together a bigger sense of the evolution of software and technology as some parts graduate from being difficult to being relatively simple.
Imagine Ethernet networking and TCP/IP. Once upon a time, it required expensive additions to your computer hardware, a bunch of programming and configuration skills with trial and error happening continuously. Now you can just walk into a store and buy any number of consumer products which will plug in to a wired network or associate with a wireless one using little more than a couple of GUI prompts.
Look at just TCP/IP. Assume you had the benefit of having a working stack already. At some point, it was a matter of running 'ifconfig' and 'route' by hand. Then it probably got scripted - the same commands were happening, but now they were automated. After that, those scripts morphed again to pick up parameters from an external source. The scripts themselves would no longer be edited since they'd find their settings in a configuration file.
The configuration file itself probably required manual editing at some point. Then in a later version, it became a series of text-based prompts which themselves were provided by the OS packager. Your answers were written to that same file. This interface probably changed to be a text-mode pseudo-GUI, and then maybe even a "real" graphical mode interface with pointing, clicking, and all of that.
Obviously, DHCP helped quite a bit here, but even if it had never happened, things still got easier for regular users. They no longer needed to summon a network guru just to get online and be productive.
I wonder what would have happened if TCP/IP hadn't been relatively stable since it was unleashed upon the world, and instead had kept changing in non-trivial ways. Would we have ever made it past the level of slapping patches on our kernels and running config commands by hand every time we wanted to use the network?
This seems like a call back to my original disconnect between config files and source code when I didn't yet know how compilers worked. When a once-complicated system gets to the point that it can be expressed as a series of finite configuration directives instead of requiring the flexibility of an actual program, you can guess "normal" users are not far behind.
Of course, if someone keeps pulling the rug out from under the tech to reset it every couple of years, will it ever get to that point?
Come to think of it, there are probably some profitable (if nefarious) reasons for doing that sort of thing every so often. It's the technical equivalent of creating a sick system: the users have to depend on you, the programmer who actually understands this stuff.
Why torture them? What have they ever done to you?