By Dave Winer on Sunday, November 06, 2011 at 9:31 PM.
I had a problem the other day that involved running a Java app. The problem was, apparently, that the developer, a large Taiwanese company that generally makes good products, had written the code a couple of years ago. It didn't run on any of my Macs.
Frustrated, I had an idea. I have a netbook that I haven't run in over a year, sitting in my closet. I imagined that they had tested the Java software on Windows, and since nothing had updated on the machine in a year, it might just work.
My theory proved correct! It worked. I was able to run the configuration app and get on with the rest of my project.
But then I left the machine running, so the battery could charge, and hell broke loose. Every app on the machine, every little bit of system code wanted to update itself. Not only did it waste a ton of my time supervising all this updating, I could never be sure when it was finished. After updating a lot of stuff didn't run anymore.
Now the typical answer is this is good for the ecosystem, sweep out all the code that isn't being watched or maintained. And of course turn the poor user into a janitor for the tech industry and assume he or she understands all the questions that are being asked, and all the implications.
But in what way is breakage good for users? (It's not.)
In contrast, if I leave a car parked in front of my house, and go away for a while, when I come back, the radio still works. So does the heater, and the engine. I've had batteries go bad while cars sat idle. Once I froze an engine block in a Wisconsin winter. But none of this was done to the car, deliberately, by companies in the car industry. Generally when my car breaks it's because I did something to it. (There are exceptions of course.)
The tech industry is update-happy. The rationale that somehow breakage is not only acceptable, but good -- is nonsense.