Dan Lyons blasts Mike Arrington and MG Siegler for being relatively cheap PR reps for the companies they blog about.
It's fun to read, up to a point, because he says so well what we've been thinking about these guys. But there's a problem with all this, including Lyons' post. They're changing the subject. Making it about personalities not address books.
Users are exposed, maybe millions of them, and the tech industry hasn't offered to help. Or even to stop.
Which of the apps on my iPhone is transmitting everything I think is private and to whom are they transmitting it?
The idea of companies reacting to crises about product quality issues is not new.
The first time I personally encountered it, on the other side of the equation -- as a vendor -- was with copy protection in the 80s. We were doing it like "everyone else" was. Kind of like the address book scandal that's breaking out now.
My company wasn't the main target of the outrage, Lotus Development was. That doesn't mean we didn't get punished by our customers, we did. And we deserved it, and like everyone else, eventually we gave the customers what they wanted. But it took too long. And I learned an important lesson here. It totally influenced my thinking about the role of vendors in relation to customers, and who's really doing the innovating.
The first tech crisis that came about after the birth of the blogosphere was the controversy about floating point math errors in Intel chips. A professor in Virginia had discovered that under some circumstances the math processor in an Intel chip would return the wrong answer! You could demo it in an Excel spreadsheet. The company responded at first with a technical answer, explaining how unlikely it is that anyone would ever see an incorrect result. As an engineer and mathematician and computer guy, I understood what they were saying, and was willing, personally, to take them at their word. But this did not go over at all with users and the press. Computers are supposed to be perfect. No bugs allowed. They didn't care how unlikely it is -- fix it! That seemed to be what people were saying. Intel tried to wait it out. They tried to stonewall it. I don't remember if they ever attacked the critics personally as we're seeing in the industry response to AddressBookGate, but if they had it wouldn't have gotten the results they were hoping for.
Eventually Intel had to relent and offered to replace anyone's CPU with one that didn't have the bug. The cost of the exchange was huge. Not just in dollars spent on fixing the problem, but in reputation and trust lost. People found out that computer chips were fallable. This is not something they wanted to know. And had Intel responded initially with the response they eventually had to implement, the cost would have been much lower. It cost them a lot to try to douse the flames, and it didn't work.
The classic textbook example of a crisis perfectly handled was the Tylenol tampering incident in 1982. Some unknown person had put cyanide poison in a few bottles of Tylenol in Chicago, and seven users died as a result. This was not something, in the opinion of Johnson & Johnson, the owners of the product, they could brush off, or explain. They immediately, with no hesitation, took responsibility. They emptied store shelves of their product, even though the vast majority of them were not poisoned. They did not re-introduce the product until they had a process in place that would guarantee not that it was unlikely their product would be tampered with (Intel's defense) but guaranteed that it was impossible. All the double-security packaging you see on medical and food products these days is a result of that incident in 1982. That industry went from being innocent about possible security issues to passionate about it. It could have been the death, not only of seven customers, but of the brand. Tylenol quickly came back to the top, and trust in their product and the company went up as a result of the incident.
And when Tylenol communicated about the incident, they validated people's concerns, they did not dismiss them, or minimize them. They have families too! No one wants to take a pain reliever thinking it might be poison. They understood. They are humans, like we are.
This is what the tech industry should be learning. Will the adults in the industry get with the CEOs, behind closed doors, and coach them on this process. You simply can't win by trying to intimidate people who ask serious questions about the security of your products.
The truth is that repressive, murderous governments have been caught hacking into commercial vendors servers to get information about people they want to repress or murder. They use social networks to find out who they are associating with. This is a problem that is recognized by all serious security experts. It's not something you can or should want to brush aside. Here's a chance for your companies to shine. Instead the response has been even more sequestered than Intel's response to a much more benign issue, 18 years ago. It's time to make this change in tech, once and for all. Your products are not toys, they are used seriously by real people. You need to show respect for your product, and that means respect for your users.
I have two smartphone-like devices -- A Samsung Galaxy/S on T-Mobile, and an Apple iPhone 4 that does not currently have a service plan. I also have a Google Voice account that I use for calls from my desktop. My NY apartment has awful cellphone coverage, so most of the calls I make are via Google Voice.
Until I used Path for the first and only time in late 2010, it never occured to me to ask whether the contents of my address book on the iPhone are private. As I said in the blog post I wrote at the time, this can't be a legit part of someone's business model, because I pay for the right to keep my address book on the iPhone.
But I learned before, that even though Apple jealously guards its own secrets, it doesn't help its customers protect their information from being shared with the world. I learned this when a hard drive stopped working. I bought a replacement from Apple, and they refused to give me the old drive. I had to raise the issue all the way up to Steve Jobs to get the hard drive back. I was really disappointed to see this. I thought of Apple as a highly competent company, and this lack of concern for their customers says something completely different about their competence. For the month it was out of my hands, I have no idea where it was, if it was backed up, and where the data on the drive might have ended up.
I am disappointed that programmers at Path and at Apple and perhaps dozens of other tech companies lack the ethics to stop their employers from using any data they can easily put their hands on. This is like a doctor who sees your wallet on your hospital room nightstand, and copies your credit card numbers, driver's license number, social security card, pictures of family members. It's there. No one is protecting it. Right? But it's even worse. It's as if the hospital had a policy to copy the info in all wallets left on patient's nightstands.
All these are moral questions, important, but not totally pragmatic.
The pragmatic question, for Apple to answer, is this.
Can users store private information on iPhones and iPads and at the same time use apps?
Pretty simple. It seems to me that the two actions are incompatible. If you install even one app on your iPhone or iPad, all your data is compromised. Since the tech industry is the predator here, we have to think for ourselves.
This is the issue that Consumer Reports and the FTC should be looking into. How about Congressional hearings? Bend over backwards to protect users the same way you would protect the entertainment industry.
I would have added that the NY Times should be weighing in too, but they already have. And Nick Bilton, writing in the Times, was right that information in address books, in some contexts, is a matter of life and death. In some countries in some contexts people do get killed for talking to reporters.