Friday, July 4, 2014 at 7:13 AM

About Facebook users and Facebook

Briefly, why the recent Facebook eruption is unlike all previous ones.

  1. We need a name for this event.

  2. Conducting tests on humans without their consent is unethical for doctors and academics. Nothing vague about this. And there are good reasons for it. Testing can do harm.

  3. You say that A-B testing is done all the time on websites. Well, perhaps people should have to consent to that too. But this was different. They were testing the Facebook newsfeed. That's not like Amazon testing a product page to find out which one gets more business. People know they're being sold products there. They willfully chose to go to Amazon to be sold. Choice is very clear.

  4. The newsfeed is in dangerous territory. People thought they had an idea what it was. But this news raises questions, things people clearly don't want to have to think about.

  5. Am I reading ads? Yes, I know that. That's clear.

  6. Am I reading ideas and news from friends? Yes, I can see that too.

  7. Am I being tested like a lab rat in a science experiment? Well, that's not obvious at all, until now. And I don't like the idea. I feel important. I am the center of my own universe. I think somehow Facebook doesn't understand that I am a person, just like they are. In fact, now I'm sure of it.

  8. That last one is probably the biggest issue for Facebook. If they want their service to work, long-term, they have to have suspension of disbelief, like a good book or movie. You're inducing a trance. We're being captured by the story. If the real story is scientists in lab coats, with notebooks, measuring us like lab rats, well that might be good for a while, a social network for lab rats, but I'm going to probably not spend ALL my time there, and probably only if I'm sort of kinky and into S&M a little.

  9. Ideally, you want the user not to think about Facebook at all. To get lost in the story of their own timeline. To feel as if this was something created just for him or her, with knowledge of what makes them special, and with care and love. It's impossible (I think) for an algorithm to love you, but it should feel that way.

  10. So that's the problem. Facebook just broke the fourth wall. Of a sudden we see the man behind the curtain. And it's ugly and arrogant and condescending and all around not a nice feeling.

  11. To make matters worse the COO says they didn't want to upset us. Implying that the mistake was that we found out they're doing it. Well, at least we know she's not being coached by PR people.

  12. I'm not a PR consultant myself, and I don't really like PR. What I'd like instead is for Facebook to sober up a bit and come to grips with what it has accomplished and the responsibility that comes with it. More than anything FB has a bad case of hubris. Sure, you're talented, and special in many ways. But no human being can be so deep and smart and strong as to understand a social network with this many people using it. It was a huge accomplishment to create it, and huge every day it keeps running, but beyond that, the network is being created by the users. And when you get involved with the network in such a plainly disrespectful way, you break the implicit contract with the users, that you will just facilitate, and let them create their own experience here.

  13. Facebook needs something like the code of ethics that news organizations have, if you want to keep the trust of users. That's the main conclusion. And clean house, people who don't understand the role of facilitator, and feel like gods, should go. That's the downfall of all tech companies, its seems. This feeling that you're a new kind of human. You are not. Every generation thinks it is, only to find out it's not true.

PS: This was originally posted on Facebook.


Last built: Sun, Nov 2, 2014 at 11:45 AM

By Dave Winer, Friday, July 4, 2014 at 7:13 AM. Still diggin!