Dave Winer, 56, is a visiting scholar at NYU's Arthur L. Carter Journalism Institute and editor of the Scripting News weblog. He pioneered the development of weblogs, syndication (RSS), podcasting, outlining, and web content management software; former contributing editor at Wired Magazine, research fellow at Harvard Law School, entrepreneur, and investor in web media companies. A native New Yorker, he received a Master's in Computer Science from the University of Wisconsin, a Bachelor's in Mathematics from Tulane University and currently lives in New York City.
"The protoblogger." - NY Times.
"The father of modern-day content distribution." - PC World.
"Dave was in a hurry. He had big ideas." -- Harvard.
"Dave Winer is one of the most important figures in the evolution of online media." -- Nieman Journalism Lab.
10 inventors of Internet technologies you may not have heard of. -- Royal Pingdom.
One of BusinessWeek's 25 Most Influential People on the Web.
"Helped popularize blogging, podcasting and RSS." - Time.
"The father of blogging and RSS." - BBC.
"RSS was born in 1997 out of the confluence of Dave Winer's 'Really Simple Syndication' technology, used to push out blog updates, and Netscape's 'Rich Site Summary', which allowed users to create custom Netscape home pages with regularly updated data flows." - Tim O'Reilly.
8/2/11: Who I Am.
scriptingnews1mail at gmail dot com.
My 40 most-recent links, ranked by number of clicks.
People are always asking about my bike.
Here's a picture.
FYI: You're soaking in it. :-)
On my drive east earlier this month I read a bunch of audio books, including Isaac Asimov's I, Robot. It's basically a collection of short stories built around the assumption that we create robots that obey three laws at their core.
The stories are mostly about the laws. How they mold the relationship between humans and the robots, unforseen consequences of the laws, and what happens when we create robots with slightly different versions of the three laws.
The three laws:
1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.
2. A robot must obey any orders given to it by human beings, except where such orders would conflict with the First Law.
3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
It struck me, thinking about how the W3C and IETF are controlled by the big tech companies, and how they serve their interests, often in conflict with the interests of users, that perhaps a new kind of standards body is needed. One which never takes money from tech companies, and has its own version of the three laws.
The restated laws:
1. A standard may not injure users or, through inaction, allow users to come to harm.
2. Standard-compliant software must obey any orders given to it by users, except where such orders would conflict with the First Law.
3. A standard must protect its own existence as long as such protection does not conflict with the First or Second Law.
Okay I know this is a little crude. But the point is this: Users are supreme. Tech companies are not even part of the charter of a human-serving standards body. They may use the standards, but we don't care one way or the other if they continue to profit, or even exist.
Standards exist to protect the interests of users first. As a second priority standards do what users want them to do. And as a third priority, only after the first two are considered, does the standard perpetuate its own existence.
I think after all the years going round and round this loop I've finally figured out what the bug is. We let corporations push us around. They've made it so we serve them. That's what's wrong with the way technology has been evolving. Every so often we get back on track, just to have a fresh set of corporations come along and take us off track again.