Dave Winer, 56, is a visiting scholar at NYU's Arthur L. Carter Journalism Institute and editor of the Scripting News weblog. He pioneered the development of weblogs, syndication (RSS), podcasting, outlining, and web content management software; former contributing editor at Wired Magazine, research fellow at Harvard Law School, entrepreneur, and investor in web media companies. A native New Yorker, he received a Master's in Computer Science from the University of Wisconsin, a Bachelor's in Mathematics from Tulane University and currently lives in New York City.
"The protoblogger." - NY Times.
"The father of modern-day content distribution." - PC World.
"Dave was in a hurry. He had big ideas." -- Harvard.
"Dave Winer is one of the most important figures in the evolution of online media." -- Nieman Journalism Lab.
10 inventors of Internet technologies you may not have heard of. -- Royal Pingdom.
One of BusinessWeek's 25 Most Influential People on the Web.
"Helped popularize blogging, podcasting and RSS." - Time.
"The father of blogging and RSS." - BBC.
"RSS was born in 1997 out of the confluence of Dave Winer's 'Really Simple Syndication' technology, used to push out blog updates, and Netscape's 'Rich Site Summary', which allowed users to create custom Netscape home pages with regularly updated data flows." - Tim O'Reilly.
8/2/11: Who I Am.
My 40 most-recent links, ranked by number of clicks.
FYI: You're soaking in it. :-)
First, previous Blogger of the Years.
In 2001, I chose a BOTY by nominating several people and letting the readers vote. That year, the choice was Joel Spolsky, who went on to do many great things, such as the Joel on Software books and the Stack Overflow website.
Then in 2007, I named NakedJen as the BOTY. To Jen, being a blogger means being vulnerable, exposing who you are, and standing by yourself. A blogger is a sole practitioner, who sticks his or her neck out, but does it with conviction and belief. That's what choosing NJ said about blogging, to me.
In 2008, my choice was Jay Rosen. Part of being a blogger is being a teacher and a thinker. Jay's mind is so flexible, he can put himself into impossible situtations, the kinds of situations we find ourselves in, and figure out quickly where the chips must fall.
In 2009, with so much else going on, a move to NY, the passing of my father, it must have slipped my mind that I had a light to shine. But in 2010, the tradition resumes.
This year it could have been Doc Searls for his pioneering work in the economics of the Internet. Or Paul Krugman, for blogging so well inside the beast. There are dozens of bloggers I admire and would happily sing the praises of.
But not this year -- because this year -- we're on the verge of big change. It's as if the Internet has turned into a giant Tianamen Square, we're having a Summer of Love, but on the side the tanks are assembling. We know the story isn't over. Not by a lot.
As I explained in this morning's podcast, the world of news might have been split in two this year. On one side are the people and organizations who want to use the information in the WikiLeaks cables. And on the other side, those who don't. So far, the US government is on the Don't side. CNN, for some mysterious reason, is there too. Some of the things Bill Keller at the NY Times has said indicate that he is on that side, while his organization is solidly on the other. The Guardian is kicking butt on the Do side. As is Spiegel, Le Monde, El Pais, and dozens of other news organizations digging in and reporting WikiLeaks-related stories every day of the week.
Fox News is starting to love WikiLeaks. And Iran is blocking it. Remind me, why is the US against it?
There really isn't much gray there, not much ambivalence. Either you're for em or you're agin em.
Funny thing, as time goes by, I bet the number of people on the Don't side will shrink, until there comes a time when we won't remember when the public didn't know what the government was up to. Governments will pre-empt the leakers by leaking on themselves. People will wonder why WikiLeaks was seen as so threatening. Parents will explain to children that change is always scary to adults.
But in the meantime, we still have a lot of processing to do.
We have a fantastic communication system in the Internet -- will we use it?
Julian Assange is a very powerful and famous man, and he put himself in that position by doing something incredbily brave. It's hard to imagine someone risking so much, for a cause, but there he is. But as strange as it seems, he's from our world, and his values and ours are the same. There's not very much light between what I believe and what I understand that he believes.
NakedJen is an evangelist for radical transparency. Jay Rosen says the news process is turning upside-down. And Julian Assange put both ideas together. He says let's know all there is to know. Let's tell the people who take us to war and destroy countries and kill hundreds of thousands, for profit -- no more secrets. We're not just going to suspect you're doing it, we're going to know. And maybe, if they know we'll know, they won't do it.
So while the people on the Don't side try to discredit the man, and what he's done, the story is still getting out. There are new revelations every day. As Arianna says, all it takes is one story to electrify everything. I think in our guts we know, if the process is allowed to go forward, we can never go back.
WikiLeaks is the America's Tiananmen. Julian Assange is the tank guy. We all hold our breath to see if we go all the way.
Today, I did a 1/2 hour podcast in the coffeenotes thread.
It's about WikiLeaks, Wired, Salon and the freedom of the Internet.
New actors, same story!
Hope you like...
In the heat of summer I stood in line, a long line, outside the Apple Store on 14th St to wait for my iPhone 4, which I had reserved a few weeks before.
Was it worth the upgrade? Yeah, probably. But my mobile device act is still far from what I want it to be. I wonder if the next iteration of the iPhone will give me a chance to get closer to my version of nirvana.
Here's what I carry with me now:
1. An iPhone 4, which I primarily use for photos. I see it as a communicating camera. It's much less bulky than the new honker Canon EOS 5D that I almost always carry with me now (in my knapsack, on my back, along with the 11-inch MacBook Air), and it takes perfectly good pictures, and of course it can communicate. The Canon can't.
To store the pictures I use Flickr. But given the instability of Yahoo, and my new (developing) mantra No Corporate Blogging Silos -- I feel a sense of urgency to convert to a photo platform that I trust. I love Flickr, and would be happy to stay there, but they'd have to spin it out for me to feel good about it. Or show some serious love (like bringing back Stewart or Caterina and giving them autonomy to love the product and its users). That might actually be a good mission for Yahoo, with its public stock and huge flow -- to be an investment banker for proven, seasoned entrepreneurs. A federation of Internet talent. The United Artists of the Internet. Some serious potential there. But I have wandered off-topic.
2. A Droid, which I use for text messaging, GChat, to check email, and for maps. It's also my telephone, but like many other people, I don't use the phone very much these days. I have a theory that AT&T's lousy connectivity and the popularity of the iPhone have killed the phone for modern people such as myself. Can't prove it of course.
3. A MiFi device that I use to connect my MacBook Air to the Internet when I'm working in a restaurant or a coffee place with inadequate wifi.
The carriers are: AT&T, Verizon and Verizon.
At home I have FIOS, and Verizon's cable TV package.
So I'm pretty close to being an all-Verizon dude.
Now here are the upgrades I'd love to get.
1. I understand that Verizon now has a super-fast wireless way to access the Internet, what they call 4G, and its available in NYC, where I live. Of course I'd love to get that.
2. The Doid is looking realllly old. I hate the form-factor. I have no use for the keyboard. The screen is embarassingly bad compared to the iPhone. I gave away my Nexus One because it didn't work with Verizon, and stuck with the Droid. But now there are all these sexy Android phones coming out and I want one. Maybe even a small Android tablet.
3. I am annoyed by my monthly AT&T bill. I would like to, if possible, get rid of it.
4. I am also annoyed by the MiFi device. I think my Verizon phone should be doing that job.
Of course it's too much to hope that Verizon, which is now (apparently) about to become an iPhone seller, has planned for people like me, and have a nice package to offer, that allows me to get rid of AT&T and the MiFi device, get a nice new Android, and convert my iPhone to Verizon (obviouslly swapping hardware). If Verizon were really aggressive they'd be ready to mass-convert AT&T users. I hope.
When I see a situation like this, I wish I had a can of bright orange spray paint to mark this guy's bike. Nothing that would hurt him physically. But this is crazy. Old frail people use the sidewalk. Parents with children. People on cell phones. They can't compete with people on bikes, and they should be able to use the city too. We should all be able to use the city, and bikes don't belong on sidewalks. It's not fair to pedestrians. Or safe.
Yeah I sometimes ride on the sidewalk (I'm not a saint), but when there are people around, I dismount, and walk. We have bike lanes and streets, and cars are fuckers, but that's no excuse for bikers to be fuckers like this guy is.
To the people's credit in front of him, they didn't make way for him, despite his pleading, and he had to pedal at a walker's pace to the corner.
One more thing, it's three days after the storm. Why isn't the sidewalk clear?
I'm probably a fool for stepping into the middle of this, but here goes.
I skimmed Glenn Greenwald's scolding of Wired, and it was a scolding, trying to pick out the core issue, which seems to be this: Wired has the complete transcript of Private Manning's confession to Adrian Lamo and Greenwald wants it, and thinks other members of the press should have it, and Wired isn't providing it. He has some theories about why Wired is withholding it, but I didn't read that stuff carefully to get the gist of it.
Then, last night, I read the two-part response from Wired from editor Evan Hansen and reporter Kevin Poulsen. The gist of their response is that there is stuff in the transcript that has no bearing on the story, that would be embarassing and/or damaging to someone, presumably Manning, and that to release it would be irresponsible. They also make some pretty nasty statements about Greenwald, that I find really disturbing. I've met Hansen, and respect the work they do at Wired.
I also admire Glenn Greenwald. I try to read everything he writes, esp on WikiLeaks. Then I started to read his response to Hansen and Poulson, and got to the part where he says: "I'm going to address each and every one of their accusations in order" and hit the Back button. No way am I going there. (Note: I eventually did read both his pieces today.)
There's a lot of bad blood here, obviously, but please, just drop it for now and focus on the core stuff.
Wired, could we get a third-party opinion to confirm your belief that the transcripts shouldn't be released in full? Perhaps a couple of j-school profs could review the material, and decide independently which parts would help other journalists covering this story? I'm not volunteering myself, to be clear.
Hoping we can put the feelings aside for a moment, and make a good collective decision here?
Of all the great literature our species has created the story we repeat most seems to be the naked emperor who everyone compliments on how nice his clothes are.
An amazing scene in a Frontline episode in the Iraqi parliament while Saddam Hussein ruled Iraq. It was time for a purge. Saddam had decided to execute the whole legislature. I guess they didn't know this, when the doors were locked and they started taking out people one by one and shooting them. Saddam, sadist that he was, videotaped it. So you could see what these guys were doing as they figured out, one by one, what was happening. So what did the condemned legislators do while waiting to be shot? They gave speeches, denouncing each other as the real enemy of the state, and proclaiming their love for Saddam, the best friend he ever had. Didn't work, they were all killed.
I'd love to see that video. Such a perfect example of our species at its most basic.
Let's see. Over the last decade we've seen the US start a war based on lies, a crazy war that probably bankrupted the country. A war without hardship, no draft, rationing or higher taxes. To keep the people quiet, we had tax cuts. Unless you watched the news, and the reporting was lame anyway, you wouldn't have known there was a war. We lived high, financing our lifestyle by inflating a bubble around our last asset that was worth anything, our houses. As if that wasn't bad enough, when the bubble burst, we learned that the banking industry had built a house of cards around our homes and when it came down we found out that if we didn't bail them out, we would be left without a financial system. That being unthinkable, we bailed them out. Ouch!
And the Republicans, who were in power for all of this, blame the Democrats!
We're so crazy, what did we do? According to the Republicans, we decided that government was the problem and to get them out of the way of the bankers, until of course they need the taxpayers to bail them out -- then government is the answer and it's time to get out the checkbook and (we) take it up the ass. Really? Did we really do that?
It doesn't matter what any of us say because no one is listening.
We want to hear that everything is all right. So we can keep believing in the things we believe in, and everything will be as it always was, when we were pre-teens and eating at home, riding bikes, with parents keeping an eye on us, making sure we didn't screw up too badly. That's what we want. Or so it seems.
Dennis Lehane, the American novelist, in an interview with Chris Lydon, told a story about radical radio commentator Glenn Beck. Apparently Beck is in his mid-40s. When he talks about the good old days, as Lehane tells it, Beck is talking about the 70s. Nixon, Kent State, Watergate, Vietnam, spiraling stagflation, the hostage crisis, OPEC, it goes on and on. It's conceivable that Beck doesn't know that the world was totally crazy when he was growing up.
And then Lehane says something that is so obviously true. What people want is to be children again.
I think that's the story of the human race!
BTW, on a micro level, everything is great. That's because we're in another bubble. We paved over the problems of the last bubble by creating a new one. Not sure how many more times we'll be able to do that.
Suppressing dissent doesn't help your cause, because it doesn't change minds. The belief is still there, just suppressed.
If you stay out the way of self-expression, everyone wins. You get to hear something that's dissonant to you, the other person gets to express him or herself. Having their point of view heard and accepted (if not agreed with), removes one obstacle to change. Maybe not all of them. And it's possible, just possible, having really listened to the politically incorrect thought, you might find your position shifts.
Aside from that, believe it or not, it's not all about you. Other people have a right to speak even if you don't like what they're saying.
Update: There's a difference between "dissent" and laying stinky turds in the middle of the room.
A couple of days ago Roland Boon, in a comment here, asked why not believe Amazon's explanation for why they cut off WikiLeaks. I explained that whether I believe or not isn't the question. It's whether I trust them that matters. And will I hold back on what I say about them for fear of being cut off?
That said, I think it's fairly obvious why Amazon cut them off. It's the 800 pound gorilla in the room.
Let me explain...
Today I got a promotional email from Kay Kinton, Senior Public Relations Manager for Amazon Web Services, entitled "Amazon Web Services Year in Review." It contained a paragraph, quoted below, that explains how their government business grew in 2010.
"Government adoption of AWS grew significantly in 2010. The Recovery Accountability and Transparency Board became the first government-wide agency to migrate to a cloud-based environment when it moved Recovery.gov to AWS in March 2010. Today we have nearly 20 government agencies leveraging AWS, and the U.S. federal government continues to be one of our fastest growing customer segments. The U.S. General Services Administration awarded AWS the ability to provide government agencies with cloud services through the government's cloud storefront, Apps.gov. Additional AWS customers include Treasury.gov, the Federal Register 2.0 at the National Archives, the openEI.org project at DoE's National Renewable Energy Lab, the Supplemental Nutrition Assistance Program at USDA, and the Jet Propulsion Laboratory at NASA. The current AWS compliance framework covers FISMA, PCI DSS Level 1, ISO 27001, SAS70 type II, and HIPAA, and we continue to seek certifications and accreditations that make it easier for government agencies to benefit from AWS. To learn more about how AWS works with the federal government, visit: http://aws.amazon.com/federal/."
It makes perfect sense that the US government is a big customer of Amazon's web services. It also makes perfect sense that Amazon wouldn't want to do anything to jeopardize that business. There might not have even been a phone call, it might not have been necessary.
Update: Of course most tech companies do business with the US government, and if they don't they probably want to. For example, a couple of weeks ago, a story came out about the Army equipping every soldier with an iPhone or Android phone. Not saying there's a connection, but a week later Apple banned the WikiLeaks app from their store.
As far as I know the issues around Amazon's decision to evict WikiLeaks from EC2 have not been discussed in the tech blogosphere. If I've missed the discussion, please post pointers in a comment on this post. I want to read what has been said.
In a previous post here on Scripting News, Matt Terenzio, who works as a system manager at a small Connecticut newspaper, said that basically Amazon can't be used to host independent news. I quoted Matt in my talk at the PDF conference a couple of Saturdays ago. What he says is not only true, but very important to journalists and bloggers.
Initially, I said that I wouldn't take my sites off Amazon because of their decision to not host WikiLeaks. I'm re-thinking that, but I want the benefit of a really good examination. Perhaps Amazon would like a chance to clarify their intentions, now that that the dust has settled. What would they like their customers to think about this, as it relates to their work?
Where would I move my sites? Do other vendors have a more clear statement of what they will and won't do under pressure from the US government?
We need to look at this dispassionately as possible.
The question is this: What service-level guarantees do we need from vendors to make it possible to use their services in our public writing.
Can we use S3 and EC2 to host free speech? Not a question I ask lightly, since this page, as of 12/24/10, is hosted on EC2.
Update: This piece is also running on the Atlantic.
Every year we have what's called the NakedJen Film Festival, and Murphy-willing, this year will be no exception.
This is how it works. On Christmas Day you go to a lot of movies.
It doesn't matter where you are. It's called the NJFF because NakedJen invented and perfected it, and spred the good word like Johnny Appleseed with the apples. If you ever get a chance to do the NJFF with NakedJen, I say go for it. She's a wonderful person to go to the movies with. And she was my choice for Blogger of the Year in 2007. Hard to go wrong with that combo!
Anyway, given that it's December 23, it's now time to start considering which movies we will see at this year's FF in NYC.
In no special order...
2. True Grit. A Coen Brothers movie with Jeff Bridges. Say no more. Must-see. Every year there's one of these. Sometimes they are disappointing. Last year it was Sherlock Holmes. Totally boring. A few years back, Sweeney Todd. I'm going to say True Grit is the Big Hope for the blockbuster of NJFF 2010.
3. Tron. It's always good to have a schlock scifi movie as an option. I don't seriously think this will make the cut, but you never know.
4. King's Speech. This has the makings of a truly boring movie. I used to joke, when imitating British speech (I'm good with accents) that I'm speaking with marbles in my mouth -- and in this movie (I hear) they actually do speak with marbles in their mouth.
5. Somewhere. Okay Sophia Coppola is an automatic. Lost In Translation was a masterpiece. If Somewhere is 1/150th as good it's a must-see. Only problem is it's playing in just one theater in Manhattan, and it's waaaaaay uptown.
6. Rabbit Hole. Nicole Kidman. I could watch her in a shitty movie, and have many times. This one doesn't sound like much fun, but I'm in. xoxox
Those are the movies I'm thinking about for this year!
I want a node.js module that does that. A web app that takes two params, the URL of a JSON file and the name of a callback. Example:
What it would return is this:
myLocalSubroutine (["Oregon", "Pizza", "Wheat"])
Then Step 2 would simplify it to:
I wanted to provide this functionality in Frontier, but my guys are polling this thing ever 5 seconds (which is BS, they shouldn't do that, but WTF) and that would cost me $90 per month and it wouldn't run very well as soon as their stuff got a few hundred users, which is what we hope happens. A lose-lose-lose.
What we need is a bare-bones machine language thing that does this. Node.js would be ideal.
Caveat: Of course, I'm sure something like this already exists!
Tis the season and ho ho ho!
The JSONification of the River was a big hit, and now the river renderings are starting to get real beautiful. Here's the latest one.
Ain't she pretty!
That's cool, I thought -- so what's next? Well I'm getting ready to do a really simple blogging tool, like the one we did in 2002, but even simpler. And I'm going to need to display the blog posts, so why not do it in a really modern way.
1. JSONify the RSS from the blog.
Yes! That's certainly worth a try.
So this is what I did.
1. I did a JSONified version of the feed for Scripting News. You'll see that it bears a strong resemblance to the XML version, from which it is derived.
It's worth a note to say that now RSS has escaped the confines of XML. It's become a more general language for describing stuff that updates.
I walked across the Brooklyn and Manhattan bridges today with the new camera and new eyes.
This is also a test of a new Scripting News feature for JSON hackers.
I expect to write it up tomorrow.
Great piece of interviewing by Ryan Tate.
Glad to see him speaking so frankly, publicly.
"As for social, I expect that Google will find greater success with their self-driving car and moon landing initiatives."
On Twitter I see random messages making broad statements about men re the rape charges against Julian Assange. I think we're going into dangerous territory, and there's a good chance we're being manipulated, and before it goes too far, I want to try to moderate it, and talk about what we know and what we don't know.
1. As far as I know there aren't any charges against Julian Assange, in Sweden or elsewhere.
2. What I've read in the Guardian about the charges sound to me like he might not be a very nice person. But where I come from, that is not a crime -- nor is it in Sweden, which seems like a fair country.
3. Rape is awful. But I think it's almost as awful to falsely accuse someone of rape, because that's going to radically change an innocent person's life, for the worse. And it's so easy to do, it's one person's word against another's.
4. Some people use these issues as a way of saying sexist things about men. General statements that indict a whole gender, or a big chunk of one (like "men in tech," for example). I'm sure they don't think every man is guilty, and if you stop them and say that, they claim that isn't what they're saying. But if you simply flip the genders, and make broad sweeping negative statements about women, I guess they wouldn't have any trouble seeing the sexism.
5. I agree with and support feminism. The line I won't cross is condemning men in the cause of feminism. Nor would I support it the other way. If you want to speak up for your gender, or my gender -- great! But not if you're going to go negative on the other guys.
6. One rule I try to remember: We create each other. My mother was a woman. So was yours. Your father was a man. Mine too. We learned their values, like it or not. I am who I am not solely because I am a man, but also because I am my mother's son, and my grandmothers' grandson. And the student of all the women teachers I had (most of them were women). And I am also the product of every relationship I've had. It's a big mixed bag of genders that form whole people. If you want to find the cause of something, it's never as simple as one gender doing it to the other.
7. Let's be smart. What do you think the chances are that the charges against Assange, coming in the middle of a political shitstorm, from a country whose Prime Minister is closely tied to the US government, are exactly what they seem to be? Suppose it were men making charges against a woman, in similar circumstances? Would your bullshit detector be a little more alarmed? It shouldn't be, if you're not being sexist.
8. Let's seek balance. I'm not going to stand up and say the charges are bullshit. But I'm not going to say they're not. In the US, the country I come from, the standard is "innocent until proven guilty." It doesn't say "except if you're a man charged with rape." I like that standard a lot. Why don't we stay there until there's a verdict?
9. One more thing. This all should be kept separate from the work that WikiLeaks is doing. Please.
I closed comments for this thread. I know I'm
touching grabbing the third rail. If you have something to say about this, put it on your blog or tweet it or tell a friend. I look for charged issues like this one to explore, because these are the places where the greatest growth is available.
This is a huge feature for people who like to walk in NYC, as I do.
Now when you click on a blue M, which is the symbol for a subway station, it hightlights the route of one of the lines going through that station.
For example, I clicked on the High Street/Brooklyn Bridge station, and it showed the route of the A and C lines in blue.
Click on Grand Central and it shows the routes for the Lexington Ave lines, the 4, 5 and 6. And the Flushing line (the 7).
This is a very important connection. Not sure when it came online, but all NYers and esp people who visit, will want to know.
I've long believed that the Internet will turn commerce upside-down. That what happened to journalism, software, music and video, is also happening to research and development.
The benefits of centralization fade out, smaller markets work better with fast world-wide shipping networks. And development teams can be widely distributed. But most important -- it's easier to hear what users want. And the barriers to entry for users becoming developers drop, as technical information flows better.
Users creating products for users.
In the centralized way of doing R&D there was distance and secrecy. A team of geniuses are sequestered for however long it takes to make their magic. Then they come down from the mountain to deliver the product, and we all buy one and talk about it and love it, while they return to the mountain and we wait for the next one. This isn't just how Apple does it, it's how everyone does it. They've just mastered the process better than anyone else.
The other extreme is users creating products for other users.
I felt blogging would play a key role in bootstrapping this method. That communities of users would coalesce at blogs, exchange ideas over long periods of time, and gradually a consensus would develop of what the next version of a product will look like. That much certainly did happen.
Then I expected companies to form out of these blogs, to create the products the users wanted, that we knew they wanted (because they said so, and we were listening).
We needed to learn how to listen better, and we had to learn how to be easier to listen to.
Note that this is different from Craig's List and eBay -- which are users selling to other users. Definitely a step on the path, but not all the way there.
Users making products for users. No middlemen. No ivory tower. No sequestering. No waiting. An open visible development process. No secrecy.
Well, anyway, now it's happening. And it's a race to see who has the model that works. Maybe they all will work, or maybe a hybrid, combining the features of one or more, will be the way it works. But in the next few years, this is where the big innovation in commerce will be. And it's going to be bigger than Google or Apple or Facebook. In many ways those companies are just providing role models for the next generation of entrepreneurs. Perhaps in some ways as negative role models, showing how not to do it.
Anyway, here are four examples of users creating products for users.
I'll let the sites speak to you for themselves.
When my father died last year, he left behind several cameras. I got one of them, and am now starting to use it. It's a Canon EOS 5D. I don't know much about cameras, but Doc Searls says it's a good one. It's the one he uses, and he takes wonderful pictures.
So I bought an inexpensive 50mm lens, and yesterday I started taking pictures. And this morning when I went out to breakfast, of course I took the camera. Here are three of the pictures. If you click on the thumb you will get a medium resolution version of the picture.
To give you an idea of how much detail is there, here's a full-resolution section of one of the pictures.
The first thing I noticed is that you get new eyes when you get such an upgrade. Now when I walk around I see things I didn't see before, because I'm looking for pictures I could never take before.
I think this is going to as big an eye-opener as getting the bike was back in August.
Bijan Sabet writes that he plans to dig into the source material on WikiLeaks. I imagine a lot of people are in the same place. Hearing conflicting news about this and that, and not hearing enough substance to form an opinion. The story isn't actually all that hard to follow. But you have to go just a little below the surface to see what's going on. Following CNN and reading the home pages of the news sites is where the confusion is published. To get the clarity you have to pick your sources more carefully.
I was in the same place last weekend, and decided to do something about it. I created an aggregator that streams WikiLeaks stories from the four major news organizations that have all the cables, and other news organizations who are downstream from them. They are the ones who decide what to write and when to write it and they release the cables if they need to. There's been a lot of misinformation spread by the press about WikiLeaks.
However a few news outlets have been doing outstanding work to clear the fog. By far the most productive and useful reports are coming from the Guardian. Great stuff.
WikiLeaks itself has a feed where they point to other news orgs that have been working on it. And I supplemented all this with a feed I'm managing myself from scattered stories I pick up from other sources, including people on Twitter.
Key point: You don't have to dive into the source material, the reporters are doing that for us.
This is where all those streams come together...
Not looking for a plug, or a link -- I just want to help myself and don't mind sharing what I came up with.
PS: While I was writing this piece, a Spiegel interview with the German Interior Minister was published. He says "WikiLeaks Is Annoying, But Not a Threat."
I saw Reporters Without Borders issue a position on WikiLeaks, urging the US govt to not prosecute.
I've been looking for a way to make a similar statement from the point of view of someone rooted in tech, saying that it's important for the future of the net itself that it get the same freedom that offline media gets.
What I came up with...
You don't have to agree with the exact statement in order to endorse it, in fact it's better if you don't. Copy the text, edit it as you see fit, and post it on your site. That's why it has a Creative Commons license. That's how we can show that we're not standing alone.
Or copy it exactly as you found it, and link to it. Think of it as an open-source letter signing, on the net.
I don't expect tech companies to get on board here (though it would be nice if they did). But I am hopeful that the individuals in tech will stand up for the First Amendment.
It's time to stand up. Find a way to say, no matter where you live, that freedom of speech happens everywhere, including the net.
Net neutrality is a concept that the tech industry rallies around, but it is hypocrisy.
The idea is that the transport layer, operated by telephone companies and cable companies, must transport all bits across their lines at the same rate and cost. Nice idea, but it's hypocritical to demand that of their vendors when they don't provide it to their users. For some reason they are never called on this hypocrisy by the tech press.
At the PDFleaks conference in NYC last Saturday I said that after Amazon booted WikiLeaks from EC2 that signaled very clearly that there is no such thing as net neutrality. Here's a service provider, very analogous to Comcast and Verizon, that decided it wasn't in its economic interest to carry a user's bits. It wasn't just about the level or cost of the service, they cut them off totally. Without adequate explanation of why. Saying they were doing something illegal is no explanation at all. That's not for Amazon to decide, that's for the courts. Due process is required to prove that something illegal is happening. And many legal experts believe that there's nothing illegal about WikiLeaks.
Something like this happened to UserLand in Aprill 2000. We were having serious problems with our ISP, our T1 line was dead for most of a month. When I had connectivity, I was writing about the problems on my blog. I had to, people needed to know why we weren't on the net. We had our own customers, and they weren't being very patient, nor should they have been. I don't think the supplier ever understood that we had customers, even though we said it repeatedly.
Out of the blue, without any explanation, they terminated our service. That cost us hugely, not only in human time, but in startup fees with a new ISP. We were a small, thinly financed company, so it really hurt.
But that's reality. Most of us live with anything-but net neutrality. When the tech industry asks us to stand up for them, they're really pushing the chutzpah.
Read your user-agreements carefully. Most of them give the vendor the right to terminate your service at will without any appeal possible. That's not net neutral. Not even close.
As you may recall, I wrote a JSON compiler and decompiler in October. After running a bunch of tests against other people's JSON, I was satisfied that it worked, and moved on. I mostly focused on the compiler, because that's usually the hard part, trying to understand all the variables that people can throw at you. JSON was, as advertised, pretty easy to work with (though not as easy, imho, as it could have been). No matter.
But there's a problem with the JSON.
Each group of news bits is organized as a bunch of scalar data, like feed name, url, when the feed was last read, etc. Then there are one or more news items. If there's one item I just include a struct named item. If there's more than one I include a list of structs. The list is named item. I understood this is the convention for repeating elements in JSON.
In the screen shot above, there are two "updatedFeed" elements. The first has only one item, the second has more than one.
This causes problems for people in some languages because (apparently) it's hard for them to deal with an object without, in advance, knowing its type. So they say the solution is simple, always make it a list. Simple for them, but...
But this is not so simple on my end. Because I'm using a generic JSON serializer, and it would have no way of knowing that "item" should always be a list. Unless...
One way of dealing with this (that I don't like and won't do) is to make everything a list.
I was just wondering what other JSON-producing environments do in situations like this.
PS: I have a workaround I can do fairly quickly that side-steps the problem, so it won't ultimately be a deal-stopper. But I'd like to solve the problem entirely in the serializer if possible, so the solution can be work in other cases.
Twitter is useful, imho, for two things:
1. As a way to share links.
2. As a way to speak your mind.
These days I use it almost exclusively for #1. Very little of #2.
People just aren't that interested in what other people think. And it's damned difficult to speak your mind 140 characters at a time. Most of the time you can anticipate in advance what the misunderstandings will be, and self-edit. Then self-censor. Why bother going through all that michegas.
But as a link-sharing tool, it is really excellent.
How you can get the links I send through Twitter...
1. You can, of course, read them in Twitter.
2. There's also a site that has all my links going back to April 2009.
3. Recently I started flowing the links through a WordPress blog.
4. There is, of course, a realtime RSS feed of my links.
5. And a Top 40 list of recent links, ranked by click-throughs.
Then it just struck me -- sheez -- I'm mostly using Twitter the way people use del.icio.us.
Maybe there's a lesson in there. Perhaps if we figure out how to decentralize del.icio.us, we'll be on the way to decentralizing Twitter? Maybe all del.icio.us needed was to become realtime, and it would have become Twitter?
BTW, I often have the same idea about Flickr. It's a gem, with a huge and influential user base, to this day. With a little love and care it might blossom into something really wonderful.
Yes, I believe in resurrection!
I have a few brief comments then I'll STFU and let the community work on this. I never was a big del.icio.us user. But I am familiar with the problems that the community will face in a general way.
Here are the priorities in preserving what was created by the users.
1. As Stephen Jones says on Twitter, the data. Years worth of links and collections. The most important thing, but not the only important thing.
2. The domain. If the community is to survive, there needs to be something at del.icio.us (and delicious.com). Ideally it would be more or less exactly what's at the Yahoo site now.
3. The API. For apps that were developed to work with del.icio.us, preserving the API is important.
However, one more thing, if the activity of doing what del.icio.us does is to continue and to grow, don't make the same mistake again. Find a way to host your data in a place where you're seen as a customer, where there is a sufficient revenue flow to keep it operating.
The result is wikiriver.org.
It updates every ten minutes with all the latest WikiLeaks news, or more frequently for news from realtime feeds.
We still need more feeds, so if you know of a good one that we're missing, please post a link as a comment to this post.
I am hand-curating a feed to supplement the river until we get all the feeds we need in the mix.
The effort to create more visually pleasing rivers is going really well. Martin Duffly published a howto, and we have a developers mail list going to share know-how. This work is based on the JSON and static JSONP rendering of the various rivers I'm hosting.
You may include the content from wikiriver.org in your site, but please point back to wikiriver.org.
If you're working in a news organization on WikiLeaks stories, I am developing a new editorial tool for managing rivers like wikiriver.org. Get in touch if you're interested in collaborating. We'll be working over the holidays, in NYC. (Often a very productive time of year.)
I have more features in mind, as this moon mission project continues.
Nieman Journalism Lab at Harvard is running a year-end series by people such as Clay Shirky, Steve Brill, Vivian Schiller, Michael Schudson, Markos Moulitsas, Kevin Kelly, Geneva Overholser, Adrian Holovaty, Jakob Nielsen, Evan Smith, Megan McCarthy, David Fanning, Matt Thompson, Bob Garfield, Matt Haughey.
And today they ran one of mine!
It was a good one to get into the journalism discussion.
It's not a question of where to put the paywall, it's that there's no market for paywalls.
I'm going to write some more stuff for them in the new year. It will help the circulation of ideas in this space, which is a much-needed good thing, imho.
When I sent a pointer to a friend who was head-scratching through the JSON River stuff, he said this was exactly what he needed.
I asked Martin to start a Google Group for this stuff, so that all communication doesn't have to go through me. We're at the point now where there's been enough uptake that it would probably help.
And thanks to Martin for helping the JSON guys!
PS: Wikiriver.org also has a JSON version, of course.
I've been emailing with Brewster Kahle, founder of archive.org, about web hosting. In the last go-round I tried to concisely state the problem.
1. If I want to start a blog or post a research paper, or denounce the government, or post a leaked document, I don't really have a choice but to use a hosting provider. Most people don't even do that, they just start a Facebook page. I've always believed these companies would bend under pressure from the government, even in the US. Now that's not a theory, it's happened.
2. I started writing DaveNet in 1994, that's sixteen years worth of writing. A month or two after I die, the only way to get to that stuff will be through archive.org. That's not good enough. I want scripting.com to survive me. I have plenty of money to pay for it. But there's no one to give it to that I trust. No one is set up to do that business.
3. I am responsible for the web presence of two relatives who have passed on. My father and my uncle were both bloggers (in their own way). My father was a photographer. I am holding sites for them that they were working on.
4. And of course that's just me. I have a Comp Sci degree, and 30-plus years programming experience, and I have the time and the money. If I don't have this problem solved -- clearly almost no one does.
5. Jeff and I went to see the Library of Congress this fall. They're totally under-prepared for this, plus they had to shut down the WikiLeaks stuff because they're part of the US govt.
So now we know what we need. Now is the time when people's minds are open about this. We can move at double or triple speed. It pays to focus here now.
I'm looking for news organizations that have sections specifically for covering WikiLeaks-related or WikiLeaks-derived stories, that also have RSS or Atom feeds for those sections.
The feeds are for: http://wikiriver.org/.
I already have feeds for the Guardian, the NY Times, Der Spiegel, El Pais, Wired, Le Monde, Macleans, Fox News, CNN, Time and WL Central.
If you know of others, please post the link in a comment below or send it to me via email.
Update: Most major news orgs appear not to have feeds specifically for WikiLeaks stories. Seems they should, it's a long-term thing
Another example: Salon has a WikiLeaks topic page, in HTML -- but doesn't appear to have a feed for that topic.
I expected CBS News to have a topic feed, since their site is (presumably) managed by the CNET people and they certainly understand feeds, but I couldn't find a feed there either. CNET itself is very confusing. They have a topic page for WikiLeaks, but it points to a feed for Politics. I included it in my scan, at least to begin with since it seems to have mostly WikiLeaks stuff in it.
The discussion at this weekend's flash conf in NYC on WikiLeaks raised the question of where we can store our web writing and photos so that they are as safe as they possibly can be. Trusting corporations to manage this is obviously not a good idea. If this was theoretical before, it's now pragmatic, after Amazon cut off WikiLeaks.
That suggests that we need a new kind of institution that is is part news organization, university, library and foundation -- that acts as a guarantor of best-possible freedom from corporate and government limitations. We already know some things about this organization, I believe.
These are just back-of-the-envelope scribbles. Consider this a discussion-starter for the next meetup.
1. It must be long-lived, like a university -- probably with an endowment, and a board of trustees, and operations limited to what's described below. It can't operate any other kind of business.
2. It must create a least-common-denominator storage system that is accessible through HTTP. Everything must be done with open formats and protocols, meaning all components of its system are replaceable.
3. It must cost money, so the user is a customer and is treated as one. This also allows the vendor to assume its own independence from the interests of the publisher who uses the system. The same way the operator of a printing press was not responsible for the words he or she printed on the paper.
4. Simplicity of the user experience is primary so it can be accessible to as many as possible, and so that technical people don't provide yet another filter for the free flow of ideas. Factor and re-factor for simplicity.
5. The trust must serve the bits exactly as they were published. No advertising.
That's where I want to pick up the discussion.
When people in the news business try to figure out how to make news pay after the Internet, it seems analogous to the French, after being invaded by Germany in World War II, trying to figure out where to put the new Maginot Line.
The Maginot Line would have been a perfect defense in World War I. It didn't help much in the second war.
Analogously, there was a perfect paywall in the pre-Internet news business, the physical product of a newspaper. There is no equivalent in the new distribution system
Howard Weaver's latest post put this into focus for me. That, and the recent attention on Groupon, which it seems to me has usurped, again, one of the big roles that local news organizations could have played, obviating the need to find the new paywall.
The first usurpage was of course Craigslist. It wasn't so obvious then that this was the natural domain of the press, because Craigslist made a small fraction of the money the news industry used to make from classifieds. It looked like CL was just undermining the press, not competing with it. But Groupon -- this is the fastest-growing company in all time. The founder says what they do is find ways for people to get out and enjoy their city. And they make a boatload of money doing it.
Here's one way of looking at what both Groupon and local news organizations do -- they put smart hard-working people into the field to keep tabs on what people in the community are doing. Some of what they are doing is robbing and killing each other, that's what news is interested in. Another part of what they're doing is buying from and selling to each other. Groupon is making huge bucks on that.
It seems there's still time for a philosophy change in the news business. Become more focused on the commerce of your communities, and the opportunities to make money will become more apparent. Seems common sense to me.
This of course is fodder for tomorrow's Rebooting the News.
Okay I've decided to confront my Unix phobia head-on.
I used to be a Unix programmer, many moons ago, when I was a grad student at the University of Wisconsin. It was my first interactive experience with a computer. I remember it fondly. And what I have to do with this Unix server is not very ambitious. Here's the deal.
It's a server running on Slicehost, which they have given me to play with for $0. Can't beat that deal.
It's got Ubuntu 10.04 LTS (lucid) freshly installed. I have root access, and have been able to log on via SSH. All is good so far.
Here's what I want to do:
1. Get Apache running. Apparently it's not running now on the server.
2. Install a GUI that I can use over VNC. I basically need to be able to edit text files. In a pinch I could do without this if the FTP access is good.
3. Make sure FTP is working so that I can FTP into the HTTP directory.
4. SMB server set up so I can access the file system from my Mac desktop at home and from my Windows servers.
Maybe there's another version of Linux that would be easier to set up? Here's a screen shot that shows what my other choices were (although some of them may have a surcharge).
I missed an opportunity to explain something today at the PDF Forum.
When people come together and feel a need to communicate is a time when new tools for communicating have a chance to bootstrap. For a simple reason, we have the users' attention. They're interested in doing new things. Normally they're doing something else, living their lives, being busy -- their attention is elsewhere. But when they are confronting a political problem, which is often mostly a communication problem, that's when bootrappers get their boots on! Cause we can do some good now.
I've had this feeling before. In 1996 Congress had passed and the President signed the Communication Decency Act, which gave the government broad powers to control what the Internet was used for. An outcry came on the Internet from people who loved the Internet enough to be broken-hearted at this turn.
So we started a moon mission. Our goal was to have free hosting for people's stories and pictures, what we called 24 Hours of Democracy. It really worked. It wasn't easy but it was huge fun (most of the time). Here's the key point -- this single exercise probably pushed up the adoption of blogging by a few years. Because we had to try a lot of ideas out in a fixed space in time, what came out at the end had to work.
It's time to create again what we created in 1995. A place for people's thoughts and pictures, but this time, with an eye toward permanence. Create something that's independent and as protected as possible from governments and corporations. As safe a harbor as we can create.
I was just making a small pot of coffee, just two cups. So I measured out two full cups of water, and guessed how much coffee to put in the basket. The coffee came out great, but could just have easily come out awful.
So if Wikipedia is the web's encyclopedia, where is the web's user's manual? It would be infinitely expandable, aspire to cover everything, and take a practical 1-2-3 approach to doing things we humans do on this planet in the times we live in.
Here's how I set up an Apache server...
1. Find apache_2.2.9-win-x86-no_ssl-r2.msi, download install.
2. Create aliases of these files on the desktop:
3. Change the root directory from htdocs in the Apache folder to something much shorter. I like to use C:\www. If you don't shorten this, all your file names are going to be very long and waste a lot of typing and visual space.
4. Uncomment the inclusion of the vhosts.conf file in httpd.conf.
5. Uncomment the inclusion of the proxy module
6. Add default.html to the list of index files
7. Make your .htaccess files work by setting AllowOverride to All
8. Uncomment the include for modrewrite. This allows your .htaccess files to do reasonable kinds of redirects, like sending references to www.somedomain.com to somedomain.com
9. Uncomment the include for proxy_http_module
My server isn't designed to do the kind of task that JSONP requires it to do, esp in service of apps that make lots of calls to the server each minute (for each user). I just spent a few weeks cleaning up all kinds of things like this, so I'm not willing to keep hosting it.
To give you an idea of what it's doing to the server it's running on, here's what its performance monitor graph looks like now. Obviously that's not workable.
So, I've modified my app to produce a static file that I can serve without this cost that calls a routine in your page called onGetRiverStream. You don't get the flexibility to say what routine you want me to call, but you can have that routine call the routine you want.
Here's the new location you should include. It's a script.
The static file it's generated from is still available, and you can have your own JSONP handler work with it if you like. Or if you really want to be community-minded, you could let other developers access it as well.
Update: Here's what the performance monitor graph looks like after I switched to this new policy.
One of the big components we need for the loosely-coupled 140-character network are user interfaces for browsing rivers.
If you're working on a display, please post a link in a comment and I'll add it to this post.
Right now I'm aware of:
Martin Duffy's river viewer, the first to appear on this page.
Ben Trask's entry in the River of News freestyle competition.
Here's another from crashbang.co.nz.
This is amazing: riverofnews.glowdart.com -- it's my personal river, flowing through the glowdart engine, thanks to JSONP.
Hopefully, more to come.
I guess it works so here's the JSONP version of my River of News.
Refer to the writeup for the JSON version for an idea of what this is and what you can do with it.
There have been a bunch of requests that I support JSONP, which I am looking into.
I understand that JSONP is needed so you can access data on domains other than the one from which the page originates. I also understand how it works in a simple case where a small amount of JSON is being returned. But I'm returning 70K of JSON text and it includes tabs, carriage returns and linefeeds.
1. So the first question is -- should I strip out the whitespace characters?
2. Are there any characters that need to be escaped or neutered? In XML you'd be fine if you encoded left-angle-brackets.
I'm sure there are other questions.
Update: There is now a JSONP version of the river.
WikiLeaks is the perfect storm for all past issues on the net, but I'm afraid it also will draw us into a future that I've believed was coming and didn't want to talk about. We don't like to think about how much our civilization depends on the proper running of computer networks, and how vulnerable they are. Whoever it is that attacking Mastercard and Paypal are anonymous. They could be teenagers (that's what we hope) but they could also be professionals working for foreign governments, or even the US government.
I watch my friends root for the attackers and think this is the way wars always begin. The "fighting the good fight" spirit. Let's go over there and show them who we are. Let's make a symbolic statement. By the time the war is underway, we won't remember any of that. We will wonder how we could have been so naive to think that war was something wonderful or glorious. People don't necessarily think of wars being fought on the net and over the net, but new technology comes to war all the time, and one side often doesn't understand.
However, there is another side to it. The United States has been tempting someone to do this to us. The Internet and thumb drives posing as Lady Gaga music made it possible to move around massive amounts of sensitive information. Anyone who is skilled at using the net has been burned the way our government is now getting burned. And we've been pushing around governments and their people, and it's no surprise they resent it. We're afraid to see what our government has been doing in our name. That's why we should see it, and why WikiLeaks must be allowed to proceed, without impediment.
On Saturday, the Personal Democracy Forum is hosting (what I call) a flash conference to discuss the issues swirling around WikiLeaks. It's all moving so fast that it's hard to know exactly what we'll discuss there.
When we meet on Saturday I'm going to say the Internet no longer has to fight for a right to exist. The people want it. But what kind of Internet we get, and what kind of government we get, those two things are now very deeply intertwined, and absolutely not decided. And how our financial system functions, that's going to be what the war is fought over, if we can't avoid having a war -- which we should, if we can.
In the meantime, highly recommend listening to Glenn Greenwald on today's Brian Lehrer show on WNYC. As with most of the mainstream press people, Lehrer can't wrap his mind around the idea that what WikiLeaks is doing and what the NYT and Guardian are doing, are exactly the same. Whatever punishment or banishment you advocate for WikiLeaks, you must also want for the professional news orgs. Greenwald, always tenacious and brilliant, holds the line, with tenacity and brilliance.
Every day I push a bunch of links through Twitter. I have this process fairly well streamlined. I click on a bookmarklet, it grabs the headline and URL and shoots it to an app on one of my servers. It shortens the URL and then sends it to twitter.com, where it all appears in my status box. I edit it, and click the Tweet button and off it goes.
Another app checks in with Twitter every five minutes to see if I've posted anything new and if it has a link in it. If it does, it is saved in a database. This database has been going since April 2009.
On Monday I added another element to the flow. Now the current day's tweets-with-links are pushed to a post on Protoblogger.com, which is a site running on wordpress.com. The connection is very simple, through the MetaWeblog API. So now there's another way to get my links, without going through Twitter.
The links have been available via RSS for quite some time.
A few links.
1. This is my Drupal test website.
2. My username is Dave Winer. He mailed me my password.
3. The URL of the MetaWeblog API is here.
I think I have everything I need to test it.
Let's see if, when I do a metaweblog.getpost, I get back the source code.
metaweblog.getpost failed, but this time I got an error response. (Earlier errors were silent.)
The server, yiwen.pair.com, returned error code -32602: Server error. Invalid method parameters.
The only one that seems could be causing the problem is the postid which I sent across as the number 6.
Then I changed it to the string 6 and it worked.
Did it return a value for postSource? It did. Was it what I sent?
Yeah it looks pretty good.
First a few preambles...
1. I'm a big believer in the River of News style of feed reader. Reverse-chronologic. Scan all the feeds every hour or every ten minutes, and accept notifications that feeds have updated in realtime. When there are new items, rebuild the river.
2. For the last month I've been posting pointers to my personal river, so people can get an idea of how it works for a real person.
3. A few weeks ago I issued a design challenge asking designers to take a look at the River of News idea and see if they could come up with a better way to display it. I'm using tables. They suck. Make it nicer. There was some interest, but so far nothing to actually look at.
So... here's what's new: Now...
When I build the HTML version of my River, I also build a JSON river.
If you view the content of that file you'll see it's a series of updatedFeed elements, each of which contains a list of items that are new.
This is the raw data of my River, with none of the formatting. In a language that every browser understands at its core.
Now it should be simple to experiment with new renderings. And not in a mockup. This is a real river, for a real user, that updates at least six times an hour, and usually quite a bit more often. (There are a number of frequently-updating realtime feeds I'm subscribed to.)
One more thing...
I released this last night as a feature of River2, an app that runs in the OPML Editor. To get the rendering for your river, you just have to turn on the Dropbox preference that allows a static rendering of the river. I will eventually add a dynamic page to the site that makes this step unnecessary.
That's it -- that's the news. I'm pretty jazzed about this and hope to create some new connections.
PS: It isn't just new in that this is in JSON, it's the first time there's been a data-only rendering of a river.
Update: There is now a JSONP version of the river.
I'm a veteran of many free speech campaigns on the Internet dating back to the Communication Decency Act in 1996. I've been around this block many times. So when people say "I thought we were boycotting Amazon for their treatment of WikiLeaks" when I posted a link to Amazon's new programmable DNS feature (which I've been waiting for, thanks) I see it all coming around again.
First, there certainly are things worth going to the mat for. But this is not one of them. For a variety of reasons.
First, if I were in Amazon's shoes, and I have been, I'm not sure I wouldn't have done exactly what they did. They're running a business, not a government. They aren't the guarantors of anyone's rights, that's the government's job. There are plenty of choices for web hosting besides Amazon. And because they charge money for their services, they guarantee that there will always be. If Twitter had shut them off, that would be serious, because their pricing model, and lack of federation, more or less cuts off competition. There it would be really chilling. Here, not even slightly chilling.
Further, I depend on Amazon a lot more than most people do. For me it's not at all a casual decision.
And when I make a principled stand, I tend to stick by it. For example, I was a vocal critic of Twitter's Suggested Users List. When they put me on the list, I had to request to be removed. Painful thing to do, because millions of followers, like it or not, have huge PR value. No matter, I had put my stake in the ground.
I boycotted Amazon once before, for the one-click patent. It was when I learned that other open tech advocates were using (and loving) Amazon, years after I quit, that I realized my stand was pointless. I wasn't making a statement any longer, I was just cutting myself off from a convenient service.
We don't know all the facts about WikiLeaks. I see the press taking their side of the story and publishing it as fact. So maybe people who want a boycott are being misled. Maybe WikiLeaks didn't have to go to Amazon, maybe the DOS attack isn't severe. We don't know. I see, in WikiLeaks, a man and an organization that is very good at manipulating the public. They're not alone in that, the Republicans are good at it too, as are the North Koreans and Al Qaeda. Being good at manipulating us isn't good or bad. But then you have to ask "Do they really need my help?" And do they need it this way? And will everyone feel so strongly about boycotting Amazon in a couple of weeks, or will we all go back to business as usual?
All these things weigh in a decision as to whether or not to make a principled stand.
And it adds up to this, now, re Amazon: Not even a close call.
Just read a press release about Mitch McConnell and what he will and won't let the government do about taxes and jobless benefits.
This is a guy who has only been elected to represent the state of Kentucky, and leads the minority in the Senate. But he's talking as if he has the absolute last word. And given the ineptness of our President and his party he's probably pretty close to being right.
A chilling thought. We now live in the McConnell Administration. He's the boss. Here's a big picture of our new leader. Brrr.
But if he makes a mess, he should be held responsible for it.
I wrote a bunch of web content management tools in the mid-late 90s and early part of the last decade (we still don't have a name for it!) that turned into what we now think of blogging tools. The category that's led by WordPress, Typepad and Blogger (and Posterous and Tumblr and certainly others).
Along the way a bunch of features fell by the wayside. One of them in particular is so important I'd like to pitch all the vendors of today's blogging tools. If one of them does, I'll put at least a year (Murphy-willing) into developing tools that work with the feature. I think that's a pretty good deal, when you realize how little work the feature will take.
Here's the deal. You all support the MetaWeblog API. That's great. When I save a blog post using newPost or editPost, allow me to include in the struct a new element called postSource, which contains the source code for the post. This is not executable code like Java or Python or C, its structured text in XML form that the post was rendered from.
That's the punchline, now here's the background.
There's a lot of power in separating content from the rendering of the content. All of today's blogging tools have that concept to a degree. A series of posts flows through a set of templates and is rendered into a set of pages that are part of the blog website. But the posts themselves don't have the separation. When you write the post you're writing in HTML. For most people, most of the time, that's what they want. But if we have the ability to make tools that have the separation, we can build higher-level, more powerful editors that are also easier. In fact, I already have such an editor. I just can't use it with your blogging environment.
You just have to store the XML text along with the HTML text, and when I getPost, send it back to me. It's a black box you don't have to look into (but you should probably verify that it's just XML and not executable bits).
As with RSS, podcasting was not invented.
But with podcasting there was an Aha! moment, in a hotel suite in NYC, listening to a rock star ramble incoherently (or so it seemed) about media technology. The rock star was Adam Curry. And behind the confusion was a brilliant idea, which I explained in this piece, written on Halloween in 2000, a little over ten years ago.
I remember when it clicked for me, as if it happened just the other day. Once I understood it, and went back to California, it was less than two months before both ends of the system were working, using RSS as the transport format, and Radio UserLand as both the authoring and podcatching tool. A few years later, by the fall of 2004, lots of content was flowing across this pipe.
There are always three parts to every standard: reading, writing and content. It's also true of the process: dreamer, tech, media. (Sometimes one person plays two roles, sometimes all three.)
Who knows maybe there's something here.
And while podcasting didn't make anyone rich, it certainly made our lives richer. Every day I go for a walk in NYC listening to Chris Lydon or Terry Gross interview someone, or the Radio Lab guys teach me something. Or Selected Shorts entertains me. Or whatever. "Rich" is the way I feel about the medium. It worked.
I posted this on an internal list, and thought it would be a good idea to make it public.
Finally I think I've done most of the fussing with my servers, and I actually backed up and turned off the server that was causing all the problems. It's work is now distributed among three servers. Seems likely that I'll have enough bandwidth.
And since the AFP app was unplugged by AFP, I now have the freedom to move that server. They were sending the pictures to a fixed IP address, if I ever moved it I would have lost the pics.
After all this fussing I now see what nirvana would be...
An integrated web server, registrar and DNS.
Without all the ads and upsell and pictures of female gymnasts.
The whole thing in one package, where you configure the sub-domains in the same place as you say where their content is served from and how it is served. One Get Info panel where you set it all up.
The difficulty in all this is all the different places you have to go and that have to agree with each other in order for anything to work. That's also where the fragility comes from.
And a reduction of the number of concepts you have to master. Most of them are irrelevant vestiges of the way some engineer who's probably dead used to think about this stuff before anyone understood it.
Lots of links (thanks!), but perhaps the most interesting to me, was the roundup posted at Berkman Center, quoting Ethan Zuckerman, Rebecca Mackinnon and myself. I think a lot about Berkman these days, and wonder if this was the event it was founded for.
Its full name is Berkman Center for Internet and Society. This one lands on Berkman's desk, I'm afraid.
Is it time to "do something" to create a haven for the free web?
Would the school form a legal braintrust to work on this? Could it happen, or is it hopeless? Do all lawyers want to work in government, and therefore would be working against their own interests if Wikileaks found a safe place to operate?
We now understand that we can't look to the tech industry or even the Library of Congress. The tech industry more or less failed the neutrality test, and the LOC has failed the unwritten code of librarians everywhere. They had a tough choice, no doubt, as part of the US government (think of their name, they take it seriously) they were obligated to maintain its confidentiality, but as librarians, they had an obligation to provide access to the information. I have a feeling that if you're a member of Congress, you can access the info. But the people, whom the government serves, may not enter.
Twitter, who I often am critical of, is singular in not banning Wikileaks. Good for them and thanks.
It makes sense that Switzerland, long the exemplar of neutrality, and the birthplace of the web, would also stand by Wikileaks.
Ron Paul, alone, in the government -- speaks up for Julian Assange.
I think of my friend and former boss, John Palfrey, who is now responsible for the Harvard Law School library. He has the same decision to make that the LIbrary of Congress had, I assume.
And when you think about the future-safety our writing, remember that it must be safe against a future where the government is even less perfect that the government of today. Sending archives into deep space doesn't seem too wacky when you think in those terms. Very deep space.
It's a perfect storm if you're a tech guy, a political guy and a writer, such as myself. Seems to me this is the kind of problem Berkman was founded to work on. I know they have moved in a different direction, but the Internet and Society need us now.
To paraphrase JFK: Ask not what the Internet can do for you, ask what you can do for the Internet.
Joe Lieberman calls Amazon to say they should cut them off. Amazon cuts them off, then says the Lieberman call had nothing to do with it. We have no reason to doubt Amazon. It's consistent with their philosophy of not taking sides in political battles. Still I wish Amazon, who I'm a big fan of, had stood with them, had maintained its neutrality.
Then their DNS host cuts them off, claiming they were being hit with a massive Denial-of-Service attack. Again, no reason to doubt their word, but we wish they had found a way to work with them. Then WikiLeaks says you can find them through wikileaks.ch, which redirects to a dotted ID. People exchange this information on Twitter. So in a weird sneaker-net sort of way we have implemented a human DNS.
Now France runs a press release saying WikiLeaks can't be hosted there. We assume they had some reason to think that they might have this problem.
All the while there's a huge glaring 800-pound-gorilla elephant-in-the-room size contradiction. The pols say that businesses can't support WikiLeaks, but the Guardian and the NY Times and the newspapers in Pakistan, Indonesia, Israel -- all over the world -- they're businesses too. So if they're really serious about this, we're in First Amendment territory. They ought to be careful, because the politicians depend on these businesses to sell their product. Without them, they have no way to lie to us.
See the problem isn't that WikiLeaks is lying, the problem is that they're telling the truth. This is not business as usual.
While the politicians and reporters are getting a fumbling on-the-job education in the architecture of the Internet (an NPR reporter said, hesitatingly, that it appears as if the server is now in Switzerland), the next question is where does the running stop? When does the situation reach equilibrium? What's the best outcome for the people of the planet?
It seems to me that at the end of this chain is BitTorrent. That when WikiLeaks wants to publish the next archive, they can get their best practice from eztv.it, and have 20 people scattered around the globe at the ends of various big pipes ready to seed it. Once the distribution is underway the only way to shut it down will be to shut down the Internet itself. Politicians should be aware that these are the stakes. They either get used operating in the open, where the people they're governing are in on everything they do, or they go totalitarian, around the globe, now.
That must be what they're discussing behind the scenes in government. And don't miss that this is equally threatening to media. They won't be able to engage in spin rooms and situation rooms, appearances and perception. When we can see the real communiques, that kind of mush won't do.
Ethan Zuckerman, via email: "Dave, [the torrents are] already there - the cables themselves are being distributed as a torrent. What's so crazy about all this is that the 'illegal content' everyone claims to be worried about hosting is basically just a promotional page - the sensitive stuff is out there in torrentspace and virtually impossible to stop from being distributed."
Daniel Bachhuber joined us for this week's Realtime Brunch.
He's a nice-looking young developer guy who moved from Portland, Oregon to the Big City to work at the J-school at CUNY.
He also writes WordPress plug-ins. Interesting guy.
He did something I've been thinking about -- he quit using Twitter and Facebook.
I had to shake his hand so I could say I shook the hand of someone who had quit.
I talk about it, but I haven't had the guts to do it yet.
I even have a site ready for the day I do it.
Every time I adorn it with another hamster cage I'm thinking. Thinking.
But I believe in not slamming the door on the way out. So that's all I"m going to say now.
Anyway, here's the post where Dan announced that he deleted his Twitter account.
His girlfriend wasn't pleased.
But the deed was done.
If you're looking for the single place in NYC to position yourself for maximum access to the rest of the city, Union Square is the place for you.
You can get to the upper east or west sides, the financial district, Brooklyn, Queens, the Bronx, the airports.
It's only one of two Manhattan streets that has a cross-town subway, the miraculous L train, which is probably responsible for the resurgance of Williamsburg, which is a single stop from Manhattan and just three stops from the magic of Union Square. (The other cross-town street is 42nd.)
On a walk today I happened by an entrance to the subway that tells the story.
At breakfast this morning Jeremy said something I've been thinking. He wants to be able to blog a single idea, without researching anytihng. A note like the notes he puts in the book he keeps in his breast pocket.
I said I agree! In fact that's what Scripting News was like in the early days.
DaveNet was where I wrote my longer pieces. It was more like what people think of as a blog today.
Jeremy Zilar and I have been having a roughly weekly breakfast in Manhattan for much of 2010, usually at a wonderful French restaurant in the West Village. We'd usually have the same meal, Feuille de Brique. Very tasty.
Since I moved to the east side we've been in search of a replacement and today we may have found it. The Silver Spurs diner at 771 Broadway. It has all the features you look for in a regular tech meetup place.
1. It's relatively empty for the morning meal on a weekday morning.
2. The tables aren't nailed to the floor so they can be reconfigured if 20 people show up.
3. The prices aren't bad.
4. It's got a bit of kitsch.
In fact it reminded me of Buck's Woodside, but without the kooky quirkyness.
So if we have a breakfast meeting don't be surprised if I ask to meet at the Silver Spurs on Broadway.
As you weave among the obstacles on the sidewalks of Manhattan, it's easy to get distracted from your thoughts and pay attention to the people you're encountering. It's okay to do that if you're at a stop, but if you're in motion, if your eyes engage with another that signals that you would like to negotiate. Not good. A sign of weakness. Whether the oncoming traffic is aware or not, he or she will take advantage of this weakness and charge right into your path, all the while not making eye contact. There is no appeal.
All you can do is shift out of their path, but even this won't avoid a collision because your adversary will unconsciously shift closer to you. Your weakness is attractive. Your space is up for grabs. At this point you have no choice but to collide, and in the etiquette of NY street walking you're responsible.
That's why the people who check their smartphones for text messages or emails while walking so totally command the sidewalks. They are heat-seeking missiles, and it's your heat they seek.
I don't think this is just New York, it's a feature of the human species. We seek companionship.
For a while in 2005 I lived on the beach in northeast Florida outside St Augustine. The beach is so long and relatively empty, they let you drive on the beach to find the perfect spot to bathe, and if you're willing to drive a bit you can be alone. So I would drive to a secluded spot, park my car and go out into the surf. When I came back, more often than not, there was a car parked right next to mine. They could have parked anywhere in a mile in either direction and had it all to themselves.
In the server mess I had to take the FeedHose server off the air so I could reorganize things. Now I'm ready to turn it on again.
This server is for testing for interop only. Please don't build serious systems on it.
The default hose on this server is Hacker News.
The main entrypoint is the default page. It's meant to be called from an application. It's a long-poll, meaning it returns when something new is available from the indicated hose, or if it times out. In either case you're expected to loop back around and call it again.
Here's an example call that times out after three seconds, returns JSON-formatted news from Hacker News. (If no new stories came in during the three seconds, which is almost a certainty, you'll just get metadata.)
There are four optional parameters, all of which have defaults:
1. name, is the name of the hose you're inquiring about. If it's not specified the information returned is about the default hose (on this server, Hacker News).
2. timeout, is the number of seconds before the call times out. If it's not specified, the default value is 120 seconds.
4. seed, a string returned from a previous call to the hose, passed back to the next call. If any new items came in while you were processing the the previous call, or if your client rebooted or otherwise went away for a long time, passing back the seed assures that you don't miss any news. If you don't mind possibly missing some news you don't have to specify the seed.
None of the parameters are case-sensitive. In other words, JSON is the same as json is the same as JsOn.
The values returned are straight from RSS 2.0. elements like title, link, description, pubDate, etc. There are three additional values that are passed back, the feedUrl, feedTitle and feedLink of the feed the item came from. Hoses can be made up of many feeds, and it's sometimes useful to know which feed an item came from.
There are two other entry-points:
1. recent, which just returns the latest stories. It's useful to see what you get back from the feednose while you're developing, or to show someone roughly what the hose looks like at a technical level. It looks for several optional parameters, as described above: name and format that determine which hose is queried and what format the returned text is in. There's an additional optional parameter, count. If not specified it's 3. The maximum count is 15.
2. seed, which returns the current seed without waiting and without returning any news. You can specify the name and format, as optional parameters, as described above.
As part of my server reorg, I now maintain a mirror of the static content of scripting.com on S3. The address of this archive is s3.scripting.com. If you go there you'll get an error message, so please no need to report it.
This backup includes the .htaccess files, sitemaps, and could possibly include other metadata. If you have any suggestions let me know.
Uploading the huge folder was a messy operation.
So here's a feature request for the Amazon team: Allow us to upload a zip archive containing the initial contents of a bucket. That way I can arrange it exactly as I want it, and have the upload take a relatively short period of time. All the tools I use are relatively slow at uploading tens of thousands of files in a single batch.
Even better of course would be if Amazon could understand the .htaccess files, and even understand a subset of the httpd.conf file. Then we'd really be cooking with gas!
As part of my server reorganization, I'm once again maintaining sitemaps on scripting.com.
There's a single sitemapindex file that links to all the sitemaps.
I also released the code that generates the sitemaps, running in the OPML Editor. Here's a source listing.
Update: When the nightly backup code runs it also produces a JSON file listing the URLs of files that changed since the last time it backed up. It's stored in a calendar-structured folder.