Dave Winer, 56, is a software developer and editor of the Scripting News weblog. He pioneered the development of weblogs, syndication (RSS), podcasting, outlining, and web content management software; former contributing editor at Wired Magazine, research fellow at Harvard Law School and NYU, entrepreneur, and investor in web media companies. A native New Yorker, he received a Master's in Computer Science from the University of Wisconsin, a Bachelor's in Mathematics from Tulane University and currently lives in New York City.
"The protoblogger." - NY Times.
"The father of modern-day content distribution." - PC World.
"Dave was in a hurry. He had big ideas." -- Harvard.
"Dave Winer is one of the most important figures in the evolution of online media." -- Nieman Journalism Lab.
10 inventors of Internet technologies you may not have heard of. -- Royal Pingdom.
One of BusinessWeek's 25 Most Influential People on the Web.
"Helped popularize blogging, podcasting and RSS." - Time.
"The father of blogging and RSS." - BBC.
"RSS was born in 1997 out of the confluence of Dave Winer's 'Really Simple Syndication' technology, used to push out blog updates, and Netscape's 'Rich Site Summary', which allowed users to create custom Netscape home pages with regularly updated data flows." - Tim O'Reilly.
8/2/11: Who I Am.
scriptingnews2mail at gmail dot com.
My 40 most-recent links, ranked by number of clicks.
FYI: You're soaking in it. :-)
I loved a lot of things about the Apple II and came to hate quite a few others. But the biggest thing it brought to my world was memory-mapped video. It's probably a foreign concept to programmers today, because we're so far from it, but it was the coolest thing. It was the guarantee that you could make the hardware do anything it could do that you could think of. Which back then wasn't really very much.
Here's the deal. There was a two-dimensional array of memory that you could read from and write to, just like any other memory. But the display hardware read the memory sixty times a second and smashed the bits out onto the display. So the way you changed a pixel from white to black is by clearing a bit in the right place. No APIs, just write directly to the memory.
There were actually two screen buffers, or was it four?
There was a bitmapped display and a character-mapped display.
And there were two versions of each so you could prepare a new screen out of view of the user, and then change the address of the screen buffer and blam the bits would all change at once.
Maybe that was on the IBM PC -- it also had a memory mapped display.
That idea didn't make it to the Macintosh, though I wish it had. But they had something even cooler, Quickdraw. Those were the days.
The reason I think of it is that I have become a rabid consumer of CSS tricks to make today's screens do impressive things that wouldn't even be slightly impressive on an Apple II or PC of the early 1980s.
If Woz is out there reading this, a virtual hug to you! What a great hack. I totally loved it then, and I love the memory of it today. Keep on truckin.
Some things I push to Twitter are bits that I want to come back to.
I use Twitter the way people used Delicious, or Instapaper or Readability.
Because there's room in my mind for just one bookmarklet that "Routes this somewhere."
The same channel is used for must-read bits. Or pictures I took in the park, or on a bike ride. It's a mishmash that no one is supposed to make sense of or remember in any general way. It's my contribution to the fog of realtime.
That's all I had to say today.