I loved a lot of things about the Apple II and came to hate quite a few others. But the biggest thing it brought to my world was memory-mapped video. It's probably a foreign concept to programmers today, because we're so far from it, but it was the coolest thing. It was the guarantee that you could make the hardware do anything it could do that you could think of. Which back then wasn't really very much.
Here's the deal. There was a two-dimensional array of memory that you could read from and write to, just like any other memory. But the display hardware read the memory sixty times a second and smashed the bits out onto the display. So the way you changed a pixel from white to black is by clearing a bit in the right place. No APIs, just write directly to the memory.
There were actually two screen buffers, or was it four?
There was a bitmapped display and a character-mapped display.
And there were two versions of each so you could prepare a new screen out of view of the user, and then change the address of the screen buffer and blam the bits would all change at once.
Maybe that was on the IBM PC -- it also had a memory mapped display.
That idea didn't make it to the Macintosh, though I wish it had. But they had something even cooler, Quickdraw. Those were the days.
The reason I think of it is that I have become a rabid consumer of CSS tricks to make today's screens do impressive things that wouldn't even be slightly impressive on an Apple II or PC of the early 1980s.
If Woz is out there reading this, a virtual hug to you! What a great hack. I totally loved it then, and I love the memory of it today. Keep on truckin.
Some things I push to Twitter are bits that I want to come back to.
I use Twitter the way people used Delicious, or Instapaper or Readability.
Because there's room in my mind for just one bookmarklet that "Routes this somewhere."
The same channel is used for must-read bits. Or pictures I took in the park, or on a bike ride. It's a mishmash that no one is supposed to make sense of or remember in any general way. It's my contribution to the fog of realtime.
That's all I had to say today.