News and commentary from the cross-platform scripting community.
Mail Starting 3/31/97
Dave, sometime ago we met. I started The Computer Museum in 1979 .. and now in 1986, The Computer Museum added a division here in Silicon Valley .. The Computer Museum History Center that is devoted to the preservation of computing.
From: email@example.com (Gwen Bell);
Sent at 4/1/97; 10:24:34 PM;
saving software .. and more
Long ago, I learned that computing is a verb ... it is about doing. .. And things that get done in software on one turn of the revolution, get done in hardware on the next, and then back to software and so on.
Collecting and preserving the history of computing means collecting the hardware (and trying to keep some of it running), the programs -- software and their documentation, personal accounts of how things were 'really done', films and videotapes of machines, interactions, commentary, photographs, and the ephemera that provided a commentary for this.
I hope you remember me, because The Computer Museum is another place that really has done and can show you what Fred Davis talks about. For example, The Computer Museum has the original SuperPaint program done at Xerox PARC by Dick Shoup. The hardware is based on a DG NOVA with lots of special stuff. We have the documentation, the hardward and software in "runable condition" with Dick around, a video of Dick doing a demonstration. We also maintain the original "SpaceWar!" program on a running PDP-1 circa 1962 .. there are no more of these around (at least that I know of). We have a caged copy of the Morris worm. .. and we actually have these available now.
Fred David and the San Francisco Computer Museum made an offer to take the Livermore collection of computers (hardware) that has been in jeopardy. They had to renig on their offer, they did not have the resources to carry out this project. Fortunately, The Computer Museum History Center has taken the first component and will take the entire collection.
We have been there for some 15 years catching and preserving computing. But we don't do it all. The Babbage Institute in Minnesota has the collection of materials of computing languages and they also keep corporate records. Stanford, MIT, and other univeristy libraries/museums are maintaining what was developed at their institutions. And I must mention that some corporations, such as Microsoft, IBM, DEC, Fujitsu, and Intel preserve the records of computing that they have made. Finally, the Air and Space Museum and National Museum of American History in the US, the Deutsches Museum and Nixdorf Museum in Germay, and the Science Museum in London have active collecting/preservation programs that include software as well as hardware, i.e., they encompass computing. And, most of us in the business today support each other, so that the public can understand the evolution of this significant new advance.
I hope that we can count on you to help in this process.
To the people that posted mail about the Boston Computer Museum:
From: firstname.lastname@example.org (Fred Davis);
Sent at 4/1/97; 9:55:10 PM;
Thanks for your interest in this topic. Yes, I do know about the Boston Museum, and about their plans for a History Center. In fact, the folks on my board had a meeting with the Bells, Len Shustek, and Carol Welsch last fall. But there are some important differences between us and some popular misconceptions about their project. Here are some points to consider:
-- The History Center will NOT be open to the public. It is a place for scholarly researchers to use (including us, I hope). The Bell's public Museum will remain in Boston. In contrast, the Computer Institute's SF Computer Museum will be open to the public and also have a major Virtual Museum on the Web.
-- The History Center places its emphasis on hardware and other physical collectibles. The Computer Institute's SF Computer Museum encompasses hardware, software, computer and multimedia art (most often neglected), culture, and people. We don't just put old boxes on display, we also highlight the achievements of the people and place it all in both a historical and forward-looking context.
-- The History Center is Valley-based. The Computer Institute's SF Computer Museum is international. Locating in SF (one of the worlds top tourist destinations) opens it up to the widest possible access and exposure. We plan to work with other major museums around the world to further broaden our reach.
-- The Computer Institute also has strong support from the community. Our early supporters include Senator Diane Feinstein, Mayor Willie Brown, SF Supervisor Leslie Katz, Ted Nelson, David Bunnell, Michael Leeds, Christine Comaford, Marc Canter, Brewster Kahle, Katie Hafner, Rich Levandov, Andrew Eisner, George Coates, Steve Beck, Raines Cohen, and many others. Our curator and VP is Kip Crosby, head of the Computer History Association of California http://www.chac.org, and publisher of the Analytical Engine, the only publication devoted to computer history.
These differences are major in both content and direction: The Computer Institute and its San Francisco Computer Museum are charged with providing the widest possible range of computer and computer-related historical, contemporary, and future content to the greatest number of people, whereas the Boston Museum and its History Center are limited in scope, subject, and attendance.
I'm not sure if you are aware of The Computer Museum, based in Boston, which has also initiated a project to build a History Center here in Silicon Valley.
From: email@example.com (John Shoch);
Sent at 4/1/97; 6:38:14 PM;
The Museum in Boston is a major public entity, with a special focus on education, etc.; the History Center here will have more of a focus on collecting and retaining important artifacts of computing. [The Computer Museum provided the materials for the historical exhibition which you may have seen at Comdex this year, and is also the sponsor of the annual East-West Computer Bowl.]
It seems that the "SF Computer Museum" has some very ambitious ideas for an entertainment-oriented mega-project, while The Computer Museum is already collecting machines. Much of The Computer Museum's collection of equipment has already been moved to the West Coast; it is not yet on display, but can be visited by appointment in their space at Moffet Field. In addition, I believe they have acquired the collection from Lawrence Livermore. It is a great place to visit when they have an open house -- everyone re-connects with an "old friend" they once programmed on!
This Valley-based project has strong support from the community (early contributors include Gordon and Gwen Bell, Len Shustek, Eric Benhamou, Steve Blank, Bill Davidow, Dan Lynch, Jon Rubinstein, Charles Simonyi, etc., etc.....(and me, of course)).
For more information, see: http://www.tcm.org/info/press/wpr-silvalley.html or contact Carol Welsch, Director of the W. Coast Office.
Cheers, John Shoch
[PS: I do recognize that the problem of preserving old software is very important (but different from building a history center). I agonized long and hard about throwing out all my old Alto disk packs (2.5MB each!), which included my PhD thesis, but concluded that it would be easier to have someone scan and OCR the hard-copy! But there are a number of old Xerox alums with Altos, D-series machines, and Stars.....]
FYI - There was a very good article in Scientific American in (I think) mid-1995 on this very subject. It provided several very good scenarios illustrating the problem. Sorry I don't have a more complete reference but if you're interested you ought to be able to find the article.
From: firstname.lastname@example.org (Brian Molyneaux);
Sent at 4/1/97; 3:40:31 PM;
The described situation is what feeds an entire industry. If everything was reusable and rebootable - then a lot of people had to look for new jobs.
Sent at 4/1/97; 5:36:16 PM;
Here's another aspect on conservation:
BOEING (is this a great phonetic logo, or what) conducted a research on resource management in their engineering department.
The result: 75% of the time is spent looking for information that has already been created or recreating information that has been created but people can't find or apply for some reason.
Other than in manufacturing, where lean thinking and waste management are #1 topics, we find that in research & design (and marketing also) we live happily on tremendous information waste piles.
Put that in juxtaposition with the neverending information redundance and overflow from TV, newspapers and magazines and you'll see that we have huuuuuuuge problems with information management and handling in general.
And by the time we find solutions for certain type of problems, technology will present as with new tools to create even more chaos.
Anyway! Good luck with your cleaning efforts.
Reply to: RE>Saving Software
From: email@example.com (Richard Landry);
Sent at 4/1/97; 2:10:42 PM;
Hi, Dave. Why not use software emulators for the older systems? How long do you think Fred will be able to keep the Xerox Star going? All these older systems were built less solidly than today's PCs, with more fragile storage media to top it off, so there's no question that they will ultimately break. If the idea is to keep the software accessible, then the only certain way to do that is to host the software in a new body.
What about the (original) Computer Museum in Boston that Gwen Bell started? They have tons of old computers, and I'll bet they've got a lot that run, too!
From: firstname.lastname@example.org (Catherine Mikkelsen);
Sent at 4/1/97; 12:23:53 PM;
computer museum in boston
My old computer science teacher in High School _still_ runs Cromemcos. I remember using them with one of the first versions of Informix to put our National Honor Society on computers. It was way cool...Let me know if you'd like to get in touch with him. I don't have an email address for him but I do keep in touch with him in other ways.
From: email@example.com (Omar Javaid);
Sent at 4/1/97; 1:44:05 PM;
I had just read the comments on NetObjects Fusion and the "Saving Software" piece and an interesting sort of mental cross-pollination happened -- we don't just have to worry about disk formats changing, we also have to worry about how things are stored on those disks. Most of the software I've written lives on disks I can still read, albeit with effort, but it's still unusable because the development environments saved sources in a tokenized format. It's not source code, it's a binary file. Early MS Basic did this, as did Mac Pascal (the precusror to Lightspeed Pascal.)
From: firstname.lastname@example.org (Richard Clark);
Sent at 4/1/97; 10:58:13 AM;
Saving source code
This presents an interesting problem -- have you ever tried to run Mac Pascal on a modern Macintosh? This is where I get concerned with Fusion, et. al. storing web content in a proprietary format -- unless there's a workable HTML source copy someplace, those materials may be lost as platforms move forward, programs drop compatibility with their ancient file formats, etc.
I've been looking into this archive problem in the Computer Art field. (Not that hacking code is any less of an Art form) But we have 3 decades of Interactive computer art that is running the risk of being lost for all time.
From: email@example.com (wanda);
Sent at 4/1/97; 11:22:55 AM;
Interactive Art Backups
Not only do we have to maintain working CPU's and OS's but documentation has to be done on how these pieces were created in the first place. Many of the great telepresence and VR pieces were hacked the night prior out of three versions of the code.
I'm working on a online log of the Art pieces where they were exhibited what hardware and software were required and a place for artists to drop copies of their code.
I talk to Fred every now and then and think he should be pushing the restoration of Interactive Art as the need for saving the old CPU's and OS's. Backers who don't understand the importance of perserving Code and the ability to run it, can better grasp perserving an art form they are familiar with.
I have a couple of URL's for you to check out on this topic.
From: firstname.lastname@example.org (David Weingart);
Sent at 4/1/97; 1:36:12 PM;
The first one is for the Obsolete Computer Museum:
That's a good jumping off point for learning about some older computer technology. This site concentrates mostly on hardware.
The other URL is for the Emulation on the Macintosh site:
This site lists the bewildering array of emulators that exist for the Mac. There's an emphasis on games, but many old operating systems are represented as well.
Sent at 4/1/97; 9:59:03 AM;
An example, [Fred Davis] believes he has the only bootable Xerox
Star on earth. I hope he's wrong!
I think he is: http://wildflower.meer.net/ is a Xerox Dandelion (the CPU in the original Xerox Star) belonging to Alan Freier email@example.com, complete with a TCP/IP stack and HTTP server.
I applaud Fred's project, and want to point out that in addition to preserving old hardware (especially important to read old magnetic media), it's possible to emulate most old machines in real time or faster. An interesting article that covers both approaches is:
Preserving Computing's Past: Restoration and Simulation Maxwell M. Burnet and Robert M. Supnik Digital Technical Journal Vol. 8, No. 3 (December 1996), 23-38. Available online at http://www.digital.com/DTJN00/
Supnik has written emulators for many old computers from DEC and other companies, and has negotiated hobbyist licenses for a number of the operating systems.
This piece was very timely. Web pages are "media" + "software". They are also "collages" of files.
From: firstname.lastname@example.org (Stephen Bove);
Sent at 4/1/97; 9:57:43 AM;
I've produced or co-produced a number of big commercial websites in my "web-life". I've also produced or been part of producing shows for TV and feature films.
In Hollywood, you assemble your work into a "reel", which is really your resume. If you forget to get something onto your reel in the heat of production or post production, it's pretty easy to get a hold of a videotape of the work at a later date.
I recently decided it would be fun to have a "reel" of my web-work (up till now I've just been pointing people at URLs). But alas, the web sites I've been part of making are "alive". They change every day/week/month. My handiwork is still present in some of them, but, with one exception, the "UI" for all of them has changed. Is this still my work? How can I get copies or print outs of the pages I played a direct role in creating?
Guess what. It's almost impossible - because of the compound nature of the "medium". GIFs and JPEGs are scattered all over the place, many of them lost. Databases upon which dynamic pages are based have been re-architected and overwritten. Text templates, CGIs and HTML files have been changed beyond recognition and the orginals are almost never to be found.
Except for going into back-up tapes, which will require enormous effort on the part of various overworked sys-admins and web masters, I'm out of luck (and even then, there is no guarantee that the back-up tapes contain orderly content archives or have not been "rotated" through a few overwrites by now). Bottom line: except for a few color xeroxes of pages that I made as proofs, I'm SOL.
So, if you make web-stuff, save copies of your work methodically (all the pieces!!). And also, make good color copies of the pages you really like and save them in a portfolio.
Just wondering why you mentioned Object Design on your News website?
From: email@example.com (Andrew Migliore);
Sent at 4/1/97; 9:21:59 AM;
Re:"Object Design's Objectstore OODB"
As I mentioned around 7 months ago the product that I am working on called DynaBase uses Objectstore for its OODB.
Now that you are running a few NT machines you might want to check out http://dynabase.ebt.com again. DynaBase is a web configuration managment system with a full text index, link managment and is scriptable through WebBasic.
Its been snowing heavily for over a day in Boston. This morning when i woke up, the city was shut down. Nobody trying to get anywhere. Subways weren't running. No taxicabs. No newspaper, no mail delivery.
From: Sally@kins.com (Sally Atkins);
Sent at 4/1/97; 12:06:49 PM;
the blizzard, day 2
Still, at noon, the snow continues to fall. The only traffic outside my window is foot traffic.
Hundreds of thousands of people around the metropolis of Boston have no power due to trees which snapped under the weight of the wet snow in the high winds last night.
Everyone has a day to reflect on the power of Mother Nature. It's beautiful. This is one of the things I missed when I lived in California. THE SNOW!
Time to go sledding
I've had a similar idea, but on the software, rather than hardware side. With a basement full of old and new software, I routinely pack boxes in hopes that someday they'll be of interest to people. The rest I typically give away to schools, but even still it's a losing battle... the software pours in faster than I can do anything with it..
From: firstname.lastname@example.org (Rich Santalesa);
Sent at 4/1/97; 4:40:38 PM;
In the course of this finger in the dike game I thought of an idea that I wanted to bounce off you, but you beat me to it... I wanted to create a software muenster that as soon as software came off the market, we placed the executables (not the code) in the "public domain" as it were.
The idea was to offer both an ftp site so people could download that old copy of Jazz or Symphony, and a physical location where the actual manuals and boxes would be "archived." For programmers it'd be a real source, I think, for if we forget the past, we're doomed to waste time re-coding it. ;-)
For instance, how many people remember that HP's NewWave -- six years ago -- put a scriptable, OO shell on top of Windows? Not many, but I have a copy downstairs. In many ways, today's software lacks huge areas of functionality that we took for granted years ago...
If you are having trouble finding one of these (and actually want to), you might try a company called ColorGraphics Systems in Madison, Wisconsin. They were one of the companies that made "weather computers" and were still using Cromemco machines up until I left the broadcasting business. I think the first system I used (back in the late 70's) was based on a Z2D. Weren't all real computers rack-mountable?
From: email@example.com (Brad Pettit);
Sent at 4/1/97; 8:49:08 AM;
From: firstname.lastname@example.org (Chris Trimble);
Sent at 3/31/97; 4:03:07 PM;
Re:"scripting languages (ousterhout article)"
Excellent pointer to an excellent article. Thanks for mailing that out. I am doing a lot of my work in Python http://www.python.org these days, and am really loving it. It is a structured solution to scripting; it offers the dynamically-typed nature of scripting but large Python projects are easier to maintain because of its structure.
Also check out Dave Beazley's SWIG. SWIG is a great tool for anyone who wants to use C or C++ code from TCL, Perl, and Python. Pretty much all you need is the header file, and you can make an interface from one of these languages to just about any C library.
Reading the WebMonkey link regarding Cascading Style Sheets and the discussion of cross platform type issues, I want to point something out regarding this statement:
From: email@example.com (Brad Pettit);
Sent at 3/31/97; 8:33:20 AM;
Cross-Platform Type Issues
>Windows, on the other hand, ignores convention, choosing instead to
>display type for all the old farsighted folks at home - making a
>12-point font as big as 16 points. So when I specify a font to look like
>12 points for that 75 percent of the Web on Windows by using a 9-point
>face, the other 25 percent of readers can't read it.
Windows is not the culprit here. Most of the Windows documentation teaches the usage of type based on logical coordinates, which is what the article is griping about. The foundation of logical coordinates is supposedly based on the resolution and size of VGA monitors and how much text will fit across the display. However, the Windows API has the means to get the physical resolution of a graphics device as well as the attributes of a font. It is simple from there to handle text correctly. Even a glance at Petzold 95 will show the way.
After spending time wading through cross platform type issues in the Open Doc Framework (The best cross platform development library!), I consider the basic Windows API superior to the basic Mac API calls for text. The most commonly used Mac API (GetFontInfo) is overly simplified, in my opinion.
One of the greatest differences is not in font resolution, but the reference for text size. When one system includes descent in a font height and the other doesn't, and neither has reasonable performance when getting font metrics, resolution is the least of your problems.
It's time that the responsibility for poor text handling in Windows is placed where it belongs -- on the developers and the Windows programming books rather than on Microsoft. Windows software is more crappy than Mac software not because of Microsoft, but because Windows users have learned to tolerate crappiness more than Mac users have.