News and commentary from the cross-platform scripting community.
Mail Starting 5/29/97
From: firstname.lastname@example.org (webworks);
Sent at 6/2/97; 11:33:59 AM;
Re:Jesse Berst on Web Development
the problem is that there really is *no* database that has been built for this purpose or medium. Oracle, Informix, Sybase - they're building databases for financial systems. they're building databases to handle extremely complex queries on very large mainframe systems on internal networks. Microsoft is building databases for LAN-based workgroups (a la Notes). In neither case are their products built around TCP/IP or designed with APIs that have been developed for integration with existing Web products.
wish list for a Web database:
1) optimized for low transaction time with relatively simple queries.
2) scalable to millions of transactions per day, with high-tolerance for heavy concurrent usage. 3) optimized for simple SQL queries (SELECT, INSERT, UPDATE, DELETE).
4) small memory footprint.
6) well worked out API for integration with JDBC, NSAPI, CGI, CORBA, RMI etc.
7) TCP native protocol
8) minimized resource usage
9) cross-platform (in this case all Unix-es and NT)
10) less than $20k
i actually think there is a market for a product like this, but people seem to be more interested in either: repurposing existing expensive SQL database products that are overkill and cost too much or repurposing existing desktop or workgroup database products that don't have the performance or scalability or functionality needed.
When you dynamically build a website, you get all the intra-site links for free. And it is trivial to dramatically increase the number of navigation links, which makes the site easier to use.
From: email@example.com (Robert J. Woodhead (AnimEigo));
Sent at 6/2/97; 8:56:42 AM;
Re:Jesse Berst on Web Development
http://www.animeigo.com/ is built that way, using a custom MPW shell program I hacked together (MPW PASCAL is my "comfort language" for small tool projects) about a year ago. I recently revamped the site layout totally, without changing the content at all. Each of the content pages is a "stub" file containing the content and macros for invoking buttons, etc.
The program also generates a permuted site index and recent changes lists automatically. It is hack piled on hack, reading a bunch of tab-delimted database files that sort of accumulated over time as we added features to the website, but it works and is damn near bulletproof.
For highly dynamic content, on-the-fly is required, but for a site where the content may accumulate but individual content pages are static between updates, generating to HTML files is usually the best approach because the Mac file-system is much harder to mung, and much easier to repair, than most DBs. Plus of course, one less point of failure [DB app] on the webserver.
I too have seen some nasty and slow Java pages, but I place the blame at the programmer's feet.
From: firstname.lastname@example.org (John Jensen);
Sent at 6/1/97; 6:17:16 AM;
Regarding Java Needs Marimba?
Java generates very dense bytecodes, but 1) loads classes seperately (until JARs arrive), and 2) downloads data (gifs, etc.) as seperate files. I think the slowest sites are pulling in more classes than they need, and downloading way too much data.
An early offender was the bouncing head program. The program was small, but it had to download a dozen gifs before it could run.
The best sites I've seen use algorthmic approaches, rather than the heavy, data driven ones. The 3d maze I posted at
is (too) simple and lightweight. It weighs in at 3,712 bytes.
First let me say that I enjoy your "rants" even though I understand only a little of what you do. The issues in much of what you express are clear even to someone who has little education in your area of expertise.
From: email@example.com (Winsor Crosby);
Sent at 5/30/97; 5:44:17 PM;
Jesse Berst's Website
I just disconnected myself from AnchorDesk because it was so frustrating to get the EMail teaser and go to his website and wait minutes for each page to load. It really, for me, is much slower than a lot of graphics laden sites. His data base may be easy to use to add content to his pages, but it seems to me to be poorly organized from the user's point of view. The pages are very dense, do not scroll except in slow little jerks, and in order to get to the "detailed article" you have to go through several of these pages which may add up to 10 minutes of download time with a 28.8 modem only to find an article, frequently, with little content except a few references to other web sites. You may been watching his site longer and may have seen some good content, but in the month or so I have looked at it, what good stuff is there is not worth the aggravation.
In Java Needs Marimba? you note that Java over the "net" was too slow. I agree. Long ago (in net terms), I ditched applets as a target market and began writing Java applications on my Mac. The trick then became finding appropriate applications for this language that deemphasised the weaker aspects of the language. The good thing is that IMO projects come together relatively quickly in Java vs C/C++. That is where Java will see the benefits that sustain it as a development tool; IMHO.
From: firstname.lastname@example.org (Richard Freytag);
Sent at 5/30/97; 12:51:09 PM;
Re:Java Needs Marimba?
One of the big things in Java 1.1 is the introduction of JAR (Java archive) files.
From: email@example.com (Preston Holmes);
Sent at 5/29/97; 10:19:38 AM;
Re:Java Needs Marimba?
Much of what is slow in getting the current java stuff, is that the client has to establish a new HTTP connection for every .class file (sometimes hundreds). JAR files will provide a way of bundling all the .class files, gif images, sounds etc into a file that can be downloaded in one connection. Then it should be no worse than downloading say a 500K - 1000K movie clip (depending on content obviously).
Of course I still agree with you that Castanet is the way to go for larger Java projects, esp those that people might use repeatedly (at a news site for example).