The immediate concern at Netscape was it must look like Java. People have done Algol-like syntaxes for Lisp but I didn't have time to take a Scheme core so I ended up doing it all directly and that meant I could make the same mistakes that others make.
Eich's interview rushes like a whirlwind. He clearly is such an intense and active thinker, and has so much that he wants to talk about, that there just isn't enough time or space in a few small pages to contain it all.
Luckily, there are a number of places on the web where you can find recorded lectures and writings that he has done so you can learn more, and in more detail. For example, here is a recent talk he gave at Yahoo.
As I was reading through the chapter, I found myself pausing about every other page to go chase a reference to various bits of knowledge I hadn't been aware of:
- Hindley-Milner type inferences
- The Currey-Howard correspondence
- Valgrind, Helgrind, Chronomancer, and Replay
It must be exhausting to share an office with Eich :)
And, as a person who loves to use a debugger while studying code, I was pleased to read that Eich shares my fondness for stepping through code in the debugger:
Seibel: Do you do that with your own code, even when you're not tracking down a bug?
Eich: Absolutely -- just sanity checks. I have plenty of assertions, so if those botch then I'll be in the debugger for sure. But sometimes you write code and you've got some clever bookkeeping scheme or other. And you test it and it seems to work until you step through it in the debugger. Particularly if there's a bit of cleverness that only kicks in when the stars and the moon align. Then you want to use a conditional break point or even a watch point, a data break point, and then you can actually catch it in the act and check that, yes, the planets are all aligned the way they should be and maybe test that you weren't living in optimistic pony land. You can actually look in the debugger, whereas in the source you're still in pony land. So that seems important; I still do it.
"Optimistic pony land" -- what a great expression! It captures perfectly that fantasy world that all programmers are living in when they first start writing some code, before they work slowly and thoroughly through the myriad of pesky details that are inherent in specifying actions to the detail that computers require.
Well, more thoughts will have to wait for later; I'm heading back to pony land :)