Norvig is the 2nd of 3 Google employees that Seibel interviews in his book, I believe. Is 25% too high a ratio for a single company to have? Perhaps, but there's no disputing that Google has a fantastic array of talent, and Norvig reputedly is one of the reasons why, so I'm pleased to see him included.
At this point in his career, Norvig is much more of a manager/executive than a coder, so his observations on coding are perhaps not as immediate as with others that Seibel speaks to, but still, Norvig has some very serious programming chops. Joel Spolsky and Jeff Atwood told a story on their podcast recently about their DevDays conference, which had the unusual format of being a short conference held multiple times, successively, in various locations around the world: at each location, one of the events that they organized was a tutorial introduction to the programming language Python. Each tutorial was taught by a different instructor, who was chosen from the available instructors local to the conference location, and each time, Joel and Jeff suggested to the instructor that for the material of the course, the instructor could choose to perform an in-depth analysis of Norvig's spelling checker. 10 of the 12 instructors apparently chose to take this suggestion, and, each time, the technique of analyzing a single brilliantly-written piece of software was successful.
One of the things that must be fascinating about being at Google is the scope and scale of the problems. Norvig says:
At Google I think we run up against all these types of problems. There's constantly a scaling problem. If you look at where we are today and say, we'll build something that can handle ten times more than that, in a couple years you'll have exceeded that and you have to throw it out and start all over again. But you want to at least make the right choide for the operating conditions that you've chosen -- you'll work for a billion up to ten billion web pages or something. So what does that mean in terms of how you distribute it over multiple machines? What kind of traffic are you going to have going back and forth?
Although the number of such Internet-scale applications is increasing, there still aren't many places where programmers work on problems of this size.
Another interesting section of the interview (for me) involves Norvig's thoughts about software testing:
But then he never got anywhere. He had five different blog posts and in each one he wrote a bit more and wrote lots of tests but he never got anything working because he didn't know how to solve the problem.
...
I see tests more as a way of correcting errors rather than as a way of design. This extreme approach of saying, "Well, the first thing you do is write a test that says I get the right answer at the end," and then you run it and see that it fails, and then you say, "What do I need next?" -- that doesn't seem like the right way to design something to me.
...
You look at these test suites and they have assertEqual and assertNotEqual and assertTrue and so on. And that's useful but we also want to have assertAsFastAsPossible and assert over this large database of possible queries we get results whose precision value of such and such...
...
They should write lots of tests. They should think about different conditions. And I think you want to have more complex regression tests as well as the unit tests. And think about failure modes.
During this long section of the interview, which actually spans about 5 pages, Norvig appears to be making two basic points:
- Testing is not design. Design is design.
- Many test suites are overly simple. Testing needs to start simple, but it needs to be pursued to the point where it is deep and powerful and sophisticated.
I think these are excellent points. I wonder what Norvig would think of Derby's test harness, which has support functions that are much more sophisticated than assertEquals and the like: assertFullResultSet, assertSameContents, assertParameterTypes, etc.
I'm quite pleased that Seibel raises the often-controversial question of Google's "puzzle" style of interviewing, and Norvig's answer is quite interesting:
I don't think it's important whether people can solve the puzzles or not. I don't like the trick puzzle questions. I think it's important to put them in a technical situation and not just chitchat and get a feeling if they're a nice guy.
...
It's more you want to get a feeling for how this person thinks and how they work together, so do they know the basic ideas? Can they say, "Well, in order to solve this, I need to know A, B, and C," and they start putting it together. And I think you can demonstrate that while still failing on a puzzle. You can say, "Well, here's how I attack this puzzle. Well, I first think about this. Then I do that. Then I do that, but geez, here's this part I don't quite understand."
...
And then you really want to have people write code on the board if you're interviewing them for a coding job. Because some people have forgotten or didn't quite know and you could see that pretty quickly.
So the puzzle question, in Norvig's view, is just a way to force the interviewer and the interviewee to talk concretely about an actual problem, rather than retreating into the abstract.
Over the years (decades), I've interviewed at many interesting software companies, including both Microsoft and Google, and I think that this approach to the interviewing process is quite sensible. Although in my experience it was actually Microsoft, not Google, where I encountered the full-on trick puzzle interview process, I can see that, as a technique, it actually works very well. And, as the interviewee, I actually appreciate getting past the small talk and getting direct to the "code on a wall" portion of the interview.
I really enjoyed the Norvig interview.
Your reviews are awesome. I just read the book, too, and I loved the Norvig interview.
ReplyDeleteThe spelling corrector, separately, is brilliant.