John LeRoy Hennessy (founder of MIPS Technologies)

June 22nd, 2009
Note: Answers to these questions were received in audio form; these can be found here - https://GameHacking.org/downloads/John_Hennessy_Q&A_MP3.7z


LB: You pioneered the modern RISC architecture, particularly pivotally when you founded MIPS Computer Systems (now MIPS Technologies) and introduced the MIPS R2000, the first RISC microprocessor. What inspired you to do so?

JLH: Of course, the RISC architecture that MIPS developed had its roots in the Stanford MIPS research project. We were inspired to start the Stanford MIPS research project because we made the observation that instruction set architectures which had been designed and honed for mainframes and mini-computers might need to change due to the shift to microprocessor technology, with its different trade-offs. I also, given the experience I'd had as a consultant working on the first VLSI VAX microprocessor, had reached the opinion that architectures had become overly complicated without a significant increase in performance that paid for that complication, and I think both those things influenced our belief that we should rethink instruction set architecture and a leaner, meaner instruction set architecture should be the direction that we go. Now, after we completed the research project, like good academic researchers, we published our papers, and we expected that industry would see the significantly greater performance, 2, 3, 4 times what other microprocessors were capable of doing, and much faster than the VAX mini-computers of that time, at a fraction of the price. But the technology was largely ignored, and we were convinced, largely by Gordon Bell, who'd been the founder of the Digital Equipment Corporation, that if we were not willing to start a company to develop the architecture and make a commercial version of our research project, that the technology, the RISC approach, would die. Of course, we redesigned the entire microprocessor, and a new architecture, based on the concepts, but re-architected, became the MIPS R2000; like you said, the first RISC microprocessor.


LB: As founder and lead scientist of MIPS Computer Systems/MIPS Technologies, you were responsible for the processors of several of the gaming systems (N64, PSX, PS2, etc) we've all come to love (and hack). We're guessing this isn't the first time you've been hailed as one of the fathers of modern gaming; did you expect the MIPS architecture to play so heavily in the electronic entertainment industry?

JLH: The emergence of the MIPS Technologies/MIPS Computers architecture into the gaming systems happened in an interesting way. As you probably know, initially Nintendo was looking at trying to get a major breakthrough in gaming and they had developed special purpose graphics processors to deal with that, but it became quite clear that in order to get the overall performance they needed, they needed a processor which was much more adept at moving data faster. At that time we had just designed the 64-bit version of the MIPS architecture, and it was the only processor that could possibly develop the capability...the bandwidth really, to deliver the data that was really necessary to draw with these new graphics processors. So that was really the beginning of it, and of course, as you point out, it went on to PSX and PS2. So in some ways it was accidental, although, one of the things that certainly happened and drove it was, the MIPS architecture was one of the cleanest, fastest architectures around, and, in the gaming industry, that speed, that performance capability, turned out to be much more important than compatibility with an old line of x86 architectures. So, I didn't expect the MIPS architecture to do that, but it certainly sky-rocketed the architecture to much broader use, and really established MIPS as the standard in a whole new segment of the industry.


LB: As we're a part of the greater video game community, we have to ask: what is your favorite video game of all time?

JLH: My favorite video game...well, I have to admit that my first video game that I ever saw was Pong, so I grew up on very early things like Pong, Donkey Kong, Pac-Man, so those were my first generation. Super Mario Bros was a big step forward. I think one of the games that I look at as really a breakthrough was Doom. It was a game that not only brought first-person-shooters really to then going from the perspective of the player, but used lots of new interesting graphics techniques, to really make a major breakthrough. So, normally, when I get to play video games, which isn't very often, I'm more of a strategy type of person, but I did admire the real breakthroughs that those early games, and even later, like Doom, really did represent.


LB: Between your current presidency of Stanford University, your professorship of computer science and electrical engineering, chairmanship and board membership of various industry-shaking companies (Google, Cisco, Atheros, etc) and day-to-day living, you're definitely a busy guy. That said, do you still find time to work on new ideas and technologies? If so, assuming you're free to talk about it, what's currently in the works, and, perhaps of even more interest, what's still in the 'drawing board' phase?

JLH: You're absolutely right; I'm very busy, and most of the things I think about these days are solving various problems that exist, either at the university primarily, or the boards of companies I sit on. I don't have as much creative time, I used to say "free thinking time", that occurs when you're taking a shower in the morning, because I have a lot on my mind, but much of what I think about now is, "Where are the opportunities in terms of emerging technologies?", and, because of my responsibilities at the university, that's broadly...well, I think about things like "What are the opportunities in stem cell technology?", "What are the opportunities in bio-engineering?", one of the fields I'm most excited about - bringing high-quality engineering talent to biological applications, particularly with the deployment of nanotechnology and integrated circuit technology that can be applied in the bio sector. New battery technologies, new solutions for clean energy...and of course, of course, in computer science, where I think there are great opportunities. I think we've seen some real breakthroughs recently. Stanley, the car developed by my colleagues in computer science, that won the DARPA Grand Challenge and demonstrated that we could make an autonomous vehicle that could navigate was a real breakthrough, and I think we'll continue to see breakthroughs of that form. Nowadays, the challenge is not so much, "Can we deliver the performance and the memory necessary?", it's creating innovative uses of that technology that really makes something that's a big step forward for users.


LB: Within the realm of technology, is there anything in particular that someone else has accomplished recently, which you find especially impressive?

JLH: One of the things that I think that's most impressive is some work that one of my colleagues in material science, Cui is his name, is doing, that is a new nano-technology-based battery structure, that uses...rather than using lithium with carbon, uses lithium with silicon. It could represent the biggest breakthrough in battery technology in twenty years. Now, of course, it's still in research stage, but if it really happened, it could revolutionize the amount of capacity we have in batteries, making your iPhone and various portable devices last days, weeks, rather than hours, and potentially making electric cars a highly viable technology [here's a website describing the technology: http://news.stanford.edu/news/2008/january9/nanowire-010908.html ] . I think we are also just on the cusp of really liberating machine learning, to do really interesting things. We've seen some important applications in spam detection; in Stanley...in autonomous vehicles. I think we're going to see more of that, with the computer intuiting what people mean when they do a search, when they ask for a summary of information, all those kinds of things, and I think those applications are going to be interesting. We're also in the infancy of wireless technology; as wireless becomes more and more pervasive, we'll see interesting new applications where you can get whatever information any time you need it, any place, right away, and I think that will change the way we think about using the Web and lots of online information.


LB: What do you like to do with your free time?

JLH: Well, I don't have that much these days. I do do a lot of reading; I'm usually working on two books at once - one fiction, one non-fiction, and I read the non-fiction book usually in the conventional form, right now I'm working on a Biography of Darwin, in preparation for a trip to the Galapagos, and fiction...I read whatever I can get on my kindle, because I read it when I'm exercising, so it happens to be some crime/mystery/historical fiction novel about Louis XVI's son. So, that's what I do for most of my time. I also started playing golf a few years ago, and I found it's a great game for getting me outside, and enjoying some natural outside, and forcing me to concentrate on something other than my full plate at the university.


LB: Who did you look up to the most as you gained interest in, learned, mastered, and improved upon the world of digital technology? Who directly influenced you more than anyone else? [doesn't have to be the same person for both]

JLH: Well, certainly, during my time, I've had different mentors that interested me in different ways. My early interests came from my father, who was an engineer, who got me interested in digital technology, and then a good friend, with whom I built a high school science project, a tic-tac-toe machine...built, believe it or not, out of spare, surplus relays that we could get. Of course, many people recognize, tic-tac-toe is a game that you can never lose if you go first. But, boy, it sure it impressed a lot of my high school classmates to have a machine that could do tic-tac-toe. Later, in college, I had a young faculty member who stirred my interest in computer science. At that time, there were no computer science majors, so I majored in electrical engineering, but he gave me the opportunity to work in computer science, and to work in his research laboratory, which was a really wonderful opportunity. That continued through graduate school, where I was lucky to get a research project early on, and it brought together current-time systems with the early microprocessor technology of today. And then, when I came to Stanford, I worked with a number of faculty, including Forest Baskett, who really ran much of the research program that MIPS was done under, and before that, the Geometry Engine, which became Silicon Graphics, and inspired me to work in the area of experimental computer science, and how important it was, which is really, of course, where I made my career.


LB: What has been the most exciting moment in your career to-date?

JLH: Well, that's hard to say. I think, getting back the first MIPS research processor at Stanford and seeing it work was a wonderful moment. Seeing that first R2000 processor ship, and the first boards, which were built then for Prime Computer, and the first actual complete system which we built. Those were truly extraordinary moments, and each one of the major projects I've worked on has had a wonderful moment in which we demonstrated that some concept we had really worked, and worked in practice, at least in prototype style, and those were always terrific moments.


LB: What do you think will be our greatest technological challenge in near future computing, and what must be done in order for us to properly weather this storm?

JLH: I'm gonna say there are two challenges. One is in the applications domain, and it comes down to compelling, what are sometimes called 'killer apps', that are easy to use for people, that really do serve a key function. I think with the invention of the iPhone, and various other things, we're seeing more and more of these, because we've opened up a whole new level of creativity. In fact, there's an interesting parallel between things like...well, what the PC earlier, the iPhone, and various game processors have done: they've liberated lots more people to contribute. And that's in exact parallel to what happened with the VLSI revolution in the 1980's, where the availability of integrated circuit technology to a broader design community - not just to the people that manufactured the integrated circuits: the Intels, and Zilogs, and AMDs, and Nationals of the world - but to a broader community, liberated a whole bunch of new designs. Out of that came, of course, both Silicon Graphics, and MIPS, among others. I think there is a big thing around applications, creative new applications that people really find compelling, and make their lives better, and really make important contributions. We also have some important technical challenges. The most important technical challenge is figuring out how to use parallelism/multi-core effectively; really effectively. We don't know how to do it yet, we don't know how to write software so that programs easily port between configurations of processors, and between, let's say...a two-processor core and a four-processor core. We have a lot of work to do, we're way behind on software, and that's going to be a major challenge to develop that software and framework for writing that code.


LB: One last question: if you had one thing to say to current, aspiring, and future hackers and innovators, what would it be?

JLH: Well, I think the thing that has made computing so exciting for me is that it's a discipline that's constantly changing. It's constantly providing new opportunities. The fact that hardware marches on, creates new capabilities, new opportunities, means that there are constantly new things to do. It's the next game, it's the next application, it's the next capability that you liberate that makes it so exciting. And, I think one of the interesting things about computing is, exciting as that core technology is, and driving the hardware faster, and figuring out how to get more done...it is the applications that people build, whether it's a video game, whether it's a great new capability on my iPhone, whether it's new capabilities on my desktop machine. That, I think, is what's going to keep this discipline exciting, and I think there will be lots of new opportunities to go to the next step, to create an even more realistic environment, and to really show people what this technology can do.