Over the past decade, Luiz has played a leading role in the definition and implementation of Google’s cluster architecture which has become a blueprint for the computing systems behind the world’s leading Internet services. As the first manager of Google’s Platforms Engineering team, he helped deliver multiple generations of cluster systems, including the world’s first container-based data center. His theoretical and engineering insights into the requirements of this class of machinery have influenced the processor industry roadmap towards more effective products for server-class computing. His book "The Datacenter as a Computer" (co-authored with Urs Hoelzle) was the first authoritative publication describing these so-called warehouse-scale computers for computer systems professionals and researchers. Luiz was among the first computer scientists to recognize and articulate the importance of energy-related costs for large data centers, and identify energy proportionality as a key property of energy efficient data centers. Prior to Google, at Digital Equipment Corporation's Western Research Laboratory, he worked on Piranha, a pioneering chip-multiprocessing architecture that inspired today’s popular multi-core products. As one of the lead architects and designers of Piranha, his papers, ideas and numerous presentations stimulated much of the research that led to products decades later.
In the last four years at Google, Dick led the team developing new camera systems and improved photographic image processing for Street View, while leading another team developing technologies for machine hearing and their application to sound retrieval and ranking. He is now writing a book with Cambridge University Press, and will teach a Stanford course this fall on "Human and Machine Hearing," returning to a line of work that he carried out at Xerox, Schlumberger, and Apple while also doing the optical mouse, bit-serial VLSI computing machines, and handwriting recognition. The optical mouse (1980) is especially called out in the citation, because it exemplifies the field of "semi-digital" techniques that he developed, which also led to his work on the first single-chip Ethernet device. And more recently, as chief scientist at Foveon, Dick invented and developed several new techniques for color image sensing and processing, and delivered acclaimed cameras and end-user software. A hallmark of Dick’s work during his distinguished career has been a practical interplay between theory, including biological theory, and practical computing.
Muthu has made significant contributions to the theory and practice of Internet ad systems during his more than four years at Google. Muthu's breakthrough WWW’09 paper presented a general stable matching framework that produces a (desirable) truthful mechanism capturing all of the common variations and more, in contradiction to prevailing wisdom. In display ads, where image, video and other types of ads are shown as users browse, Muthu led Ad Exchange at Google, to automate placement of display ads that were previously negotiated offline by sales teams. Prior to Google, Muthu was well known for his pioneering work in the area of data stream algorithmics (including a definitive book on the subject), which led to theoretical and practical advances still in use today to monitor the health and smooth operation of the Internet. Muthu has a talent for bringing new perspectives to longstanding open problems as exemplified in the work he did on string processing. Muthu has made influential contributions to many other areas and problems including IP networks, data compression, scheduling, computational biology, distributed algorithms and database technology. As an educator, Muthu’s avant garde teaching style won him the Award for Excellence in Graduate Teaching at Rutgers CS, where is on the faculty. As a student remarked in his blog: "there is a magic in his class which kinda spellbinds you and it doesn't feel like a class. It’s more like a family sitting down for dinner to discuss some real world problems. It was always like that even when we were 40 people jammed in for cs-513."
For the past three years, Fernando has been leading some of Google’s most advanced natural language understanding efforts and some of the most important applications of machine learning technology. He has just the right mix of forward thinking ideas and the ability to put ideas into practice. With this balance, Fernando has has helped his team of research scientists apply their ideas at the scale needed for Google. From when he wrote the first Prolog compiler (for the PDP-10 with David Warren) to his days as Chair at University of Pennsylvania, Fernando has demonstrated a unique understanding of the challenges and opportunities that faced companies like Google with their unprecedented access to massive data sets and its application to the world of speech recognition, natural language processing and machine translation. At SRI, he pioneered probabilistic language models at a time when logic-based models were more popular. At AT&T, his work on a toolkit for finite-state models became an industry standard, both as a useful piece of software and in setting the direction for building ever larger language models. And his year at WhizBang had an influence on other leaders of the field, such as Andrew McCallum at University of Massachusetts and John Lafferty and Tom Mitchell at Carnegie Mellon University, with whom Fernando developed the Conditional Random Field model for sequence processing that has become one of the leading tools of the trade.