Systems Seminar: A Decade of Machine Learning Accelerators: Lessons Learned and Carbon Footprint, David Patterson

Systems Seminar

Title: A Decade of Machine Learning Accelerators: Lessons Learned and Carbon Footprint
Speaker: David Patterson
Date: May 17
Time: 4:00pm
Event link: https://stanford.zoom.us/j/97905322347?pwd=Z3B5Q1N5dW9BNTk1YS90SDZIUzN6Zz09


Abstract:

The success of deep neural networks (DNNs) from Machine Learning (ML) has inspired domain specific architectures (DSAs) for them. ML has two phases: training, which constructs accurate models, and inference, which serves those models. Google’s first generation DSA offered 50x improvement over conventional architectures for inference in 2015. Google next built the first production DSA supercomputer for the much harder problem of training. Subsequent generations greatly improved performance of both phases. We start with ten lessons learned, such as DNNs grow rapidly; workloads quickly evolve with DNN advances; the bottleneck is memory, not floating-point units; and semiconductor technology advances unequally. The rapid growth of DNNs rightfully raised concerns about their carbon footprint. The second part of the talk identifies the “4Ms” (Model, Machine, Mechanization, Map) that, if optimized, can reduce ML training energy by up to 100x and carbon emissions up to 1000x. By improving the 4Ms, ML held steady at <15% of Google’s total energy use despite it consuming ~75% of its floating point operations. Climate change is one of our most important problems, so ML papers should include emissions explicitly to foster competition on more than just model quality. External estimates have been off 100x–100,000x, so publishing emissions also ensures accurate accounting, which helps pinpoint the biggest challenges. With continuing focus on the 4Ms, we can realize the amazing potential of ML to positively impact many fields in a sustainable way. 



Bio:



David Patterson is the Pardee Professor of Computer Science, Emeritus at the University of California at Berkeley, which he joined after graduating from UCLA in 1976.

Dave's research style is to identify critical questions for the IT industry and gather inter-disciplinary groups of faculty and graduate students to answer them. The answer is typically embodied in demonstration systems, and these demonstration systems are later mirrored in commercial products. In addition to research impact, these projects train leaders of our field. The best known projects were Reduced Instruction Set Computers (RISC), Redundant Array of Inexpensive Disks (RAID), and Networks of Workstations (NOW), each of which helped lead to billion dollar industries.

A measure of the success of projects is the list of awards won by Patterson and as his teammates: the ACM A.M. Turing Award, the C & C Prize, the IEEE von Neumann Medal, the IEEE Johnson Storage Award, the SIGMOD Test of Time award, the ACM-IEEE Eckert-Mauchly Award, and the Katayanagi Prize. He was also elected to both AAAS societies, the National Academy of Engineering, the National Academy of Sciences, the Silicon Valley Engineering Hall of Fame, and to be a Fellow of the Computer History Museum. The full list includes about 40 awards for research, teaching, and service.

In his spare time he coauthored seven books---including two with John Hennessy who is past President of Stanford University and with whom he shared the Turing Award--- Patterson also served as Chair of the Computer Science Division at UC Berkeley, Chair of the Computing Research Association, and President of ACM. He is currently Vice-Chair of the Board of Directors of the RISC-V Foundation.

Date: 
Tuesday, May 17, 2022 - 4:00pm to 5:00pm