It is no coincidence that the rise of AI as a valuable tool for science has come at an interesting time for computing, where the end of Moore’s Law has meant that energy constraints are increasingly driving hardware innovation. AI is playing a growing role in shaping computing architecture, even at the largest scales. The US Department of Energy (DOE) recently announced the USA's first exascale supercomputers, coming in 2022 at a combined cost of $1.1B. These will be based on GPU architectures, and have been designed specifically for large-scale AI science applications that will be partly developed by the US DOE ExaLearn program, a new co-design center for exascale machine learning technologies. In addition, many other specialized energy-efficient architectures designed for AI workloads have emerged in the past few years, including FPGAs, custom ASICs like Google’s TPU and Graphcore, and neuromorphic machines. In this talk I will explore the interplay between energy-efficient hardware and the rise of AI as a serious factor in scientific computing. I will focus on the challenges and opportunities for AI in the exascale era, with a focus on applications to cosmology.
Speaker: Debbie Bard, UC Berkeley
See weblink for connection information
Contact:Website: Click to Visit
Save this Event:iCalendar
Windows Live Calendar