University of Minnesota's AI Breakthrough Slashes Energy Use

University of Minnesota's AI Breakthrough Slashes Energy Use

2024-08-02 semicon

Minneapolis, Friday, 2 August 2024.
Researchers at the University of Minnesota Twin Cities have developed a groundbreaking device that could reduce AI energy consumption by up to 1,000 times. This computational random-access memory (CRAM) innovation processes data within the memory array, eliminating power-intensive data transfers. As AI energy use is projected to double by 2026, this technology offers a promising solution for sustainable AI computing.

Revolutionizing AI with CRAM

The University of Minnesota’s Twin Cities campus has made a significant leap in the semiconductor industry with the development of Computational Random-Access Memory (CRAM). This new technology is poised to revolutionize AI computing by drastically reducing energy consumption. At its core, CRAM processes data entirely within the memory array itself, circumventing the need for energy-intensive data transfers typical in traditional AI operations.

How CRAM Works

CRAM operates by performing computations directly within memory cells, leveraging the unique properties of Magnetic Tunnel Junctions (MTJs). MTJs use the spin of electrons rather than their charge to store data, providing an energy-efficient alternative to conventional transistor-based memory. This integration of logic and memory breaks away from the traditional von Neumann architecture, where data must constantly move between separate logic and memory units, consuming significant power.

The Benefits of CRAM

The primary advantage of CRAM is its potential to reduce energy consumption by a factor of 1,000. In practical terms, this could mean a reduction in energy use for AI applications from 1,000 TWh to as low as 1 TWh. Given that the International Energy Agency forecasts AI energy consumption to double to 1,000 TWh by 2026, equating to Japan’s total electricity consumption, CRAM offers a critical solution to this impending energy crisis. Additionally, CRAM’s efficiency translates to faster processing speeds and lower operational costs, making AI applications more sustainable and economically viable.

The Innovators Behind CRAM

This breakthrough is attributed to the collaborative efforts of a multidisciplinary team at the University of Minnesota’s College of Science and Engineering. Key figures include Yang Lv, a postdoctoral researcher, and Jian-Ping Wang, a Distinguished McKnight Professor and Robert F. Hartmann Chair in the Department of Electrical and Computer Engineering. Their work, spanning over two decades, has culminated in multiple patents and the successful experimental demonstration of CRAM.

Industry Collaboration and Future Prospects

The University of Minnesota team is now working with industry leaders to scale up CRAM technology. This collaboration aims to produce hardware that can be integrated into existing AI systems, thereby extending the benefits of CRAM to a wider array of applications. The team’s research has been supported by grants from the U.S. Defense Advanced Research Projects Agency (DARPA), the National Institute of Standards and Technology (NIST), the National Science Foundation (NSF), and Cisco Inc., ensuring a robust foundation for future advancements.

A Sustainable Future for AI

As the demand for AI continues to grow, so does the need for sustainable computing solutions. CRAM represents a significant step towards meeting these demands, offering a path to more energy-efficient and cost-effective AI technologies. By integrating computation and memory, CRAM not only reduces energy consumption but also enhances the performance and flexibility of AI applications, paving the way for a more sustainable future in the field of artificial intelligence.

Bronnen


www.nature.com www.innovationnewsnetwork.com energy efficiency AI device cse.umn.edu