Skip to main content Skip to secondary navigation
Main content start

Machine learning could speed the arrival of ultra-fast-charging electric cars

Using artificial intelligence, a Stanford-led research team has slashed battery testing times – a key barrier to longer-lasting, faster-charging batteries for electric vehicles – by nearly fifteenfold.

Cascade of yellow light illustration
“The bigger hope is to help the process of scientific discovery itself,” says Stanford professor Stefano Ermon. “We’re asking: Can we design these methods to come up with hypotheses automatically? Can they help us extract knowledge that humans could not?"

Battery performance can make or break the electric vehicle experience, from driving range to charging time to the lifetime of the car. Now, artificial intelligence has made dreams like recharging an EV in the time it takes to stop at a gas station a more likely reality, and could help improve other aspects of battery technology.

For decades, advances in electric vehicle batteries have been limited by a major bottleneck: evaluation times. At every stage of the battery development process, new technologies must be tested for months or even years to determine how long they will last. But now, a team led by Stanford professors Stefano Ermon and William Chueh has developed a machine learning-based method that slashes these testing times by 98 percent. Although the group tested their method on battery charge speed, they said it can be applied to numerous other parts of the battery development pipeline and even to non-energy technologies.

“In battery testing, you have to try a massive number of things, because the performance you get will vary drastically,” said Ermon, an assistant professor of computer science. “With AI, we’re able to quickly identify the most promising approaches and cut out a lot of unnecessary experiments.”

The study, published by Nature on Feb. 19, was part of a larger collaboration among scientists from Stanford, MIT and the Toyota Research Institute that bridges foundational academic research and real-world industry applications. The goal: finding the best method for charging an EV battery in 10 minutes that maximizes the battery’s overall lifetime. The researchers wrote a program that, based on only a few charging cycles, predicted how batteries would respond to different charging approaches. The software also decided in real time what charging approaches to focus on or ignore. By reducing both the length and number of trials, the researchers cut the testing process from almost two years to 16 days.

“We figured out how to greatly accelerate the testing process for extreme fast charging,” said Peter Attia, who co-led the study while he was a graduate student. “What’s really exciting, though, is the method. We can apply this approach to many other problems that, right now, are holding back battery development for months or years.”

A smarter approach to battery testing

Designing ultra-fast-charging batteries is a major challenge, mainly because it is difficult to make them last. The intensity of the faster charge puts greater strain on the battery, which often causes it to fail early. To prevent this damage to the battery pack, a component that accounts for a large chunk of an electric car’s total cost, battery engineers must test an exhaustive series of charging methods to find the ones that work best.

The new research sought to optimize this process. At the outset, the team saw that fast-charging optimization amounted to many trial-and-error tests – something that is inefficient for humans, but the perfect problem for a machine.

Battery research team
The research team included, from left, Stanford Professor William Chueh, Toyota Research Institute scientist Muratahan Aykol, Stanford PhD student Aditya Grover, Stanford PhD alumnus Peter Attia, Stanford Professor Stefano Ermon and TRI scientist Patrick Herring. (Image credit: Farrin Abbott)

“Machine learning is trial-and-error, but in a smarter way,” said Aditya Grover, a graduate student in computer science who also co-led the study. “Computers are far better than us at figuring out when to explore – try new and different approaches – and when to exploit, or zero in, on the most promising ones.”

The team used this power to their advantage in two key ways. First, they used it to reduce the time per cycling experiment. In a previous study, the researchers found that instead of charging and recharging every battery until it failed – the usual way of testing a battery’s lifetime –they could predict how long a battery would last after only its first 100 charging cycles. This is because the machine learning system, after being trained on a few batteries cycled to failure, could find patterns in the early data that presaged how long a battery would last.

Second, machine learning reduced the number of methods they had to test. Instead of testing every possible charging method equally, or relying on intuition, the computer learned from its experiences to quickly find the best protocols to test.

By testing fewer methods for fewer cycles, the study’s authors quickly found an optimal ultra-fast-charging protocol for their battery. In addition to dramatically speeding up the testing process, the computer’s solution was also better – and much more unusual – than what a battery scientist would likely have devised, said Ermon.

“It gave us this surprisingly simple charging protocol – something we didn’t expect,” Ermon said. “That’s the difference between a human and a machine: The machine is not biased by human intuition, which is powerful but sometimes misleading.”

Wider applications

The researchers said their approach could accelerate nearly every piece of the battery development pipeline: from designing the chemistry of a battery to determining its size and shape, to finding better systems for manufacturing and storage. This would have broad implications not only for electric vehicles but for other types of energy storage, a key requirement for making the switch to wind and solar power on a global scale.

“This is a new way of doing battery development,” said Patrick Herring, co-author of the study and a scientist at the Toyota Research Institute. “Having data that you can share among a large number of people in academia and industry, and that is automatically analyzed, enables much faster innovation.”

The study’s machine learning and data collection system will be made available for future battery scientists to freely use, Herring added. By using this system to optimize other parts of the process with machine learning, battery development – and the arrival of newer, better technologies – could accelerate by an order of magnitude or more, he said.

The potential of the study’s method extends even beyond the world of batteries, Ermon said. Other big data testing problems, from drug development to optimizing the performance of X-rays and lasers, could also be revolutionized by the use of machine learning optimization. And ultimately, he said, it could even help to optimize one of the most fundamental processes of all.

“The bigger hope is to help the process of scientific discovery itself,” Ermon said. “We’re asking: Can we design these methods to come up with hypotheses automatically? Can they help us extract knowledge that humans could not? As we get better and better algorithms, we hope the whole scientific discovery process may drastically speed up.”

Additional Stanford co-authors include Norman Jin, Yang-Hung Liao, Michael H. Chen, Bryan Cheong, Nicholas Perkins, Zi Yang, Stephen Harris and Todor M. Markov. Additional co-authors are from MIT and the Toyota Research Institute.

This work was supported by Stanford, the Toyota Research Institute, the National Science Foundation, the U.S. Department of Energy and Microsoft.

Media Contacts

Mark Golden
Stanford Precourt Institute for Energy
(650) 724-1629, mark.golden@stanford.edu

Stephen Hughes
Toyota Research Institute
(650) 422-8947, stephen.hughes@tri.global

Explore More