Artificial intelligence’s rapid growth has led to advancements like autonomous vehicles, virtual reality, and ChatGPT. But AI technologies and training AI models require a lot of energy, increasing concerns about the environmental impact of AI and its sustainability.
To put AI’s energy usage into perspective, it took nine days to train one of OpenAI’s early model chatbots, MegatronLM. According to TechTarget, during those nine days, 27,648 kilowatt hours of energy was used. That’s about the same amount of energy used by three U.S. homes over the course of an entire year.
In an effort to make AI more sustainable, Walid Saad, a professor in the Bradley Department of Electrical and Computer Engineering at Virginia Tech and the Next-G Faculty Lead for Virginia Tech’s Innovation Campus, is exploring the concept of green federated learning, or green FL in partnership with Amazon. Federated learning is a distributed machine learning technique that enables the deployment of collaborative AI algorithms.
Saad and his team want to make federated learning systems, and more generally distributed AI systems, more sustainable and energy-efficient during both the training phase and inference phase, when algorithms are used to execute real-world AI tasks.
By minimizing the energy expenditure of these algorithms and improving scalability to a larger number of wirelessly interconnected devices, the environmental impact of these technologies can be greatly reduced.
“As more and more people adopt these types of technologies at scale (ChatGPT and Large Language Models being a case-in-point), we must find ways to make them more sustainable, energy-efficient, and friendly to the environment,” said Saad. “If not, we could reach a point where the benefits of AI become more of an ethical concern, particularly when we think about our carbon footprint.”