Tech

These simple changes could make AI research much more energy efficient.

FROM first article A study of the technology’s environmental impact was published three years ago, and a movement has grown among researchers to self-report the energy they consume and the emissions generated by their work. Having accurate numbers is an important step towards making changes, but getting those numbers can actually be tricky.

“You can’t improve what you can’t measure,” says Jesse Dodge, Research Fellow Allen Institute II in Seattle. “The first step for us if we want to make progress on reducing emissions is to get good measurements.”

To this end, the Allen Institute recently partnered with Microsoft, artificial intelligence company Hugging Face, and three universities to create instrument that measures electricity consumption any machine learning program running on Azure, a Microsoft cloud service. It allows Azure users creating new models to view the total amount of power consumed by graphics processing units (GPUs)—computer chips designed to perform computations in parallel—at every stage of their project, from model selection to training and use. . It is the first major cloud service provider to provide users with access to information about the energy impact of their machine learning programs.

While there are already tools that measure power consumption and emissions from machine learning algorithms running on local servers, these tools don’t work when researchers use cloud services provided by companies like Microsoft, Amazon, and Google. These services do not provide users with direct information about the GPU, CPU, and memory resources consumed by their activities, and existing tools such as Carbontracker, Experiment Tracker, EnergyVis, and CodeCarbon need these values ​​to provide accurate estimates.

The new Azure tool, which debuted in October, currently reports energy usage rather than emissions. So, Dodge and other researchers figured out how to match energy consumption with emissions and presented companion article at this job in DATA, a major computer science conference, at the end of June. The researchers used a service called watttime to estimate outliers based on cloud server zip codes running 11 machine learning models.

They found that emissions could be significantly reduced if researchers used servers in certain geographical locations and at certain times of the day. Emissions from training small machine learning models can be reduced by up to 80% if training starts at a time when more renewable electricity is available on the grid, while emissions from large models can be reduced by more than 20% if training is suspended with renewable energy. there is little electricity, and it is turned on again when there is more of it.


Source link

Leave a Reply

Your email address will not be published.

Back to top button