Lost in all of the hype around how artificial intelligence (AI) advances are going to change everything from how people work to how they function is just how much electrical power it will take to make this happen, and, equally important, what impact that will have in addressing legally binding sustainability mandates such as the Paris Agreement.
As Forbes noted in an article released late last year, “simply put, data centres and edge networks enable the magic that emerges from AI applications. They do so by using high-performance computing (HPC) clusters, which are made up of multiple servers connected through high-speed networks that allow for parallel processing and fast training times.
“These hard-working machines require considerable power and, as a result, generate a lot of heat. Because of this data- and compute-intensive AI workload, the need for high-density infrastructure, led by high-density cooling and power, is emerging at a record rate.”
Andrew Eppich, managing director of data centre provider Equinix Canada, said the company believes that “AI can provide sustainability-related benefits such as optimizing consumption in energy-intensive sectors, modeling and analyzing large data sets for weather and climate-related issues, and image recognition for wildlife conservation, to name a few.
“But we also recognize that training these models and analyzing massive data sets will require intensive computing resources that consume energy. This is why Equinix is committed to designing, building, and operating highly efficient data centres utilizing clean and renewable energy for our operations and our customers’ workloads.”
The highly efficient data centre capacity, said Eppich, will be “foundational to mitigating the carbon impacts of advanced computing such as AI and machine learning (ML).”
Jim Kalogiros, vice president of secure power at Schneider Electric Canada, said there is an interesting dichotomy now in play in that “data centres are going to be critical to driving the future in terms of technological advancement, and that is going to come by way of AI for the most part in the next 10-15 years. The catalyst, though, is that is going to be increasing our consumption of electricity and increasing emissions, so how do we do this in a mindful way without impacting the greater good, or the environment, or the climate?”
Schneider Electric, he said, has developed analytical software that allows data centre operators to not only lower their energy consumption, but reduce their energy costs. “With AI and the predictive nature of our software, it will tell you, ‘during these times of the day, you’re going to move your workload from those servers to this server, and it is going to reduce your energy consumption by X, which means you are going to get a carbon savings of Y.’
“If you look at the studies, AI compute is going to drive the power consumption story by four or five times, which means there are a multitude of areas that are going to be impacted. Do we have the power grid that is going to be available to support this growth? Do we have the equipment that is going to be able to support this growth? Do we have the cooling solutions that are going to be able to support this growth?
“Now, the great thing is that we are working at developing a lot of new software and hardware to help drive this call of revolution that’s happening from an AI standpoint, but the benefit is that AI is also going to help itself by coming up with solutions to solve a lot of these problems. It is like a blessing and a curse. We are working with a technology that is almost self-curing to some extent, because it is going to tell us what it needs to do to become better.”
Still, Kalogiros said that, in the short term, “I think we’ve got some challenges that we really have to face, which is, until we get these generative AI (GenAI) and these technologies up and running and out of incubation and humming along, we’re going to have to solve the power issue, we’re going to have to solve the cooling issue and we are going to have to be able to do it in a pragmatic way.”
Eppich, meanwhile, said Canadian data centres, like their global counterparts, face challenges in transitioning to more sustainable practices. “One of the significant challenges is that Canada’s energy mix includes both renewable and non-renewable sources,” he noted.
“Ensuring that a data centre operates at the speeds and reliability that customers expect without compromising on sustainability is an important indicator of a company that has invested strategically in sustainable practices within their business models.”
To overcome the challenge and support a target of 100 per cent clean and renewable energy coverage globally by 2030, he said that “Equinix is taking steps to build our renewable energy portfolio. Power purchase agreements (PPAs) are long-term contracts between electricity buyers and renewable energy generators like wind and solar farms.
“PPAs are a high-impact way for data centres to procure renewables and add new renewable energy sources to local markets. In 2023, we rapidly scaled our PPA purchasing globally, increasing our agreements by 300 per cent over 2022.”
Kalogiros contended that, while the Paris Agreement forces “people to do the right thing and legislation definitely helps, I would say there is an incentive for the data centre community to (reduce emissions) on their own for a multitude of reasons.
“The net benefit of being able to implement solutions that are going to drive efficiencies into your facility is that they are not just going to help the climate, they are going to help your bottom line. That is the beauty of this whole story.”The post Drilling down into the impact AI is having on data centres, climate change first appeared on IT World Canada.