Combatting AI Energy Consumption through Renewable Sources

Could renewable sources of energy lead the way toward energy-efficient AI?

June 15, 2023

Combatting AI Energy Consumption through Renewable Sources

Although artificial intelligence has numerous applications and benefits across several sectors, business leaders must be cognizant of the technology’s costs – particularly when it comes to its energy consumption and carbon footprint, highlights Bill Haskell, CEO of Innventure.

Artificial intelligence has been touted as the future by many people thanks to its ability to improve efficiency in many industries. However, some very real costs of artificial intelligence use must be considered. Although much attention has been paid to the labor costs of AI – with many people worried they could lose their jobs to artificial intelligence – the energy consumption of AI technology should be a primary concern.

The True Cost of Artificial Intelligence

Many people do not understand just how much energy AI consumes. According to TechTarget, the total consumption of one model over nine days was no less than 27,648-kilowatt hours (kWh)Opens a new window . This staggering number is more than the amount of energy that three households use in an entire year – and it only took one program a little more than a week to consume that much energy. A new solution like Accelsius is required to address this pending limitation.

Although energy consumption has always been a concern with any new technology, the complexity of AI operations means it requires more resources than the simpler computers of yesteryear. Computers once only had the ability to solve mathematical problems. While these problems were complex – far more so than human capability – they still required comparatively little computational power to what artificial intelligence can do. For example, generating a life-like text response requires much more computational power (and therefore energy) than anything a computer could previously do.

The recent boom in interest in artificial intelligence technology, spurred by the introduction of highly visible and user-friendly tools like OpenAI’s ChatGPT, has led to an increased demand for AI infrastructure. The manufacturing, operation, and maintenance of this infrastructure accounts for substantial energy consumption – especially given that AI is being adopted in many different industries, and each use case typically requires servers, training, and data processing of its own.

Many pioneers in the artificial intelligence space like to describe every task an artificial intelligence completes as a “transaction” between a memory storage unit and a processor. Each of these “transactions” requires energy to be completed. It is also worth noting that the further apart these components are, the more energy is required to function. Even a distance of a few centimeters requires more processing energy than if they were part of the same unit. With data centers now taking up thousands of square feet, the distance must be taken into account regarding the energy consumption of AI.

See More: Sustainable Construction: Building Sustainability with AI

How AI Consumes Energy

Indeed, many AI offerings are powered by large facilities, often at 100% utilization rates. The impact of this on energy consumption is even more massive than one might expect. According to one professor and expert at the University of Pennsylvania School of EngineeringOpens a new window , each facility consumes between 20 and 40 megawatts of power annually. Even on the lower end of this spectrum, that is enough to power nearly 16,000 households – a staggering amount of energy.

One of the processes of AI that demands the most energy is AI training. Teaching AI through repetition requires substantial processing power, which in turn consumes much energy. However, this is a necessary evil, as training is necessary for the proper functioning of the technology. Without training, artificial intelligence will not exhibit the quality of adaptability necessary for these programs and algorithms to be used across many industries.

That’s not to mention the significant energy consumption that goes along with producing the hardware necessary for AI to function. Artificial intelligence sometimes requires thousands, if not more GPU and CPU chips. The production of these components accounts for substantial energy consumption in itself – and that’s before they are even used. Unfortunately, for artificial intelligence to conduct these more complex tasks and functions, the technology to support it must be adequate to handle this level of data processing.

Solving the Problem of AI Energy Consumption

At this point, many in the AI industry are beginning to question which uses of artificial intelligence are worth the cost of energy consumption and its carbon footprint and which are not. Although artificial intelligence has shown the potential to improve efficiency in many sectors, does this increased efficiency impact the bottom line if it costs more than is being saved in labor costs and waste? Other applications – such as the use of artificial intelligence for research and data analysis purposes in the medical field – make it easier to justify the costs of AI use.

As a solution to reduce this carbon footprint, many experts suggest pioneers in the artificial intelligence community implement renewable energy into their operations. If the energy needs of artificial intelligence cannot be addressed – which does not appear to be a practical solution now – the focus instead should be placed on finding a more sustainable method of meeting those extensive energy needs.

Looking to the future, many have wondered where AI is heading in terms of energy consumption. Some have understandably argued that as AI tech becomes more advanced, it will require even more computational power and intricate hardware, consuming even more energy than it already does. Others have argued that developments in AI technology will allow it to become better and more efficient – allowing it to do more complex tasks with less energy demand. Artificial intelligence technology is constantly evolving, and it seems that it is here to stay. However, one evolution that must undergo if we want to continue to use AI sustainably is energy efficiency.

Towards Sustainable Energy Efficiency

As newer processors are getting faster and hotter, we are approaching a time when traditional air conditioning will be unable to adequately cool the next generation of servers. This limit is referred to as the thermal wall. A new solution is required to address this pending limitation.

I believe that over time, the majority of the data center cooling market will shift to liquid cooling. Organizations need to focus and invest in technology designed to deliver the highest-performing cooling solution for servers, routers, switches, cell tower base stations, and other critical electronics platforms. It would be exciting to collaborate on a global scale on how we can meaningfully contribute to solving the cooling dilemma while substantially reducing both the energy requirement and the correlated carbon footprint.

How are you moving towards more energy-conscious processes? Share with us on FacebookOpens a new window , TwitterOpens a new window , and LinkedInOpens a new window . We’d love to hear from you!

Image Source: Shutterstock

MORE ON SUSTAINABILITY

Bill Haskell
CEO and Founding member of Innventure, Bill Haskell, has spent over 20 years laying the framework for Innventure’s methodology and approach to building highly-effective businesses. Bill has been director of over a dozen private and public companies during his 30 years of experience in company creation and development. Most recently, Bill has been a partner at a boutique investment bank focused on converting private companies into employee-owned enterprises. He is also an acting principal for a blockchain technology company.
Take me to Community
Do you still have questions? Head over to the Spiceworks Community to find answers.