Generative AI doesn't have to be a power hog after all

1 week ago 4
building with trees
Yuichiro Chino/Getty Images

Like a mid-life crisis, with dad throwing out his fuel-efficient sedan for a gas-guzzling muscle car, the rise of generative AI (gen AI) and its power-hungry GPUs looked to be the death knell for any concerns about compute power efficiency and corporate sustainability.

Early numbers seemed to show this nightmare coming true. After years of steady rises in commitments to reducing power consumption and creating efficient computing capabilities, Aberdeen's research in 2023 showed for the first time a small decline in these metrics, as businesses put aside efficiency commitments in the race to deploy powerful computing infrastructures to leverage artificial intelligence (AI).

Also: Gartner's 2025 tech trends show how your business needs to adapt - and fast

It's hard to fault businesses for this lapse. After all, the breathless hype that accompanied the first wave of gen AI left many organizations believing that they had to have AI solutions and capabilities as fast as possible. 

This accompanied stories about GPU shortages and the rise in power of a certain graphics chip company that put an emphasis on compute power over power conservation. We even heard from some smaller businesses buying up gaming systems for AI creation. For a little while there, it looked like the age of sustainability, power conservation, and zero-carbon commitments was on pretty shaky ground.

But, along with a reduction in hype and the rise of reasonable skepticism about some of the more extreme AI predictions, trends in using and deploying generative AI show that many businesses can take advantage of AI while still reducing power consumption and costs. 

To understand some of these trends, Aberdeen recently completed a new survey into how businesses use and plan to use AI.  In this survey we asked about the drivers pushing businesses to use AI, the challenges that they face, the key strategies and technologies they are using, and what benefits they may have already seen from AI use. The findings will be presented at the upcoming SpiceWorld conference this November in Austin.

Also: Spiceworks invites ZDNET readers to its SpiceWorld 2024 IT conference - with a 40% discount

As one would expect, most organizations are taking AI very seriously, with over 90% using AI in some form and 25% making dedicated strategic investments in building AI (a number expected to grow by over 20% in the next six months). However, interestingly, we are also seeing the rise of a more practical and efficiency-focused approach to using gen AI. 

To a large degree, businesses are looking to leverage AI internally, with their own data, and using small custom language models. At first glance, this might lead you to worry that they will be investing in lots of power-hungry, GPU-intensive systems. 

But increasingly, businesses are finding that building small custom models doesn't require massive banks of powerful systems (with some companies I've heard from building their models on a single engineer's laptop) and that power consumption can be kept to a minimum. In fact, when we asked what technologies businesses were purchasing to support their AI initiatives, GPUs had dropped to 4th place while increased storage capacity and hybrid cloud capabilities topped the list.

These trends make sense as businesses do need to manage and store the massive amounts of data needed to feed their AI models. And increasingly, hybrid cloud models that take advantage of the flexibility of the cloud and the security and performance of on-premise are proving to be core infrastructures for deploying AI solutions.

Also: The secret to successful digital initiatives is pretty simple, according to Gartner

Most importantly, businesses and AI users are no longer prioritizing compute power over efficiency when it comes to using generative AI. In our research, we saw that the decline in concern over power usage had reversed itself and instead, we saw a 10% increase in concerns around power consumption, as these organizations are understanding that they can benefit from AI without throwing power conservation and sustainability out the window.

Of course, when it comes to the biggest AI vendors in the world, power is still a necessity, which is why we see major players like Amazon, Microsoft, and Google making exclusivity deals for power from new nuclear plants. But there are other trends and potential new technologies around AI that could even reduce power consumption for these giants.

Also: The future of computing must be more sustainable, even as AI demand fuels energy use

Researchers in both the private and public sectors are working on and demonstrating new techniques using Linear-Complexity Multiplication and Matrix Multiplication that have the potential to massively reduce the power consumption of generative AI. And in the more immediate timeframe, we've seen several new server and system developments from major vendors that are designed to be more efficient when it comes to building and developing AI solutions. 

All of this means that the era of generative AI and power conservation being mutually exclusive may be coming to an end. Businesses care about power savings and efficiencies and are finding that they can both benefit from AI and meet their power savings goals. 

Sort of like if that mid-life crisis dad ditched the sports car for a high-performance EV.

Read Entire Article