Fresh report analysing total power usage of AI servers suggests consumption is through the roof!
French firm Schneider Electric reports that power consumption of AI servers will total around 4.3GW in 2023, which is slightly lower than the power consumption of the nation of Cyprus (4.7GW) in 2021. The company anticipates that power consumption of AI workloads will grow at a compound annual growth rate (CAGR) of 26% to 36%, which suggests that by 2028, AI workloads will consume between 13.5GW and 20GW, which is more than Iceland consumed in 2021.
In 2023, the total power consumption of all data centres is estimated to be 54GW, with AI workloads accounting for 4.3GW of this demand, according to Schneider Electric. Within these AI workloads, the distribution between training and inference is charcterised by 20% of the power being consumed for training purposes, and 80% allocated to inference tasks. This means that AI workloads will be responsible for approximately 8% of the total power consumption of all data centres this year. Schneider Electric recommends transitioning from the conventional 120/208V distribution to 240/415V to better accommodate the high- power densities of AI workloads. For cooling, a shift from air cooling to liquid cooling is advised to enhance processor reliability and energy efficiency, although immersive cooling might produce even better results. The racks used should also be more capacious, with specifications such as being at least 750mm wide and having a static weight capacity greater than 1,800kg.
See the full report here:
If corporations truly cared about the environment, then they's be using LESS resources and not more.