Sources: EPRI, Powering Intelligence; Luccioni et al, Power Hungry Processing; blog post by computer science professor Wim Vanderbauwhede. All freely available to read, I can provide direct links if folks are curious
Basically, data centers are growing rapidly and taking MORE power (thanks, GPUs) after years of getting more efficient/ flat electricity need, and it’s straining the US power grid.
US data centers currently consume more power than the state of New Jersey
Direct link please, as I’m curious on the methodology. I played a bit with stable diffusion when it was new. Prompts took a few seconds to resolve on my desktop. I wasn’t running a voltmeter on it, but there’s no way the energy usage matches up with this