CNBC reports on the overwhelming power demands of generative AI and its implications for global energy infrastructure. With over 8,000 data centers worldwide, the current capacity is insufficient to meet the surging energy needs driven by AI technologies such as ChatGPT, Google’s Gemini, and Microsoft’s Co-Pilot. Each query to ChatGPT consumes nearly ten times the energy of a standard Google search, highlighting the escalating energy consumption associated with AI applications. Training large language models generates significant CO2 emissions, comparable to the lifetime emissions of five gas-powered cars. As data centers expand to accommodate AI, their energy consumption is projected to rise dramatically, potentially reaching 16% of total U.S. power consumption by 2030, up from just 2.5% before the AI boom. The aging electrical grid struggles to support this increased demand, leading to potential blackouts and prompting data center companies to build facilities closer to energy sources. Innovative solutions, such as generating power on-site and utilizing renewable energy, are being explored to mitigate the crisis. The report emphasizes the urgent need for advancements in energy infrastructure and efficiency to sustain the rapid growth of generative AI and its associated power requirements.