AI advancements are significantly dependent on energy resources. Caleb Writes Code, on September 7, 2025, explores this critical intersection of AI and energy. The discussion is pivotal as it sheds light on the energy demands of training and deploying large language models like GPT-4, which required an extensive amount of computational power and energy—estimated to be comparable to the monthly energy consumption of a small city. This insight is especially pertinent as companies like OpenAI strive to increase the scale and efficiency of their AI systems. Caleb effectively explains the intricate details of computational parallelism, cooling needs, and the impressive infrastructure required in the AI industry. However, the video’s optimism about future energy infrastructures is challenged by concerns about escalating energy demands against finite resources. While it highlights the advancements in energy-efficient hardware and parallel processing methods, the concerns about regulatory challenges and ecological implications remain crucially understated. Both the U.S and China are depicted as competitors in the AI domain, leveraging different strategies to meet energy supply demands. This narrative invites viewers to consider the broader implications of the ever-increasing power consumption in AI innovation and questions the sustainability of this rapid growth. Notably, the comparison to China’s state-driven energy capacity offers a stark contrast to the U.S.’s corporate-led efforts and poses questions about the efficiency and sustainability of these differing approaches.