The Challenge: AI’s Energy Crisis
The exponential growth of artificial intelligence has triggered an unsustainable energy demand, with global consequences. A single ChatGPT query consumes 2.9 watt-hours—nearly ten times the energy of a standard Google search (0.3 watt-hours)—while data centres now emit over 8,000 gigatonnes of CO₂ annually, surpassing the aviation sector’s carbon footprint.
By 2030, synthetic data generation is projected to grow fivefold, further straining power grids already burdened by energy-intensive workloads like Large Language Models (LLMs), video streaming, and relentless data transmission. At the heart of this crisis lies outdated CMOS-based hardware, a relic of the fossil-fuel era, ill-equipped to meet the efficiency demands of modern AI. Without radical innovation, the environmental cost of computation will render even the most advanced AI systems untenable in a climate-conscious world.