To build a digital economy by 2030, the Egyptian government is betting on artificial intelligence (AI). In May 2021, the ICT Ministry launched the country’s first five-year National Artificial Intelligence Strategy, making it the first in the region to do so. This year, it published a new one through 2030.
The second edition targets “integrating AI tools into education, enhancing professional development and fostering robust international partnerships,” the 2025-2030 strategy document said. It aims to create a $430 million to $860 million fund to support AI companies, stimulate venture capital funding, encourage Egypt-based businesses to use AI, and increase public awareness.
To maximize benefits, the government should train AI systems on local data stored in domestic AI data centers. According to GeoPoll, a think tank, localizing AI training data “can dramatically improve AI applications in emerging markets.”
Ensuring sufficient energy for those data centers is critical. “The power sector is rapidly becoming a protagonist in the AI story,” said a McKinsey report published in September. “As the power ecosystem grapples with meeting data centers’ vicarious need for power, it faces substantial constraints, including limitations on reliable power sources, sustainability of power, and upstream infrastructure for power access,” among others.
Power to data centers
According to McKinsey, data centers have unique energy requirements compared to utility buildings and other facilities that operate around the clock. The former always operates at near maximum capacity, requiring backup energy storage systems to maintain top performance. Currently, energy accounts for 20% of a data center’s operating costs.
Another standout feature is “data center owners typically have a higher willingness than most other power customers to pay for power,” McKinsey said, mainly because data centers “have proved to be highly profitable for large companies” despite high energy consumption.
Lastly, unlike other facilities that rely on electricity, higher energy efficiency in data centers doesn’t translate to less consumption. “Breakthroughs that allow access to low cost, highly efficient computing … may increase the complexity of models that can be run and … enable more use cases that lead to more power demand,” McKinsey said.
According to a February research note from Goldman Sachs, data centers worldwide required approximately 59 gigawatts of electricity in 2024. Roughly 60% of this consumption was by large tech companies that offer public cloud services by clustering data centers in different jurisdictions (hyperscalers) and data center operators catering to a handful of large enterprise customers (wholesale operators). “Traditional corporate and telecom-owned data centers” accounted for the rest.
That energy consumption is increasing rapidly thanks to generative AI (GenAI), which generates bespoke replies to user queries by parsing the internet. Since ChatGPT, the first free GenAI chatbot, launched in 2022, data centers worldwide have consumed 37% more energy, according to Goldman Sachs.
To meet GenAI’s energy demands, investors are looking to a new category of data centers. “The AI-dedicated data center is an emerging class of infrastructure,” said Goldman Sachs. “It is designed for the unique properties of AI workloads,” such as high absolute power requirements, and more power density and cooling. “They’re usually owned by hyperscalers or wholesale operators,” Goldman Sachs said.
AI-fueled surge
Estimates of AI data centers’ energy demand through 2030 vary greatly, mainly because it is difficult to determine how much humans will rely on the technology and how energy-efficient the next generation of AI hardware and algorithms might be. “While there have been numerous forecasts around energy demands of AI and the efficiency gains it will unlock, it is hard to predict these with certainty, given the rapidly evolving landscape,” the World Economic Forum said in January.
According to Vijay Gadepelly, a senior scientist and principal investigator at MIT Lincoln Laboratory, a research company, “data centers accounted for 1% to 2% of overall global energy demand” as of January, “similar to what experts estimate for the airline industry.” By 2030, accelerated AI usage could increase data center energy consumption to 21% of total generated electricity.
McKinsey forecasts AI data centers’ energy consumption will triple between 2023 and 2030, mainly due to “skyrocketing computation and data demands … further accelerated by gains in computing capabilities, alongside reductions in chip efficiency relative to power consumption.” Another reason for growing energy demand is the “amount of time [needed] to double performance efficiency has increased from every two years to nearly every three years.”
Goldman Sachs expects the “arms race to develop” GenAI to increase electricity consumption by 50% between 2023 and 2027, rising by 165% by 2030. It added that most of the energy demand will be from newly built data centers. “The occupancy rate [of existing data centers] is projected to increase from around 85% in 2023 to [over] 95% in late 2026,” Goldman Sachs said. “That will likely be followed by a moderation starting in 2027, as more data centers come online and the AI-driven demand growth [rate] slows.”
Sustaining data centers
Manav Mittal, a senior project manager at Consumers Energy, a utility company, highlighted several strategies to ensure AI doesn’t become a “burden on the grid.” “With thoughtful strategies and a proactive approach,” he said, “We can minimize the environmental and infrastructural costs of these data centers.”
The “first line of defense” is designing energy-efficient data centers, Mittal said. That requires using “innovative cooling methods” and “energy-efficient processors and [graphics processing units] to improve the performance-to-energy ratio.”
Mittal stressed the importance of making the energy infrastructure feeding AI data centers more efficient. “The grid, as it is built today, was not built to handle the enormous … energy demands of AI data centers,” he said. As they “become larger and more prevalent, the grid needs to evolve to accommodate them.”
Using renewables to power those AI data centers will be essential. “The shift to clean energy is one of the most impactful ways to reduce the strain on the grid,” Mittal said. Additionally, those sources’ zero emissions would not contribute to climate change.
Nuclear power is one solution hyperscale AI data center operators are pursuing. “Several big tech companies looking for low carbon, round-the-clock energy signed contracts for new nuclear capacity” last year, Goldman Sachs said in January. Nuclear “will be a key part of a suite of new energy infrastructure built to meet all of the increased data-center power needs.”
However, McKinsey predicts most AI data centers will use more affordable and less demanding renewables. “Many technologies – offshore wind, fission, fusion, geothermal, gas carbon capture and storage, and clean fuels – may be able to supply their energy from medium to long-term.” However, “the bulk of new clean generation is expected to come from solar and onshore wind.”
That plays to Egypt’s green energy strengths, as the government is pushing the private sector to invest in solar energy in Aswan’s Benban Park and onshore wind farms along the Red Sea’s western coast. By 2030, the government wants to double the contribution of clean energy to the national power grid to reach 42%. By 2040, that percentage should reach 58%, Minister of Electricity Mohamed Shaker told the media in June.
Meeting those targets, if not exceeding them, will be essential to realizing Egypt’s long-term AI goals. “As we move from text to video to image, these AI models are growing larger and larger, and so is their energy impact,” said Gadepally of MIT Lincoln Laboratory. “This is going to grow into a pretty sizable amount of energy use and a growing contributor to emissions across the world.”