Alberta’s tech sector is embracing an AI data centre boom. Will it pay off?

Tech leaders say energy-hungry infrastructure will catalyze startup creation, but critics warn of "oversold" benefits.

In Alberta, a new long-term vision is afoot: to shape the province not only into a destination for AI development, but the backbone of the nation’s AI infrastructure. 

As the provincial government moves forward on an ambitious $100-billion plan to build AI data centres, local tech leaders are enthusiastic. They say the projects will lead to company creation, reinforce control over Canadian data, and build on the tech sector’s momentum. But these proposed projects, which are multi-year efforts, could potentially strain the province’s electrical grid and significantly drive up carbon emissions. 

“Alberta aims…to be the host jurisdiction of that ripple effect of technological advancement in every industry.”

Nate Glubish
Alberta Minister of Technology and Innovation

Alberta is looking to attract $100 billion of investment through its AI data centre strategy. Released in December, the approach hinges on pushing Alberta as a cheap source of abundant natural gas, a reduced-regulation environment for infrastructure projects, a cold climate for easy cooling, and a comparatively low tax burden for companies. The plan includes a comprehensive review of all regulatory timelines to cut “red tape,” a data centre “concierge” program to streamline projects, new programs and funding opportunities for AI development, and collaborations with municipalities as well as Indigenous groups.

Alberta’s technology and innovation minister, Nate Glubish, has been vocal about his desire to make Alberta the premier destination for AI data centres. He says this will make Alberta more attractive for AI companies looking to set up shop and make access to AI compute more affordable and reliable. 

“Alberta aims…to be the host jurisdiction of that ripple effect of technological advancement in every industry,” Glubish told BetaKit in an interview.

Traditional data centres provide computing power to store data and run applications. Canada is home to roughly 240 data centres, according to the Canada Energy Regulator, which is part of the federal natural resources ministry. AI data centres, which can power training and inference of large-language models (LLMs), are significantly more energy-intensive. While traditional data centres require between five and 10 megawatts (MW) of power, one AI “hyperscale” data centre typically demands more than 100 MW, according to the International Energy Agency (IEA). 

RELATED: Calgary says it has the makings of an innovation capital. What’s the next step?

Alberta’s plan aligns with a federal push to build out Canadian compute capacity. The federal government committed $2 billion toward AI infrastructure through its Canadian Sovereign AI Compute Strategy in December. The initiative aims to increase access to computing power for Canadian companies and researchers, the government said, and includes a $300-million fund for small companies. 

The newly appointed Minister of AI and Digital Innovation, Evan Solomon, wants to make Canada a world leader in AI, he said via video remarks at the Upper Bound AI conference in Edmonton earlier this month. In addition to appointing Solomon, Prime Minister Mark Carney is dictating that his cabinet “deploy AI at scale” to “boost productivity” (the impact of AI on productivity is a mixed picture). 

Local tech leaders say AI data centres would provide a boon to the Alberta tech ecosystem. Josh Rainbow, CEO of Future Summit, which organizes a variety of tech events, said access to more AI compute would have the downstream effect of building more companies in the region. He said it will create a more robust ecosystem uniting software, infrastructure, and energy sectors.

“I think we have a generational opportunity,” Rainbow told BetaKit.

Cory Janssen, CEO of Edmonton-based venture studio and AI lab AltaML, agreed that expanded access to local compute will result in more AI companies getting their start, and even lead to “the next Cohere.” 

“AI is underhyped,” Janssen told BetaKit, adding that AI data centre demand is only going to grow. 

Part of the advantage of building AI data centres in Alberta, Janssen said, is data sovereignty for Canadian companies. Hosting data servers in Canada has become a hot topic as the country seeks to build out a domestic stack for AI compute amid trade tensions with the United States (US). 

Telecommunications giants Bell and Telus recently announced Canadian AI data centre projects. Both use exclusive partnerships with American hardware providers Groq and Nvidia. Bell’s AI data centre project lead Dan Rink said at Web Summit Vancouver that Groq does not store data processed through its hardware, allowing for a Canadian-controlled data stack. 

“Sovereign compute matters,” Janssen said. “It doesn’t matter if the data centre [is] in Canada. If it’s through a foreign-owned firm, like through an American firm, the administration can get access to that data.”

The US CLOUD Act permits US law enforcement to request data hosted by US providers in the US or elsewhere only if it obtains a warrant related to criminal proceedings.

Aside from the geopolitical considerations, some political leaders argue that the economic benefits of building AI data centres have been overblown. Naheed Nenshi, former mayor of Calgary and leader of the Alberta NDP, is not convinced it’s the best economic development strategy.

RELATED: Bell to build six AI data centres in Canada as telcos compete on infrastructure

“Data centres are not actually very job-creating,” Nenshi told BetaKit. “It may take 1,000 people to build, but maybe 100 people to run it. So the benefits, particularly in rural areas, have been largely oversold.”

He added that the province requires a more fulsome economic development strategy beyond contracts for large infrastructure projects. 

Glubish claimed that AI data centres create roughly 3,500 jobs for each gigawatt of infrastructure. Once the centres are built, he claims each one would create 400 to 1,000 long-term jobs to operate and oversee the infrastructure.  

Vast compute, vaster demand 

Since the provincial plan’s rollout in December, proposals for AI data centre projects are already outpacing electricity availability.

According to a March 2025 report from the Alberta Electric System Operator (AESO), a provincial arms-length agency that helps oversee electricity, the demand for proposed large load projects (data centres) reached 11,879 MW in Q1 2025, up 60-fold from 200 MW the year before. 

Carson Kearl, a senior energy transition analyst at Enverus, told BetaKit that this “queue” of proposed projects is requesting roughly 100 percent of Alberta’s current peak electricity demand—meaning, if they were all approved, the province’s electricity use would more than double. 

However, not all these projects will be approved. Kearl says there is not enough natural gas infrastructure to support that level of energy consumption right now. 

“The risk is you go too fast and you start to strain the system,” Kearl said. “But there’s a bunch of guardrails in place to prevent that from happening.” 

“There’s going to be a bunch of winners and losers in that spend. But the end result of that is we’ll have the infrastructure.”

Cory Janssen
AltaML

According to the AESO, electricity requests from projects under assessment far surpass the amount of power generation these projects would add to the grid. It estimates that the electricity load of facilities requesting to come online will reach 10,000 MW by 2029, with only 2,500 MW of generation put back into the system, such as by capturing the heat these centres would generate and putting it to use in a way that would otherwise use electricity. The AESO told BetaKit it would reveal more information on its strategy for dealing with requests on June 4 during a webinar for industry players. 

One of the biggest projects in the works comes from Beacon AI Centers, which recently announced it would invest up to $10 billion CAD to build six data centres near Edmonton and Calgary that it claims will go live in 2027 at the earliest. The centres will collectively demand 4,500 MW of power, the company said.

“If you can get billion-dollar investments, that’s great,” Nenshi said. “But you have to make sure that they’re not doing harm.”

The power needed to run data centres is expected to grow 160 percent globally by 2030 due to increased AI use, according to a May 2024 report from Goldman Sachs. Beyond potentially straining the grid, data centres could potentially produce double the emissions they do today, and consume up to four percent of the world’s energy, as compared to the one to two percent they are using today. Data centres are already having adverse impacts on human respiratory health in nearby communities, due to their demand for energy supplied by fossil fuels, researchers say

Rendering of a Beacon AI Centers data centre facility. Image courtesy Stantec.

These AI data centres would be powered by Alberta’s abundant natural gas, according to the province’s data centre strategy. Glubish claimed that natural gas is the only way to reliably power AI data centres in Alberta in the short term, as the province doesn’t have access to hydroelectric power, and nuclear plants would take too long to build. The province of Alberta has officially opposed the federal government’s drafted clean electricity regulations, which call for an end to fossil fuel use by 2035.

Using non-hydro renewables for AI data centres would be a “technology mismatch,” Kearl said, as these always-on facilities require a baseline electricity supply that would make solar or wind power impractical.

Canadian telecommunications giants building AI infrastructure have so far avoided Alberta as a destination. Bell’s planned AI data centres, for example, will run on hydroelectricity in BC, with plans to expand to Manitoba and Québec. The project’s lead, Dan Rink, told Bloomberg at Web Summit Vancouver that he didn’t think it made sense to opt for emissions-heavy natural gas when hydroelectricity is abundant in Canada.

Tech leaders BetaKit spoke with touted carbon capture technology as the solution to increased emissions. Canadian tech companies such as Deep Sky are building a pilot project in Alberta to suck carbon out of the air and store it. 

RELATED: COO Alex Petre replaces Damien Steel as CEO of carbon removal startup Deep Sky

Once online, Deep Sky claims the project will capture 3,000 tons of carbon dioxide (CO2) per year—a fraction of the output of even one data centre. According to the environmental tech non-profit WattTime, a 100-MW natural-gas powered data centre in the US state of Virginia is responsible for 463,000 tonnes of CO2 emissions annually, while a California data centre with access to more renewable power sources emits 309,000 (for scale, annual greenhouse gas emissions per capita in Canada is about three tonnes). 

Building the centres themselves, plus building associated energy infrastructure, will take years, Kearl said. Given how much money this infrastructure costs, he said, patience will pay off for companies that don’t build it all at once, but scale up gradually with demand.

There’s a question as to whether that demand will grow as projected, or if infrastructure will outpace it, creating an AI “bubble” similar to when companies overbought early internet infrastructure before the dot-com crash in 2000. Some hyperscalers, such as Microsoft and Amazon Web Services, have already cancelled proposed data centre contracts after spending tens of billions of dollars on compute. 

“There’s going to be a bunch of winners and losers in that spend,” Janssen said. “But the end result of that is we’ll have the infrastructure.” 

With files from Josh Scott. Feature image courtesy Kevin Ache via Unsplash.

0 replies on “Alberta’s tech sector is embracing an AI data centre boom. Will it pay off?”