New frontiers for AI data centres
The machines that power the digital world may one day be suspended in space, buried underground, or even sitting beneath the ocean.
Data centres, the facilities that power modern communication, business, entertainment and artificial intelligence, are essential to our digital lives. They are also becoming a growing flashpoint in the clean energy transition.
Australia is already home to an estimated 250 data centres, recording the world’s second-largest investment in the sector in 2024, behind only the United States.
Investors are drawn to the sector’s seemingly unstoppable expansion, but the ability of traditional energy and water systems to keep pace with the massive workloads required by AI-driven data centres has become an increasingly urgent question.
Australian data centres consumed around two percent of grid-supplied power last year, a figure that could triple by 2030 as AI workloads grow, according to the Australian Energy Market Operator (AEMO).
Without sufficient renewable energy, this surge would significantly increase planet-warming greenhouse gas emissions. Energy use, however, is only part of the story. Data centres also consume enormous volumes of water.
A single one-megawatt data centre can use around 25.5 million litres of water each year for cooling alone, according to estimates from the World Economic Forum. This rising demand comes as global water scarcity continues to deepen.
A recent UN report warned the world had entered an era of “global water bankruptcy”, with overuse, pollution and climate change pushing critical water systems beyond sustainable limits.
With agriculture, industry, energy generation and cities already competing for shrinking water supplies, the UN cautioned that adding new water-intensive infrastructure could worsen tensions in regions already nearing systemic failure.
These pressures, combined with community concerns about land use, noise and visual disruption, as well as anxiety about AI’s impact on jobs, have fuelled strong and growing local opposition to new data centre projects.
Local opposition contributed to the cancellation of at least 25 data centre projects in the United States last year, based on a review by Heatmap Pro of press coverage, public records and project announcements.
This resistance has not come as a surprise to data centre operators. Many are already experimenting with ways to ease these pressures, from investing in new power sources to less conventional approaches. Here is a snapshot of what is beginning to take shape.
Renewable energy
As technology giants roll out aggressive data-centre expansion plans, access to reliable power has emerged as a critical constraint. At the same time, ambitious climate targets are coming under increasing pressure. In response, some of the world’s largest players are adopting new models for procuring energy.
Amazon, for example, last year pledged $20 billion to develop new data centres in Sydney and Melbourne, alongside investments in three solar farms in Victoria and Queensland to help power its expanding footprint.
Microsoft has also said it is committed to sourcing 100 per cent renewable energy for its Australian data-centre operations and has supported large-scale renewable projects in NSW. Google, meanwhile, has entered long-term renewable energy purchasing agreements to align new capacity with clean-energy supply.
In the local operator space, NextDC’s M1 Melbourne data centre became the first facility in the Asia-Pacific to deploy onsite solar power in 2012, and the company has since expanded rooftop solar across multiple sites while exploring additional renewable options.
Australian unicorn AirTrunk, with data centre platforms across Asia Pacific, says it aims to match 100 per cent of its electricity use with renewable energy by 2030 and is sourcing projects across Malaysia, Hong Kong and Australia. Acquired by Blackstone in 2024, AirTrunk joined Google and OX2 in 2023 on a 25 MW solar farm in the NSW Riverina.
Late last year, Australian startup Firmus Technologies raised $330 million to build a renewable-powered AI factory campus in Tasmania. “With Tasmania’s clean energy and our AI Factory platform, we believe this will be the most cost-effective, sustainable AI facility in the world,” Firmus co-CEO Oliver Curtis said at the time.
Reusing wastewater
One Australian innovator targeting the intense water demands of data centre infrastructure is Pacific Bio, whose wastewater treatment system produces large volumes of reusable water while using native macroalgae and sunlight to naturally clean sewage.
The system, known as RegenAqua, can generate about 3 million litres a day of A-grade reusable water from a single facility, according to Pacific Bio CEO Sam Bastounas. That output represents a significant new resource for water-intensive industries such as data centres, while also supporting urban expansion.
“This is a massive opportunity. We’re harnessing nature to take the waste of humanity and deliver a low-cost solution for the environment, consumers and ratepayers.”
Developed over a decade at James Cook University, RegenAqua uses native macroalgae to remove nutrients such as nitrogen and phosphorus from wastewater before the treated water is disinfected using UV light. The nutrient-rich algae is then harvested and can be repurposed into products such as fertiliser, with Pacific Bio also researching ways to convert used algae into biofuels.
One full-scale RegenAqua facility already operates amid cane fields in Burdekin Shire, southeast of Townsville in Queensland, making it among the first in the world to use native macroalgae to clean wastewater at this scale.
Bastounas said the company is now expanding into the southeastern seaboard, including NSW and Tasmania, working with Sydney Water on a Pacific Bio pre-facility that has been operating at Picton in NSW for almost three years.
In addition to producing reusable water that could be used for data centre cooling, Bastounas says RegenAqua is cheap and quick to build and uses far less energy than conventional treatment plants, with an operational facility able to be delivered in around 12 months.
“This is a massive opportunity. We’re harnessing nature to take the waste of humanity and deliver a low-cost solution for the environment, consumers and ratepayers,” he told The Zero Planet.
A cool change
The surge in AI development is also driving innovation in cooling systems and power efficiency, as Knight Frank wrote in its 2025 global data centres report.
The report highlights a range of advanced cooling technologies aimed at managing the rising heat generated by AI and high-performance computing. These include chip-level liquid cooling, used in Microsoft’s August 2024 design to improve efficiency while eliminating water use, waterless cooling deployed at Edged Data Centres in Kansas City, and immersion cooling, which submerges servers in dielectric fluid to allow year-round, compressor-free operation and lower overall demand.
“To meet the evolving demands of today and the future, data centres are undergoing a fundamental transformation in infrastructure design, with a strong emphasis on efficiency, sustainability and performance,” the report said.
Firmus Technologies also late last year also outlined plans to use advanced cooling systems at its AI data centre project in Tasmania, describing “advanced, liquid-everywhere data halls” that combine immersion, single-phase and two-phase cold-plate cooling platforms.
Going underground
Another emerging approach is to locate data centres below ground, repurposing abandoned mines, tunnels and bomb shelters.
Finland has become a global testbed, hosting underground data centres in disused mines and shelters beneath cities – including Helsinki. Deployments began in the mid-2010s and have accelerated over the past decade.
One underground data centre in Espoo already heats 40,000 homes, and another warms Helsinki’s cathedral district, where waste heat from the data centre servers is captured and fed into district heating networks.
In this scenario, cooling becomes cheaper for operators while cities reduce heating costs and emissions. Underground sites also minimise visual impact, land conflicts and community opposition, say project advocates.
“The decision to invest in a datacenter region that also provides surplus heat to our cities and homes is a win-win," former Finnish Prime Minister Sanna Marin said at the project's launch in 2022.
“I also hope that this collaboration can serve as a model to other countries and cities looking to achieve the double transformation of climate neutrality and digital competitiveness.”
Under the ocean
Oceans offer a vast, naturally cold heat sink, and several countries and companies are now exploring how to harness it.
China launched the world’s first commercially operational underwater data centre off the coast of Hainan in 2023, with facilities submerged about 35 metres below sea level and housing racks of hundreds of servers. The servers sit inside sealed capsules on the seabed, which its operators say dramatically reduces cooling energy requirements and freshwater use.
Microsoft’s Project Natick was a research initiative designed to test the feasibility of subsea data centres powered by offshore renewable energy. Prototype modules were submerged off the coasts of the United States and Scotland between 2015 and around 2024.
The project explored the benefits and challenges of deploying subsea data centres worldwide, and Microsoft said it found that underwater servers tended to fail less often than land-based counterparts while using less energy.
It said other advantages of the underwater model included a minimal land footprint, proximity to coastal users and reduced community opposition. Challenges remained, however, including potential impacts on marine ecosystems, maintenance complexity and high upfront costs.
Floating in the sea
Some operators are avoiding the complexity of the underwater model by putting data centres on barges. These floating facilities take in water from a river, canal or harbour and work by transferring heat from servers to the surrounding water, returning it slightly warmer and without consuming freshwater.
Advocates say the benefits of floating data centres include fast deployment timelines, minimal land use, and zero freshwater usage, offering a major advantage in water‑stressed regions.
This approach also does away with the need for cooling towers, chillers and water treatment chemicals, significantly reducing energy use for cooling compared with conventional facilities.
Companies are still in the early stages of testing the feasibility of this approach. In the early 2010s, Google built a fleet of four large ocean-going barges that it described as floating high-tech showrooms. Around this time it was also reported that the company held a water-based data centre patent.
The Google barges were docked in US ports including San Francisco Bay and in Portland, Maine. While some industry watchers at the time speculated that the barges were floating data centres, the project was ultimately cancelled.
A current commercial example of this tech is a floating data centre from Nautilus Data Technologies located on a barge in the Port of Stockton in California. It uses river water to achieve cooling and to reduce power use, with no refrigerants or wastewater. Nautilus is also working on floating data centre projects in Marseille, France and with AltaSea at the port of Los Angeles, California.
AI in space
What once may have sounded sci-fi is now attracting big investment, with tech giants and startups exploring orbital data centres powered directly by the sun.
Companies like SpaceX and Google, through its Project Suncatcher, along with startups like Nvidia-backed Starcloud, are working on space-based data-centre concepts. Billionaire Jeff Bezos's space firm Blue Origin is also said to be planning satellite networks that could serve similar functions.
The appeal of this model lies in continuous solar power, no land or water use and a lower chance of community opposition. The barriers, however, are formidable, including launch costs, the need for radiation-hardened hardware to withstand high-energy particles, and the cooling challenges in a vacuum where heat cannot dissipate through air. Opponents have also raised concerns about space debris and collision risks in increasingly crowded orbits.
Most analysts see space-based computing as a long-term prospect, but early demonstrations are underway with Starcloud recently launching its first GPU-equipped satellite into orbit and running AI computations in space.
Data villages
No matter how efficient they become, today’s data-centre architectures will eventually reach a point where they are no longer fit for purpose, opening the door to the next wave of technological innovation, according to the world's largest PC maker, Lenovo.
According to its Data Center of the Future study released late last year, architects and engineers are increasingly focused on integrating data centres into communities rather than imposing them, as a way to reduce opposition, with biomimicry and aesthetic design concepts being promoted as a way to create infrastructure that communities could better tolerate.
The study explores the idea of 'data villages' where modular servers are clustered near urban edges or suburban campuses where they export waste heat to homes, schools and pools. Building on this concept is the 'data spa', a data centre that integrates into natural landscapes such as valleys, lagoons, or geothermal pools.
Supporters of this more tightly integrated data infrastructure model argue that its success will ultimately hinge on the ability of cross-sector regulators to keep pace with the rapidly evolving information age.
Related stories

