The Environmental Cost of Data Centres in Australia
Every time you stream a video, ask a chatbot a question, or back up your phone to the cloud, a data centre somewhere is drawing power. Lots of power. And in Australia, data centre energy consumption is growing at a rate that makes the country’s emissions reduction targets harder to hit.
This isn’t an argument against data centres. Modern economies need them. But the environmental costs deserve honest examination, particularly as AI workloads push energy demand higher than anyone projected five years ago.
How Much Energy Are We Talking About
Australia’s data centre sector consumed approximately 5.5 TWh of electricity in 2025, according to estimates from the Australian Energy Market Operator (AEMO). That’s roughly 2% of national electricity consumption. To put it in perspective, that’s more electricity than Tasmania uses in a year.
And it’s growing fast. AEMO’s 2024 Integrated System Plan projected data centre electricity demand could reach 8-11 TWh by 2030, depending on the pace of AI adoption and cloud migration. Some industry analysts think even the upper estimate is conservative given the current trajectory of AI compute demand.
The main data centre hubs in Australia are western Sydney (the largest concentration), Melbourne’s inner suburbs, and growing capacity in Brisbane and Perth. Western Sydney alone hosts over 200MW of data centre capacity, with another 500MW+ in various stages of development and approval.
Why AI Changes the Equation
Traditional cloud computing — hosting websites, running databases, storing files — is relatively energy-efficient per unit of useful work. Modern servers are vastly more efficient than the ones from a decade ago.
AI training and inference are different. Training a large language model can consume the equivalent energy of hundreds of Australian homes for a year. And that’s a one-time cost. Inference — actually running the model to answer queries — draws ongoing power proportional to usage.
The explosion of AI applications in 2025-2026 has driven GPU-dense server deployments that draw significantly more power per rack than traditional computing. A standard server rack might draw 7-10kW. An AI-optimised rack with NVIDIA H100 or B200 GPUs can draw 40-100kW. Data centres designed for traditional workloads are running out of power capacity.
This is already visible in Australia. AirTrunk, the country’s largest data centre operator, has announced multiple expansions specifically for AI workloads. Equinix, Digital Realty, and NEXTDC are all building or expanding AI-capable facilities.
The Water Problem
Energy gets the headlines, but water consumption is a growing concern. Data centres use water for cooling — either directly in cooling towers or indirectly through the electricity generation process.
In Australia, where water scarcity is a persistent issue, this matters. A mid-sized data centre with evaporative cooling can use 10-25 million litres of water annually. That’s equivalent to the water consumption of several hundred households.
Some operators are shifting to air-cooled or liquid-cooled systems that reduce water use. But these are more expensive and less efficient in hot climates — which is most of Australia.
Western Sydney, where the largest cluster of data centres is located, already faces water stress. The rapid expansion of data centre capacity in the region has drawn scrutiny from local councils and environmental groups.
The Grid Impact
Data centres need reliable, 24/7 power. They can’t tolerate blackouts. This means they typically connect to the grid with high-reliability power feeds and maintain diesel backup generators.
The grid impact is twofold:
Baseload demand. Data centres run continuously, creating steady electricity demand that must be supplied around the clock. In a grid transitioning to variable renewables (solar and wind), this constant demand can be served by renewables during the day but requires gas, batteries, or other dispatchable generation at night.
Concentration. Having hundreds of megawatts of demand concentrated in one area (western Sydney) requires significant transmission infrastructure. This creates bottlenecks and can delay other grid projects.
The irony is that many of the companies building data centres have ambitious renewable energy targets. Microsoft, Google, and Amazon all claim to match their electricity use with renewable energy purchases. But “matching” through certificates isn’t the same as actually running on renewables 24/7. A data centre drawing power at 2 AM is using whatever the grid is producing at 2 AM, regardless of renewable certificates purchased for daytime solar generation.
Google has committed to running on carbon-free energy 24/7 by 2030. That’s a harder and more meaningful target. It requires either on-site generation, dedicated renewable supply with storage, or locating in areas with consistently clean grids.
What Australia Is Doing
The policy response has been mixed.
NSW government has designated western Sydney as a Special Activation Precinct for data centres, streamlining planning approvals. Environmental conditions are attached, but critics argue they’re insufficient given the scale of development.
Energy efficiency standards for data centres exist but aren’t mandatory in Australia. The industry self-reports through metrics like Power Usage Effectiveness (PUE), which measures total facility power divided by IT equipment power. A PUE of 1.2-1.4 is typical for modern Australian data centres, meaning 20-40% of energy goes to cooling and other overheads. Best-in-class facilities achieve PUE below 1.2.
Renewable energy integration varies by operator. Some are signing Power Purchase Agreements (PPAs) with renewable energy projects. Others are installing on-site solar. But the scale of renewable energy needed to genuinely power Australia’s growing data centre fleet is enormous — and competes with other demand for renewable capacity.
The Positive Side
It’s worth noting that data centres also enable emissions reductions elsewhere. Cloud computing allows businesses to consolidate from many inefficient on-premises servers to fewer, more efficient shared data centres. Remote work (enabled by cloud infrastructure) reduces commuting emissions. AI optimisation of logistics, agriculture, and energy systems can deliver emissions savings that exceed the energy cost of running the AI.
Working with firms focused on custom AI development often involves weighing compute costs — both financial and environmental — against the efficiency gains that AI systems deliver in production. The net environmental impact depends entirely on the specific application.
The question isn’t whether data centres should exist. It’s whether they’re being built and operated as efficiently as possible, powered by the cleanest energy available, and held accountable for their environmental impact.
What Needs to Change
Mandatory efficiency reporting. PUE and water usage effectiveness (WUE) should be publicly reported and regulated, not voluntary.
Grid-aligned renewable targets. “24/7 carbon-free energy” should be the standard, not certificate-matched annual averages.
Water usage limits. Data centres in water-stressed regions should face binding water consumption limits with real enforcement.
Planning that accounts for cumulative impact. Approving data centres individually ignores the cumulative effect on energy grids, water supplies, and local communities. Regional planning needs to account for the full pipeline.
Australia’s data centre sector is growing because demand for digital services and AI is growing. That growth isn’t stopping. But how we build and power these facilities is a choice, and the choices we make in the next few years will determine whether Australia’s data centre boom is compatible with its climate commitments.