New Zealand faces sustainability challenges of Generative AI

AI’s environmental and social impact need to be part of the conversation about the role it should play in New Zealand society (INL Stock Image)

Nathan Cooper and Amanda Turnbull
Hamilton, October 23, 2024

As investment in generative AI continues to grow globally, New Zealand’s government has been implementing its use across the public sector and encouraging businesses to embrace the technology’s potential.

But the environmental, social and governance risks and costs of AI remain under-investigated. In particular, the energy-hungry computations generative that AI requires means an ever-expanding carbon footprint for this technology at a time when countries are expected to make more ambitious commitments to cut emissions at the upcoming United Nations climate summit (COP29) in November.

We argue that questions around AI’s environmental and social impact need to be part of the conversation about the role it should play in New Zealand society.

Societal costs and risks

Currently, the global use of AI gobbles up as much energy as a small country. This rate is expected to double by 2026. Increasingly sophisticated AI is also projected to double the number of data centres in the next four years.

This ever-expanding reliance on data centres brings sustainability worries. The average data centre already uses about 40% of its power for cooling, often relying on local water supplies.

AI also brings social risks to employees and users. Its capabilities may result in job displacement and the wellbeing of staff who train AI could be affected if they are repeatedly exposed to harmful content.

Governance pitfalls include concerns about data privacy, breaches of copyright laws and AI hallucinations. The latter refers to outputs that sound right but are incorrect or irrelevant, but nevertheless affect decision making.

Data centres already use about 40% of their power for cooling (INL Stock Image)

Why this matters

Like other countries, Aotearoa is rapidly adopting generative AI, from business to the courts, education and the work of government itself.

A recently announced collaboration between Microsoft and Spark Business Group means that New Zealand will enter the hyperscale data centre trend.

Hyperscale data centres allow for vast data processing and storage needs.

Once completed, a new hyperscale cloud region promises to enable New Zealand businesses to scale up locally, all powered by 100% carbon-free energy provided through an agreement with Ecotricity.

For the moment, many environmental and social costs of New Zealand’s growing use are being borne elsewhere. The issue for New Zealand is one of global entanglement. Asking ChatGPT a question in New Zealand means relying on overseas data centres, using a lot of electricity from their municipal grid and likely their water for cooling.

Data centres are scattered across the world and many are located in developing countries. Even where data centres use renewable electricity sources, this diverts supply from other priorities, such as the electrification of public transport.

Ethical Problem

This is ethically problematic because other (often poorer) countries are shouldering the burden of New Zealand’s AI use. It may also be legally problematic.

As a developed country and party to the Paris Agreement, New Zealand is committed to taking the lead in addressing climate change. This means setting progressively more ambitious emissions reduction targets (known as nationally determined contributions).

Last year’s United Nations global stocktake on climate action confirmed that countries’ total efforts are insufficient to limit temperature rise to well below 20 above pre-industrial levels. New Zealand’s contribution is also falling short.

The tension between the increasing use of generative AI and meeting climate goals is one that Climate Change Minister Simon Watts and his team will be wrestling with as they prepare for next month’s climate summit in Azerbaijan.

To press on with New Zealand’s commitment to address climate change, we need to focus on entangled solutions to deal with the growing environmental, social and governance costs of generative AI.

Digital Sobriety

Digital sobriety“ is a concept that encourages reduced technology use. It is one approach to thinking about the tensions between AI use and its escalating impacts.

This is similar to our approaches to reducing water consumption and waste.

It also involves asking ourselves whether we really need the latest smart device or bigger data plans.

Another potential remedy is to scale down slightly and make use of small language models instead of data-hungrier large language models. These smaller versions use less computational power and are suitable for smaller devices.

Integrating sustainability into AI guardrails would also help to balance some of the environmental impacts of generative AI. Guardrails are filters or rules that sit between the inputs and outputs of the AI to reduce the likelihood of errors or bias. Currently, these safeguards are mainly focused on fairness, transparency, accountability and safety.

As the Paris Agreement acknowledges, adopting sustainable patterns of consumption plays an important role in addressing climate change. Careful thinking now about how we adopt hyperscale generative AI in New Zealand in sustainable ways could help steer the country towards a more responsible relationship with this powerful and swiftly developing technology.

Nathan Cooper is an Associate Professor and Amanda Turnbull is a Lecturer in Law at the University of Waikato. The above article, which appeared in The Conversation, has been published here under Creative Commons.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Share this story

Related Stories

Indian Newslink

Advertisement

Previous slide
Next slide

Advertisement

Previous slide
Next slide

Advertisement

Previous slide
Next slide

Advertisement

Previous slide
Next slide

Advertisement

Previous slide
Next slide

Advertisement

Previous slide
Next slide

Advertisement

Previous slide
Next slide

Advertisement

Previous slide
Next slide

Advertisement

Previous slide
Next slide