Artificial Intelligence, the Environment and Resource Conflict: Emerging Challenges in Global Governance

VOLUME 7

ISSUE 3

June 27, 2025

Artificial intelligence (AI) embodies a system’s capacity to autonomously collect and interpret data from its environment, learn from that data, and apply these insights to inform decision making, problem solving and actions that traditionally require human intelligence.1 At the heart of this technological process are data centres — facilities where AI models are trained, deployed and maintained.2 As AI infrastructures expand rapidly across the globe, they bring into sharper focus the often-overlooked costs of digital innovation, particularly their entanglement with extractive economies and conflict dynamics. For example, in 2024, the construction of a massive AI data centre by Elon Musk’s xAI in Memphis, Tennessee, sparked public backlash over its projected energy consumption, water use and environmental impact.3 Meanwhile, in the Democratic Republic of the Congo (DRC), the mining of cobalt — essential for powering AI hardware — continues to exacerbate environmental degradation and social instability, revealing the darker ecological and human costs of the digital revolution.4 Recognizing that critical minerals form the backbone of data centre infrastructure, the surging demand for these resources positions AI as a key driver of “digital extractivism,”5 reproducing colonial logics of exploitation that Abeba Birhane describes as “algorithmic colonization.”6 Through a critical analysis of scholarly literature, policy documents, media reporting and energy consumption data, this paper interrogates the evolving nexus between AI, environmental sustainability and conflict dynamics. Central to this inquiry is a pressing question: Can AI fulfill its transformative potential without exacerbating environmental tensions and resource-driven conflict? What are the implications for global AI governance? This paper argues that while AI technologies have revolutionized communication, commerce and innovation, they have also significantly expanded the environmental footprint of digital systems. By centring generative AI, recognized as the most resource-intensive form of AI due to its high energy demands, this paper foregrounds urgent concerns about the sustainability of emerging digital infrastructures.7 Accordingly, the paper advocates for the establishment of global governance frameworks that embed sustainability considerations across the entire AI lifecycle — from hardware manufacturing to algorithmic deployment. These frameworks must mandate robust energy disclosure, minimize the ecological footprint of AI systems and address AI’s indirect contribution to resource conflict. Without deliberate action, the AI revolution risks entrenching the very environmental and geopolitical crises it promises to alleviate.

AI and the Environment

The Fourth Industrial Revolution — an era of technological transformation characterized by the convergence of digital, physical and biological systems — is poised to significantly reshape global industries, as digital technologies increasingly permeate all aspects of human life.8 Within this broader transformation, AI has emerged as a key driver of both opportunity and concern. Recent research has examined the contribution of AI in enhancing environmental monitoring, particularly in improving the accuracy of disaster forecasting, pollution source identification, and air and water quality assessments.9 In the realm of natural resource governance, AI has emerged as a promising tool for addressing complex challenges related to environmental management, conservation and recycling systems. Christine Chan and Guo Huang, for instance, analyze how AI technologies, including expert systems, fuzzy logic and neural networks, are being applied to minimize and mitigate pollution.10 Their study highlights the potential of these techniques to optimize decision-making processes and advance sustainable environmental governance.

An emerging challenge in the field of environmental governance concerns the actual and potential effects of AI technologies on ecological systems. Increasingly, there is recognition that AI advancements depend on vast computational infrastructure and prolonged energy use, particularly during the training of complex algorithms. As Lauren Leffer observes, “every online interaction relies on a scaffolding of information stored in remote servers — and those machines, stacked together in data centres worldwide, require a lot of energy.”11 The growing scale and intensity of AI development thus raise urgent questions about its compatibility with global sustainability goals.

Evidence presented in Figure 1 suggests that while the pace of innovation in AI has been remarkable, a critical but often overlooked concern is the significant energy consumption involved in training large AI networks. As Lakshmi Varanasi observes, “the computational intensity of AI platforms such as ChatGPT is primarily attributable to their high electricity demands.”12 Varanasi cites Elizabeth Kolbert’s report in The New Yorker, which estimates that ChatGPT consumes more than half a million kilowatt-hours of electricity per day to process more than 200 million user requests. The fundamental concern raised by Kolbert is how humanity can achieve net-zero emissions while continually developing technologies that dramatically increase energy consumption.13 Such concerns underscore the imperative of global governance frameworks that mitigate the environmental impact of energy-intensive AI systems, which, if left unregulated, risk accelerating the climate crisis.

Figure 1: AI Power Consumption and Share of Total Data Centre
Consumption Worldwide in 2023, with Forecasts

Source: Statista, 2023.

Figure 2 illustrates a significant rise in Google’s energy consumption over the past five years, increasing from 12.8 terawatt-hours in 2019 to 25.9 terawatt-hours in 2023. This surge is largely attributed to the global expansion of the company’s data centres. In response, Google reports having implemented a range of energy efficiency measures, including the deployment of customized high-performance servers, smart temperature and lighting systems, advanced cooling technologies, and machine learning algorithms to optimize energy use.14 While such initiatives suggest a commitment to sustainability, they exist in tension with a deeper set of structural critiques concerning the trajectory of AI development itself.

Figure 2: Energy Consumption of Google from Financial Year 2011-2023 (in gigawatt hours)

Source: Statista, 2024.

One of the most foundational and prescient critiques of AI systems prior to the emergence of ChatGPT is articulated in the seminal work of Emily Bender and her colleagues, which foregrounds the ecological externalities associated with training large language models.15 Similarly, a study by Pengfei Lai and colleagues reveals that “training large-scale models such as GPT-3 in Microsoft’s U.S.-based data centres can consume approximately 5.4 million liters of water in total.”16 The study further demonstrates that generating approximately “10 to 50 medium-length responses using the GPT-3 model can result in the consumption of a 500 milliliter bottle of water.”17 Recent estimates indicate that composing a 100-word email using GPT-4 consumes approximately 519 milliliters of water, primarily for data centre cooling.18

These environmental costs are not isolated but are structurally embedded within the cloud infrastructure operated by tech companies. For example, Figure 3 shows that Amazon Web Services, Microsoft Azure, and Google are leading global vendors in the Infrastructure as a Service (IaaS) sector. As the chart indicates, Google Cloud Platform has been widely adopted by tech companies for cloud infrastructure, solidifying Google’s position as a dominant market player. According to Lucía Fernández, Google’s prominence is reflected in the proportion of tech firms that were utilizing its cloud services as early as 2018.19 However, this widespread adoption raises significant environmental concerns, as the vast data centres underpinning Google’s cloud infrastructure require substantial energy resources. As a major provider powering the expanding AI sector, Google’s energy demands underscore the broader carbon impact of the tech industry and reinforce the urgent need for sustainable innovation in green cloud infrastructure. As Steven Monserrate starkly puts it, “the Cloud now has a greater carbon footprint than the airline industry,” noting that “a single data centre can consume as much electricity as 50,000 homes.”20

Figure 3: Cloud Infrastructure as a Service (IaaS) Vendor in Use in Organizations Worldwide as of 2018

Source: Statista, 2022.

Recently, researchers have paid increasing attention to the rapid expansion of machine learning models and the growing concerns surrounding their environmental impact, particularly their carbon footprint.21 A seminal study by Emma Strubell, Ananya Ganesh and Andrew McCallum demonstrates that the training and development of large AI models generate significant environmental costs, largely due to the carbon emissions associated with powering tensor processing hardware.22 Similarly, David Patterson and colleagues have attempted to quantify the energy consumption and carbon footprint of recent large-scale AI systems. Their findings indicate that the energy required to train a machine learning model is shaped by multiple factors, including the algorithm used, the efficiency of its implementation, the number and power of processors, the data centre’s energy and cooling efficiency, and the carbon intensity of the energy supply.23 Sophia Chen notes that as of 2024, data centres accounted for approximately 1.5 percent of global electricity consumption, and this figure is expected to double by 2030 due to the intensive computational demands of models such as GPT-4.24

There is growing concern about the accurate estimation of the overall energy impact of AI technologies, particularly due to the lack of transparency among major technology firms regarding their energy consumption. Recent estimates by Alex de Vries suggest that the AI sector could consume between 85 and 134 terawatt-hours of electricity annually by 2027 — comparable to the annual electricity usage of countries such as Argentina, the Netherlands and Sweden — and representing approximately 0.5 percent of global electricity demand.25 In a related study, researchers at MIT conducted a lifecycle assessment of several large AI models and found that “the training process can emit more than 626,000 pounds of carbon dioxide equivalent, nearly five times the lifetime emissions of the average American car.”26

This staggering environmental cost underscores the importance of examining data centres. Chandana Patnaik, citing data from the Institute of Electrical and Electronics Engineers, reveals that cooling systems alone account for 50 percent of a data centre’s total energy consumption, with servers and storage contributing 26 percent, power conversion systems 11 percent, network hardware 10 percent and lighting three percent (see Figure 4).27

Figure 4: Data Centre Energy Consumption Breakdown

Source: Extracted from Data Center Knowledge.

According to data compiled by Brightlio researchers, there were approximately 11,800 data centres operating worldwide as of March 2024.28 A March 2025 ranking of countries by number of data centres reveals that the United States leads by a significant margin, followed by Germany and the United Kingdom (see Figure 5).29 With the exponential rise in AI workloads and the proliferation of connected devices, data centres are expected to account for an increasingly substantial share of global electricity consumption and carbon emissions.30 It is imperative, therefore, that we accord greater attention to the environmental impacts of AI deployment.

Figure 5: Leading Countries by Number of Data Centres

Source: Statista, 2025.

AI and Resource Conflict

Natural resources underpin nearly every aspect of modern life, and they have become a foundational driver of the AI revolution. From the cars we drive to the smartphones and computers that enable cross-border interaction and remote problem-solving, AI technologies are increasingly embedded in the tools we rely on daily. The rapid development and integration of AI systems have intensified demand for specialized electronic components, including graphic processing units, central processing units, and memory chips. Recent studies indicate that the hardware enabling AI development, such as data storage systems and servers, depends heavily on the extraction of rare earth elements and other non-renewable resources.31 While critical minerals are indispensable for the scalability and performance of AI infrastructure,32 their extraction is highly concentrated in ecologically fragile and politically unstable regions of Africa, Latin America and Asia. Despite the growing reliance on AI technology, empirical research explicitly linking its expansion to conflict dynamics remains limited.

An emerging challenge associated with the projected energy demands of AI technologies is their contribution to natural resource depletion, particularly fossil fuels, which remain a dominant source of global electricity generation. As Varanasi notes, the widespread adoption of generative AI could substantially increase global energy consumption, potentially exceeding the annual usage of entire countries.33 This concern is further compounded by the reliance of AI-enabled devices and data infrastructures on the extraction of critical minerals, including cobalt, lithium, nickel, copper, silicon, aluminum, and rare earth elements such as neodymium and dysprosium.34 As global demand for electronic and computing hardware intensifies, so too does competition for these finite resources, exacerbating environmental degradation through deforestation, water pollution and biodiversity loss. According to the World Bank, the accelerating demand for critical minerals represents a significant vulnerability in the transition to digital and low-carbon economies.35

In response to these mounting concerns, recent scholarly interventions have begun to critically interrogate the multivariate dimensions of AI systems. Kate Crawford, for instance, foregrounds the material and planetary infrastructure of AI,36 while Salvador Regilme frames AI as a vehicle of “digital colonialism.”37 Paola Ricaurte extends this critique by arguing that dominant data regimes reproduce colonial hierarchies by erasing Indigenous ways of knowing.38 Similarly, Sábëlo Mhlambi and Simona Tiribelli expose the limitations of prevailing AI ethics frameworks, particularly their failure to account for the relational and contextual dimensions of harm.39 These theoretical critiques are further substantiated by empirical evidence documenting AI’s reliance on extractive supply chains.

For example, Figure 6 illustrates the reliance of major tech companies such as Google, Apple, Meta, Amazon and Microsoft (GAMAM) on conflict minerals sourced from African countries.40 As Florian Zandt observes in his analysis of the 2023 Conflict Minerals Report, Amazon’s operations are implicated in mineral sourcing from African countries, such as the DRC and South Sudan, where armed groups leverage resource extraction to sustain violent conflict.41 Similarly, Apple, Google, Meta and Microsoft have acknowledged that smelters in their supply chains may process minerals originating from conflict-affected regions, although the scope and specificity of sourcing remain unclear.42

Figure 6: Big Tech’s Reliance on Conflict Minerals: Number of Countries GAMAM Potentially Sourced Conflict Minerals from in 2023, by Risk Level

Source: Statista, August 2, 2024.

Beyond the geopolitical entanglements of conflict minerals, the material architecture of AI systems depends heavily on rare earth minerals, such as erbium, europium, gadolinium, holmium, lanthanum and terbium, that are essential to the production of semiconductors, optical fibers, hard disk drives and capacitors.43 Both the extraction and processing of these minerals can exacerbate environmental degradation, including soil and water contamination and elevated carbon emissions.44 The implication is that while mineral extraction has contributed to economic growth and development in many countries, it has also produced significant negative externalities, particularly in regions marked by poverty, weak regulatory oversight and political instability.45

Since the 1990s, large-scale commercial gold mining has sparked widespread opposition globally, often manifesting in land disputes, environmental degradation and violent conflict.46 A notable example is the civil war in Sierra Leone during the 1990s, which was largely fuelled by competition over the country’s diamond resources.47 Similar patterns of mining-induced conflict have been documented in Afghanistan, the Central African Republic, China, Ghana and Mozambique.48 In such contexts, mineral exploitation often fails to benefit local communities, instead enriching companies that rely on these resources to power data centres and sustain global supply chains increasingly driven by AI technologies. Despite growing concerns about the ecological footprint of AI systems, the continued lack of transparency in mineral extraction and trade data continues to impede corporate accountability and environmental justice.49

These extractive dynamics are mirrored in the rising resistance to water-intensive data centres. In Castilla‑La Mancha, Spain, Meta’s proposed €1 billion data centre project, expected to consume approximately 665 million litres of water annually, has sparked tensions with local farmers alarmed by the strain on water resources.50 Across North America, rising tensions between Big Tech companies and local communities reflect broader concerns over ecological degradation, as the rapid expansion of water-intensive data centres strains local resources while delivering limited community benefit.51 In early 2025, for example, the Sturgeon Lake Cree Nation in Canada opposed Kevin O’Leary’s proposed US$70 billion “Wonder Valley” AI data centre near Grande Prairie, Alberta, citing environmental threats to land and water resources.52

AI and Global Governance Innovations

In an increasingly interconnected world, where rapid technological advancements driven by AI intersect with mineral extraction and escalating environmental crises, the imperative for global AI governance has never been more urgent. A global survey of 84 AI ethics documents, conducted by Anna Jobin, Marcello Ienca, and Effy Vayena, reveals widespread consensus around principles such as transparency, justice and accountability.53 Christian Djeffal contributes to this discourse by proposing the Sustainable AI Development (SAID) model, which emphasizes participatory and legally grounded AI governance centred on access to justice and environmental sustainability.54 A recent study by Marie Francisco deepens the analysis by interrogating how global environmental governance institutions conceptualize and deploy AI.55 This foundational knowledge underscores the imperative for AI governance frameworks that bridge normative principles with demands for planetary accountability.

This paper proposes an Artificial Intelligence Governance Framework comprising three strategic pillars, as presented in Figure 7. The schematic illustrates how regulatory norms, transparency norms, and conflict prevention norms operate as mutually reinforcing pillars essential to a coherent framework for global AI governance.

Figure 7: Artificial Intelligence Governance Framework

Source: Author.

First, AI governance must prioritize the development and enforcement of regulatory norms that hold major technology firms accountable for their environmental impacts. These norms should be supported by robust global enforcement mechanisms to ensure that companies address the environmental externalities of AI technologies, particularly those linked to energy-intensive computation and data centre operations.

As the energy footprint of AI continues to grow, innovations that reduce computational demand and environmental cost are becoming critical benchmarks for sustainable development.56 Aili McConnon observes that DeepSeek-R1, developed by Chinese startup DeepSeek, has disrupted the global AI landscape by achieving GPT-4-level performance in mathematical and coding tasks with 96 percent lower training costs, using only a fraction of the computing resources typically required by US tech giants.57 This trend is part of a broader race toward smarter, more power-efficient AI systems, driven in part by China’s unique regulatory and geopolitical imperatives.58 In response to US export controls restricting access to high-performing AI chips, Chinese developers have adopted innovative strategies to sustain competitive productivity, exemplified by DeepSeek’s development of the R1 model. While this model challenges assumptions about the environmental cost of AI, there is a risk that its energy-efficiency claims may be used to greenwash AI development.59 The concern is not with the technical efficiency of DeepSeek-R1 itself, but with how such claims may be leveraged to sidestep deeper environmental responsibilities and obscure ongoing reliance on extractive infrastructures.

As extractive industries continue to exploit institutional weaknesses and regulatory gaps in developing regions,60 effective AI governance must include regulatory norms that move beyond performative commitments. These norms should institutionalize enforceable sustainability standards, such as mandating transparency, limiting reliance on unverifiable carbon offsets, and requiring AI systems to be energy-efficient and powered by verifiably renewable sources.

Second, global AI governance must include transparency norms that compel technology firms to disclose energy usage, carbon emissions and other environmental metrics associated with AI development and deployment. A report by the Öko-Institut reveals that carbon offset mechanisms frequently suffer from a lack of transparency, contributing to systematic greenwashing.61

Given the interconnectedness of environmental transparency and ethical sourcing, the proposed Artificial Intelligence Governance Framework incorporates supply chain accountability as a core component of transparent AI governance. Tech giants, such as Alibaba, Amazon, Apple, Google, Meta, Microsoft, Netflix, NVIDIA and Tesla, must be subject to international reporting standards to facilitate independent evaluation of their environmental footprints. Such disclosures would enable policymakers to design evidence-based interventions and accelerate the implementation of sustainable practices. Moreover, global governance must embed enforceable transparency norms throughout the AI lifecycle, including mandates for infrastructures to operate on verifiable, site-specific renewable energy sources, while considering the environmental and human rights externalities associated with the extraction of critical minerals used to power these technologies.

Third, AI governance frameworks must proactively address the externalities of mineral extraction, including corruption, inequality and violent conflict, through AI-enabled conflict prevention and resource monitoring tools.

AI can enhance natural resource governance by leveraging large-scale data and remote sensing technologies to track illicit mining, promote supply chain transparency, and provide early warning signals in fragile contexts. Satellite imagery, for instance, can be combined with machine learning to detect unauthorized mining operations in real time. Initiatives such as the Global Witness partnership with DataKind and the Natural Resource Governance Institute demonstrate the feasibility of such approaches.62 Replicating these efforts across resource-rich, conflict-prone regions could enable AI to serve as both a catalyst for development and a safeguard for vulnerable communities.

Conclusion

In recent years, a new strand of scholarship has emerged examining the actual and potential impacts of AI technologies on environmental sustainability and conflict dynamics.63 While the literature on AI ethics continues to expand, research linking AI to environmental and geopolitical risks remains nascent. This paper has demonstrated that AI systems — far from being neutral — are materially embedded within extractive economies and carbon-intensive technological regimes.64 Simultaneously, geopolitical instability is likely to intensify in mineral-rich regions where extractive practices required to sustain the AI economy exacerbate existing vulnerabilities. Widening inequalities in the Global South, where Big Tech corporations benefit disproportionately from AI-driven growth, further compound the risk of social fragmentation and conflict. Mapping the socio-environmental impacts of AI is thus essential to charting a more just and peaceful trajectory in an increasingly AI-mediated world.

This paper concludes by proposing an Artificial Intelligence Governance Framework as a global mechanism for addressing the environmental and social implications of AI. The framework integrates principles of sustainability, equity and accountability, prioritizing ecological resilience and the protection of communities most vulnerable to the externalities of technological innovation.

Acknowledgement

The author expresses gratitude to the two anonymous reviewers for their thoughtful and constructive feedback.

Endnotes

1. See Michael Chui, James Manyika and Mehdi Miremadi, “Where Machines Could Replace Humans—And Where They Can’t (Yet),” The McKinsey Quarterly (2016); Jay Lee, Hossein Davari, Jaskaran Singh and Vibhor Pandhare, “Industrial Artificial Intelligence for Industry 4.0-Based Manufacturing Systems,” Manufacturing Letters 18 (2018): 20–23, https://doi.org/10.1016/j.mfglet.2018.09.002; Georg von Krogh, “Artificial Intelligence in Organizations: New Opportunities for Phenomenon-Based Theorizing,” Academy of Management Discoveries 4, no. 4 (2018): 404–9, https://doi.org/10.3929/ethz-b-000320207.

2. See Nicola Jones, “How to Stop Data Centres from Gobbling Up the World’s Electricity,” Nature, September 13, 2018, https://www.nature.com/articles/d41586-018-06610-y; Petroc Taylor, “Leading Countries by Number of Data Centers Worldwide as of March 2025,” Statista, https://www.statista.com/statistics/1228433/data-centers-worldwide-by-country/.

3. Andrew R. Chow, “Elon Musk’s New AI Data Center Raises Alarms Over Pollution,” Time, September 17, 2024, https://time.com/7021709/elon-musk-xai-grok-memphis/; Dara Kerr, “Elon Musk’s xAI Powering Its Facility in Memphis with ‘Illegal’ Generators,” The Guardian, April 9, 2025, https://www.theguardian.com/us-news/2025/apr/09/elon-musk-xai-memphis; Lora Kolodny, “Elon Musk’s xAI Is Polluting Air in Memphis, Using More Gas Turbines Than Permitted, Advocacy Group Says,” CNBC, April 10, 2025, https://www.cnbc.com/2025/04/10/elon-musks-xai-accused-polluting-air-in-memphis-selc-says-in-letter.html.

4. Global Center on AI and Geopolitics, Geopolitics of AI: Africa’s Role in a Shifting Global Landscape, 2024, https://www.globalcenter.ai/analysis/articles/geopolitics-of-ai-africa-s-role-in-a-shifting-global-landscape?utm_source=chatgpt.com.

5. Prabha Kannan, “Neema Iyer: Digital Extractivism in Africa Mirrors Colonial Practices,” Stanford Institute for Human-Centered Artificial Intelligence (HAI), August 15, 2022, https://hai.stanford.edu/news/neema-iyer-digital-extractivism-africa-mirrors-colonial-practices.

6. Abeba Birhane, “Algorithmic Colonization of Africa,” SCRIPTed 17, no. 2 (2020): 389–409, https://script-ed.org/?p=3888.

7. Generative AI is a subset of AI that uses deep learning models to simulate aspects of human cognition by extracting patterns from vast datasets to generate original content, such as text, images, video, and code, in response to user prompts. See Cole Stryker and Mark Scapicchio, “What Is Generative AI?,” IBM Think, accessed June 11, 2025, https://www.ibm.com/think/topics/generative-ai; OpenAI, GPT-4 Technical Report, 2023, https://cdn.openai.com/papers/gpt-4.pdf; Emily M. Bender, Timnit Gebru, Angelina McMillan-Major and Margaret Mitchell, “On the Dangers of Stochastic Parrots: Can Language Models Be Too Big?” in Proceedings of the 2021 ACM Conference on Fairness, Accountability, and Transparency (FAccT ’21), March 2021, 610–11, https://doi.org/10.1145/3442188.3445922.

8. Mohammad Aazam, Sherali Zeadally and Khaled A. Harras, “Deploying Fog Computing in Industrial Internet of Things and Industry 4.0,” IEEE Transactions on Industrial Informatics 14, no. 10 (2018): 4674–82, https://doi.org/10.1109/TII.2018.2855198; Min Xu, Jeanne M. David and Suk Hi Kim, “The Fourth Industrial Revolution: Opportunities and Challenges,” International Journal of Financial Research 9, no. 2 (2018): 90–95, https://doi.org/10.5430/ijfr.v9n2p90.

9. Shankar Subramaniam, Naveenkumar Raju, Abbas Ganesan, Nithyaprakash Rajavel et al., “Artificial Intelligence Technologies for Forecasting Air Pollution and Human Health: A Narrative Review,” Sustainability 14, no. 16 (2022): 9951, https://doi.org/10.3390/su14169951; David B. Olawade, Ojima Z. Wada, Abimbola O. Ige, Bamise I. Egbewole et al., “Artificial Intelligence in Environmental Monitoring: Advancements, Challenges, and Future Directions,” Hygiene and Environmental Health Advances 12 (December 2024): 1–3, https://doi.org/10.1016/j.heha.2024.100114.

10. Christine W. Chan and Guo H. Huang, “Artificial Intelligence for Management and Control of Pollution Minimization and Mitigation Processes,” Engineering Applications of Artificial Intelligence 16, no. 2 (2003): 75–90, https://doi.org/10.1016/S0952-1976(03)00062-9.

11. Lauren Leffer, “The AI Boom Could Use a Shocking Amount of Electricity,” Scientific American, October 13, 2023, https://www.scientificamerican.com/article/the-ai-boom-could-use-a-shocking-amount-of-electricity/.

12. Lakshmi Varanasi, “ChatGPT Uses 17,000 Times the Amount of Electricity Than the Average US Household Does Daily: Report,” Business Insider, March 9, 2024, accessed April 2, 2024, https://www.businessinsider.com/chatgpt-uses-17-thousand-times-more-electricity-than-us-household-2024-3.

13. Elizabeth Kolbert, “The Obscene Energy Demands of A.I.,” The New Yorker, March 9, 2024, accessed April 2, 2024, https://www.newyorker.com/news/daily-comment/the-obscene-energy-demands-of-ai.

14. Lucía Fernández, “Google Energy Consumption 2011-2023,” Statista, October 11, 2024, https://www.statista.com/statistics/788540/energy-consumption-of-google/.

15. Bender, Gebru, McMillan-Major and Mitchell, “Stochastic Parrots.”

16. Pengfei Li, Jianyi Yang, Mohammad A. Islam and Shaolei Ren, “Making AI Less ‘Thirsty’: Uncovering and Addressing the Secret Water Footprint of AI Models,” arXiv preprint, 2023, https://arxiv.org/pdf/2304.03271, 2.

17. Li, Yang, Islam and Ren, “Making AI Less ‘Thirsty’,” 2.

18. Pranshu Verma and Shelly Tan, “A Bottle of Water per Email: the Hidden Environmental Costs of Using AI Chatbots,” The Washington Post, September 18, 2024, https://www.washingtonpost.com/technology/2024/09/18/energy-ai-use-electricity-water-data-centers/.

19. Lucía Fernández, “Google Energy Consumption 2011-2023,” Statista, October 11, 2024, https://www.statista.com/statistics/788540/energy-consumption-of-google/.

20. Steven Gonzalez Monserrate, “The Staggering Ecological Impacts of Computation and the Cloud,” The MIT Press Reader, 2022, https://thereader.mitpress.mit.edu/the-staggering-ecological-impacts-of-computation-and-the-cloud/.

21. David Patterson, Joseph Gonzalez, Urs Hölzle, Quoc Le et al., “Carbon Emissions and Large Neural Network Training,” arXiv, 2021, https://arxiv.org/ftp/arxiv/papers/2104/2104.10350.pdf.

22. Emma Strubell, Ananya Ganesh and Andrew McCallum, “Energy and Policy Considerations for Deep Learning in NLP,” in Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics (Florence, Italy: Association for Computational Linguistics, 2019), 3645–50.

23. Patterson, Gonzalez, Hölzle, Le et al., “Carbon Emissions.”

24. Sophia Chen, “Data Centres Will Use Twice as Much Energy by 2030 — Driven by AI,” Nature, May 21, 2025, https://www.nature.com/articles/d41586-025-01113-z.

25. Alex de Vries, The Growing Energy Footprint of Artificial Intelligence,” Joule 7, no. 10 (2023): 2191–94, https://doi.org/10.1016/j.joule.2023.09.004.

26. Karen Hao, “Training a Single AI Model Can Emit as Much Carbon as Five Cars in Their Lifetimes,” MIT Technology Review, June 6, 2019, https://www.technologyreview.com/2019/06/06/239031/training-a-single-ai-model-can-emit-as-much-carbon-as-five-cars-in-their-lifetimes/.

27. Chandana Patnaik, “Data Center Power: Fueling the Digital Revolution,” Data Center Knowledge, March 22, 2024, https://www.datacenterknowledge.com/energy-power-supply/data-center-power-fueling-the-digital-revolution.

28. Brightlio, “170 Data Center Stats for 2025,” accessed June 9, 2025, https://brightlio.com/data-center-stats/#pp-toc-huja89rng0qk-anchor-1.

29. Taylor, “Leading Countries by Number of Data Centers Worldwide.”

30. Patnaik, “Data Center Power.”

31. Ranjith Gundeti, Kaushik Vuppala and Varun Kasireddy, “The Future of AI and Environmental Sustainability: Challenges and Opportunities,” in Exploring Ethical Dimensions of Sustainability and Use of AI (IGI Global, 2024), 346–371, https://doi.org/10.4018/979-8-3693-0892-9.ch017.

32. World Bank, “Minerals for Climate Action.”

33. Varanasi, “ChatGPT Uses 17,000 Times the Amount of Electricity.”

34. International Energy Agency, Global Critical Minerals Outlook 2024, April 2024, https://www.iea.org/reports/global-critical-minerals-outlook-2024.

35. World Bank, “Minerals for Climate Action: The Mineral Intensity of the Clean Energy Transition,” World Bank Group, 2020, https://pubdocs.worldbank.org/en/961711588875536384/Minerals-for-Climate-Action-The-Mineral-Intensity-of-the-Clean-Energy-Transition.pdf.

36. Kate Crawford, The Atlas of AI: Power, Politics, and the Planetary Costs of Artificial Intelligence (Yale University Press, 2021).

37. Salvador Santino F. Regilme, “Artificial Intelligence Colonialism: Environmental Damage, Labor Exploitation, and Human Rights Crises in the Global South,” SAIS Review of International Affairs 44, no. 2 (2024): 75–92.

38. Paola Ricaurte, “Data Epistemologies, the Coloniality of Power, and Resistance,” Television & New Media 20, no. 4 (2019): 350–365.

39. Sábëlo Mhlambi and Simona Tiribelli, “Decolonizing AI Ethics: Relational Autonomy as a Means to Counter AI Harms,” Topoi 42, no. 3 (2023): 867–80, https://doi.org/10.1007/s11245-022-09874-2.

40. Florian Zandt, “Big Tech’s Reliance on Conflict Minerals,” Statista, August 2, 2024, https://www.statista.com/chart/32755/number-of-countries-gamam-potentially-sourced-conflict-minerals-from/.

41. Zandt, “Big Tech’s Reliance.”

42. Zandt, “Big Tech’s Reliance.”

43. Walter Leal Filho, Richard Kotter, Pinar Gökçin Özuyar, Ismaila Rimi Abubakar et al., “Understanding Rare Earth Elements as Critical Raw Materials,” Sustainability 15, no. 3 (2023): 1919, https://doi.org/10.3390/su15031919.

44. V. Balaram, “Rare Earth Elements: A Review of Applications, Occurrence, Exploration, Analysis, Recycling, and Environmental Impact,” Geoscience Frontiers 10 (2019): 1285–1303, https://doi.org/10.1016/j.gsf.2018.12.005.

45. Macartan Humphreys, Jeffrey D. Sachs and Joseph E. Stiglitz, “What Is the Problem with Natural Resource Wealth?” in Escaping the Resource Curse, eds. Macartan Humphreys, Jeffrey D. Sachs and Joseph E. Stiglitz (Columbia University Press, 2007).

46. Bettina Engels, “Mining Conflicts in Sub-Saharan Africa: Actors and Repertoires of Contention” (No. 2), GLOCON Working Paper, 2016; Tony Andrews, Bernarda Elizalde, Philippe Le Billon, Chang Hoon Oh et al., The Rise in Conflict Associated with Mining Operations: What Lies Beneath (Canadian International Resources and Development Institute, 2017), 32.

47. Roy Maconachie and Tony Binns, “Beyond the Resource Curse? Diamond Mining, Development, and Post-Conflict Reconstruction in Sierra Leone,” Resources Policy 32, no. 3 (September 2007): 104–15, https://doi.org/10.1016/j.resourpol.2007.05.00.

48. Foreign Policy, How Mining Fuels Conflict Across the Globe: From Rubies in Mozambique to Sand in the South China Sea,” Foreign Policy, April 30, 2020, https://foreignpolicy.com/2023/04/30/mining-conflict-minerals-rubies-gold-coal-sand-bauxite/.

49. Sam Leon, “How Can We Use Artificial Intelligence to Help Us Fight Corruption in the Mining Sector?” Global Witness, 2018, accessed March 20, 2024, https://www.globalwitness.org/en/blog/how-can-we-use-artificial-intelligence-help-us-fight-corruption-mining-sector/.

50. Clara Hernanz Lizarraga and Olivia Solon, “Thirsty Data Centers Are Making Hot Summers Even Scarier,” Bloomberg, July 26, 2023, https://www.bloomberg.com/news/articles/2023-07-26/extreme-heat-drought-drive-opposition-to-ai-data-centers.

51. Olivia Solon, “Drought-Stricken Communities Push Back Against Data Centers,” NBC News, June 19, 2021, https://www.nbcnews.com/tech/internet/drought-stricken-communities-push-back-against-data-centers-n1271344; Emilie Rubayita, “Alberta First Nation Voices ‘Grave Concern’ over Kevin O’Leary’s Proposed $70B AI Data Centre,” CBC News, January 16, 2025, https://www.cbc.ca/news/canada/edmonton/alberta-first-nation-voices-grave-concern-over-kevin-o-leary-s-proposed-70b-ai-data-centre-1.7431550.

52. Rubayita, “Alberta First Nation.”

53. Anna Jobin, Marcello Ienca and Effy Vayena, “The Global Landscape of AI Ethics Guidelines,” Nature Machine Intelligence 1, no. 9 (2019): 389–99.

54. Christian Djeffal, “Sustainable AI Development (SAID): On the Road to More Access to Justice,” in Technology, Innovation and Access to Justice: Dialogues on the Future of Law, eds. Siddharth Peter de Souza and Maximilian Spohr (Edinburgh University Press, 2021), 112–30, http://www.jstor.org/stable/10.3366/j.ctv1c29sj0.15.

55. Marie Francisco, “Making Sense of Artificial Intelligence for Global Environmental Governance: Ideas, Power, and Policy Pathways” (PhD diss., Linköping University, 2025), https://liu.diva-portal.org/smash/get/diva2:1955120/FULLTEXT01.pdf.

56. Monserrate, “Staggering Ecological Impacts of Computation and the Cloud”; Li, Yang, Islam and Ren, “Making AI Less ‘Thirsty’.”

57. Aili McConnon, “DeepSeek’s Reasoning AI Shows Power of Small Models, Efficiently Trained,” IBM, January 27, 2025, https://www.ibm.com/blog/deepseek-ai-small-models-efficiency.

58. Jared Cohen, George Lee, Lucas Greenbaum, Frank Long and Wilson Shirley, “The Generative World Order: AI, Geopolitics, and Power,” Goldman Sachs, December 14, 2023, https://www.goldmansachs.com/insights/articles/the-generative-world-order-ai-geopolitics-and-power.

59. Jon Truby, “DeepSeek’s AI Disruption: Implications for Global Climate Policy on Digital Decarbonisation, Energy Transitions and International Law,” Centre for International Law, National University of Singapore, February 8, 2025, https://cil.nus.edu.sg/blogs/deepseeks-ai-disruption-implications-for-global-climate-policy-on-digital-decarbonisation-energy-transitions-and-international-law/.

60. Nathan Andrews and Marcellinus Essah, “The Sustainable Development Conundrum in Gold Mining: Exploring ‘Open, Prior and Independent Deliberate Discussion’ as a Community-Centred Framework,” Resources Policy 68 (2020): 101798. https://doi.org/10.1016/j.resourpol.2020.101798.

61. Martin Cames et al., “How Additional Is the Clean Development Mechanism? Analysis of the Application of Current Tools and Proposed Alternatives,” Institute for Applied Ecology (Berlin), March 2016, https://climate.ec.europa.eu/system/files/2017-04/clean_dev_mechanism_en.pdf.

62. Leon, “How Can We Use Artificial Intelligence to Help Us Fight Corruption in the Mining Sector?”

63. Catrin Misselhorn, Artificial Morality: Concepts, Issues and Challenges, Society 55 (2018): 161–69, https://doi.org/10.1007/s12115-018-0229-y; Michael Haenlein and Andreas Kaplan, “A Brief History of Artificial Intelligence: On the Past, Present, and Future of Artificial Intelligence,” California Management Review 61, no. 4 (2019): 5–14, https://doi.org/10.1177/0008125619864925; von Krogh, “Artificial Intelligence in Organizations.”

64. United Nations Environment Programme, “AI has an environmental problem. Here’s what the world can do about that,” UNEP, June 5, 2024, https://www.unep.org/news-and-stories/story/ai-has-environmental-problem-heres-what-world-can-do-about; Hande Yuksel Sen, “The Artificial Intelligence Environmental Impacts Act of 2024: What You Need to Know,” Holistic AI, 2024, https://www.holisticai.com/blog/artificial-intelligence-environmental-impacts-act.

ISSN 2563-674X

doi:10.51644/bap73

Skip to content