Can AI and the environment coexist?

9/20/2024

There Are Two Terms I Struggle to Reconcile… No, it’s not me and being tanned after returning from vacation — it’s Artificial Intelligence (AI) and the environment.

Yes, I know many actors are working to put AI at the service of the environment, but that’s not what I mean here.

This article focuses on the impact of AI on the environment: how to measure it, how to control it, and what exactly we mean when we talk about frugal AI.

1. The environmental impact of digital technology driven by AI-based systems

As digital technology’s environmental footprint grows, this is an essential topic to examine. On July 2, 2024, Arcep reported that digital technology now accounts for 3–4% of global greenhouse gas emissions, a figure expected to rise sharply (+60% by 2040). It’s no secret: we urgently need to rethink our approach and usage of digital tools.

And beyond digital technology’s footprint, the use of AI-based systems continues to grow.

  • According to the European Parliament, AI refers to any tool, software, or machine designed to “reproduce human-related behaviors, such as reasoning, planning, and creativity.”
  • AFNOR defines an AI-based system as consisting of an algorithm, data, and the hardware needed to execute it.

These systems may be used for machine learning, image classification, text analysis, and more. Reliable statistics on the number of such systems are scarce, but one widely known example is OpenAI’s ChatGPT.

Launched in November 2022, ChatGPT attracted over 100 million active users in just two months, reaching 180 million by the end of 2023. Several studies have since analyzed AI’s environmental impact, which is no longer in doubt. As AFNOR notes, we must account for everything required to execute these algorithms.

Though these processes feel immaterial, running a single ChatGPT query consumes significant resources. This urgency has led AFNOR to introduce two new definitions:

Efficient AI system: one that uses state-of-the-art optimization techniques to reduce hardware and energy requirements and associated emissions, while maintaining sufficient performance.
Frugal AI system: one where the necessity of using AI (rather than a less resource-intensive solution) has been demonstrated, good practices are adopted to reduce environmental impact, and usage remains within planetary boundaries.

2. What if other, more frugal algorithms exist?

Today, AI is deployed almost as casually as putting on socks. This makes it all the more important to ask: is AI really necessary here?

As a founder, I’ve often heard colleagues admit they used AI in projects simply as a buzzword to secure funding.

At WedoLow, we advocate questioning the type of algorithm chosen: perhaps another algorithm could meet the need just as effectively while consuming fewer resources. This echoes eco-design principles: define the required service quality early to avoid using a bazooka to kill a fly.

And what about “efficient AI systems”? Research on this subject is highly active, and many optimization techniques are already applied in industrial contexts. Three key examples:

a) Simplifying AI algorithms
A primary optimization lever is arithmetic simplification. Deep learning systems are remarkably resilient to errors introduced by modified arithmetic. This allows the use of integer rather than floating-point operations, reducing execution time and energy consumption. This is the basis of quantization.

However, this requires a clear understanding of the acceptable quality level. For example, the well-known SqueezeNet model classifies images. Its quality metric is classification accuracy: should it be wrong 1% of the time? 0.01%? The answer defines how much data simplification is acceptable. Variants include fixed-point or customized floating-point arithmetic.

b) Pruning
Pruning involves removing neural network parameters with little impact on accuracy, thus reducing model size. Large networks can reach hundreds of billions of parameters, but only a subset is truly essential. Weights are usually pruned, either during or after training. Combined with quantization, this is particularly valuable for resource-constrained systems (like embedded devices).

3. Tangled Program Graphs (TPGs)

Another promising approach for constrained environments is Tangled Program Graphs (TPGs). This paradigm shift combines genetic programming and reinforcement learning (“trial and error,” with rewards or penalties).

A TPG consists of:

  • Programs: arithmetic instruction sequences that map input states to real numbers
  • Teams: internal graph nodes
  • Actions: graph leaves

Together, these elements form the graph. TPGs achieve results comparable to state-of-the-art models but require 1–3 orders of magnitude fewer computations and 2–10 orders of magnitude less memory to store inference models.

Their efficiency can be further enhanced by implementing them in high-performance languages like C++ and enabling parallelization.

Conclusion

Several methods and techniques can move us toward more frugal AI systems: quantization, pruning, approximations, and new paradigms such as TPGs (the list is far from exhaustive). These approaches make it possible to align embedded targets with the needs of AI algorithms.

The key, however, is knowing the required output quality. Once defined, new quality/performance trade-offs can be explored using the techniques outlined here.

Sources
[1] Iandola, Forrest N., et al. “SqueezeNet: AlexNet-level accuracy with 50x fewer parameters and <0.5 MB model size.” arXiv preprint arXiv:1602.07360 (2016).
[2] Milot, Quentin, et al. “Wordlength Optimization for Custom Floating-point Systems.” DASIP 2024 Proceedings. Springer Nature, 2024.
[3] Brown, Tom B. “Language models are few-shot learners.” arXiv preprint arXiv:2005.14165 (2020).
[4] Kelly, Stephen, and Malcolm I. Heywood. “Emergent tangled graph representations for Atari game playing agents.” EuroGP 2017 Proceedings. Springer, 2017.
[5] Desnos, Karol, et al. “Gegelati: Lightweight artificial intelligence through generic and evolvable tangled program graphs.” 14th DASIP Workshop. 2021.

Ready to optimize your embedded code?

Get started with WedoLow and see how we can transform your software performance