Make AI Eco-Responsive | Alliance

The race for artificial intelligence performance has now merged with AI’s eco-responsibility concerns. The issue is emerging and many tools are still missing for achieving economical AI.

During DIMS 2022, CEA’s Timothy Sylvester urges those responsible for innovation not to approach disruptive technology solely from a performance standpoint. These innovations must take into account resilience – or even directly contribute to it.

This is one of the lessons learned from the health crisis. The climate crisis encourages greater consideration of the eco-responsibility of technology, including artificial intelligence. AI projects are increasingly integrating ethical dimensions. Responsible goes beyond AI. It takes into account, for example, data privacy and environmental issues.

The frugality of AI: awareness, but lack of maturity

The Green AI movement is still recent, however, observed by Olivier Matz, R&D program manager at Capgemini Engineering’s Artificial Intelligence. “Until 2019, we were in an AI instance that was characterized by a race for performance and accuracy,” he added.

The Green AI concept was officially transformed in 2019, especially at the initiative of Roy Schwartz. The work will later measure whether training in the new NLP algorithm can generate up to 300 tons of C02. “This is equivalent to the emissions of five internal combustion vehicles, considering their entire life cycle,” Olivier Matz noted.

The use of artificial intelligence is not neutral in terms of impact. Additionally, the computing power required to train AIs is increasing “theoretically”. In a span of 6 years, it has multiplied by 300,000, according to Capgemini expert figures.

Didier Gaultier, director of data science and AI for Business and Decision (Orange Group), also observed the tendency to use the most complex models, almost automatically, the highest CO2 emitters. Thus, deep learning is “very often an abyss of energy,” he warns. And this is specifically explained by the huge amount of training data.

These results and indicators “prove that the development of new AI algorithms is not always more efficient, sustainable from an environmental point of view”, emphasizes Olivier Matz. This reality therefore calls for awareness, and above all for action. The first is present, at least in the academic community.

Cost measurement: An essential step

At the industry level, implementation of Green AI has begun as part of a more global approach focusing on greater frugality in digital use and development. However, the level of awareness has remained immature. In addition to adding to AI’s environmental issues, action plans need to be added.

In Capgemini Engineering, initiatives in this sector are formed exclusively in the SusAI project. [Ndlr : sustainability AI], An in-house research and innovation program. Its purpose is to reduce the environmental impact of AI. This objective requires exploring different areas of work, the first of which is measurement.

The company therefore develops tools and benchmarks aimed at measuring and measuring the environmental impact of artificial intelligence. These measurements will then be able to detect the reduced lever. Their implementation will first require raising awareness and promoting good practice in the AI ​​community.

Capgemini Engineering is developing a tool, a library, to monitor the use of different algorithms in production (Computer Vision, NLP, Voice Assistant) in real time, and to reduce recommendations.

A first version has been created, available for internal parties This version will be enriched by 2022, then in 2023 will be offered commercially externally. The following year, Capgemini also plans to expand the application of the “business to cover most of the use of AI” values.

An echo score for the AI ​​algorithm

The company also wants to bring Green AI certification. In the long run, it will be a question of establishing a score for IA, such as Echo Score or DPE in real estate. To take this project forward, Capgemini Engineering must enlist the support of partners with a certification body.

In the short term, however, it is already possible to introduce more frugality in artificial intelligence. This requires less complex initiatives. The location of the datacenter thus forms a lever to improve CO2 emissions, as does the choice of hardware with a GPU.

“New GPUs will cost more per hour, but their high performance helps reduce the overall bill. In addition, their energy efficiency is more important. It’s an easy choice to implement, which makes it possible to save on costs and environmental aspects “, quotes Olivier Matz as an example.

Reducing environmental impact does not necessarily spread with increasing costs and even, conversely, can be synonymous with substantial savings.

As part of her responsibilities in business and decision making, Didier Goltier has recently had the opportunity to reflect on these environmental issues with a number of colleagues who specialize in clients and artificial intelligence. From these exchanges, various ways to improve the energy restraint of AI emerge.

Record and prepare data upstream

The first track contains “recording and preparing upstream data as much as possible to adapt to their exploitation by algorithms”, we are also talking about data enhancement and rich upstream at an advanced feature engineering level. .

“The more algorithm’s input data, the more relevant and adaptable it will be”, the faster and easier results can be obtained downstream. “The number and relevance of available features far outweigh the complexity of commonly used algorithms.”

“Thanks to more accurate data, we can then afford simple algorithms”, surpasses the expertise in economics, explanatory, firmness, detection and error correction.

The Director of Data Science and AI also supports other recommendations, including limiting and promoting systematically centralized storage of all available data, where possible, “edge computing” or even “data mesh”.

Third area to improve: “Try to use better designed architectures, processors and software as a priority”. Optimization of hardware and software architecture is an important path, and is not related to DevOps and MLOps.

A movement in favor of data-centric AI

Olivia Matz advises data scientists to “choose the right structure according to its use”. Notebooks, Python, TenserFlow, Pieterch… these technologies are preferred by these experts. But from an environmental point of view, “these are the worst from an energy standpoint,” he said, recommending alternative, optimized frameworks and specific hardware for production, such as OpenVINO and ONNX runtime.

There are also “more specific” methods for reducing carbon footprint. Olivier Matz cites the mode of weight or its accuracy of the model as an example. And again, these methods do not reduce performance, while helping to reduce energy costs by up to 10%.

Frugal AI is therefore involved in raising the awareness of data scientists for better development practices. This is a novelty for profiles that are not strictly related to software development. To make progress, Olivier Matz is finally behind the data-centric AI movement launched by Andrew Ng in 2021.

“It’s a paradigm shift. We are no longer a model centered on in-depth learning, but on information, “he explains. This results in significantly lower data consumption, such as through low-resolution images, for training, and therefore reduced footprint in AI learning and deployment.

Leave a Comment