AI (artificial intelligence) is the technology that is attracting the most attention right now. Among them, ChatGPT, which is a type of generative AI that is good at natural language processing, is being considered for introduction by large companies, government offices, local governments, etc., and there are high expectations for its practical use.
Generative AI has high expectations from various quarters, but in reality it consumes a large amount of electricity and is even more likely to cause CO2 emissions.2emissions and water consumption are also pointed out. The issue of ``the impact of generative AI on the environment'' will eventually become inevitable. However, the use of AI and the consumption of large amounts of electricity are not equal; the main factor that determines the power efficiency of AI is the "algorithm" of the AI model.
Recently, the keyword "green AI" has been heard, which refers to the use of AI to solve environmental problems. The revolutionary "Green Micro AI" has a small impact on the environment. Let us introduce its features.
What is Generative AI
Generative AI, also called generative AI, is an AI that can learn large amounts of data such as text, images, and audio, and create new content based on its patterns and characteristics. Among them, ChatGPT, which has rapidly become a hot topic since the end of 2022, is said to bring about the fastest and largest change in humankind in history, and has attracted attention not only from researchers and developers but also from the general public. This ground-breaking technology, which enables natural interaction in a chat format and expands the possibility of being used for information gathering, can be said to be a representative example of generative AI.
Expectations for generative AI and environmental issues
Generative AI has a wide range of applications, including customer service, marketing, and creativity, and is bringing about major changes in all fields.
On the other hand, it has been pointed out that the rapid spread of generative AI, not just ChatGPT, is consuming a large amount of electricity.For example, if you search for ``generative AI environmental issues'' on a search engine, you will find many related articles. Deep learning, the technology that supports generative AI, requires a large amount of resources, including electricity, so while advances in AI will brighten the future, there is also a need for consideration for the environment as technology progresses.
Actual state of power consumption of generated AI including ChatGPT
Generative AI processes a huge amount of information on cloud servers and data centers, which consumes a large amount of electricity. In particular, large-scale language models (LLM) using deep learning have a large number of parameters, and the required amount of calculation has increased dramatically compared to conventional AI. As the number of parameters in an AI model increases, the number of GPUs required for inference and training (learning) increases, which in turn increases power consumption.
What is an AI model?
Creating an AI model means finding characteristics and rules based on data. In deep learning, which is the mainstream of AI, including generative AI, there are many calculation layers between the input and output of the AI model, as shown in the diagram, and some models have hundreds of billions of parameters or more.
What is the relationship between training (learning)/inference performed with AI models and power consumption?
AI builds a model through training (learning), makes inferences using that model, and outputs results. When training deep learning models such as generative AI, first a large amount of calculations are performed to adjust parameters, that is, adjust connections within the network. When performing inference using the model built in this way and outputting the output, the computer again performs a huge amount of calculation.
When training and inferring generative AI in this way, calculations are performed while consuming a large amount of power in each calculation layer.
How does ChatGPT consume power?
ChatGPT generates sentences that are as natural as humans, and this is due to its specialized technology that keeps predicting the next word (token). The "inference" that generates sentences that answer questions is not done on the user's computer or smartphone using ChatGPT, but on a server in the cloud, and this seemingly simple "probabilistically predicting the next word" The process requires a large amount of computation and consumes power.
Also, although there is no official public information on the "training" phase of ChatGPT's model until it achieved its amazing capabilities, it is easy to imagine that a huge amount of energy was required for calculations on the server.
Power consumption and CO2Relationship between emissions and water consumption
Consuming electricity means that CO2 is emitted when that electricity is generated.2This means that greenhouse gases such as (carbon dioxide) are emitted. Furthermore, the cooling system for the exhaust heat generated by server operation also requires fresh water; for example, in ChatGPT, it is estimated that the server consumes the equivalent of one 20ml plastic bottle of water for every 50 to 500 questions.
Reference: https://www.businessinsider.jp/post-268695
What companies are expected to do is both introduce AI and respond to ESG issues.
Now that AI has evolved to this point, many companies are likely considering introducing some form of AI in order to support their own development and growth.
On the other hand, there is an accelerating trend in which companies are required to proactively disclose ESG* information. This is because ESG information is important for institutional investors in evaluating a company's growth potential and sustainability, and "responsible investment" is required, which means making investment decisions from an ESG perspective. Simply put, responding to ESG is essential for the smooth management of a company.
If a company introduces and operates large-scale AI, it may not necessarily be positive from an environmental (E) perspective. However, despite these concerns, there is no realistic option not to introduce AI. In this dilemma, companies are expected to balance the introduction of AI with contributing to the environment and addressing social issues.
*ESG: Activities such as management that takes into consideration the environment (E: Environment), society (S: Social), and governance (G: Governance).
KIBIT, an environmentally friendly “Green Micro AI”
FRONTEO's "KIBIT" is a unique AI engine that supports companies in solving problems with high-precision text analysis, is extremely energy efficient, and meets ESG-oriented approaches. "Green Micro AI" is an AI that is environmentally friendly (=Green) and uses a simple algorithm with few parameters (=Micro).
This is due to KIBIT's simple algorithm with few parameters. By making the most of mathematical approaches without relying on deep learning, which is computationally expensive, we have achieved AI that has a low environmental impact and is lightweight. Its strength is that it is highly effective with the minimum necessary parameters, and can analyze and learn text at high speed with extremely low power consumption, using the CPU level of a regular PC rather than the GPU used in data centers.
Three features of KIBIT, an energy-saving AI “Green Micro AI”
1. CO2Comparison of emissions (unit: lbs)
FRONTEO's AI "KIBIT" is CO2The amount of emissions is overwhelmingly small.
*1 Excerpted from Energy and Policy Considerations for Deep Learning in NLP, College of Information and Computer Sciences University of Massachusetts Amherst (Jun 2019)
*2 Japanese CO2Created by FRONTEO from emissions and Japan's population
*3 Created by FRONTEO using the same calculation method as the paper in *1.
2. Realizes a simple structure
KIBIT is a form of machine learning that utilizes an algorithm different from deep learning, based on the researchers' belief that all phenomena that occur in the world can be expressed using mathematical formulas.
High accuracy is achieved using a simple mathematical approach. It is approximately 400/1th the size of typical large-scale models used to analyze causal relationships.
Background of KIBIT’s growth into “Green micro AI”
KIBIT is FRONTEO's in-house developed AI engine that was introduced over 10 years ago and has undergone repeated improvements. Its origins were in supporting Japanese companies in US litigation.
In litigation, it is necessary to find evidence from a huge amount of data, and the AI "KIBIT" was developed with the aim of solving this problem. The algorithm, which is designed to "minimize the effort of expert lawyers and achieve the goal as quickly as possible," has higher accuracy than deep learning, and is also a "green" algorithm that is flexible and consumes less power, which is an issue with deep learning. micro AI.
Promoting harmony between technology and the environment to build a sustainable AI society
KIBIT's ease of implementation, with its low environmental load, light operation, and small amount of training data, also lowers the hurdles for social implementation. This is an AI engine for companies that want to realize their business mission at a higher level with AI, while also aiming to manage their activities in an environmentally friendly manner from an ESG perspective.
KIBIT uses text analysis using natural language processing to provide expert support in a wide variety of fields, including litigation support, forensic investigations, business intelligence, drug discovery, life sciences, and economic security.