Blog Details

Red AI Versus Green AI

Rory Donovan
January 20, 2023
0 comments

Over View

An unfortunate byproduct of machine learning and deep learning is the high computational costs that have an adverse environmental impact. We call these wasteful AI systems red AI. But there is a positive alternative – green AI, which has been spearheaded by conscientious programmers who aim to reduce the computational expense resulting from deep learning algorithms.

How Did We Get Here? 

AI researchers are driven by the goal to be innovative by producing state-of-the-art technology. But to achieve these results, enormous quantities of compute are needed – pouring more resources into mining results. This produces red AI, and the logarithmic curve describes the most favorable relationship one can expect between computational power and model performance.

But the only way to make significant gains in performance is to create exponentially larger models that require exponential compute, training data, time, and other expensive inputs. In short, the current research tradition for artificial intelligence is subject to the law of diminishing returns.  

Critics of this culture have pointed to AI leaderboards as problematic because they reward AI and deep learning researchers for producing accurate algorithms without any requirements (or equal recognition) for computational efficiency. Further, it ignores other forms of efficiency as well, such as economic and social efficiencies, which suffer and exclude many from entering the research world.

Red AI: An Explainer 

First, it is important to know that AI systems are measured by carbon emissions or energy used, in relation to the size of the dataset. That comes into play because AI model complexity and computation power grow exponentially in comparison to incremental accuracy gains. Researchers build models that are bigger and more complex to achieve higher levels of accuracy and precision – but this comes at a cost in terms of 1) model training cost; 2) dataset size; and 3) number of experiments needed.  

Model Training Cost. The drive toward accuracy and precision can be considered measures of efficiency, but inversely, as measures of inefficiency in terms of costs driven by the model’s complexity. Processing a single example is often expensive and inefficient in and of itself.  

Dataset size costs can also be measured by the size of the data training set. One way to ascertain the accuracy of the model is to train and test it on several subsets of a larger data pool, requiring several iterations of an already expensive process. The challenge of this process becomes apparent when one realizes expense comes not just from data processing but also storage. The law of diminishing returns rule applies here because the size of the dataset affects the number of model executions and drives costs upward while failing to deliver increased accuracy.  

The third cost is related to the experiment size and number. When dealing with algorithms containing hyperparameters, results do not usually reach their desired plateau after a small number of trials. This means that a full exploration of the model itself is so cost prohibitive that the possibility of exploring competing models becomes untenable for many. It is also the case that tuning hyperparameters requires several runs of the algorithm, leading to an increase in the overall runtime, energy usage, and computation power usage. The problem only compounds the complexity and size of the models and the experiments because these variables drive the need for repetitive trials. 

The Flipside: Green AI 

Because research has traditionally skewed to red AI in the dedicated pursuit of high levels of accuracy, Green AI has not been a priority for most. But green AI deserves our support and focus because it addresses the computational expense that goes into running deep-learning algorithms. Unlike red AI, which increases the carbon in the atmosphere, green AI takes efficiency into account in its drive toward accuracy.  

Another challenge for green AI is the significant financial costs to achieve it, prohibiting stakeholders like academics, researchers, and students from engaging in research related to deep learning. As such, large companies like Amazon, Google, and Microsoft already have a large foothold in AI.  

Some researchers have proposed that the ethical approaches to AI should include a greening component that identifies algorithmic modeling process efficiency – a goal we fully support.  

Measuring the Costs of AI 

Green and red AI systems can be measured by parameters related to the system but also connected to the natural resources running them. Such parameters include cost of training, size of data, and carbon emissions. These costs are also measured by the quantity of how many resources are used to consume — carbon, electricity, the model’s runtime, money, and other measurable resources. 

The following subsections contain heuristics that can be measured to identify the cost of training a model. 

1. Carbon Emissions. 

Measuring the amount of carbon or byproducts of carbon dioxide is released into the environment. The measure of carbon emissions is closely related to efficiency: the more efficient an algorithmic process, the less carbon is emitted into the environment. Efficiency, therefore, minimizes carbon emissions and does this by reducing the runtime and energy consumption that goes into the training and testing of models.

2. Electricity

Measuring electricity consumption correlates with carbon emission, so that the former gives some evidence of how green the AI is by indicating how much carbon is being emitted into the environment for each run of the algorithm.

3. Runtime

A model’s speed can be measured by how much time it takes to perform its functions. This is called its runtime, and with all other variables remaining equal, a model that moves faster indicates that it does less computation—hence the algorithm is more efficient.

4. Parameter Count

A possible measure of an algorithm’s operation cost is the number of parameters it contains. Parameters are weights that are either initially input into the system or which must be learned during a system’s training. The parameter counts correlates very closely to the work done by the machine because calculations must be made using each of its parameters, either to determine (that is, learn or update) other parameters (in deeper layers of the model, or to produce the outputs that indicate the model’s accuracy. The more parameters an algorithm has, the more work it takes to run it from start to finish.

5. Floating Point Operations

The number of floating-point operations (FPO) used when delivering an output gives some insight into the cost of deploying a model. It gives a general indication of the computations’ workload and can be weighted by operation—that is, assigning an index of computational difficulty by assigning cost values to the addition and multiplication operations.  

Conclusion 

As developers, we should make green AI a top priority – it is ethical and good for our environment, and if promoted correctly could create jobs. The harsh truth is red AI is contributing to global warming in ways that are not understood by society at large. As developers and managers, we have a responsibility to each other and the future inhabitants of this Earth.  We welcome your thoughts.  

Interested in learning more about how to develop ethical AI? Our firm can help you put best practices in place to better serve your customers?  Contact us! Quickly develop ethical AI that is explainable, equitable, and reliable with help from our complete AI IaaS. Sign up for FREE diagnostics.

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *