Description
AI, and large generative models in particular, might produce increased carbon emissions and increase water usage for their training and operation.
Why is impact on the environment a concern for foundation models?
Training and operating large AI models, building data centers, and manufacturing specialized hardware for AI can consume large amounts of water and energy, which contributes to carbon emissions. Additionally, water resources that are used for cooling AI data center servers can no longer be allocated for other necessary uses. If not managed, these could exacerbate climate change.
Increased Carbon Emissions
According to the source article, training earlier chatbots models such as GPT-3 led to the production of 500 metric tons of greenhouse gas emissions—equivalent to about 1 million miles driven by a conventional gasoline-powered vehicle. This same model required more than 1,200 MWh during the training phase—roughly the amount of energy used in a million American homes in one hour.
Parent topic: AI risk atlas
We provide examples covered by the press to help explain many of the foundation models' risks. Many of these events covered by the press are either still evolving or have been resolved, and referencing them can help the reader understand the potential risks and work towards mitigations. Highlighting these examples are for illustrative purposes only.