Yesterday the Wall Street Journal reported that OpenAI CEO Sam Altman wants to raise up to $7 trillion for a “wildly-ambitious” tech project to boost the world’s chip capacity, funded by investors including the U.A.E. — which in turn will vastly expand its ability to power AI models.
While this may simply be a dreamy moonshot on Altman’s part, or an Elon Musk-like hype-generator, what is not in doubt is the environmental impact of such a massive effort, according to Sasha Luccioni, climate lead and researcher at Hugging Face.
“If it does work out, the amount of natural resources that will be required is just mind-boggling,” she told VentureBeat. “Even if the energy is renewable (which it isn’t guaranteed to be), the quantity of water and rare earth minerals required is astronomical.”
For comparison, in September 2023 Fortune reported that AI tools fueled a 34% spike in Microsoft’s water consumption; Meta’s Llama 2 model reportedly guzzled twice as much water as Llama 1; and a 2023 study found that OpenAI’s GPT-3 training consumed 700,000 liters of water.
The AI Impact Tour – NYC
We’ll be in New York on February 29 in partnership with Microsoft to discuss how to balance risks and rewards of AI applications. Request an invite to the exclusive event below.
And beyond the environmental impact, shortages of rare earth minerals such as gallium and germanium have even helped inflame the global chip war with China.
Luccioni criticized Altman for not focusing on more efficient AI methods to develop AI. Instead, she said, “he’s taking a brute force approach and people are calling it.. visionary?”
GPU access has become key Silicon Valley AI struggle
But the fact is, Altman’s desire to tackle the current GPU shortages and reshape the semiconductor landscape is not unusual. Last summer, VentureBeat reported on how access to Nvidia’s hard-to-come-by, ultra-expensive, high-performance computing H100 GPU for large language model (LLM) training was becoming the “top gossip” of Silicon Valley.
And just last week, Meta offered a deep dive into its AI strategy in its latest earnings call, CEO Mark Zuckerberg said that to build AI “full general intelligence” the first key requirement is “world-class compute infrastructure.” Zuckerberg went on to repeat what he had recently disclosed in a recent Instagram Reel: that by end of this year Meta will have about 350k H100s — including other GPUs the total will be around 600k H100 equivalents of compute.
The company plans to continue investing aggressively in this area, he explained: “In order to build the most advanced clusters, we’re also designing novel data centers and designing our own custom silicon specialized for our workloads.”
Luccioni has been critical about Nvidia’s transparency about the carbon footprint of its products (which are designed by the company but manufactured by the Taiwan Semiconductor Manufacturing Company: “Nvidia has yet to publish any information about the environmental footprint of their manufacturing,” she said, adding that e-waste as a whole is also a “huge issue because people want the new GPUs and they’re essentially throwing out the old ones after a year or two.”
In Nvidia’s 2023 Corporate Responsibility Report, the company said “emissions are generated at every stage of our product lifecycle, including manufacturing within our supply chain. Since 2014, we’ve expected our key silicon manufacturing and systems contract manufacturing suppliers to report their annual energy and water usage, waste, greenhouse gas (GHG) emissions, and reduction goals and objectives through the RBA Environmental Survey or CDP. We also expect suppliers to have their GHG emissions verified by a third party. We use this supplier data to better understand our product manufacturing impact and allocate carbon emissions to our customers.”
More transparency from OpenAI is unlikely
Overall, Luccioni maintains that there is less transparency today when it comes to the environmental impact of AI — and is unlikely to change anytime soon with Altman’s new fundraising march.
“If you look at the PaLM 1 paper from Google, which was in 2022, and then Palm 2 [released in May 2023], the amount of information they provided drastically dropped,” she said. In the original paper, she explained, Google shared enough information so that energy-use estimates could be made.
“Now [companies] don’t even say how long it took [to train], how many chips they used, there’s absolutely no information provided anymore,” she said.
But overall, Luccioni says she isn’t too worried: “I think this is just a moonshot project that won’t actually pan out,” she said. “But that will put [Altman] on par with Elon in terms of outlandish projects that attract attention and generate hype.”
VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.