Array
(
    [content] => 
    [params] => Array
        (
            [0] => /forum/threads/openai-would-have-to-spend-over-1-trillion-to-deliver-its-promised-computing-power-it-may-not-have-the-cash.23837/
        )

    [addOns] => Array
        (
            [DL6/MLTP] => 13
            [Hampel/TimeZoneDebug] => 1000070
            [SV/ChangePostDate] => 2010200
            [SemiWiki/Newsletter] => 1000010
            [SemiWiki/WPMenu] => 1000010
            [SemiWiki/XPressExtend] => 1000010
            [ThemeHouse/XLink] => 1000970
            [ThemeHouse/XPress] => 1010570
            [XF] => 2030770
            [XFI] => 1060170
        )

    [wordpress] => /var/www/html
)

OpenAI would have to spend over $1 trillion to deliver its promised computing power. It may not have the cash.

Daniel Nenni

Admin
Staff member
Open AI Sam Altman.jpg


OpenAI (OPAI.PVT) would have to spend more than $1 trillion within the next five years to deliver the massive amount of computing power it has promised to deploy through partnerships with chipmakers Nvidia (NVDA), Broadcom (AVGO), and Advanced Micro Devices (AMD), according to Citi analysts.

OpenAI's latest deals with the three companies include an ambitious promise to deliver 26 gigawatts worth of computing capacity using their chips, which is nearly the amount of power required to provide electricity to the entire state of New York during peak summer demand.

Citi estimates that it takes $50 billion in spending on computing hardware, energy infrastructure, and data center construction to bring one gigawatt of compute capacity online.

Using that assumption, Citi analyst Chris Danely said in a note to clients this week that OpenAI's capital expenditures would hit $1.3 trillion by 2030.

OpenAI CEO Sam Altman has reportedly floated bolder promises internally. The Information reported in late September that the executive has suggested the company is looking to deploy 250 gigawatts of computing capacity by 2033, implying a cost of $12.5 trillion.

But there's no guarantee that OpenAI will have the capital to support the costs required to achieve its goals. While OpenAI's costs are set to soar to more than $1 trillion, Citi estimates the company's revenue will climb to a fraction of that figure — $163 billion — by 2030.

That disconnect has added to Wall Street concerns over a stock market bubble. Stocks have soared to new records this year largely on investor optimism over artificial intelligence.

OpenAI had already made big commitments to the global AI build-out ahead of its latest deals with chipmakers. The company in September announced a $300 billion deal with Oracle (ORCL) as part of its 10-gigawatt US AI infrastructure project called Stargate. OpenAI has unveiled additional Stargate infrastructure projects abroad in partnership with Nvidia in the United Arab Emirates and Norway. The company also committed $22 billion to purchasing data center capacity from Nvidia-backed AI data center provider CoreWeave (CRWV).

The tangled web of investments among the leading industry players has led to concerns that AI demand could be overstated.

"[OpenAI CEO Sam Altman] has the power to crash the global economy for a decade or take us all to the promised land, and right now we don't know which is in the cards," Bernstein analyst Stacy Rasgon wrote in an Oct. 6 note.

Adding to funding concerns, it's unclear whether US power infrastructure can scale up in time to meet the energy demands of the latest AI projects, which would prevent OpenAI from cashing in on its spending.

If, however, OpenAI meets its goals, chipmakers could see huge gains. Nvidia could see as much as $500 billion in revenue from OpenAI if the total deal amount is fulfilled, according to Bank of America analyst Vivek Arya. And Broadcom could see more than $100 billion in revenue from its own deal with the ChatGPT developer, Bernstein's Rasgon estimated.

 
"OpenAI's latest deals with the three companies include an ambitious promise to deliver 26 gigawatts worth of computing capacity using their chips"

Is the gigawatt the new unit of computing capacity then ? Leaving aside the dimensional analysis fail here, I'm curious exactly who dreamed up this new metric as it seems almost the opposite of what we should be optimising for. For raw performance, surely some sort of Tflops type measurement applies. But don't we really want to keep our eyes on performance/watt and look for some sort of Moore's Law type effect where we can scale up performance in the same power envelope ?
 
Chips are getting more power efficient but it's like 15% better performance per watt at every two years. Given that you can't really rely that much on efficiency to get you the amount of compute required.

People are talking about power as the metric for datacenters since that's become the main constraint. It's no longer a matter of getting the chips - there is a 4 year long (and growing) wait to get a large load connected to the grid. Datacenters are building their own power plants to get around that constraint, but it's still the major constraint. This is why you have seen such a rise in the stock of Bloom Energy, since they offer what's probably the best solution to get power quickly to a new datacenter - even if it's a bit more expensive (at least today).

I think these data centers will likely end up getting plopped on gas fields directly in Ohio and Texas and powered by fuel cells. There is basically no other way to get the power required at the speed required.
 
"OpenAI's latest deals with the three companies include an ambitious promise to deliver 26 gigawatts worth of computing capacity using their chips"

Is the gigawatt the new unit of computing capacity then ? Leaving aside the dimensional analysis fail here, I'm curious exactly who dreamed up this new metric as it seems almost the opposite of what we should be optimising for. For raw performance, surely some sort of Tflops type measurement applies. But don't we really want to keep our eyes on performance/watt and look for some sort of Moore's Law type effect where we can scale up performance in the same power envelope ?

I agree but I do like the gigawatt measurement term in regards to how much power these datacenters will require. I'm not confident we have done the math here in regards to how much electricity we will need to power all of these "announced" AI datacenters.

As it is, in Northern California we are told to not use our appliances during peak power consumption periods. We also have water issues. There are also massive high density housing growth initiatives which will make things worse.

Exciting times.........
 
I agree but I do like the gigawatt measurement term in regards to how much power these datacenters will require. I'm not confident we have done the math here in regards to how much electricity we will need to power all of these "announced" AI datacenters.

As it is, in Northern California we are told to not use our appliances during peak power consumption periods. We also have water issues. There are also massive high density housing growth initiatives which will make things worse.

Exciting times.........

"TSMC previously projected that its data center AI related revenue would grow at a mid-40% annual rate through 2029. JP Morgan analyst Gokul Hariharan asked whether this forecast should be revised upward to reflect current market conditions. C.C. Wei agreed it probably should, but said that TSMC would provide updated guidance early next year. C. C. Wei described the current demand as “insane.”"

 
I agree but I do like the gigawatt measurement term in regards to how much power these datacenters will require. I'm not confident we have done the math here in regards to how much electricity we will need to power all of these "announced" AI datacenters.

As it is, in Northern California we are told to not use our appliances during peak power consumption periods. We also have water issues. There are also massive high density housing growth initiatives which will make things worse.

Exciting times.........

They won't be building too many new GW scale datacenters in NoCal. Companies will avoid grid an pipeline and water constraints by building datacenters directly on top of shale gas fields and powered by SOFC fuel cells, which produce water as a byproduct. There is no other way to bring on the capacity being talked about in the relevant time frames. Everything else takes too long - although massive investments are being made in nuclear fusion it will take beyond 2030 to scale.

What's incredible to me is this AI bubble will likely lead to another energy revolution.

In a way the mobile computing revolution directly lead to the EV revolution because it lead to the development of cheaper batteries.

I think the AI bubble will lead to the next energy revolution.
 
What's incredible to me is this AI bubble will likely lead to another energy revolution.

In a way the mobile computing revolution directly lead to the EV revolution because it lead to the development of cheaper batteries.

I think the AI bubble will lead to the next energy revolution.

I agree completely. Apple making their own chips was very disruptive. Apple and TSMC pioneered the low power market which paved the way for EV. Now TSMC and their big customers will pioneer low power datacenter chips.

Energy is one of the many industries that will be disrupted by AI. In fact, I cannot think of an industry that will not be disrupted by AI. Exciting times, absolutely.
 
Citi estimates that it takes $50 billion in spending on computing hardware, energy infrastructure, and data center construction to bring one gigawatt of compute capacity online.

Using that assumption, Citi analyst Chris Danely said in a note to clients this week that OpenAI's capital expenditures would hit $1.3 trillion by 2030.

OpenAI CEO Sam Altman has reportedly floated bolder promises internally. The Information reported in late September that the executive has suggested the company is looking to deploy 250 gigawatts of computing capacity by 2033, implying a cost of $12.5 trillion.
Per Statistica, 2022 US electricity capacity was 1200GW; 2025 US GDP should be about $27T growing at ~1.5%

Squarely in national macro economic level, for one company!
But there's no guarantee that OpenAI will have the capital to support the costs required to achieve its goals. While OpenAI's costs are set to soar to more than $1 trillion, Citi estimates the company's revenue will climb to a fraction of that figure — $163 billion — by 2030.
CapEX of $1.3T by 2030, and they still have to pay for the electricity and such. So, even if OpenAI's 2030 revenue is 10x Danely's estimate to $1.63T, it still might not be cash flow positive.

BTW, Mag7+ORCL 2024 total revenue was $2.1T.
"[OpenAI CEO Sam Altman] has the power to crash the global economy for a decade or take us all to the promised land, and right now we don't know which is in the cards," Bernstein analyst Stacy Rasgon wrote in an Oct. 6 note.
I have seen crashes, but I do wonder what that Promised Land looks like!
 
Back
Top