What does OpenAI need a CEO for anyway? Just let chatGPT run the company if they are so gung-ho about it.
I can’t wait for the AI bubble to burst.
We are all waiting. If they don’t come up with proven revenue opportunities in the next ~18 months, it’s going to be difficult to justify the astronomical capex spend.
The Bloomberg podcast series ‘Foundering – The OpenAI Story’ is quite insightful in regard to Sam Altman’s psyche.
There are five episodes, first is here:
He will get that. The ultra rich ignore all healthy limits.
I’ve heard someone call it billionaire brain rot. I think at some point you end up with so much money and not enough people telling you no, that it literally changes your brain.
Seems likely.
I think it’s also likely that it’s very hard to amass billions unless you already have some sort of brain rot.
Imagine never hearing the word “No.” as a complete sentence ever again in your life.
deleted by creator
Middle Eastern money
Something tells me the Saudis don’t want AI for the betterment of all humanity.
Could be the human rights abuses, dunno.
Open AI has a projected revenue of 3 Billion this year.
It is currently projected to burn 8 Billion on training costs this year.
Now it needs 5 Gigawatt data centers worth over 100 Billion.
And new fabs worth 7 Trillion to supply all the chips.I get that it’s trying to dominate a new market but that’s ludicrous. And even with everything so far they haven’t really pulled far ahead of competing models like Claude and Gemini who are also training like crazy.
There is no market, or not much of one. This whole thing is a huge speculative bubble, a bit like crypto. The core idea of crypto long term make some sense but the speculative value does not. The core idea of LLMs (we are no where near true AI) makes some sense but it is half baked technology. It hadn’t even reached maturity and enshittification has set in.
OpenAI doesn’t have a realistic business plan. It has a griftet who is riding a wave of nonsense in the tech markets.
No one is making profit because no one has found a truly profitable use with what’s available now. Even places which have potential utility (like healthcare) are dominated by focused companies working in limited scenarios.
IMO it’s even worse than that. At least from what I gather from the AI/Singularity communities I follow. For them, AGI is the end goal - a creative thinking AI capable of deduction far greater than humanity. The company that owns that suddenly has the capability to solve all manner of problems that are slowing down technological advancement. Obviously owning that would be worth trillions.
However it’s really hard to see through the smoke that the Altmans etc. are putting up - how much of it is actual genuine prediction and how much is fairy tales they’re telling to get more investment?
And I’d have a hard time believing it isn’t mostly the latter because while LLMs have made some pretty impressive advancements, they still can’t have specialized discussions about pretty much anything without hallucinating answers. I have a test I use for each new generation of LLMs where I interview them about a book I’m relatively familiar with and even with the newest ChatGPT model, it still makes up a ton of shit, even often contradicting its own answers in that thread, all the while absolutely confident that it’s familiar with the source material.
Honestly, I’ll believe they’re capable of advancing AI when we get an AI that can say ‘I actually am not sure about that, let me do a search…’ or something like that.
This guy is losing touch of reality
deleted by creator
I think it’s a delusion of grandeur.
deleted by creator









