OpenAI, a standout name among AI startups, is contemplating venturing into AI chip production. This decision isn’t sudden; conversations regarding AI chip strategies have been bubbling since the previous year, mainly due to the escalating scarcity of chips vital for AI model training.
The shortage is becoming palpable. For instance, Microsoft, a giant in the tech industry, forewarned about potential service interruptions because of the lack of server hardware essential for AI, as highlighted in a recent earnings announcement. Meanwhile, Nvidia, known for its top-tier AI chips, is reportedly out of stock until 2024, revealing the immense demand in the GPU market.
At present, OpenAI heavily leans on GPU-based infrastructure to forge models like ChatGPT and DALL-E 3. The inherent parallel computation capability of GPUs fits the requirements of today’s sophisticated AIs seamlessly. The rise in generative AI applications has proved lucrative for companies like Nvidia but has simultaneously stressed the GPU market.
OpenAI’s reliance on GPU clusters in the cloud isn’t just for its core operations but also to manage client requests. However, this comes with a hefty price tag. An intriguing finding by analyst Stacy Rasgon suggests that if ChatGPT’s usage were to scale up to even a tenth of Google Search, an initial investment of around $48.1 billion in GPUs would be required, with an annual upkeep of roughly $16 billion.
Interestingly, OpenAI isn’t pioneering the concept of customized AI chips. Tech behemoths like Google, Amazon, and Microsoft have already dipped their toes in these waters. Google has its TPU (Tensor Processing Unit) designed for powerful AI systems, Amazon provides specialized chips to its AWS patrons, and Microsoft is said to be collaborating with AMD on a proprietary AI chip named Athena. There’s chatter that OpenAI might be experimenting with Athena.
Financially, OpenAI is sturdy. With over $11 billion from venture capital backing and an approaching annual revenue of $1 billion, it’s also speculated that a stock sale might elevate its secondary-market worth to an astounding $90 billion.
However, entering the AI chip domain isn’t a stroll in the park. Just last year, Graphcore, an AI chip manufacturer, faced a valuation dip of $1 billion post a collapsed deal with Microsoft. Their subsequent announcements included planned layoffs attributed to the tough economic landscape. Additionally, Habana Labs, owned by Intel, axed around 10% of its employees. Meta too has encountered hurdles in its AI chip endeavors, leading to some projects being shelved.
Given these challenges, if OpenAI decides to launch a unique chip, it could be a long-drawn, pricey venture. It’s anticipated to take years and could siphon off hundreds of millions every year. It’s still up in the air if stakeholders, including big names like Microsoft, are ready to roll the dice on such a gamble.