According to people familiar with the company's plans, OpenAI, the company behind ChatGPT, is considering developing its own artificial intelligence chips and has gone as far as evaluating a potential acquisition target, in recent internal discussions described to Reuters, the company has not yet decided whether to proceed. According to people familiar with the matter, it has been discussing various solutions to the shortage of expensive AI chips on which OpenAI relies since at least last year. These options include developing its own AI chip, collaborating more closely with other chipmakers such as Nvidia, and diversifying its suppliers beyond Nvidia.
CEO Sam Altman has made the acquisition of more AI chips a top priority for the company. He has publicly complained about the scarcity of graphics processing units, a market dominated by Nvidia, which controls more than 80 per cent of the global market for the chips best suited to run AI applications.
Altman attributes the effort to two major concerns: a shortage of the advanced processors that power OpenAI's software and the "eye-watering" costs associated with running the hardware required to power its efforts and products. Since 2020, OpenAI has been developing generative artificial intelligence technologies on a massive supercomputer built by Microsoft, one of its most significant backers, that employs 10,000 Nvidia graphics processing units (GPUs).
The cost of running ChatGPT is prohibitively expensive for the company. According to Bernstein analyst Stacy Rasgon's analysis, each query costs about 4 cents. If ChatGPT queries grow to a tenth the size of Google searches, it would require approximately $48.1 billion in GPUs initially and approximately $16 billion in chips per year.
We use cookies to ensure you get the best experience on our website. Read more...