ChatGPT might be powered by homegrown chips in the future, if OpenAI does indeed decide to make its own. According to Reuters, the company is currently exploring the possibility of making its own artificial intelligence chips and has even evaluated a potential acquisition. OpenAI CEO Sam Altman previously blamed GPU shortages for users’ concerns regarding the company API’s speed and reliability, so he reportedly made acquiring more AI chips a priority.
In addition to being able to address GPU shortages, OpenAI using its own chips could make costs associated with running its products more manageable. Based on an analysis by Stacy Rasgon from Bernstein Research, each ChatGPT query costs the company around 4 cents. The service reached 100 million monthly users in its first two months, which translates to millions of queries a day, though it did lose users for the first time in July. Rasgon said that if ChatGPT queries reach a tenth of what Google gets, then it would initially need $48.1 billion worth of GPUs and would spend $16 billion a year on chips going forward.
At the moment, NVIDIA controls the market for chips meant for AI applications — the Microsoft supercomputer OpenAI used to develop its technology, for instance, uses 10,000 NVIDIA GPUs. That’s why other companies — bigger players in the tech industry — have chosen to start developing their own. Microsoft, OpenAI’s biggest backer, has been working on an AI chip of its own since 2019, according to The Information. The product is codenamed Athena, and OpenAI has reportedly been testing the technology.
OpenAI has yet to decide whether to push through with its plans, Reuters says. And even if it does choose to move forward, it could take years before it can start using it own chips to power its products.
This article originally appeared on Engadget at https://www.engadget.com/openai-is-reportedly-considering-making-its-own-chips-113010353.html?src=rss