Each request to ChatGPT costs OpenAI 4 cents.

0
964

Insider data from Reuters

Using ChatGPT is very expensive for OpenAI. Each request costs about 4 cents, according to Bernstein analyst Stacy Rasgon. Therefore, OpenAI is exploring the possibility of creating its own artificial intelligence chips, as Reuters writes, citing people familiar with the company’s plans.

Even last year, various options were discussed to solve the problem of the shortage of expensive artificial intelligence chips that OpenAI relies on. Those options included building its own AI chip, working more closely with other chip makers, including Nvidia, and diversifying its suppliers beyond Nvidia. 

ChatGPT costs OpenAI 4 cents
ChatGPT costs OpenAI 4 cents

Each request to ChatGPT costs OpenAI 4 cents.

CEO Sam Altman has made acquiring more artificial intelligence chips a top priority for the company. He has publicly complained about a shortage of graphics processors, a market dominated by Nvidia, which controls more than 80% of the global market for chips best suited to run artificial intelligence applications.

Efforts to get more chips face two main challenges: a shortage of the cutting-edge processors that run OpenAI’s software and the “incredible” costs associated with running the hardware needed to support current products.

Since 2020, OpenAI has been developing its generative artificial intelligence technologies on a massive supercomputer built by Microsoft that uses 10,000 Nvidia GPUs.

 
Also Read:  Top secret - you can now talk to ChatGPT without saving the chat