Essentially nothing. Repeating a word infinite times (until interrupted) is one of the easiest tasks a computer can do. Even if millions of people were making requests like this it would cost OpenAI on the order of a few hundred bucks, out of an operational budget of tens of millions.
The expensive part of AI is training the models. Trained models are so cheap to run that you can do it on your cell phone if you’re interested.
Essentially nothing. Repeating a word infinite times (until interrupted) is one of the easiest tasks a computer can do. Even if millions of people were making requests like this it would cost OpenAI on the order of a few hundred bucks, out of an operational budget of tens of millions.
The expensive part of AI is training the models. Trained models are so cheap to run that you can do it on your cell phone if you’re interested.
Removed by mod
Which is very cheap.
It’s still very cheap, that’s why they allow people to play with the LLMs. It’s training them that’s expensive.
Removed by mod
GPT4 definitely isn’t cheap to run.
Depends how you define “cheap”. They’re orders of magnitude cheaper to run than they are to train.
Well it depends what user experience and quality you are after. Some of Meta’s Llama 2 models require several GBs of GPU ram to run and be responsive.