

With dedicated prices from AWS, that would cost over $2.4 million. It took about 1 million GPU hours to train. Meta's largest LLaMA model released last month, for example, used 2,048 Nvidia A100 GPUs to train on 1.4 trillion tokens (750 words is about 1,000 tokens), taking about 21 days, the company said when it released the model last month. More advanced language models could cost over "the high-single-digit millions" to train, said Rowan Curran, a Forrester analyst who focuses on AI and machine learning. Nvidia makes most of the GPUs for the AI industry, and its primary data center workhorse chip costs $10,000. Scientists that build these models often joke that they " melt GPUs." Training modelsĪnalysts and technologists estimate that the critical process of training a large language model such as OpenAI's GPT-3 could cost more than $4 million. Most training and inference now takes place on graphics processors, or GPUs, which were initially intended for 3D gaming, but have become the standard for AI applications because they can do many simple calculations simultaneously. While traditional computer processors can run machine learning models, they're slow. These calculations also require specialized hardware. By comparison, serving web apps or pages requires much less calculation.
#Fantastical gamer twitter software
Even when the software is built, or trained, it still requires a huge amount of computing power to run large language models because they do billions of calculations every time they return a response to a prompt.

The high cost of training and "inference" - actually running - large language models is a structural cost that differs from previous computing booms. The high cost of machine learning is an uncomfortable reality in the industry as venture capitalists eye companies that could potentially be worth trillions, and big companies such as Microsoft, Meta, and Google use their considerable capital to develop a lead in the technology that smaller challengers can't catch up to.īut if the margin for AI applications is permanently smaller than previous software-as-a-service margins, because of the high cost of computing, it could put a damper on the current boom.

Latitude's pricey AI bills underscore an unpleasant truth behind the recent boom in generative AI technologies: The cost to develop and maintain the software can be extraordinarily high, both for the firms that develop the underlying technologies, generally referred to as a large language or foundation models, and those that use the AI to power their own software. Latitude's generative AI bills have dropped to under $100,000 a month, Walton said, and the startup charges players a monthly subscription for more advanced AI features to help reduce the cost.
#Fantastical gamer twitter free
"We spent hundreds of thousands of dollars a month on AI and we are not a big startup, so it was a very massive cost."īy the end of 2021, Latitude switched from using OpenAI's GPT software to a cheaper but still capable language software offered by startup AI21 Labs, Walton said, adding that the startup also incorporated open source and free language models into its service to lower the cost. "We joked that we had human employees and we had AI employees, and we spent about as much on each of them," Walton said. Personal Loans for 670 Credit Score or LowerĬompounding the predicament was that Walton also discovered content marketers were using AI Dungeon to generate promotional copy, a use for AI Dungeon that his team never foresaw, but that ended up adding to the company's AI bill.Īt its peak in 2021, Walton estimates Latitude was spending nearly $200,000 a month on OpenAI's so-called generative AI software and Amazon Web Services in order to keep up with the millions of user queries it needed to process each day. Personal Loans for 580 Credit Score or Lower Best Debt Consolidation Loans for Bad Credit
