Editor's Pick

So far, AI hasn’t been profitable for Big Tech

Enlarge (credit: Getty Images)

Big tech companies like Microsoft and Google are grappling with the challenge of turning AI products like ChatGPT into a profitable enterprise, reports The Wall Street Journal. While companies are heavily investing in AI tech that can generate business memos or code, the cost of running advanced AI models is proving a significant hurdle. Some services, like Microsoft’s GitHub Copilot, drive significant operational losses.

Generative AI models used for creating text are not cheap to operate. Large language models (LLM) like the ones that power ChatGPT require powerful servers with high-end, energy-consuming chips. For example, we recently cited a Reuters report with analysis that claimed each ChatGPT query may cost 4 cents to run. As a result, Adam Selipsky, the chief executive of Amazon Web Services, told the Journal that many corporate customers are unhappy with the high running costs of these AI models.

The current cost challenge is tied to the nature of AI computations, which often require new calculations for each query, unlike standard software that enjoys economies of scale. This makes flat-fee models for AI services risky, as increasing customer usage can drive up operational costs and lead to potential losses for the company.

Read 4 remaining paragraphs | Comments

What's your reaction?

In Love
Not Sure

You may also like