LLMs Are Surprisingly Cheap to Run

2025-06-09

This post challenges the widespread misconception that Large Language Models (LLMs) are prohibitively expensive to operate. By comparing the costs of LLMs to web search engines and citing various LLM API prices, the author demonstrates that LLM inference costs have dropped dramatically, even being an order of magnitude cheaper than some search APIs. The author also refutes common objections to LLM pricing strategies, such as price subsidization and high underlying costs, and points out that the real cost challenge lies in the backend services interacting with AI, not the LLMs themselves.