Everyone has already heard about the AI from China. I know, there are countless articles about it, but let’s start with a few words about it.
DeepSeek is a Chinese company specializing in artificial intelligence and the creator of a family of large language models.
In January 2025, the release of DeepSeek-R1 caused a sharp drop in stock prices for OpenAI, Microsoft, NVIDIA, and other industry giants. This sparked discussions about reassessing AI investments.
Most importantly, DeepSeek released its models as open-source. We’ll get back to this point later.
Why It Matters
At the very least, companies using ChatGPT now have a strong reason to consider alternatives. Why? Because it’s cheaper and faster. DeepSeek can significantly reduce costs for medium and large businesses.
The main issue for startups and small companies is that using APIs for powerful models like OpenAI's o1 or GPT can be quite expensive. DeepSeek offers much more affordable rates, making LLMs accessible to budget-conscious companies.
Let’s take a closer look at DeepSeek’s pricing:
- Cached input tokens: $0.14 per million.
- Non-cached input tokens: $0.55 per million.
- Output tokens: $2.19 per million.
To understand what tokens are, think of them as chunks of text. Input tokens are the text you send to the model for processing (e.g., a user’s query), while output tokens are the responses you receive. One token is approximately four characters of text.
For small businesses processing a high volume of requests, the price difference between DeepSeek and other models can be substantial. DeepSeek can save companies up to 10–30 times more, allowing them to reinvest those funds into other crucial areas such as marketing or product development.
The low cost of cached input tokens is especially advantageous. If your users frequently ask the same questions, DeepSeek allows you to significantly cut expenses. This makes it an ideal solution for chatbots, online assistants, and other systems handling repetitive queries.
Take a look yourself: the low price for cached input tokens ($0.14) is a game-changer for tasks involving recurring requests. It’s particularly useful for chatbots, virtual assistants, or analytical tools dealing with repetitive data. This lets companies not only reduce costs but also reinvest the savings into other areas, such as marketing, new product development, or improving user experience.
In my opinion, DeepSeek is not just another buzzword in the AI market, it’s a powerful competitor with serious advantages. If you’re building something with AI, be sure to try DeepSeek. You might find it better suited for your needs.
Comparison with ChatGPT
I decided to put DeepSeek to the test with a simple yet tricky question: “How can you parse HTML natively in Node.js?” Spoiler: you can’t. It can only be done with third-party libraries. Both AIs managed the task, but DeepSeek was pleasantly surprising.
I won’t write too much about it; just compare for yourself:
This is a conversation with ChatGPT. Unfortunately, sharing a DeepSeek conversation isn’t possible, so I’ve attached a video instead:
sorry, prompt in Russian language, because my original post in ru lang
Notice how DeepSeek pointed out the lack of a native method and suggested the lowest-level solution for the task: the parse5 library.
ChatGPT also noted the absence of a native solution but suggested an overhead approach with jQuery-like syntax (cheerio) as its primary recommendation.
When it comes to coding, DeepSeek seems to think more deeply and offers more optimized solutions.
p.s. personally, I’ve replaced ChatGPT with DeepSeek in my pet project, for me the transition was easy: my code’s interface allows for quick switching between AI providers. All I had to do was add a new provider implementing the AI interface with methods like prompt, and so on.
Top comments (0)