Deepseek-R1 is a grace for businesses-making AI apps cheaper, easier to build and more innovative
Join our daily and weekly newsletters for the latest updates and exclusive content of a leading AI coverage industry. Learn more
Issuing the Deepseek R1 reasoning model has caused shock waves throughout the technology industryAs the most common sign is the sudden Sale of major shares of AIS The advantage of well -funded AI laboratories such as Openai and Anthropic no longer seems very solid, as Deepseek is able to develop its O1 competitor with some of the cost.
While some AI laboratories are currently CrisisAs for the enterprise sector, this is the most good news.
More expensive applications, more applications
As we had said here before, one of Trends worth watching in 2025 is the continued decline in the cost of using AI models. Businesses should experiment and build prototypes with the largest AI models, regardless of the price, knowing that prolonged prices will allow them to ultimately deploy their applications on a scale.
This trend has just seen a huge step change. OPENAI O1 costs $ 60 per million markers against 2.19 dollars per million for Deepseek R1S And if you are concerned Send your data to Chinese serversYou can access the R1 of US suppliers as Together.AI and AI fireworksWhere it is priced at $ 8 and $ 9 per million tokens, respectively, it is still a huge deal compared to O1.
To be fair, the O1 still has an advantage over R1, but not so much as to justify such a huge difference in prices. In addition, the capabilities of R1 will be sufficient for most applications. And we can expect more modern and capable models to be launched in the coming months.
We can also expect second -row effects on the common AI market. For example, Openai CEO Sam Altman announced that free Chatgpt users would soon have access to the O3-Mini. Although he explicitly does not mention R1 as a reason, the fact that the message was made shortly after the release of R1.
big news: the free tier of chatgpt is going to get o3-mini!
— Sam Altman (@sama) January 23, 2025
(and the plus tier will get tons of o3-mini usage)
More innovation
The R1 still leaves many unanswered questions – for example, there are numerous reports that Deepseek trains the model of Openai Large Language Models (LLMS). But if his document and technical report are correct, Deepseek has succeeded in creating a model that almost matches the most modern while reducing costs and removal of some of the technical steps that require a lot of manual labor.
If others can play Deepseek results, this can be good news for AI Labs and companies that have been removed from financial barriers over innovation in this area. Businesses can expect faster innovations and more AI products to power their applications.
Today's "DeepSeek selloff" in the stock market — attributed to DeepSeek V3/R1 disrupting the tech ecosystem — is another sign that the application layer is a great place to be. The foundation model layer being hyper-competitive is great for people building applications.
— Andrew Ng (@AndrewYNg) January 27, 2025
What will happen to the billions of dollars that big technology companies have spent to acquire hardware accelerators? We have not yet reached the ceiling of AI, so leading technology companies will be able to do more with their resources. The more accessible AI will actually increase the demand in the medium and long term.
Jevons paradox strikes again! As AI gets more efficient and accessible, we will see its use skyrocket, turning it into a commodity we just can't get enough of. https://t.co/omEcOPhdIz
— Satya Nadella (@satyanadella) January 27, 2025
But more importantly, R1 is proof that not everything is tied to more computing clusters and data sets. With the right engineering chops and good talent, you will be able to press the limits of the possible.
Open Code for Victory
To make it clear, R1 is not a fully open code as Deepseek has only released the weights, but not the code or full details of the training data. Nevertheless, this is a big profit for Open codeS After the release of Deepseek R1, more than 500 derivatives have been published on Hugging Face and the model has been downloaded millions of times.
It's been released just a few days ago and already more than 500 derivative models of @deepseek_ai have been created all over the world on @huggingface with 2.5 million downloads (5x the original weights).
— clem 🤗 (@ClementDelangue) January 27, 2025
The power of decentralized open-source AI!
It will also give businesses more flexibility where to manage their models. In addition to the full model of 671 billion parameter, there are distilled versions of R1, ranging from 1.5 billion to 70 billion parameters, allowing companies to manage the model of various hardware. Moreover, Unlike O1R1 reveals its full thought chain, giving the developers a better understanding of the behavior of the model and the ability to direct it in the desired direction.
By catching up with an open code to closed models, we can hope to renew the commitment to share knowledge and research so that anyone can benefit from the progress in AI.
To people who think
— Yann LeCun (@ylecun) January 25, 2025
"China is surpassing the US in AI"
the correct thought is
"Open source models are surpassing closed ones"
See ⬇️⬇️⬇️