The article discusses how AI distillation techniques, particularly those demonstrated by Deepseek, are challenging big tech companies’ dominance in AI development. Deepseek has shown that smaller, more efficient AI models can match or exceed the performance of larger models through knowledge distillation, a process where smaller models learn from larger ones. This development suggests that by 2025, high-quality AI models could become more accessible and cheaper, potentially disrupting the current AI market dominated by companies like OpenAI and Google. The article highlights how Deepseek’s 7-billion parameter model achieved comparable performance to GPT-3.5, despite being significantly smaller. This breakthrough indicates that AI capabilities might become a commodity sooner than expected, with implications for big tech companies’ business models and market positions. The piece also explores how this democratization of AI technology could lead to increased competition and innovation in the field, while potentially reducing the competitive advantage currently held by major tech companies. Experts cited in the article suggest that this trend could accelerate the development of more efficient AI systems and make advanced AI capabilities more widely available to smaller companies and developers. The conclusion emphasizes that while large language models won’t disappear, the ability to create smaller, equally capable models could fundamentally change the AI industry’s landscape.