DeepSeek’s breakthrough has sent shockwaves through Silicon Valley, demonstrating that high-performance AI models can be developed at a fraction of the typical cost. This Chinese startup’s approach has prompted US competitors to scramble for cost-effective solutions and raised critical questions about the billions being invested in AI infrastructure.
The energy consumption crisis in AI development has become impossible to ignore. Training large AI models requires massive processing power from GPU clusters housed in energy-intensive data centers. According to Andreas Riegler, general partner at APEX Ventures, “The energy consumption of training a large AI model can produce emissions equivalent to the lifetime emissions of multiple cars.” As models grow larger, energy demands scale exponentially, creating urgent sustainability concerns.
Twelve innovative startups are leading the charge toward more efficient AI solutions. Business Insider consulted seven investors across Europe and the US to identify companies making AI cheaper and greener. These startups employ diverse strategies including improving software efficiency, developing energy-efficient chips, and leveraging renewable energy sources.
Notable companies include Berlin-based Mobius Labs, which claims to reduce AI costs by using 10 times less computing power. The startup has successfully “quantized” Meta’s Llama70B model, enabling it to run on a single GPU instead of four without sacrificing accuracy. German startup Gemesys has developed neuromorphic chips that mimic the human brain, processing multiple tasks simultaneously for applications like pattern recognition and image processing.
Toronto-based Cohere, having raised $975 million, provides enterprise-grade large language models with partnerships including McKinsey and Oracle. The company emphasizes “highest accuracy at the lowest cost,” delivering superior cost of ownership for customers. Swiss companies Corintis and Apheros are tackling the cooling challenge, with Corintis developing precision microfluidic cooling solutions that achieve 10 times more efficiency than current methods.
MIT spinout Liquid AI is pioneering liquid neural networks that maintain constant inference time regardless of context length, dramatically reducing computational resources and energy consumption. Meanwhile, Boston-based Mako automates GPU tuning to reduce compute costs by up to 70%, and Cambridge spinout PoroTech works with gallium nitride semiconductors to enhance energy efficiency at the speed of light.
Key Quotes
The energy consumption of training a large AI model can produce emissions equivalent to the lifetime emissions of multiple cars. As models grow in size, the demand for energy scales exponentially, raising sustainability concerns for future applications.
Andreas Riegler, general partner at APEX Ventures, highlighted the urgent environmental crisis facing AI development, emphasizing how current training methods create unsustainable carbon emissions that grow exponentially with model size.
The company is providing high-performance LLM models, with the highest accuracy at the lowest cost — thus providing the lowest cost of ownership for its customers.
Umesh Padval, managing director at Thomvest Ventures, explained Cohere’s value proposition, emphasizing how the Toronto-based startup delivers enterprise-grade AI solutions that prioritize both performance and cost-efficiency.
By 2030, an estimated 6% of global energy consumption will be used for cooling data centers — and a need for cost-and energy efficient liquid-based solutions is inevitable.
Antonia Albert, principal at Founderful, underscored the critical importance of cooling solutions like those developed by Apheros, highlighting the massive energy demands that data centers will place on global infrastructure within just five years.
Unlike traditional transformer-based models, whose memory usage and inference time increase with longer input sequences, LFMs maintain near-constant inference time and memory complexity regardless of context length.
Chip Hazard from Flybridge Capital explained how Liquid AI’s innovative liquid neural networks fundamentally differ from traditional models, processing longer sequences without proportional increases in computational resources or energy consumption.
Our Take
DeepSeek’s disruption reveals a fundamental truth: the AI industry’s race toward ever-larger models may have been misguided. The startups profiled here represent a paradigm shift from brute-force scaling to intelligent optimization. What’s particularly striking is the diversity of approaches—from neuromorphic chips mimicking brain architecture to precision cooling solutions and automated GPU tuning. This suggests multiple pathways exist toward sustainable AI, not just one silver bullet.
The timing couldn’t be more critical. As AI becomes embedded in everyday business operations, unsustainable energy consumption threatens both the technology’s viability and public acceptance. These efficiency-focused startups aren’t just reducing costs; they’re ensuring AI’s long-term survival. The companies that master efficient AI deployment will define the next decade of technological innovation, making this one of the most important trends to watch in 2025.
Why This Matters
This development represents a pivotal moment for the AI industry’s sustainability and economic viability. DeepSeek’s cost-effective approach has exposed potential inefficiencies in the billions being spent on AI infrastructure, forcing a industry-wide reckoning about resource allocation and environmental impact.
The environmental stakes are enormous. By 2030, an estimated 6% of global energy consumption will be dedicated to cooling data centers alone. Without innovative solutions, AI’s carbon footprint could become unsustainable, potentially limiting the technology’s growth and societal acceptance.
For businesses, these efficiency gains translate directly to competitive advantage. Companies that can deploy high-performance AI models at lower costs will democratize access to advanced AI capabilities, enabling smaller enterprises to compete with tech giants. This shift could accelerate AI adoption across industries while addressing legitimate concerns about the technology’s environmental impact.
The emergence of these startups signals a maturation of the AI industry, moving beyond the “bigger is better” mentality toward smarter, more sustainable approaches that balance performance with responsibility.
Recommended Reading
For those interested in learning more about artificial intelligence, machine learning, and effective AI communication, here are some excellent resources:
Recommended Reading
Related Stories
- How AI Can Help Make Buildings More Energy Efficient
- Encharge AI Secures $21.7M Series A Funding to Revolutionize AI Chip Efficiency
- Wall Street Asks Big Tech: Will AI Ever Make Money?
- The AI Hype Cycle: Reality Check and Future Expectations
- Jensen Huang: TSMC Helped Fix Design Flaw with Nvidia’s Blackwell AI Chip
Source: https://www.businessinsider.com/ai-startups-efficient-cheaper-greener-vcs-2025-2