EnCharge AI Secures $100M Series B to Revolutionize Energy-Efficient AI Chips

EnCharge AI, a California-based startup founded in 2022, has successfully raised $100 million in Series B funding led by prominent investment firm Tiger Global. This substantial investment brings the company’s total funding to approximately $144 million and positions it to commercialize groundbreaking AI chip technology that promises to transform how artificial intelligence computations are performed.

The startup has developed an innovative in-memory computing AI chip that delivers a remarkable twenty-fold improvement in energy efficiency compared to traditional architectures. This breakthrough technology enables inference computation—the process where AI systems make predictions based on new data—to occur locally on devices rather than relying exclusively on cloud-based data centers. This capability opens possibilities for advanced AI applications to run on smaller computers, edge devices, and embedded systems across various industries.

EnCharge’s technology stems from research conducted by its CEO, Naveen Verma, who explored non-volatile memory devices that retain stored information even without power. The company’s chips utilize static random-access memory (SRAM) for in-memory computing, addressing a critical inefficiency in conventional computing architectures where moving data between separate memory and processing units consumes excessive time and energy.

The hardware applications span multiple sectors, including automotive sensing, smart retail, and industrial robotics, demonstrating the versatility of the technology. EnCharge is collaborating with semiconductor manufacturing giant TSMC to refine its first-generation commercial chips, leveraging world-class fabrication capabilities.

The Series B round attracted significant investor interest, with participation from Maverick Silicon, Capital TEN, SIP Global Partners, Zero Infinity Partners, CTBC VC, Vanderbilt University, Morgan Creek Digital, and others. Previous backers including RTX Ventures and Anzu Partners also joined the round, signaling continued confidence in the company’s vision.

With this fresh capital injection, EnCharge AI plans to accelerate bringing its AI accelerator solutions to market, potentially disrupting the current AI infrastructure landscape dominated by power-hungry data center operations. The company’s approach addresses growing concerns about AI’s environmental impact and operational costs while enabling new use cases for edge AI deployment.

Key Quotes

This allows inference computation — when AI makes predictions based on new data — to take place outside the cloud and on local devices.

This statement explains the core value proposition of EnCharge’s technology, highlighting how their chips enable decentralized AI processing that doesn’t depend on cloud infrastructure, which has significant implications for privacy, latency, and operational costs.

Other computing architectures don’t use in-memory techniques, which can make devices unsuitable for running AI calculations because it takes too much time and energy to move data back and forth between memory and processing units.

This quote identifies the fundamental inefficiency in traditional computing architectures that EnCharge’s technology addresses, explaining why conventional chips struggle with AI workloads and how in-memory computing provides a solution.

Our Take

EnCharge AI’s emergence reflects a broader trend toward specialized AI hardware designed for specific workloads rather than general-purpose computing. The twenty-fold efficiency gain isn’t just incremental improvement—it’s potentially transformative for edge AI deployment. What’s particularly noteworthy is the timing: as regulatory pressure mounts around AI’s environmental impact and companies seek to reduce operational costs, energy-efficient solutions become increasingly valuable. The partnership with TSMC provides crucial manufacturing credibility, while the diverse investor base suggests confidence across multiple sectors. However, EnCharge faces significant competition from established chip makers and well-funded startups. Their success will depend on execution speed, real-world performance validation, and ability to integrate into existing AI development workflows. The focus on inference rather than training is strategically sound, as inference represents the majority of deployed AI workloads.

Why This Matters

EnCharge AI’s funding and technology represent a critical development in addressing one of artificial intelligence’s most pressing challenges: energy consumption and efficiency. As AI models grow increasingly complex and computationally demanding, the environmental and economic costs of running them in massive data centers have become unsustainable. This twenty-fold efficiency improvement could fundamentally reshape AI deployment strategies.

The ability to perform sophisticated AI inference on edge devices and local hardware has profound implications for privacy, latency, and accessibility. Industries from automotive to retail can deploy AI capabilities without constant cloud connectivity, enabling real-time decision-making in autonomous vehicles, smart manufacturing, and IoT applications. This democratizes AI access beyond companies with massive data center infrastructure.

The $100 million investment from Tiger Global and other prominent investors signals strong market confidence in specialized AI chip architectures as alternatives to general-purpose GPUs. As the AI chip market becomes increasingly competitive, EnCharge’s in-memory computing approach represents a differentiated strategy that could capture significant market share in edge AI applications, potentially challenging established players in specific use cases.

Source: https://www.businessinsider.com/encharge-ai-funding-tiger-global-pitch-deck-2025-2