Companies and governments are looking for tools to run AI locally to reduce cloud infrastructure costs and build sovereign capacity. Quadric, a chip-IP startup founded by veterans of early Bitcoin mining firm 21E6, is trying to power that change, scaling beyond automotive into laptops and industrial equipment, with its on-device inference technology.
That expansion is already proving profitable.
CEO Veerbhan Kheterpal (pictured above, center) told TechCrunch in an interview that Quadric will generate $15 million to $20 million in licensing revenue in 2025, up from about $4 million in 2024. The company, which is based in San Francisco and has offices in Pune, India, is targeting up to $35 million this year as it builds a royalty-driven on-device AI business. Kheterpal said that growth has excited the company, whose post-money valuation is now between $270 million and $300 million, up from about $100 million in its 2022 Series B.
This has also helped in attracting investors towards the company. Quadric last week announced a $30 million Series C round led by Accelerate Fund managed by BEENEXT Capital Management, bringing its total funding to $72 million. Kheterpal told TechCrunch that the growth comes as investors and chipmakers are looking for ways to move more AI workloads from centralized cloud infrastructures to devices and local servers.
Everything from automotive
Quadric originated in automotive, where on-device AI could power real-time functions like driver assistance. Kheterpal said the proliferation of Transformer-based models predicted by “just about everything” in 2023 has led to a sharp shift in business over the past 18 months, as more companies try to run AI on-premise rather than relying on the cloud.
“Nvidia has a strong platform for data-center AI,” Kheterpal said. “We were looking to create a similar CUDA-like or programmable infrastructure for on-device AI.”
Unlike Nvidia, Quadric doesn’t make its own chips. Instead, it licenses the programmable AI processor IP, which Kheterpal described as a “blueprint” that customers can embed into their own silicon along with the software stack and toolchain to run models, including vision and voice, on the device.
techcrunch event
san francisco
|
October 13-15, 2026

The startup’s customers include printers, cars and AI laptops, including Kyocera and Japan’s auto supplier Denso, which makes chips for Toyota vehicles. Kheterpal told TechCrunch that the first products based on Quadric’s technology are expected to ship this year, starting with laptops.
Nonetheless, Kheterpal said Quadric is now looking beyond traditional commercial deployments and into markets exploring “sovereign AI” strategies to reduce reliance on US-based infrastructure. He said the startup is exploring customers in India and Malaysia and counts Moglix CEO Rahul Garg as a strategic investor who helps shape India’s “sovereign” vision. Quadric employs about 70 people worldwide, including about 40 in the US and about 10 in India.
Khetarpal said the move is being driven by the rising cost of centralized AI infrastructure and the difficulty many countries face in building hyperscale data centers, which is leading to greater interest in “distributed AI” setups, where inference runs on laptops inside offices or small on-premise servers, rather than relying on cloud-based services for each query.
The World Economic Forum pointed to this shift in a recent article, as AI inference has moved closer to users and away from completely centralized architectures. Similarly, EY said in a November report that the sovereign AI approach has gained traction as policymakers and industry groups have emphasized domestic AI capabilities spanning computation, models and data, rather than relying solely on foreign infrastructure.
The challenge for chipmakers, Khetarpal said, is that AI models are evolving faster than hardware design cycles. He argued that customers needed programmable processor IP that could keep pace through software updates rather than requiring expensive redesigns each time the architecture moved from earlier vision-centric models to today’s transformer-based systems.
Quadric is positioning itself as an alternative to chip vendors like Qualcomm, which typically use its AI technology inside its own processors, as well as IP suppliers like Synopsys and Cadence, which sell neural processing engine blocks. Khetarpal said Qualcomm’s approach can lock customers into their own silicon, whereas traditional IP suppliers offer engine blocks that many customers find difficult to program.
Quadric’s programmable approach allows customers to support new AI models through software updates rather than redesigning hardware, providing an advantage in an industry where chip development can take years, while model architectures are turned around in just a few months nowadays.
Still, Quadric is in its early stages, having signed up a handful of customers so far and much of its long-term profit hinges on turning today’s licensing deals into high-volume shipments and recurring royalties.