Sandlogic Pursues Nvidia-Scale Stack With In-House Chips

Bengaluru-based Sandlogic, led by founder Kamalakar Devaki, is pivoting from enterprise AI services to a full-stack edge AI strategy that includes custom silicon. Backed by about $4 million in external funding and over half a million in government grants, the company has built enterprise automation for clients such as Bajaj Finance and Indira IVF but has run losses while investing in chip design, models, and a runtime platform. Sandlogic aims to match the performance of Nvidia's A100 with its Krsna AI co-processor and to offer a Cuda-like portability layer via EdgeMatrix. The ambition is technically plausible for targeted edge workloads but will require substantial capital, IP, and ecosystem adoption to compete with entrenched players.
What happened
Sandlogic, a Bengaluru AI startup founded in 2018 by Kamalakar Devaki, is shifting from B2B enterprise AI services to building a vertically integrated edge AI stack that includes custom silicon, models, and a runtime platform. The company reports total external funding around $4 million plus government grants, and last reported revenue near Rs 1 crore. Its stated goal is to approach the performance profile of Nvidia's A100 using a bespoke Krsna AI co-processor while delivering developer portability through a Cuda-like runtime, EdgeMatrix.
Technical details
Sandlogic is combining three technical pillars to deliver a full-stack edge offering.
- •A custom co-processor, Krsna AI, positioned for inference and edge workloads.
- •Small and vision-language models, marketed as Shakti SLMs and VLMs, optimized for on-device deployment.
- •A portability/runtime layer, EdgeMatrix, intended to let developers "build once, run anywhere" across heterogeneous edge hardware.
The stack targets low-latency, resource-constrained inference scenarios typical in manufacturing, logistics, fintech, and healthcare. Existing customer projects include document parsing, lead scoring for fertility clinics, and voice agents for call centers, which demonstrate Sandlogic's data and domain expertise. Matching A100 class throughput or versatility is a high bar; the A100 is a datacenter GPU optimized for large-batch training and large models. Sandlogic's realistic near-term path is to optimize for edge-specific performance metrics: power efficiency, quantized inference, model compression, and low-latency I/O.
Context and significance
The move reflects two converging trends: enterprises want tighter hardware-software integration for edge AI, and regional startups are seeking sovereignty over AI infrastructure. Nvidia's advantage is not only silicon but also developer lock-in through Cuda and a vast ecosystem. Sandlogic acknowledges this and is explicitly attempting to replicate the software portability and toolchain benefits that make Nvidia ubiquitous. However, competing with incumbent chipmakers requires deep IP in RTL, fabrication partnerships, compilers, runtime optimization, and long sales cycles for edge devices.
Financially, Sandlogic is making a classic deep-tech tradeoff: short-term profitability for long-term platform value. The company has operated profitably in the past but has recorded losses in the last two years while investing in chips and models. With $4 million in external funding, the runway for a chip-first strategy is limited; chip development typically demands tens to hundreds of millions to reach production at scale.
What to watch
Execution hinges on three variables: fundraising or strategic partnerships for silicon tapeout and manufacturing, demonstrable performance gains on real edge benchmarks versus existing accelerators, and adoption of EdgeMatrix by third-party developers or OEMs. If Sandlogic secures fabrication partners, a clear cost-performance lead on targeted workloads, and early OEM integrations, it could become a credible regional alternative for edge AI. If not, the pivot risks draining resources and eroding the enterprise business that funded earlier growth.
Scoring Rationale
The story matters because it highlights a rare regional attempt to assemble chips, models, and a runtime for edge AI, a relevant infrastructure trend. The company's limited funding and the scale of the challenge cap its near-term systemic impact, keeping the story notable but not industry-shaking.
Practice interview problems based on real data
1,500+ SQL & Python problems across 15 industry datasets — the exact type of data you work with.
Try 250 free problemsStep-by-step roadmaps from zero to job-ready — curated courses, salary data, and the exact learning order that gets you hired.


