SiMa.ai, a Silicon Valley-based startup that manufactures embedded ML system-on-chip (SoC) platforms, today plans to introduce a second-generation chipset built specifically for multimodal. Announced that it has raised a $70 million expansion funding round. Bringing generative AI processing to market.
According to Gartner, the global AI support chip market is expected to reach $119.4 billion by 2027, more than doubling from 2023. However, only a few companies have started producing specialized semiconductors for AI applications. Most of the prominent candidates initially focused on supporting AI in the cloud. Nevertheless, various reports predict significant growth in the AI on the edge market. This means that the hardware processing the AI calculations is closer to the data collection source than in a centralized cloud. SiMa.ai, named after Seema, which means “boundary” in Hindi, is harnessing this shift by delivering edge AI SoCs to organizations across industrial manufacturing, retail, aerospace, defense, agriculture, and healthcare. I'm trying to take advantage of it.
The San Jose-headquartered startup, targeting the 5W to 25W energy usage market segment, launches the first ML SoC that enables AI and ML through a combination of integrated software and hardware. I sold it. This includes a proprietary chipset and no-code software called Palette. The combination is already being used by more than 50 companies around the world, Krishna Rangasei, founder and CEO of SiMa.ai, told TechCrunch.
The company touts that its current generation ML SoC achieved the highest FPS/W results in MLPerf benchmarks across the closed, edge, and power sector categories in MLPerf Inference 4.0. However, the first generation chipsets focused on classic computer vision.
As the demand for GenAI increases, SiMa.ai is focused on providing multimodal GenAI capabilities to its customers and plans to introduce its second generation ML SoC in the first quarter of 2025. Rangasayee said the new SoC will be an “evolutionary change” to his previous generation SoC, with “several architectural adjustments” to the existing ML chipset. He added that the basic concept remains the same.
Like the company's existing ML platform, the new GenAI SoC adapts to any framework, network, model, or sensor, and is also compatible with any modality, including voice, audio, text, and images. It will serve as a single edge platform for all AI across computer vision, transformers and multimodal GenAI, the company said.
“You can't predict the future, but you can pick a vector and say, 'This is the vector I want to bet on.' And I want to continue to evolve based on my own vector. That’s kind of the approach we took architecturally,” Rangasayee said. “But fundamentally, we haven't really had to retreat or change our architecture significantly. It’s also a benefit of being there.”
SiMa.ai has Taiwan's TSMC as a manufacturing partner for its first and second generation AI chipsets, and Arm Holdings as its computing subsystem provider. The second generation chipset is based on his TSMC's 6nm process technology and includes Synopsys EV74 embedded vision processor for pre- and post-processing of computer vision applications.
The company is considering competitors such as established companies such as NXP, Texas Instruments, STMicro, Renaissance & Microchip Technology and Nvidia, as well as AI chip startups such as Hailo. However, like other AI chip startups, it sees Nvidia as its main competitor.
Rangasayee told TechCrunch that while Nvidia is “great in the cloud,” it doesn't build a platform for the edge. He believes Nvidia lacks adequate power efficiency and software for edge AI. Similarly, he argued that other startups building AI chipsets are not solving system problems, only providing ML acceleration.
“Among our peers, Hailo has done a really good job. And it's not like we're better than them. But from our perspective, our value proposition is completely different,” he said.
The founder continued that SiMa.ai delivers higher performance and better power efficiency than Hailo. He also said that SiMa.ai's system software is completely different and effective for GenAI.
“As long as we're solving customer problems, and as long as we're better at it than anyone else, we're in a good position,” he said.
SiMa.ai's new all-equity funding led by Maverick Capital, with participation from Point72 and Jericho, extends the startup's $30 million Series B round, which was first announced in May 2022. Existing investors include Amplify Partners, Dell Technologies Capital, Fidelity Management, and Lip. -Bu Tan also participated in additional investment. With this funding, the five-year-old startup has raised a total of $270 million.
The company currently employs 160 people, 65 of whom are based at its research and development center in Bangalore, India. SiMa.ai plans to grow its headcount by adding new roles and expanding its research and development capabilities. The company also hopes to develop a go-to-market team for its Indian customers. Additionally, the startup plans to expand its customer-facing team globally, starting with South Korea and Japan, as well as Europe and the United States.
“The computational intensity of generative AI has sparked a paradigm shift in data center architecture. The next step in this evolution is widespread adoption of AI at the edge. The computing landscape is also poised for a complete transformation. SiMa.ai has three key ingredients in place: a best-in-class team, cutting-edge technology, and forward momentum to capitalize on this seismic shift. We are excited to work with SiMa.ai to seize this once-in-a-generation opportunity,” said Andrew, Senior Managing Director at Maverick Capital. Homan said in a statement.