Ad image

Beyond GPUs: Innatera and the Quiet Uprising of AI Hardware

MONews
10 Min Read

We want to hear from you! Take our short AI survey to share your insights on the current state of AI, how it’s being implemented, and what you expect for the future. Learn more


While much of the tech world is fixated on the latest large-scale language models (LLMs) powered by Nvidia GPUs, a quieter revolution is taking place in AI hardware. As the limitations and energy requirements of traditional deep learning architectures become increasingly apparent, a new paradigm called neuromorphic computing is emerging that promises to dramatically reduce the computational and power requirements of AI.

Mimicking Nature’s Masterpieces: How Neuromimetic Chips Work

But what exactly is a neuromorphic system? To find out, VentureBeat spoke to Sumeet Kumar, CEO and founder. InnateraWe are a leading startup in the field of neuromorphic chips.

“Neuromorphic processors are designed to mimic the way a biological brain processes information,” Kumar explained. “Instead of performing sequential operations on data stored in memory, neuromorphic chips use networks of artificial neurons that communicate via spikes, much like real neurons.”

This brain-inspired architecture offers distinct advantages for neuromorphic systems, and is particularly suited for edge computing applications in consumer devices and industrial IoT. Kumar highlighted several compelling use cases, including always-on audio processing for voice activation, real-time sensor fusion for robotics and autonomous systems, and ultra-low-power computer vision.


VB Transform 2024 Countdown

Join business leaders in San Francisco July 9-11 for the premier AI event. Connect with peers, explore the opportunities and challenges of Generative AI, and learn how to integrate AI applications into your industry. Register now.


“The key is that neuromorphic processors can perform complex AI tasks using a fraction of the energy of traditional solutions,” Kumar said. “This enables capabilities like continuous environmental awareness in battery-powered devices, which was previously impossible.”

From Doorbells to Data Centers: The Emergence of Real-World Applications

Innatera’s flagship Spiking Neural Processor T1, unveiled in January 2024, demonstrates these advantages well. The T1 combines an event-driven compute engine with a traditional CNN accelerator and a RISC-V CPU to create a comprehensive platform for ultra-low-power AI in battery-powered devices.

“Our neuromorphic solution can perform computations using 500 times less energy than existing approaches,” Kumar said. “And we’re seeing pattern recognition speeds that are about 100 times faster than competitors.”

Kumar illustrated this point with compelling real-world applications. Innatera collaborated with: SocioNextA Japanese sensor supplier is developing an innovative solution for human presence detection. The technology, which Kumar demonstrated at CES in January, combines radar sensors with Inatera’s neuromorphic chips to create a highly efficient and privacy-preserving device.

“For example, let’s look at a video doorbell,” Kumar explains. “Traditional doorbells use power-hungry image sensors that need to be recharged frequently. Our solution uses radar sensors, which are much more energy-efficient.” The system can detect the presence of a person as long as they have a heartbeat, even when the person is motionless. And because it’s not imaging, it preserves privacy until the camera needs to be activated.

The technology has a wide range of applications beyond doorbells, including smart home automation, building security, and even occupancy detection in vehicles. “This is a perfect example of how neuromorphic computing can transform everyday devices,” Kumar said. “We’re bringing AI capabilities to the edge while actually reducing power consumption and enhancing privacy.”

Doing more with less in AI computing

These dramatic improvements in energy efficiency and speed are attracting significant industry interest. Kumar said Inetera is in multiple customer engagements and momentum for neuromorphic technology is steadily growing. The company is targeting the sensor edge application market and has an ambitious goal of bringing intelligence to 1 billion devices by 2030.

To meet this growing demand, Innatera is ramping up production. The Spiking Neural Processor is expected to enter production in late 2024, with volume deliveries beginning in the second quarter of 2025. This timeline reflects the rapid progress the company has made since it spun off from Delft University of Technology in 2018. In just six years, Innatera has grown to about 75 employees and recently appointed former Apple VP Duco Pasmooij to its advisory board.

The company recently closed a $21 million Series A round to accelerate the development of its spike neural processor. The oversubscribed round included investors such as Innavest, InvestNL, EIC Fund, and MIG Capital. This strong investor support demonstrates the growing interest in neuromorphic computing.

Kumar sees a future where neuromorphic chips increasingly handle AI workloads at the edge, while larger underlying models remain in the cloud. “There’s a natural tradeoff,” he says. “Neuromorphic chips excel at processing real-world sensor data quickly and efficiently, while large language models are better suited for inference and knowledge-intensive tasks.”

“It’s not just about raw computing power,” Kumar said. “The brain achieves incredible intelligence at a fraction of the energy that current AI systems require. That’s the promise of neuromorphic computing: AI that’s not just more capable, but much more efficient.”

Seamless integration with existing tools

Kumar highlighted a key element that could accelerate the adoption of neuromorphic technology: developer-friendly tools. “We’ve built a very comprehensive software development kit that allows application developers to easily target silicon,” he explained.

Innatera’s SDK uses PyTorch as its front end. “You actually develop your neural network entirely in a standard PyTorch environment,” Kumar said. “So if you know how to build a neural network in PyTorch, you can already use the SDK to target the chip.”

This approach significantly lowers the barrier to entry for developers already familiar with popular machine learning frameworks, allowing them to leverage the power and efficiency of neuromorphic computing while leveraging their existing skills and workflows.

“This is a simple, turnkey, standard and very fast way to build and deploy applications on our chips,” Kumar added, highlighting the potential for rapid adoption and integration of Innatera’s technology across a wide range of AI applications.

Stealth Games in Silicon Valley

While LLM grabs headlines, industry leaders are quietly acknowledging the need for radically new chip architectures. In particular, OpenAI CEO Sam Altman, who has been vocal about the imminent arrival of artificial general intelligence (AGI) and the need for massive investment in chip manufacturing, has personally invested in another neuromorphic chip startup, Rain.

The move is significant. Despite Altman’s public statements that he is expanding current AI technologies, his investment suggests a recognition that the path to more advanced AI may require a fundamental shift in computing architecture. Neuromorphic computing could be one of the keys to bridging the efficiency gap that current architectures face.

Bridging the Gap Between Artificial Intelligence and Biological Intelligence

As AI permeates every aspect of our lives, the need for more efficient hardware solutions only grows. Neuromorphic computing is one of the most exciting frontiers in chip design today, with the potential to enable a new generation of intelligent devices that are more capable and sustainable.

While large-scale language models make headlines, the real future of AI may lie in chips that think more like our own brains. As Kumar puts it: “We’re only scratching the surface of what’s possible with neuromorphic systems. The next few years are going to be very exciting.”

As brain-inspired chips enter consumer devices and industrial systems, we may be entering a new era of artificial intelligence that is faster, more efficient, and more closely aligned with the amazing capabilities of the biological brain.

Share This Article