Nvidia-Powered Robot Sweeps Away Bathroom Cleaning Challenges
Primech AI has integrated an NVIDIA system on module (SoM) into its latest HYTRON bathroom cleaning …
08. January 2025
Ottonomy Unveils Contextual AI 2.0 onto Ambarella Inc.’s Cutting-Edge N1 Edge Computing Hardware
At the forefront of autonomous robotics, Ottonomy Inc. has made a groundbreaking announcement by integrating its Contextual AI 2.0 onto Ambarella Inc.’s cutting-edge N1 edge computing hardware. This strategic partnership marks a significant milestone in the evolution of robot intelligence, enabling robots to make more contextually aware decisions and exhibit intelligent behaviors.
According to Amit Badlani, director of generative AI and robotics at Ambarella, “The integration of Ottonomy’s Contextual AI 2.0 with our N1 Family of SoCs represents a pivotal moment in the evolution of autonomous robotics. By combining edge AI performance with the transformative potential of vision language models, we’re empowering robots to process and act on complex real-world data in real time.”
Ambarella’s single SoC supports up to 34 B-Parameters multi-modal large language models (LLMs) with low power consumption, while its new N1-655 edge GenAI SoC provides on-chip decode of 12x simultaneous 1080p30 video streams, concurrently processing that video and running multiple, multimodal VLMs and traditional convolutional neural networks (CNNs).
The integration of Contextual AI 2.0 has far-reaching implications for the field of robotics. By enabling robots to comprehend environments in a more nuanced manner, Ottonomy’s technology promises to revolutionize robot perception, decision making, and behavior. The company claims that its delivery robots can now detect objects not only with precision but also understand real-world complexities that provide additional context.
With situational awareness, Ottobots can adapt to environments, operational domains, or even weather and lighting conditions, making them more effective in a wide range of applications. This is a significant leap towards general intelligence for robotics, as Ottonomy’s CEO Ritukar Vijay noted, “LLMs on edge hardware is a game-changer for moving closer to general intelligence, and that’s where we plug in our behavior modules to use the deep context and adds to our Contextual AI engine.”
Ottonomy sees numerous applications for VLMs, including its SAE Level 4 autonomous ground robots delivering vaccines, test kits, e-commerce packages, and spare parts in both indoor and outdoor environments to large manufacturing campuses. With customers in healthcare, intralogistics, and last-mile delivery, the company is committed to developing innovative and sustainable technologies for delivering goods.
Stanford University students used Solo Server to deliver fast, reliable, and fine-tuned artificial intelligence directly on the edge, helping to deploy VLMs and depth models for environment processing. With Contextual AI 2.0, Ottonomy is poised to revolutionize the field of robotics, enabling robots to process complex real-world data in real time and exhibit intelligent behaviors that were previously unimaginable.
The company’s innovative approach has caught the attention of experts, who see significant potential for its technology in various industries. As Ottonomy scales globally, its focus on developing Contextual AI 2.0 suggests a commitment to pushing the boundaries of robot intelligence.