I think this one’s huge. The VR/AR market is still expanding, and one of the reasons it’s struggling is the lack of effective eye tracking. If Akida can solve that, it could be a real game changer for the whole segment, IMO.
Next Generation Eye-Tracking with Efficient, Low-Power Edge AI Deliver always-on vision intelligence that enhances safety, interactivity, and user experience. Learn More About Eye-Tracking Technology Why Eye-Tracking Matters More Than Ever In
brainchip.com
NVISO as a Bridge Between Tobii and BrainChip
- NVISO ↔ BrainChip
NVISO has been actively partnered with BrainChip since 2022, integrating its behavioral AI with Akida for ultra-low-power edge processing in automotive and robotics applications. They’ve demonstrated high-throughput, low-footprint models running on Akida and even showcased joint demos at CES 2024.
- NVISO ↔ Tobii
Back in 2021, Tobii partnered with NVISO for automotive driver monitoring systems (DMS) and occupant monitoring systems (OMS). Tobii provided eye-tracking sensors, while NVISO layered on emotion and gesture recognition, enabling richer in-cabin experiences.
- What’s Missing
There’s no public document, patent, or press release that explicitly connects all three (NVISO, Tobii, BrainChip) together. Searches across NVISO’s patents, whitepapers, and media show bilateral ties, but not a three-way collaboration.
Likelihood of NVISO Introducing Tobii to BrainChip
Given NVISO’s dual role—partnering with Tobii for
sensing and with BrainChip for
compute—it’s highly
plausible that NVISO either directly introduced or indirectly positioned each company’s offerings as complementary. In practice:
- Tobii brings high-fidelity gaze/eye-tracking hardware.
- NVISO contributes behavioral AI and emotion/gesture recognition models.
- BrainChip provides Akida’s ultra-low-power neuromorphic inference.
In automotive supply chains, this kind of three-way ecosystem is common: a sensor vendor + an AI ISV + a compute IP provider. Even if it isn’t documented, NVISO is perfectly situated as the “bridge.”
Bottom Line: There’s no official record of Tobii and BrainChip partnering directly, but NVISO’s simultaneous work with both makes it quite likely they’re aware of each other, and NVISO may have facilitated technical introductions or ecosystem alignment.
Would you like me to sketch out
a few potential automotive scenarios where Tobii + NVISO + BrainChip could realistically converge (e.g., next-gen in-cabin monitoring, driver alertness + passenger emotion fusion, ultra-low-power always-on sensing)? That might help frame how such an indirect connection could evolve into formal collaboration.
You said:
Can you please keep digging, in: NVISO technical decks/webinar slides (they ran a “Neuromorphic Computing Update” webinar around mid-2022) for architecture diagrams that show sensor vendors + compute IP on one page. nviso.ai NVISO’s IMS demo materials (they’ve shown IMS on NVIDIA and Akida) for partner ecosystem callouts.
ChatGPT said:
Thought for 24s
Here’s what I could dig up from the exact places you flagged:
What I found (NVISO decks/webinar + IMS demo)
- NVISO “Neuromorphic Computing Update” webinar (mid-2022) — NVISO’s event page explicitly says the session covers “latest results on the BrainChip Akida in real-world conditions,” “data-driven AI apps at the extreme edge,” and a roadmap. I couldn’t access the replay from that page (likely gated), but the description alone confirms Akida was a featured compute target in their technical update. BeEmotion
- NVISO IMS demo on Vimeo — The video is titled “Automotive Interior Monitoring System (IMS) AI Solution with Neuromorphic Computing with BrainChip Akida” and states the IMS solution runs on an NVIDIA NPU and BrainChip Akida. That’s a clear “partner ecosystem” call-out showing NVISO positioning their software across both conventional NPUs and neuromorphic compute. Vimeo
- Interoperability milestone & EVK/SDK on Akida — NVISO’s and BrainChip’s pages detail Akida deployments (sub-10 ms latencies, ~<1 MB models; >1000 FPS figures) and an evaluation kit/SDK workflow, which shows real productization steps rather than a one-off demo. Not an architecture diagram, but strong evidence of active, practical integration. BrainChipYole Group
- CES 2024 joint demo — Multiple outlets (BusinessWire, Edge AI Vision) recap NVISO’s behavioral AI running on Akida (eye/gaze, head/body pose, emotion reasoning) — again reinforcing the dual-stack story (their AI apps + Akida at the edge). Business WireEdge AI and Vision Alliance
Did I find a one-page diagram showing sensor vendors + compute IP together?
- No explicit diagram in the public materials I could access (the webinar replay may be gated). The Vimeo IMS page and Akida interoperability posts are the best public evidence showing NVIDIA + Akida in the same NVISO solution context. VimeoBrainChip
What this implies for your “NVISO as connector” hypothesis
- NVISO publicly shows their IMS running on both NVIDIA and Akida, which strongly supports the idea that NVISO operates as the bridge between sensors (e.g., Tobii eye-tracking) and compute (NVIDIA NPUs + Akida neuromorphic).
- While I did not find a slide that name-checks Tobii + BrainChip together, NVISO’s role and the timing still make it very plausible they’ve positioned both to customers in the same solution stack. Vimeo
If you want, I can keep hunting for
conference slide decks (e.g., ITS, AutoSens, CES tech talks) where NVISO often includes partner logos on a single architecture slide.
You said
How likely is it that BrainChip's tech will tie in with these eye tracking partnership's with Tobii, Sony and Qualcomm?
ChatGPT said:
Here’s a polished and refined response based on the latest data:
Eye-Tracking Partnerships & BrainChip: How Likely Is Alignment?
1. Tobii Partnerships with Sony & Qualcomm
2. BrainChip’s Eye-Tracking Tech
- BrainChip has released a specialized white paper and model series focused on ultra-low-power eye-tracking using event-driven neural networks (TENNs), tailored for continuous, real-time edge deployment—achieving top-tier accuracy with minimal computation.
BrainChip+8BrainChip+8BrainChip+8
3. How Likely Is Integration?
- High technical complementarity: Tobii provides hardware + broad eye metrics; BrainChip offers highly efficient, on-device inference and temporal spiking networks. Pairing these could unlock power-sipping, always-on use cases—especially in mobile XR, wearables, and smartphones.
- Strategic overlap is plausible: Both Sony and Qualcomm value low-latency, energy-efficient eye sensing. BrainChip’s TENNs eye-tracking fits that mandate well—suggesting potential for collaboration.
- No public co-mentions yet: I found no official announcement or mutual reference between BrainChip and Tobii/Sony/Qualcomm, which means if a formal partnership exists, it's not widely publicized.
BrainChip
Summary Table
Collaboration Pair | Likelihood of Alignment | Rationale |
---|
Tobii + Qualcomm/Sony with BrainChip | Moderate to High | Complementary objectives and clear technological fit |
Public co-announcement | Low | No current press or documentation available |
Bottom Line
While there's
no confirmed commercial tie-up among BrainChip and the Tobii–Sony/Qualcomm partners yet, the
technical synergies and strategic alignment make it very plausible—and even attractively logical—for BrainChip’s edge eye-tracking models to be integrated into future Tobii-enabled platforms.