Brainchip gets a mention below concerning SONY. It looks more like they benchmark against us and intend to go with their own?????
Its not clear.
Read the link. It could all be crap anyway. Are they suggesting that they are better than AKIDA? As far as i know Sony has not 'invented' their own Neuromorphic Edge AI with on chip learning.
Certainly if Sony's chip needs a brain it needs AKIDA.
	
	
		
			
				
			
			
				
				This emerging technology is on the verge of a breakout. Over the past decade, rapid advancements have quietly built momentum, offering early adopters—investors, innovators, and enthusiasts—a chance to ride the wave before it goes mainstream. From revolutionizing robotics and cybersecurity to...
				
					
						
							 
						
					
					neuromorphiccore.ai
				
 
			 
		 
	 
"More on the Technology
Sony’s 
Event-Based Vision Sensors (EVS) deliver <1 µs latency and <10 mW power, processing only dynamic events for 10x efficiency over traditional CMOS sensors, ideal for high-speed applications like autonomous driving and robotics. The IMX500 AI chip provides 40 TOPS, enabling on-device intelligence with 5x lower power than GPUs for tasks like object detection. Compared to BrainChip’s Akida (40 TOPS/W), Sony’s EVS excels in vision-specific tasks, achieving 1,000 fps equivalent processing. SPAD sensors enhance low-light performance, detecting single photons for neuromorphic vision, making them suitable for healthcare imaging and industrial automation."