What is so far not mentioned is firstly Brainchip has a link to Xilinx going back to at least 2019. Secondly Xilinx, AMD and Kinara.ai are in the box. Thirdly Kinara.ai is a multi channel accelerator running at 3.5 watts. The Ai inference is on top of the box.
The speculation that I am inviting here is that as Xilinx, AMD and Kinara.ai are in the box then the question needs to be asked what is providing the Ai inference on top of the box? What also can provide the Ai inference on top of the Nvidia box? Anyone recall Nviso running their demo after the Brainchip AGM with AKIDA running inference on top of the Nvidia Jetson box?
Sorry did I mention that Rob Telson said in response to whether Nvidia was a competitor that they saw Nvidia more as a partner moving forward.
Before todays confirmation that Brainchip and VVDN are jointly presenting at CES 2023 in the next 5 to 8 days or so we would not have been in any position to speculate on this but as we do now know this fact what are the odds that the Ai inference on top of the box is AKIDA technology in some form or other?
Could VVDN be using a less than Science Fiction Beast to provide the on top Ai and inference? Does this even make any sense? They are already pushing the power boundary in the Xilinx box so for Edge computing the power savings for Ai inference on top become very useful.
No guarantees in this world as stupid decisions are made by bright people every single day of the week according to Blind Freddie and as he says he should know he made a stupid decision in 1958 not to take his spade with his bucket to the beach and the sand was so hot it was painful picking it up to fill his bucket. I can assure you he never made that mistake again even to this day.
Anyway my speculation and opinion only so DYOR
FF
AKIDA BALLISTA