Harsh!Great find.
Was gonna give you alike but downgraded it to a
instead cause you didn't include the obligatory "Larry Gif"
![]()

Harsh!Great find.
Was gonna give you alike but downgraded it to a
instead cause you didn't include the obligatory "Larry Gif"
![]()
Great find.
Was gonna give you alike but downgraded it to a
instead cause you didn't include the obligatory "Larry Gif"
![]()
Tell me something I don't know
Some short explanation from ChatGPTNew update to BRN Git.
Not that I understand most of it but @Diogenese may see something new or diff?
I did like seeing TENNS eye tracking mention & support for 4 bit in Akida 2.0.
Presuming the dynamic shapes update adds flexibility being for Keras and ONNX based models?
Upgrade to Quantizeml 0.17.1, Akida/CNN2SNN 2.14.0 and Akida models 1.8.0
Latest
Compare
ktsiknos-brainchip released this 3 days ago![]()
2.14.0-doc-1
09e60f4
Upgrade to Quantizeml 0.17.1, Akida/CNN2SNN 2.14.0 and Akida models 1.8.0
Update QuantizeML to version 0.17.1
New features
- Now handling models with dynamic shapes in both Keras and ONNX. Shape is deduced from calibration samples or from the input_shape parameter.
- Added a Keras and ONNX common reset_buffers entry point for spatiotemporal models
- GlobalAveragePooling output will now be quantized to QuantizationParams.activation_bits instead of QuantizationParams.output_bits when preceeded by an activation
Bug fixes
- Applied reset_buffers to variables recording to prevent shape issue when converting a model to Akida
- Handle ONNX models with shared inputs that would not quantize or convert properly
- Handle unsupported strides when converting an even kernel to odd
- Fixed analysis module issue when applied to TENNs models
- Fixed analysis module weight quantization error on Keras models
- Keras set_model_shape will now handle tf.dataset samples
- It is now possible to quantize a model with a split layer as input
Update Akida and CNN2SNN to version 2.14.0
Aligned with FPGA-1692(2-nodes)/1691(6-nodes)
New features and updates:
- [cnn2snn] Updated requirement to QuantizeML 0.17.0
- [akida] Added support for 4-bit in 2.0. Features aligned with 1.0, that is InputConv2D, Conv2D, DepthwiseConv2D and Dense layers support 4-bit weights (except InputConv2D) and activations.
- [akida] Extented TNP_B support to 2048 channels and filters
- [akida] HRC is now optional in a virtual device
- [akida] For real device, input and weight SRAM values are now read from the mesh
- [akida] Introduce an akida.NP.SramSize object to manage default memories
- [akida] Extended python Layer API with "is_target_component(NP.type)" and "macs"
- [akida] Added "akida.compute_minimal_memory" helper
Bug fixes
- [akida] Fixed several issues when computing input or weight memory sizes for layers
Update Akida models to 1.8.0
- Updated QuantizeML dependency to 0.17.0 and CNN2SNN to 2.14.0
- Updated 4-bit models for 2.0 and added a bitwidth parameter to the pretrained helper
- TENNs EyeTracking is now evaluated on the labeled test set
- Dropped MACS computation helper and CLI: MACS are natively available on Akida layers and models.
Documentation update
- Updated 2.0 4-bit accuracies in model zoo page
- Updated advanced ONNX quantization tutorial with MobiletNetV4
Tech- thanks- real use and patented by Rockwell Collin’s.This may or may not interest some shareholders, but looks like Rockwell Collins had a new patent published only 16 days ago,
and guess who gets a little mention in the artwork, yeah, I hear you all, what about an IP contract!!
Maybe something from October onwards, fingers crossed ... anyway, check it out below......Tech
Good stuff Tech. Chatgpt likes it:This may or may not interest some shareholders, but looks like Rockwell Collins had a new patent published only 16 days ago,
and guess who gets a little mention in the artwork, yeah, I hear you all, what about an IP contract!!
Maybe something from October onwards, fingers crossed ... anyway, check it out below......Tech
Application Domain | Likelihood | Prototype Phase | Commercial Launch |
---|---|---|---|
Satellite cloud-cover filtering | Medium–High | 2024–2025 | 2026–2027 |
Edge-AI for autonomous vehicles & industry | High | 2024 | 2025–2026 (industry), 2026–2028 (consumer) |
Real-time touchless gesture recognition | Medium | 2024 | 2025–2027 |
Today, Axel von Arnim shared more information and photos about what fortiss and Interactive Wear were exhibiting at TechHub SVI 2025 earlier this week:
“Showcasing our neuromorphicly gesture-driven virtual mule robot for defence applications at the#TechHub-SVI defence conference in Munich. Together with our partners [sic] #InteractiveWear, we deliver wearable smart sensing solutions for defence and the medtech industry.”
While it doesn’t look like Akida was already part of that collaboration (not surprisingly, given the partnership between fortiss and BrainChip appears to be still quite young) and I can only spot Loihi, the showcase nevertheless demonstrates what fortiss mean when they say about themselves:
“As the Research Institute of the Free State of Bavaria for software-intensive Systems, fortiss stands for application-oriented cutting-edge research and sets standards in the research and transfer of highly innovative technologies. In close cooperation with academic and practice-oriented partners, we act as a bridge between science and industry and develop excellent solutions for the most pressing challenges of the digital world.”
On second thought: Have a look at the last two photos. Could that possibly be an Akida M.2 factor on the very left of the screen, which shows the Sensor Fusion Development Plattform MHX?
Doesn’t look identical, but similar?
![]()
Showcasing our neuromorphicly gesture-driven virtual mule robot for defence applications at the #TechHub-SVI defence conference in Munich. | Axel von Arnim
Showcasing our neuromorphicly gesture-driven virtual mule robot for defence applications at the #TechHub-SVI defence conference in Munich. Together with our partners #InteractiveWear, we deliver wearable smart sensing solutions for defence and the medtech industry.www.linkedin.com
View attachment 88361
View attachment 88363
View attachment 88365
View attachment 88368
View attachment 88366 View attachment 88367
We know that Tata is committed to using AKIDA.
" Tata Elxsi’s partnership with BrainChip will be driving Akida™ technology into medical devices and industrial applications by leveraging BrainChip’s first-to-market, fully digital, neuromorphic technology to provide intelligent, low-power solutions to these demanding target markets."
link: https://brainchip.com/brainchip-and-tata-elxsi-partner-to-provide-intelligent-ultralow-power-solutions/#:~:text=Laguna Hills, Calif. – August 28, 2023 –,as a partner to its Essential AI ecosystem.
Arijit Mukherjee of Tata (TCS) co authored several papers that focus on BrainChip’s Akida neuromorphic platform and Edge AI.
I asked AI " What are the potential uses that TATA can put the findings of the reports to and what is the liklihood of this happenning and timeframes for development up to commercial launch"
Its AI so take care especially with estimated dates of commercial readiness of products.
The real take away is it gives us confidence that Tata will produce the 'goods' as it said in the partnership ann.
Text below:
Potential Applications of the Findings
Below is an overview of how Tata (TCS/Tata Elxsi) can leverage each report’s results, the likelihood of adoption, and estimated timelines from development to commercial rollout.
1. On-board Cloud-Cover Detection for Small Satellites
Key Capability: Real-time, low-power filtering of cloudy frames before downlink using Akida’s spiking neural network.
- Use Cases
- Earth-observation nanosatellite constellations for agricultural monitoring, disaster response and environmental sensing
- Unmanned aerial vehicles (UAVs) performing on-the-fly scene selection to conserve bandwidth
- Likelihood of Adoption • Medium – High. TCS has deep ties with satellite integrators and government space programs; initial pilots likely within R&D divisions.
- Timeframe
- 2024–2025: Prototype integration on demonstration CubeSat platforms
- 2026–2027: First commercial small-sat constellations offering “smart downlink” as a service
2. Embedded Edge-AI Engines for Autonomous Systems
Based on: “Creating Futuristic Edge Systems with Neuromorphic Computing” white paper
- Use Cases
- Autonomous vehicles: event-driven obstacle and gesture detection to boost safety and reduce latency
- Industrial robots and factory automation: ultra-low-power vision tasks (defect detection, motion tracking)
- Smart cameras and wearables: always-on sign-language translators and interactive HMI
- Likelihood of Adoption • High. Tata Elxsi’s current partnership with BrainChip targets medical and industrial segments, indicating strong commercial intent.
- Timeframe
- 2024: Edge-AI dev kits with Akida engines for select OEMs
- 2025–2026: Full-scale deployment in automotive ADAS modules and factory vision systems
- 2026–2028: Consumer-grade wearables and AR/VR devices with embedded spiking-NN inference
3. Real-Time Gesture Recognition in Human-Machine Interfaces
Demo: NeurIPS 2019 hands-on (DVS camera → Akida SNN)
- Use Cases
- Touchless controls in smart homes, medical consoles, and public kiosks
- VR/AR gesture interfaces with sub-millisecond response
- Sign-language detection for accessibility tools
- Likelihood of Adoption • Medium. While the demo proves feasibility, embedding into mass-market devices requires partnerships with HMI vendors and UI standards bodies.
- Timeframe
- 2024: Pilot integrations with niche HMI/medical console suppliers
- 2025–2026: SDK release for consumer-electronics partners
- 2026–2027: Products shipping with turnkey gesture-AI features
Summary Table
Application Domain Likelihood Prototype Phase Commercial Launch Satellite cloud-cover filtering Medium–High 2024–2025 2026–2027 Edge-AI for autonomous vehicles & industry High 2024 2025–2026 (industry), 2026–2028 (consumer) Real-time touchless gesture recognition Medium 2024 2025–2027
By capitalizing on its R&D white papers, NeurIPS demonstrations and strategic partnerships, Tata stands poised to deliver ultra-efficient, neuromorphic-powered solutions across space, industry and consumer markets within the next 2–4 years. Would you like deeper insight into any specific vertical’s go-to-market strategy?
Wow, great find Frangipani, basically confirms that Tata are on board big time. Its a wonder we have not seen posts on linked in or X.I believe this confirmation of an “active alliance” with BrainChip is new on the Tata Elxsi website?
“Why Tata Elxsi?
(…)
- Active alliances with Brainchip for neuromorphic computing and IISc Bangalore for edge cybersecurity.”
Edge AI Solutions for Smarter Devices and Applications
Edge AI solutions optimize device performance with low-latency, secure AI processing. Explore its impact on industries like healthcare, retail, and electronics.www.tataelxsi.com
View attachment 88393 View attachment 88394 View attachment 88395
View attachment 88398 View attachment 88399
Hi Manny,
Agreed!
Tata (TCS/Tata Elxsi) + BrainChip.
It's a matter of when, not if IMO.
G'Day Baneino,I often see criticism of BrainChip for not communicating much no flashy press releases, no constant updates. But honestly? That’s exactly one of their biggest strengths in my view.
I understand how development cycles work, how sensitive partnerships are, and how important it is to protect confidentiality. Especially when you’re dealing with technologies meant for vehicles, medical devices, or safety-critical systems.
If you talk too much, you’re out.
Thank you very much for your feedback – I see it very much the same way.G'Day Baneino,
Yes- in total agreement with your statement-
Quote- " I often see criticism of Brainchip for not communicating much no flashy press releases, no constant updates. But honestly? That’s exactly one of their biggest strengths in my view."- unquote
One can use Accenture as an example as "Why is it so" - as per Prof Julius Sumner Milner
Brainchip put out this -
https://brainchip.com/brainchip-tal...icer-applied-intelligence-jean-luc-chatelain/
and there was this patent application with Akida being used -by ACCENTURE
View attachment 88405
BUT SO FAR - nothing further from ACCENTURE.
NOT EVEN A LISTING ON ACCENTURE /ECOSYSTEM PARTNERS.- ( they have tended to run with NIVIDA)- but the Von Neumann bottleneck will eventually sway in BRAINCHIP's favour.
So -DO YOUR OWN RESEARCH and have faith and belief on the technology.
![]()
Ecosystem Partners
Accenture’s ecosystem partners and suppliers bring deep expertise and the right technology to our clients for lasting value and accelerate change. Learn more.www.accenture.com
The BRAINCHIP PATENTS are what will be the glue of the future to ensure BRAINCHIP success and survival- (starting from JAST - to TENNS for the groundbreaking systems)
IF we are actively engaged and working with them, how come our revenues from providing engineering support is so tiny? Is that how development work where small tech firms don't charge much for their work for years and years with no promises?I often see criticism of BrainChip for not communicating much no flashy press releases, no constant updates. But honestly? That’s exactly one of their biggest strengths in my view.
I understand how development cycles work, how sensitive partnerships are, and how important it is to protect confidentiality. Especially when you’re dealing with technologies meant for vehicles, medical devices, or safety-critical systems.
If you talk too much, you’re out.
Discretion isn’t a weakness – it’s a core requirement.
If companies like Mercedes, Valeo or medical tech firms are working with BrainChip, they’re not looking for hype – they’re looking for reliability, maturity and trust. And that’s exactly what BrainChip delivers.
To me, this sends a clear message:
We’re not here to make noise. We’re here to deliver real solutions
People often forget: true industrial product cycles take 3–5 years, minimum especially in hardware. If you expect fireworks every month, you probably haven’t worked in this space – or you’re chasing short-term thrills. But that’s not how real value is built.
I don’t see BrainChip as a hype stock. I see a company that is slowly, solidly and respectfully building long-term partnerships with serious players. They’re not building castles in the air – they’re laying a foundation.
So let me ask do you want short-term PR or long-term substance?
I know which side I’m on.