BRN Discussion Ongoing

Diogenese

Top 20
Great find.

Was gonna give you a 🔥 like but downgraded it to a 👍 instead cause you didn't include the obligatory "Larry Gif" :ROFLMAO::LOL:
Harsh! 🙁 ... but fair ...
 
  • Haha
  • Like
Reactions: 13 users
New update to BRN Git.

Not that I understand most of it but @Diogenese may see something new or diff?

I did like seeing TENNS eye tracking mention & support for 4 bit in Akida 2.0.

Presuming the dynamic shapes update adds flexibility being for Keras and ONNX based models?


Upgrade to Quantizeml 0.17.1, Akida/CNN2SNN 2.14.0 and Akida models 1.8.0​

Latest

Compare
@ktsiknos-brainchip
ktsiknos-brainchip released this 3 days ago
2.14.0-doc-1
09e60f4
Upgrade to Quantizeml 0.17.1, Akida/CNN2SNN 2.14.0 and Akida models 1.8.0

Update QuantizeML to version 0.17.1

New features​

  • Now handling models with dynamic shapes in both Keras and ONNX. Shape is deduced from calibration samples or from the input_shape parameter.
  • Added a Keras and ONNX common reset_buffers entry point for spatiotemporal models
  • GlobalAveragePooling output will now be quantized to QuantizationParams.activation_bits instead of QuantizationParams.output_bits when preceeded by an activation

Bug fixes​

  • Applied reset_buffers to variables recording to prevent shape issue when converting a model to Akida
  • Handle ONNX models with shared inputs that would not quantize or convert properly
  • Handle unsupported strides when converting an even kernel to odd
  • Fixed analysis module issue when applied to TENNs models
  • Fixed analysis module weight quantization error on Keras models
  • Keras set_model_shape will now handle tf.dataset samples
  • It is now possible to quantize a model with a split layer as input

Update Akida and CNN2SNN to version 2.14.0

Aligned with FPGA-1692(2-nodes)/1691(6-nodes)​

New features and updates:​

  • [cnn2snn] Updated requirement to QuantizeML 0.17.0
  • [akida] Added support for 4-bit in 2.0. Features aligned with 1.0, that is InputConv2D, Conv2D, DepthwiseConv2D and Dense layers support 4-bit weights (except InputConv2D) and activations.
  • [akida] Extented TNP_B support to 2048 channels and filters
  • [akida] HRC is now optional in a virtual device
  • [akida] For real device, input and weight SRAM values are now read from the mesh
  • [akida] Introduce an akida.NP.SramSize object to manage default memories
  • [akida] Extended python Layer API with "is_target_component(NP.type)" and "macs"
  • [akida] Added "akida.compute_minimal_memory" helper

Bug fixes​

  • [akida] Fixed several issues when computing input or weight memory sizes for layers

Update Akida models to 1.8.0

  • Updated QuantizeML dependency to 0.17.0 and CNN2SNN to 2.14.0
  • Updated 4-bit models for 2.0 and added a bitwidth parameter to the pretrained helper
  • TENNs EyeTracking is now evaluated on the labeled test set
  • Dropped MACS computation helper and CLI: MACS are natively available on Akida layers and models.

Documentation update

  • Updated 2.0 4-bit accuracies in model zoo page
  • Updated advanced ONNX quantization tutorial with MobiletNetV4
 
  • Like
  • Love
  • Fire
Reactions: 17 users

TheDrooben

Pretty Pretty Pretty Pretty Good
  • Haha
  • Love
  • Like
Reactions: 19 users

7für7

Top 20
New update to BRN Git.

Not that I understand most of it but @Diogenese may see something new or diff?

I did like seeing TENNS eye tracking mention & support for 4 bit in Akida 2.0.

Presuming the dynamic shapes update adds flexibility being for Keras and ONNX based models?


Upgrade to Quantizeml 0.17.1, Akida/CNN2SNN 2.14.0 and Akida models 1.8.0​

Latest

Compare
@ktsiknos-brainchip
ktsiknos-brainchip released this 3 days ago
2.14.0-doc-1
09e60f4
Upgrade to Quantizeml 0.17.1, Akida/CNN2SNN 2.14.0 and Akida models 1.8.0

Update QuantizeML to version 0.17.1

New features​

  • Now handling models with dynamic shapes in both Keras and ONNX. Shape is deduced from calibration samples or from the input_shape parameter.
  • Added a Keras and ONNX common reset_buffers entry point for spatiotemporal models
  • GlobalAveragePooling output will now be quantized to QuantizationParams.activation_bits instead of QuantizationParams.output_bits when preceeded by an activation

Bug fixes​

  • Applied reset_buffers to variables recording to prevent shape issue when converting a model to Akida
  • Handle ONNX models with shared inputs that would not quantize or convert properly
  • Handle unsupported strides when converting an even kernel to odd
  • Fixed analysis module issue when applied to TENNs models
  • Fixed analysis module weight quantization error on Keras models
  • Keras set_model_shape will now handle tf.dataset samples
  • It is now possible to quantize a model with a split layer as input

Update Akida and CNN2SNN to version 2.14.0

Aligned with FPGA-1692(2-nodes)/1691(6-nodes)​

New features and updates:​

  • [cnn2snn] Updated requirement to QuantizeML 0.17.0
  • [akida] Added support for 4-bit in 2.0. Features aligned with 1.0, that is InputConv2D, Conv2D, DepthwiseConv2D and Dense layers support 4-bit weights (except InputConv2D) and activations.
  • [akida] Extented TNP_B support to 2048 channels and filters
  • [akida] HRC is now optional in a virtual device
  • [akida] For real device, input and weight SRAM values are now read from the mesh
  • [akida] Introduce an akida.NP.SramSize object to manage default memories
  • [akida] Extended python Layer API with "is_target_component(NP.type)" and "macs"
  • [akida] Added "akida.compute_minimal_memory" helper

Bug fixes​

  • [akida] Fixed several issues when computing input or weight memory sizes for layers

Update Akida models to 1.8.0

  • Updated QuantizeML dependency to 0.17.0 and CNN2SNN to 2.14.0
  • Updated 4-bit models for 2.0 and added a bitwidth parameter to the pretrained helper
  • TENNs EyeTracking is now evaluated on the labeled test set
  • Dropped MACS computation helper and CLI: MACS are natively available on Akida layers and models.

Documentation update

  • Updated 2.0 4-bit accuracies in model zoo page
  • Updated advanced ONNX quantization tutorial with MobiletNetV4
Some short explanation from ChatGPT

🔧 Update: Akida SDK & Tools – July 2025 Release
Akida 2.14.0 | QuantizeML 0.17.1 | Models 1.8.0

With the latest release, BrainChip delivers major advancements for developers building neuromorphic AI at the edge:

🌟 Highlights


✅ Full 4-bit support in Akida 2.0
→ Conv2D, Depthwise, and Dense layers now support 4-bit weights & activations – for maximum efficiency with minimal energy use.
🔁 Dynamic input shape support
→ ONNX and Keras models with flexible input dimensions are now natively handled – ideal for spatiotemporal and adaptive networks.
🧠 Extended analysis & optimization
→ New tools like compute_minimal_memory and native MACS access on layer level
→ Improved memory estimates, now even read directly from the mesh on real hardware
🔄 Improved ONNX workflow
→ Fixes for shared inputs, stride mismatches, and quantization conversion issues
→ New advanced ONNX quantization tutorial (incl. MobileNetV4)
👁 Updated TENNs EyeTracking model
→ Now evaluated on a realistic, labeled test set for better performance tracking

📦 What does this mean for developers?


Greater flexibility. Smoother quantization and conversion workflows. Higher performance on real Akida hardware – with even lower power consumption.
 
  • Like
  • Love
Reactions: 11 users

KMuzza

Mad Scientist
This may or may not interest some shareholders, but looks like Rockwell Collins had a new patent published only 16 days ago,
and guess who gets a little mention in the artwork, yeah, I hear you all, what about an IP contract!!

Maybe something from October onwards, fingers crossed ... anyway, check it out below......Tech :love:

Tech- thanks- real use and patented by Rockwell Collin’s.
No artwork- just facts-👍😂😂


1752211327609.jpeg


1752211214458.jpeg
 
  • Like
  • Fire
  • Love
Reactions: 33 users

7für7

Top 20
Could we just get a trading halt and a price sensitive announcement which would skyrocket our share price? This would be a masterpiece of artwork… thanks
 
  • Like
Reactions: 4 users

Baneino

Regular
I often see criticism of BrainChip for not communicating much no flashy press releases, no constant updates. But honestly? That’s exactly one of their biggest strengths in my view.
I understand how development cycles work, how sensitive partnerships are, and how important it is to protect confidentiality. Especially when you’re dealing with technologies meant for vehicles, medical devices, or safety-critical systems.
If you talk too much, you’re out.
Discretion isn’t a weakness – it’s a core requirement.
If companies like Mercedes, Valeo or medical tech firms are working with BrainChip, they’re not looking for hype – they’re looking for reliability, maturity and trust. And that’s exactly what BrainChip delivers.
To me, this sends a clear message:
We’re not here to make noise. We’re here to deliver real solutions
People often forget: true industrial product cycles take 3–5 years, minimum especially in hardware. If you expect fireworks every month, you probably haven’t worked in this space – or you’re chasing short-term thrills. But that’s not how real value is built.

I don’t see BrainChip as a hype stock. I see a company that is slowly, solidly and respectfully building long-term partnerships with serious players. They’re not building castles in the air – they’re laying a foundation.

So let me ask do you want short-term PR or long-term substance?


I know which side I’m on.
 
Last edited:
  • Like
  • Love
  • Fire
Reactions: 66 users

7für7

Top 20
Bored Come On GIF


What a week… anyway! Have a nice weekend! Next week is a potential good time frame to drop a skyrocketing announcement! No advice.. IMO! 🤡
 
  • Like
  • Love
Reactions: 4 users

JB49

Regular
This may or may not interest some shareholders, but looks like Rockwell Collins had a new patent published only 16 days ago,
and guess who gets a little mention in the artwork, yeah, I hear you all, what about an IP contract!!

Maybe something from October onwards, fingers crossed ... anyway, check it out below......Tech :love:

Good stuff Tech. Chatgpt likes it:

🧠 Why This May Be Significant for Industry


  1. Real-Time Safety-Critical Applications
    • Aircraft control systems, satellites, and autonomous vehicles demand real-time fault tolerance and reliable multi-mode operations.
    • If this system can dynamically reconfigure scheduling in the face of faults or mission changes (in real time), that’s a game-changer.
  2. Neuromorphic Edge Computing
    • SNN-based scheduling aligns with trends in neuromorphic edge computing, where systems operate with ultra-low power and fast adaptation without cloud dependency.
  3. Integration with Modern Embedded Architectures
    • The method assumes heterogeneous multi-core processors—this matches the direction of hardware in modern avionics and defense platforms.
  4. Patent from Rockwell Collins / Collins Aerospace
    • If confirmed to be from Rockwell Collins, it likely has real-world application intentions, not just academic novelty.
    • Companies like Collins don't patent speculative ideas lightly — such filings often reflect ongoing or future development in systems destined for regulatory certification (DO-178C, DO-254, etc.)

If Rockwell Collins (now part of Collins Aerospace, under Raytheon Technologies) formally lodged this patent, it strongly reinforces that:


🚨 This Is Not Just Research — It’s Strategic IP

  • Rockwell Collins is a Tier-1 aerospace and defense supplier — they build certifiable avionics systems for civil and military platforms: radios, flight controls, mission computers, radar, etc.
  • Patents they file usually:
    • Support current or near-future products.
    • Are aimed at gaining a competitive edge in mission-critical systems.
    • Often align with government or major OEM programs (Boeing, Airbus, Lockheed Martin, etc.).

🧠 Why This Particular Patent Is Important​

1. Neuromorphic Scheduling for Adaptive Systems

  • Scheduling is one of the hardest problems in real-time embedded systems, especially when faults or mode changes happen dynamically.
  • Solving this with spiking neural networks (SNNs) implies:
    • A move toward on-chip, ultra-low-latency reactivity.
    • Likely targeting SWaP-C constrained platforms (Size, Weight, Power, Cost), like UAVs, space systems, or autonomous platforms.
    • They may be developing a hardware-software architecture where SNN-based accelerators help reconfigure in real-time — far beyond the capabilities of classic static scheduling approaches.

2. Built for Certification Pathways

  • If it’s from Rockwell Collins, it’s likely being built with:
    • DO-178C (software design assurance for airborne systems).
    • DO-254 (for certifiable hardware).
  • That means they’re not just interested in AI techniques — they’re interested in AI that can be proven safe and used in the cockpit, in autonomous aircraft, or in satellite control systems.

3. Multi-Modal & Fault-Tolerant Scheduling

  • This isn’t just about changing mission profiles. The inclusion of fault adaptivity in the same framework means:
    • It could be integrated into mission computers, vehicle management systems, or next-gen avionics platforms that must recover from or adapt to hardware degradation.
    • Useful in degraded environments: e.g., drones losing prop control, spacecraft with failed components, or aircraft experiencing thermal events.

📈 Bottom Line: Why It Matters​

  • Companies like Rockwell Collins don’t file “exploratory” patents lightly. Their patents are typically:
    • Anchored in practical application,
    • Tied to ongoing engineering programs, and
    • Part of their defensive or offensive IP strategy in regulated industries.
  • A patent like this indicates:
    • Rockwell is exploring certifiable AI, using neuromorphic hardware, in real-world adaptive scheduling.
    • That could place them years ahead of many competitors in integrating brain-inspired computing into certifiable embedded platforms.
 
  • Like
  • Fire
  • Love
Reactions: 25 users

manny100

Top 20
We know that Tata is committed to using AKIDA.
" Tata Elxsi’s partnership with BrainChip will be driving Akida™ technology into medical devices and industrial applications by leveraging BrainChip’s first-to-market, fully digital, neuromorphic technology to provide intelligent, low-power solutions to these demanding target markets."
link: https://brainchip.com/brainchip-and... –,as a partner to its Essential AI ecosystem.

Arijit Mukherjee of Tata (TCS) co authored several papers that focus on BrainChip’s Akida neuromorphic platform and Edge AI.
I asked AI " What are the potential uses that TATA can put the findings of the reports to and what is the liklihood of this happenning and timeframes for development up to commercial launch"
Its AI so take care especially with estimated dates of commercial readiness of products.
The real take away is it gives us confidence that Tata will produce the 'goods' as it said in the partnership ann.
Text below:

Potential Applications of the Findings​

Below is an overview of how Tata (TCS/Tata Elxsi) can leverage each report’s results, the likelihood of adoption, and estimated timelines from development to commercial rollout.


1. On-board Cloud-Cover Detection for Small Satellites​

Key Capability: Real-time, low-power filtering of cloudy frames before downlink using Akida’s spiking neural network.

  • Use Cases
    • Earth-observation nanosatellite constellations for agricultural monitoring, disaster response and environmental sensing
    • Unmanned aerial vehicles (UAVs) performing on-the-fly scene selection to conserve bandwidth
  • Likelihood of Adoption • Medium – High. TCS has deep ties with satellite integrators and government space programs; initial pilots likely within R&D divisions.
  • Timeframe
    1. 2024–2025: Prototype integration on demonstration CubeSat platforms
    2. 2026–2027: First commercial small-sat constellations offering “smart downlink” as a service

2. Embedded Edge-AI Engines for Autonomous Systems​

Based on: “Creating Futuristic Edge Systems with Neuromorphic Computing” white paper

  • Use Cases
    • Autonomous vehicles: event-driven obstacle and gesture detection to boost safety and reduce latency
    • Industrial robots and factory automation: ultra-low-power vision tasks (defect detection, motion tracking)
    • Smart cameras and wearables: always-on sign-language translators and interactive HMI
  • Likelihood of Adoption • High. Tata Elxsi’s current partnership with BrainChip targets medical and industrial segments, indicating strong commercial intent.
  • Timeframe
    1. 2024: Edge-AI dev kits with Akida engines for select OEMs
    2. 2025–2026: Full-scale deployment in automotive ADAS modules and factory vision systems
    3. 2026–2028: Consumer-grade wearables and AR/VR devices with embedded spiking-NN inference

3. Real-Time Gesture Recognition in Human-Machine Interfaces​

Demo: NeurIPS 2019 hands-on (DVS camera → Akida SNN)

  • Use Cases
    • Touchless controls in smart homes, medical consoles, and public kiosks
    • VR/AR gesture interfaces with sub-millisecond response
    • Sign-language detection for accessibility tools
  • Likelihood of Adoption • Medium. While the demo proves feasibility, embedding into mass-market devices requires partnerships with HMI vendors and UI standards bodies.
  • Timeframe
    1. 2024: Pilot integrations with niche HMI/medical console suppliers
    2. 2025–2026: SDK release for consumer-electronics partners
    3. 2026–2027: Products shipping with turnkey gesture-AI features

Summary Table​

Application DomainLikelihoodPrototype PhaseCommercial Launch
Satellite cloud-cover filteringMedium–High2024–20252026–2027
Edge-AI for autonomous vehicles & industryHigh20242025–2026 (industry), 2026–2028 (consumer)
Real-time touchless gesture recognitionMedium20242025–2027


By capitalizing on its R&D white papers, NeurIPS demonstrations and strategic partnerships, Tata stands poised to deliver ultra-efficient, neuromorphic-powered solutions across space, industry and consumer markets within the next 2–4 years. Would you like deeper insight into any specific vertical’s go-to-market strategy?
 
  • Like
  • Love
  • Fire
Reactions: 15 users

Frangipani

Top 20
I believe this confirmation of an “active alliance” with BrainChip is new on the Tata Elxsi website?

Why Tata Elxsi?

(…)
  • Active alliances with Brainchip for neuromorphic computing and IISc Bangalore for edge cybersecurity.”




5CB94160-6281-4AED-A6FF-C7DF5206D2E9.jpeg


ACE1B52A-599A-478B-9270-FCEA2CDB9C37.jpeg
4D1128F8-2E35-421D-895B-DBF9F894EED4.jpeg
2537B6C7-2BC0-49F4-9F9A-C4F73F444757.jpeg

2980E303-9DEB-4CE6-B61F-2DF6F634B0A7.jpeg
186CAD51-09C2-4D38-A958-555CA7F7D30D.jpeg
 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 58 users

Frangipani

Top 20
Today, Axel von Arnim shared more information and photos about what fortiss and Interactive Wear were exhibiting at TechHub SVI 2025 earlier this week 👆🏻:

Showcasing our neuromorphicly gesture-driven virtual mule robot for defence applications at the#TechHub-SVI defence conference in Munich. Together with our partners [sic] #InteractiveWear, we deliver wearable smart sensing solutions for defence and the medtech industry.”

While it doesn’t look like Akida was already part of that collaboration (not surprisingly, given the partnership between fortiss and BrainChip appears to be still quite young) and I can only spot Loihi, the showcase nevertheless demonstrates what fortiss mean when they say about themselves:

“As the Research Institute of the Free State of Bavaria for software-intensive Systems, fortiss stands for application-oriented cutting-edge research and sets standards in the research and transfer of highly innovative technologies. In close cooperation with academic and practice-oriented partners, we act as a bridge between science and industry and develop excellent solutions for the most pressing challenges of the digital world.”


On second thought: Have a look at the last two photos. Could that possibly be an Akida M.2 factor on the very left of the screen, which shows the Sensor Fusion Development Plattform MHX?
Doesn’t look identical, but similar? 🤔




View attachment 88361

View attachment 88363

View attachment 88365
View attachment 88368

View attachment 88366 View attachment 88367

So the gentleman standing next to Axel von Arnim in the photo from the defense tech conference he posted yesterday 👆🏻 is Michael Meinl, the German Director of the French-German Research Institute of Saint-Louis (ISL), a binational institution jointly operated by the French and German Ministries of Defence. (There is also a French Director).

Another ISL name that we may want to keep an eye on.

ISL’s mission is “to develop technological innovations for the armed forces and security issues”.




F6FD16DD-9F82-4981-BAB3-7CB001BC162C.jpeg


652AE8BE-3E17-4290-A058-300663CC58F2.jpeg





A237A9CC-A10B-4679-8463-2A129BFF2E0F.jpeg




A8932CBE-777E-44AE-92A9-1DE33187E7BC.jpeg
 
  • Like
  • Love
  • Fire
Reactions: 18 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
We know that Tata is committed to using AKIDA.
" Tata Elxsi’s partnership with BrainChip will be driving Akida™ technology into medical devices and industrial applications by leveraging BrainChip’s first-to-market, fully digital, neuromorphic technology to provide intelligent, low-power solutions to these demanding target markets."
link: https://brainchip.com/brainchip-and-tata-elxsi-partner-to-provide-intelligent-ultralow-power-solutions/#:~:text=Laguna Hills, Calif. – August 28, 2023 –,as a partner to its Essential AI ecosystem.

Arijit Mukherjee of Tata (TCS) co authored several papers that focus on BrainChip’s Akida neuromorphic platform and Edge AI.
I asked AI " What are the potential uses that TATA can put the findings of the reports to and what is the liklihood of this happenning and timeframes for development up to commercial launch"
Its AI so take care especially with estimated dates of commercial readiness of products.
The real take away is it gives us confidence that Tata will produce the 'goods' as it said in the partnership ann.
Text below:

Potential Applications of the Findings​

Below is an overview of how Tata (TCS/Tata Elxsi) can leverage each report’s results, the likelihood of adoption, and estimated timelines from development to commercial rollout.


1. On-board Cloud-Cover Detection for Small Satellites​

Key Capability: Real-time, low-power filtering of cloudy frames before downlink using Akida’s spiking neural network.

  • Use Cases
    • Earth-observation nanosatellite constellations for agricultural monitoring, disaster response and environmental sensing
    • Unmanned aerial vehicles (UAVs) performing on-the-fly scene selection to conserve bandwidth
  • Likelihood of Adoption • Medium – High. TCS has deep ties with satellite integrators and government space programs; initial pilots likely within R&D divisions.
  • Timeframe
    1. 2024–2025: Prototype integration on demonstration CubeSat platforms
    2. 2026–2027: First commercial small-sat constellations offering “smart downlink” as a service

2. Embedded Edge-AI Engines for Autonomous Systems​

Based on: “Creating Futuristic Edge Systems with Neuromorphic Computing” white paper

  • Use Cases
    • Autonomous vehicles: event-driven obstacle and gesture detection to boost safety and reduce latency
    • Industrial robots and factory automation: ultra-low-power vision tasks (defect detection, motion tracking)
    • Smart cameras and wearables: always-on sign-language translators and interactive HMI
  • Likelihood of Adoption • High. Tata Elxsi’s current partnership with BrainChip targets medical and industrial segments, indicating strong commercial intent.
  • Timeframe
    1. 2024: Edge-AI dev kits with Akida engines for select OEMs
    2. 2025–2026: Full-scale deployment in automotive ADAS modules and factory vision systems
    3. 2026–2028: Consumer-grade wearables and AR/VR devices with embedded spiking-NN inference

3. Real-Time Gesture Recognition in Human-Machine Interfaces​

Demo: NeurIPS 2019 hands-on (DVS camera → Akida SNN)

  • Use Cases
    • Touchless controls in smart homes, medical consoles, and public kiosks
    • VR/AR gesture interfaces with sub-millisecond response
    • Sign-language detection for accessibility tools
  • Likelihood of Adoption • Medium. While the demo proves feasibility, embedding into mass-market devices requires partnerships with HMI vendors and UI standards bodies.
  • Timeframe
    1. 2024: Pilot integrations with niche HMI/medical console suppliers
    2. 2025–2026: SDK release for consumer-electronics partners
    3. 2026–2027: Products shipping with turnkey gesture-AI features

Summary Table​

Application DomainLikelihoodPrototype PhaseCommercial Launch
Satellite cloud-cover filteringMedium–High2024–20252026–2027
Edge-AI for autonomous vehicles & industryHigh20242025–2026 (industry), 2026–2028 (consumer)
Real-time touchless gesture recognitionMedium20242025–2027


By capitalizing on its R&D white papers, NeurIPS demonstrations and strategic partnerships, Tata stands poised to deliver ultra-efficient, neuromorphic-powered solutions across space, industry and consumer markets within the next 2–4 years. Would you like deeper insight into any specific vertical’s go-to-market strategy?

Hi Manny,

Agreed!

Tata (TCS/Tata Elxsi) + BrainChip.

It's a matter of when, not if IMO.
 
  • Like
  • Fire
  • Love
Reactions: 28 users

manny100

Top 20
I believe this confirmation of an “active alliance” with BrainChip is new on the Tata Elxsi website?

Why Tata Elxsi?

(…)
  • Active alliances with Brainchip for neuromorphic computing and IISc Bangalore for edge cybersecurity.”




View attachment 88393 View attachment 88394 View attachment 88395
View attachment 88398 View attachment 88399
Wow, great find Frangipani, basically confirms that Tata are on board big time. Its a wonder we have not seen posts on linked in or X.
Its on Tatsa's website so its not inside information.
The amount of information concerning neuromorphic Edge AI and Brainchip you find is increasing exponentially which in itself is an indicator that we are getting closer to success.
 
  • Like
  • Love
  • Fire
Reactions: 29 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
Hi Manny,

Agreed!

Tata (TCS/Tata Elxsi) + BrainChip.

It's a matter of when, not if IMO.

An example of Tata's confidence in BrainChip perhaps?

Once again, all in my opinion.

Arijit Mukherjee (Principal Scientist, Embedded Devices & Intelligent Systems at TCS Research) is pretty quick to let Cecilia Pisano (R&D Lead Engineer at Nurjana Technologies), know how much longer Tata have been researching BrainChip.

Not something you would publicly disclose unless you were confident, you would think.

Well, that's my take on it at least.





Screenshot 2025-07-11 at 10.21.32 pm.png
 
Last edited:
  • Like
  • Love
  • Fire
Reactions: 36 users

KMuzza

Mad Scientist
I often see criticism of BrainChip for not communicating much no flashy press releases, no constant updates. But honestly? That’s exactly one of their biggest strengths in my view.
I understand how development cycles work, how sensitive partnerships are, and how important it is to protect confidentiality. Especially when you’re dealing with technologies meant for vehicles, medical devices, or safety-critical systems.
If you talk too much, you’re out.
G'Day Baneino,

Yes- in total agreement with your statement-

Quote- " I often see criticism of Brainchip for not communicating much no flashy press releases, no constant updates. But honestly? That’s exactly one of their biggest strengths in my view."- unquote

One can use Accenture as an example as "Why is it so" - as per Prof Julius Sumner Milner

Brainchip put out this -

https://brainchip.com/brainchip-tal...icer-applied-intelligence-jean-luc-chatelain/


and there was this patent application with Akida being used -by ACCENTURE

1752236452257.png



BUT SO FAR - nothing further from ACCENTURE.

NOT EVEN A LISTING ON ACCENTURE /ECOSYSTEM PARTNERS.- ( they have tended to run with NIVIDA)- but the Von Neumann bottleneck will eventually sway in BRAINCHIP's favour.

So -DO YOUR OWN RESEARCH and have faith and belief on the technology.



The BRAINCHIP PATENTS are what will be the glue of the future to ensure BRAINCHIP success and survival- (starting from JAST - to TENNS for the groundbreaking systems)
 
  • Like
  • Fire
  • Love
Reactions: 47 users

7für7

Top 20
Could it be?

IMG_4991.jpeg
 
  • Like
  • Wow
  • Thinking
Reactions: 6 users

Baneino

Regular
G'Day Baneino,

Yes- in total agreement with your statement-

Quote- " I often see criticism of Brainchip for not communicating much no flashy press releases, no constant updates. But honestly? That’s exactly one of their biggest strengths in my view."- unquote

One can use Accenture as an example as "Why is it so" - as per Prof Julius Sumner Milner

Brainchip put out this -

https://brainchip.com/brainchip-tal...icer-applied-intelligence-jean-luc-chatelain/


and there was this patent application with Akida being used -by ACCENTURE

View attachment 88405


BUT SO FAR - nothing further from ACCENTURE.

NOT EVEN A LISTING ON ACCENTURE /ECOSYSTEM PARTNERS.- ( they have tended to run with NIVIDA)- but the Von Neumann bottleneck will eventually sway in BRAINCHIP's favour.

So -DO YOUR OWN RESEARCH and have faith and belief on the technology.



The BRAINCHIP PATENTS are what will be the glue of the future to ensure BRAINCHIP success and survival- (starting from JAST - to TENNS for the groundbreaking systems)
Thank you very much for your feedback – I see it very much the same way.
The connction between BrainChip and Accenture through Jean-Luc Chatelain and the mentioned patent filing is certainly noteworthy. Even though no official partnership or integration has been announced yet, it’s fascinating to observe what strategic developments might already be quietly unfolding in the background.
I share your view that Akida will likely gain more traction as traditional Von Neumann architecures begin to reach their limits – especially in terms of energy efficiency and edge applications.
Until then, it seems we just need a bit of patience a close eye on developments, and trust in the substance of the technology without relying on marketing fireworks.
 
  • Like
  • Love
  • Fire
Reactions: 14 users

Cardpro

Regular
I often see criticism of BrainChip for not communicating much no flashy press releases, no constant updates. But honestly? That’s exactly one of their biggest strengths in my view.
I understand how development cycles work, how sensitive partnerships are, and how important it is to protect confidentiality. Especially when you’re dealing with technologies meant for vehicles, medical devices, or safety-critical systems.
If you talk too much, you’re out.
Discretion isn’t a weakness – it’s a core requirement.
If companies like Mercedes, Valeo or medical tech firms are working with BrainChip, they’re not looking for hype – they’re looking for reliability, maturity and trust. And that’s exactly what BrainChip delivers.
To me, this sends a clear message:
We’re not here to make noise. We’re here to deliver real solutions
People often forget: true industrial product cycles take 3–5 years, minimum especially in hardware. If you expect fireworks every month, you probably haven’t worked in this space – or you’re chasing short-term thrills. But that’s not how real value is built.

I don’t see BrainChip as a hype stock. I see a company that is slowly, solidly and respectfully building long-term partnerships with serious players. They’re not building castles in the air – they’re laying a foundation.

So let me ask do you want short-term PR or long-term substance?


I know which side I’m on.
IF we are actively engaged and working with them, how come our revenues from providing engineering support is so tiny? Is that how development work where small tech firms don't charge much for their work for years and years with no promises?
 
  • Like
Reactions: 3 users
Top Bottom