The Pope
Regular
Had to spend my tax return on something. Hopefully a wise decision.ok who was the one with small change of over $660,000 in their pocket that just wiped out the the 22c line over 3mill purchased at 10.35.40
wasn't me

Had to spend my tax return on something. Hopefully a wise decision.ok who was the one with small change of over $660,000 in their pocket that just wiped out the the 22c line over 3mill purchased at 10.35.40
wasn't me
The last 2 quarters saw SP rises prior to the quarterly release. Prior quarters were however share price down into quarterlies. I posted charts on the crapper showing this.Green Friday??? Something is brewing folks… or the usual pump before we get some reports? I have no idea….
![]()
Thanks @Tothemoon24 interesting how Brn are actively promoting Edge ImpulseView attachment 88635
View attachment 88636
Edge AI solutions have become critically important in today’s fast-paced technological landscape. Edge AI transforms how we utilize and process data by moving computations close to where data is generated. Bringing AI to the edge not only improves performance and reduces latency but also addresses the concerns of privacy and bandwidth usage. Building edge AI demos requires a balance of cutting-edge technology and engaging user experience. Often, creating a well-designed demonstration is the first step in validating an edge AI use case that can show the potential for real-world deployment.
Building demos can help us identify potential challenges early when building AI solutions at the edge. Presenting proof-of-concepts through demos enables edge AI developers to gain stakeholder and product approval, demonstrating how AI solutions effectively create real value for users, within size, weight and power resources. Edge AI demos help customers visualize the real-time interaction between sensors, software and hardware, helping in the process of designing effective AI use cases. Building a use-case demo also helps developers experiment with what is possible.
Understanding the Use Case
The journey of building demos starts with understanding the use case – it might be detecting objects, analyzing the sensor data, interacting with a voice enabled chatbot, or asking AI agents to perform a task. The use case should be able to answer questions like – what problem are we solving? Who can benefit from this solution? Who is your target audience? What are the timelines associated with developing the demo? These answers work as the main objectives which guide the development of the demo.
Let’s consider our Brainchip Anomaly Classification C++ project demonstrating real-time classification of mechanical vibrations from an ADXL345 accelerometer into 5 motion patterns: forward-backward, normal, side-side, up-down and tap. This use case is valuable for industrial use cases like monitoring conveyor belt movements, detecting equipment malfunctions, and many more industrial applications.
![]()
Optimizing Pre-processing and Post-processing
Optimal model performance relies heavily on the effective implementation of both pre-processing and post-processing components. The pre-processing tasks might involve normalization or image resizing or conversion of audio signals to a required format. The post-processing procedure might include decoding outputs from the model and applying threshold filters to refine those results, creating bounding boxes, or developing a chatbot interface. The design of these components must ensure accuracy and reliability.
In the BrainChip anomaly classification project, the model analyzes the data from the accelerometer which records 100HZ three-dimensional vibration through accX, accY, and accZ channels. The data was collected using Edge Impulse’s data collection feature. Spectral analysis of the accelerometer signals was performed to extract features from the time-series data during the pre-processing step. Use this project and retrain them or use your own models and optimize them for Akida IP using the Edge Impulse platform. It provides user friendly no-code interface for designing ML workflow and optimizing model performance for edge devices including BrainChip’s Akida IP.
Balancing Performance and Resource Constraints
Models at the edge need to be smaller and faster while maintaining accuracy. Quantization along with knowledge distillation and pruning optimization methods allow for sustained accuracy together with improved model efficiency. BrainChip’s Akida AI Acceleration Processor IP leveragesquantization and also adds sparsity processing to realize extreme levels of energy efficiency and accuracy. It supportsreal-time, on-device inferences to take place with extremely low power.
Building Interactive Interfaces
Different approaches include modern frameworks such as Flask, FastAPI, Gradio, and Streamlit to enable users to build interactive interfaces using innovative approaches. Flask and FastAPI give developers the ability to build custom web applications with flexibility and control, while Gradio and Streamlit enable quick prototyping of machine learning applications using minimal code. Factors like interface complexity together with deployment requirements and customization needs influence framework selection. The effectiveness of the demo depends heavily on user experience such as UI responsiveness and intuitive design. The rise of vibe coding and tools like Cursor and Replit has greatly accelerated the time to build prototypes and enhance the UX, saving time for the users to focus on edge deployment and optimizing performance where it truly matters.
For the Anomaly Classification demo, we implemented user interfaces for both Python and C++ versions to demonstrate real-time inference capabilities. For the Python implementation, we used Gradio to create a simple web-based interface that displays live accelerometer readings and classification results as the Raspberry Pi 5 processes sensor data in real-time. The C++ version features a PyQt-based desktop application that provides more advanced controls and visualizations for monitoring the vibration patterns. Both interfaces allow users to see the model's predictions instantly, making it easy to understand how the system responds to different types of mechanical movements.
Overcoming Common Challenges
Common challenges in edge AI demo development include handling hardware constraints, performance consistency across different devices, and real-time processing capabilities. By implementing careful optimization combined with robust error handling and rigorous testing under diverse conditions, developers can overcome these challenges. By combining BrainChip'shardware acceleration with Edge Impulse's model optimization tools, the solution canshow consistent performance across different deployment scenarios while maintaining the low latency required for real-time industrial monitoring.
The Future of Edge AI Demos
As edge devices become more powerful and AI models more efficient, demos will play a crucial role in demonstrating the practical applications of these advancements. They serve as a bridge between technical innovation and real-world implementation, helping stakeholders understand and embrace the potential of edge AI technology.
If you are ready to turn your edge AI ideas into powerful, real-world demos, you can start building today with BrainChip’s Akida IP and Edge Impulse’s intuitive development platform. Whether you're prototyping an industrial monitoring solution or exploring new user interactions, the tools are here to help you accelerate development and demonstrate what is possible.
- Explore the BrainChip Developer Hub
- Get started with Edge Impulse
Article by:
Dhvani Kothari is a Machine Learning Solutions Architect at BrainChip. With a background in data engineering, analytics, and applied machine learning, she has held previous roles at Walmart Global Tech and Capgemini. Dhvani has a Master of Science degree in Computer Science from the University at Buffalo and a Bachelor of Engineering in Computer Technology from Yeshwantrao Chavan College of Engineering.
A question was asked to brn management when Qualcomm took over what was the reason behind edge impulses to hold off on further brn business.Thanks @Tothemoon24 interesting how Brn are actively promoting Edge Impulse
The Future of Edge AI Demos
As edge devices become more powerful and AI models more efficient, demos will play a crucial role in demonstrating the practical applications of these advancements. They serve as a bridge between technical innovation and real-world implementation, helping stakeholders understand and embrace the potential of edge AI technology.
If you are ready to turn your edge AI ideas into powerful, real-world demos, you can start building today with BrainChip’s Akida IP and Edge Impulse’s intuitive development platform. Whether you're prototyping an industrial monitoring solution or exploring new user interactions, the tools are here to help you accelerate development and demonstrate what is possible.
- Explore the BrainChip Developer Hub
- Get started with Edge Impulse
Interesting results from AI when you ask what does AKIDA1000 enable Chelpis Robots to achieve for industrial use and in Defence use.We've had a burst of AFRL/RTX(ILS) microDoppler news recently, and the SBIR was to run for 6 months to 1 year. It was announced in December 2024, so it could pop up any time from now to the end of the year.
Another opportunity is Chelpis. They are a Taiwanese cybersecurity company. Which countries in all the world would be under more constant cyber attack than Taiwan?
That urgency may explain why the Chelpis agreement is directed to Akida 1 SoC.
There's a lot to unpack in the Chelpis announcement:
1. Akida 1000 chips for immediate inclusion in M2 cybersecurity cards for qualification and deployment
2. Collaboration to develop a PQ-AI robotic chip (PQ = post-quantum computing, ie, hardened against future quantum computer cyber attack).
3. Akida IP to also be used for NPU capabilities (Akida's primary function)
4. Exploring "advanced Akida IP visual GenAI capabilities" (Akida GenAI).
5. Applied for Taiwanese government support for the development
6. Made-in-USA strategy
https://www.chelpis.com/post/brainchip-collaborates-with-chelpis-mirle-on-security-solution
BrainChip Collaborates with Chelpis-Mirle on Security Solution
- May 2
LAGUNA HILLS, Calif.--(BUSINESS WIRE)--BrainChip Holdings Ltd (ASX: BRN, OTCQX: BRCHF, ADR: BCHPY), the world’s first commercial producer of ultra-low power, fully digital, event-based, neuromorphic AI, today announced that Chelpis Quantum Corp. has selected its Akida AKD1000 chips to serve as the processor for built-in post-quantum cryptographic security.
Chelpis, a chip company leading the Quantum Safe Migration ecosystem in Taiwan, is developing an M.2 card using the AKD1000 that can be inserted into targeted products to support their cryptographic security solutions. The M.2 card is based on a design from BrainChip along with an agreement to purchase a significant number of AKD1000 chips for qualification and deployment. Upon completion of this phase, Chelpis is planning to increase its commitment with additional orders for the AKD1000.
This agreement is the first step in a collaboration that is exploring the development of an AI-PQC robotic chip designed to fulfill both next-generation security and AI computing requirements. This project is a joint development effort with Chelpis partner company Mirle (2464.TW) and has been formally submitted for consideration under Taiwan’s chip innovation program. The funding aims to promote a new system-on-chip (SoC) integrating RISC-V, PQC, and NPU technologies. This SoC will specifically support manufacturing markets that emphasize a Made-in-USA strategy. Mirle plans to build autonomous quadruped robotics that mimic the movement of four-legged animals for industrial/factory environments. To enable this vision, Chelpis is exploring BrainChip’s advanced Akida™ IP to incorporate advanced visual GenAI capabilities in the proposed SoC design.
"The ability to add Edge AI security capabilities to our industrial robotics project that provides the low power data processing required is paramount to successfully achieving market validation in the robotics sector," said Ming Chih, CEO of Chelpis. "We believe that BrainChip’s Akida is just the solution that we further need to bring our SoC to fruition. Their event-based processing and advanced models serve as a strong foundation for developing a platform for manufacturing customers looking to leverage advanced robotics in their facilities."
"Akida’s ability to efficiently provide cyber-security acceleration with energy efficiency can help secure autonomous robotic devices," said Sean Hehir, CEO of BrainChip. "Akida’s innovative approach to supporting LLMs and GenAI algorithms could serve as a key contributor to Chelpis as they pursue government funding to develop their SoC and advance their industrial robotic initiatives."
It looks like Chelpis are in boots and all.
Totally agree, an AI question about the Industrial jobs or tasks these robots could perform is interesting. A question about adapting the robots for Defence and future defence uses is interesting as well. Its Taiwan so defence matters.We've had a burst of AFRL/RTX(ILS) microDoppler news recently, and the SBIR was to run for 6 months to 1 year. It was announced in December 2024, so it could pop up any time from now to the end of the year.
Another opportunity is Chelpis. They are a Taiwanese cybersecurity company. Which countries in all the world would be under more constant cyber attack than Taiwan?
That urgency may explain why the Chelpis agreement is directed to Akida 1 SoC.
There's a lot to unpack in the Chelpis announcement:
1. Akida 1000 chips for immediate inclusion in M2 cybersecurity cards for qualification and deployment
2. Collaboration to develop a PQ-AI robotic chip (PQ = post-quantum computing, ie, hardened against future quantum computer cyber attack).
3. Akida IP to also be used for NPU capabilities (Akida's primary function)
4. Exploring "advanced Akida IP visual GenAI capabilities" (Akida GenAI).
5. Applied for Taiwanese government support for the development
6. Made-in-USA strategy
https://www.chelpis.com/post/brainchip-collaborates-with-chelpis-mirle-on-security-solution
BrainChip Collaborates with Chelpis-Mirle on Security Solution
- May 2
LAGUNA HILLS, Calif.--(BUSINESS WIRE)--BrainChip Holdings Ltd (ASX: BRN, OTCQX: BRCHF, ADR: BCHPY), the world’s first commercial producer of ultra-low power, fully digital, event-based, neuromorphic AI, today announced that Chelpis Quantum Corp. has selected its Akida AKD1000 chips to serve as the processor for built-in post-quantum cryptographic security.
Chelpis, a chip company leading the Quantum Safe Migration ecosystem in Taiwan, is developing an M.2 card using the AKD1000 that can be inserted into targeted products to support their cryptographic security solutions. The M.2 card is based on a design from BrainChip along with an agreement to purchase a significant number of AKD1000 chips for qualification and deployment. Upon completion of this phase, Chelpis is planning to increase its commitment with additional orders for the AKD1000.
This agreement is the first step in a collaboration that is exploring the development of an AI-PQC robotic chip designed to fulfill both next-generation security and AI computing requirements. This project is a joint development effort with Chelpis partner company Mirle (2464.TW) and has been formally submitted for consideration under Taiwan’s chip innovation program. The funding aims to promote a new system-on-chip (SoC) integrating RISC-V, PQC, and NPU technologies. This SoC will specifically support manufacturing markets that emphasize a Made-in-USA strategy. Mirle plans to build autonomous quadruped robotics that mimic the movement of four-legged animals for industrial/factory environments. To enable this vision, Chelpis is exploring BrainChip’s advanced Akida™ IP to incorporate advanced visual GenAI capabilities in the proposed SoC design.
"The ability to add Edge AI security capabilities to our industrial robotics project that provides the low power data processing required is paramount to successfully achieving market validation in the robotics sector," said Ming Chih, CEO of Chelpis. "We believe that BrainChip’s Akida is just the solution that we further need to bring our SoC to fruition. Their event-based processing and advanced models serve as a strong foundation for developing a platform for manufacturing customers looking to leverage advanced robotics in their facilities."
"Akida’s ability to efficiently provide cyber-security acceleration with energy efficiency can help secure autonomous robotic devices," said Sean Hehir, CEO of BrainChip. "Akida’s innovative approach to supporting LLMs and GenAI algorithms could serve as a key contributor to Chelpis as they pursue government funding to develop their SoC and advance their industrial robotic initiatives."
It looks like Chelpis are in boots and all.
For vehicle to vehicle communication, there will need to be an agreed international standard communication protocol, just as there is for mobile phones.This may be a good one for @Diogenese to look into.
View attachment 88649
EXTRACT ONLY
View attachment 88650
Legal Challenges in Mercedes’ V2X Communication Patents - PatentPC
Explore the legal challenges surrounding Mercedes’ V2X communication patents and their significance in the future of connected vehicles.patentpc.com
I’ve been spending some time in Japan lately, and things are starting to get interesting when you look at who’s actually connected to whom.
During a conversation with a business partner, a company called Vector came up …a German player… so let’s take a look.
Renesas, who,… as we all know,…licensed Akida from us, plays a pretty central role in all this. Not just because they’re integrating the Akida IP, but also because they’ve worked with Vector years ago. Back then, it was about CANopen and industrial Ethernet… not exactly headline material, but definitely industrially relevant.
So what’s Vector’s role? Well, they’re not some small outfit. They’re playing in the top league, especially in the AUTOSAR space. And more recently, they’ve teamed up with Synopsys to develop SDV platforms.
Yes, Synopsys… the same one that’s part of the Intel Foundry Alliance… just like BrainChip.
What’s emerging here is a pretty tight-knit web of players. Not all directly connected, but definitely linked through shared nodes:
Renesas ↔ Vector
Renesas ↔ BrainChip
Synopsys ↔ Intel Foundry ↔ BrainChip
Synopsys ↔ Vector
And when you also consider that Vector has been working with Mercedes-Benz on SDV tooling and embedded software…
Well, maybe it’s worth taking a closer look at what’s possibly already brewing quietly in the background.
Coincidence? Or is there already something moving under the radar that just hasn’t made it to the headlines yet?
Just asking for a friend.
I’m no engineer…
But the network is definitely there.
I’m not good at researching so I leave it to you @Bravo
See Adiuvo mentioned. Hope this is them.Not sure of im seeing things but in the Edge inpulse projecrs says last published 18 july 2025 at 16.34.10
Seems it is.
Great find Bravo,This press release on 11 July 2025 got me wondering whether Tata Exlsi was first introduced to BrainChip by Synopsys.
See the screenshot below showing a picture of BrainChip's Akida in Synopsys's "Corporate Overview for Investors May 2022"
BrainChip and Tata Elxsi officially announced their partnership on August 28, 2023, when Tata Elxsi joined BrainChip’s Essential AI ecosystem to integrate Akida neuromorphic technology into medical and industrial applications.
Tata’s continued collaboration with Synopsys (2025) builds upon their existing relationship with BrainChip, completing a triangle of opportunity IMO.
Synopsys + Tata could potentially model and test complex ECUs before production which is critical for SDV.
BrainChip + Tata could potentially allow these ECUs and future ECUs to embed real-time, energy-efficient AI inference.
IMO. DYOR.
Press Releases
Tata Elxsi and Synopsys Collaborate to Accelerate Software-Defined Vehicle Development through Advanced ECU Virtualization Capabilities
Date: Jul 11 2025
Integrated capabilities aim to simplify and speed software development and testing to help reduce related costs and de-risk production timelines
Bengaluru, India – July 11, 2025 — Tata Elxsi, a global leader in design and technology services, today announced the signing of a Memorandum of Understanding (MoU) with Synopsys, a leader in silicon to systems design solutions, to collaborate to deliver advanced automotive virtualization solutions. The MoU was signed at the SNUG India 2025 event in Bengaluru by senior leaders from both companies.
The collaboration will provide customers pre-verified, integrated solutions and services that make it easy to design and deploy virtual electronic control units (vECUs), a cornerstone technology critical for efficient software development and testing in today’s software-defined vehicles. The collaboration brings together Tata Elxsi’s engineering capabilities in embedded systems and integration with Synopsys’ industry-leading virtualization solutions that are used by more than 50 global automotive OEMs and Tier 1 suppliers to help reduce development complexity and cost, improve quality of software systems, and de-risk vehicle production timelines.
Together, the companies are already collaborating on programs with several global customers to enable vECUs, as well as software bring-up, board support package (BSP) integration, and early-stage software validation. These solutions are being deployed across vehicle domains such as powertrain, chassis, body control, gateway, and central compute, helping customers simulate real-world scenarios, validate software early, and reduce reliance on physical prototypes.
Through the collaboration, Synopsys and Tata Elxsi will further explore opportunities to scale and accelerate the deployment of electronics digital twins for multi-ECU and application specific systems.
![]()
“Our partnership with Synopsys reflects a future-forward response to how vehicle development is evolving. As OEMs move away from traditional workflows, there is growing demand for engineering services that are tightly integrated with virtualization tools. This strategic collaboration enables us to jointly address that shift with focus, flexibility, and domain depth,” said Sundar Ganapathi, Chief Technology Officer of Automotive, Tata Elxsi.
“The automotive industry’s transformation to software-defined vehicles requires advanced virtualization capabilities from silicon to systems. Our leadership enabling automotive electronics digital twins, combined with Tata Elxsi’s engineering scale and practical experience operationalising automotive system design, will simplify the adoption of virtual ECUs and thereby accelerate software development and testing to improve quality and time to market,” said Marc Serughetti, Vice President, Synopsys Product Management & Markets Group.
Tata Elxsi & Synopsys Partner to Speed SDV with Advanced ECU Virtualization
Discover how Tata Elxsi and Synopsys are accelerating software-defined vehicle (SDV) innovation with advanced ECU virtualization for faster development and validation.www.tataelxsi.com
Reminder - #28,573
View attachment 88661