BRN Discussion Ongoing

DK6161

Regular
Charts been looking crap for sometime and your comment is a typical hot crapper comment

View attachment 85493
R U OK Pom?
You sounds bitter, old mate. I thought you're going to keep accumulating and be greedy in times like this.
C'mon then, show us the buy order old chap!
I am just stating what the charts are showing.
And remember, we are on the same boat so I share your frustrations.
Not advice
 
  • Haha
  • Fire
Reactions: 2 users

manny100

Regular
R U OK Pom?
You sounds bitter, old mate. I thought you're going to keep accumulating and be greedy in times like this.
C'mon then, show us the buy order old chap!
I am just stating what the charts are showing.
And remember, we are on the same boat so I share your frustrations.
Not advice
BRN chart wise has been in a long range/base since around September 2023.
Its generally accepted that Neuromorphic AI at the Edge will experience exponential growth.
BRN is generally accepted as the leader of Neuromorphic AI.
At this stage of the cycle those who believe the above is true will accumulate patiently.
The role of good news:
If you believe the above is true then really good news will be the catalyst to break the equilibrium of the range/base and attract new buyers to the extent we will see higher highs and higher lows confirming a sustained uptrend.
Once the trend is established we will then see momentum investing.
Generally, a long base at lows followed by really good news on volume is a classic technical pattern for identifying potential significant price increases.
 
  • Like
  • Love
Reactions: 14 users

7für7

Top 20
Yeeeehaa record braking! Already 2 announcements this week… will we see 3 or 4???
 
  • Haha
  • Like
Reactions: 7 users
R U OK Pom?
You sounds bitter, old mate. I thought you're going to keep accumulating and be greedy in times like this.
C'mon then, show us the buy order old chap!
I am just stating what the charts are showing.
And remember, we are on the same boat so I share your frustrations.
Not advice
Never bitter just playing games with all the weasels from hot crapper

1748407883171.gif
 
Last edited:
  • Haha
  • Like
Reactions: 9 users
Isn't it Akida that specifically is what helps Nanose sniff partials out or is Jetson able to also do this as well ?.
 
  • Thinking
Reactions: 2 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
Isn't it Akida that specifically is what helps Nanose sniff partials out or is Jetson able to also do this as well ?.
Could NVIDIA's Jeston series sniff out the VOC particles with the same accuracy as BrainChip's Akida in relation to Nanose?

ChatGPT said:​

The short answer is: not directlyNVIDIA Jetson modules and BrainChip’s Akida neuromorphic processor operate on fundamentally different architectures and processing models. While both can process data from an e-nose (electronic nose) sensor array, Akida has specific advantages for low-power, real-time inference of volatile organic compounds (VOCs) due to its neuromorphic design.

🔬 Breakdown: Jetson vs. Akida for VOC Detection (e.g., in NaNose)​

FeatureBrainChip AkidaNVIDIA Jetson (e.g., Xavier NX, Orin)
ArchitectureNeuromorphic (Spiking Neural Network)GPU-accelerated (deep learning, CNNs, LSTMs)
Data ModelEvent-based / spike-encodedFrame-based / dense tensor processing
VOC Sensor InterfaceOptimized for real-time spiking data (e.g., AERO encoding)Can interface but needs preprocessing
LatencyUltra-low (milliseconds)Higher (requires buffer + batch inference)
Power EfficiencyMicro-watt to milli-watt rangeSeveral watts to tens of watts
Accuracy with NaNoseDemonstrated 97%+ accuracy using spike-based classificationPotentially similar, but with higher power cost and latency
Training ApproachOnline/on-device continual learningTypically pre-trained and updated via cloud or edge inference
ScalabilityExcellent for embedded sensorsBetter for multi-modal, compute-heavy scenarios


🎯 Why Akida May Be Better Suited for VOC Detection in NaNose-Type Devices​

  1. Real-Time Spike-Based Classification
    • VOCs generate dynamic, time-dependent signals. Akida processes them directly as spikes, which mimics the biological olfactory system.
    • Jetson would require conversion of VOC sensor data into tensors, potentially losing time-domain resolution and increasing preprocessing complexity.
  2. On-Device Learning
    • Akida supports on-chip continual learning, useful for adapting to new users, environments, or sensor drift.
    • Jetson typically relies on cloud-based retraining or large local models with fixed parameters.
  3. Power Budget
    • NaNose aims for handheld, portable diagnostics — environments where Jetson’s 10–15W draw (or more) is impractical compared to Akida’s sub-1W footprint.
  4. Biological Relevance
    • Akida’s neuromorphic design is inspired by the human olfactory cortex, making it intrinsically well-suited to VOC processing.
    • Jetson is general-purpose AI hardware — very powerful but not biologically inspired or VOC-specialized.

🔄 Could They Be Used Together?​

Yes. It's theoretically feasible for a hybrid system:
  • Akida could perform first-pass classification or anomaly detection from e-nose input.
  • Jetson could handle broader context processing, GUI, connectivity, or integration with other modalities (e.g., image + breath analysis).

🧠 Conclusion​

While Jetson modules are powerful general-purpose AI edge processors, BrainChip’s Akida is purpose-built for ultra-low-power, spike-based sensory tasks like VOC detection. In a medical e-nose like NaNose, Akida is likely more efficient, biologically aligned, and better suited to the task, especially where power, latency, and real-time accuracy are critical.






ChatGPT can make mistakes. Check important info.
 
  • Like
  • Love
  • Fire
Reactions: 30 users

Diogenese

Top 20
Anastasi Nvidia Huawei Spray Tan

Looks like Anastasi was standing downwind of Donald’s morning application.



China's HUGE AI Chip Breakthrough: NVIDIA is out?
 
  • Like
  • Haha
  • Fire
Reactions: 9 users

Labsy

Regular
BRN chart wise has been in a long range/base since around September 2023.
Its generally accepted that Neuromorphic AI at the Edge will experience exponential growth.
BRN is generally accepted as the leader of Neuromorphic AI.
At this stage of the cycle those who believe the above is true will accumulate patiently.
The role of good news:
If you believe the above is true then really good news will be the catalyst to break the equilibrium of the range/base and attract new buyers to the extent we will see higher highs and higher lows confirming a sustained uptrend.
Once the trend is established we will then see momentum investing.
Generally, a long base at lows followed by really good news on volume is a classic technical pattern for identifying potential significant price increases.
I call upon the omnipotent power of the universe to grant us a "Trading halt" followed by news of a significant partnership by big tech or/and an offer to buy us out for 5 dollars a share.... 🙏🙏🙏🙏🙏🙏🙏🙏🤞🤞🤞🤞🤞🤞
 
  • Like
  • Love
  • Fire
Reactions: 29 users
I call upon the omnipotent power of the universe to grant us a "Trading halt" followed by news of a significant partnership by big tech or/and an offer to buy us out for 5 dollars a share.... 🙏🙏🙏🙏🙏🙏🙏🙏🤞🤞🤞🤞🤞🤞
Followed by counter offers from other big tech competitors ending up at $14 a share. One can dream but if you have an offer you want more than one interested.

SC
 
  • Like
  • Fire
  • Love
Reactions: 17 users

manny100

Regular
Could NVIDIA's Jeston series sniff out the VOC particles with the same accuracy as BrainChip's Akida in relation to Nanose?

ChatGPT said:​

The short answer is: not directlyNVIDIA Jetson modules and BrainChip’s Akida neuromorphic processor operate on fundamentally different architectures and processing models. While both can process data from an e-nose (electronic nose) sensor array, Akida has specific advantages for low-power, real-time inference of volatile organic compounds (VOCs) due to its neuromorphic design.

🔬 Breakdown: Jetson vs. Akida for VOC Detection (e.g., in NaNose)​

FeatureBrainChip AkidaNVIDIA Jetson (e.g., Xavier NX, Orin)
ArchitectureNeuromorphic (Spiking Neural Network)GPU-accelerated (deep learning, CNNs, LSTMs)
Data ModelEvent-based / spike-encodedFrame-based / dense tensor processing
VOC Sensor InterfaceOptimized for real-time spiking data (e.g., AERO encoding)Can interface but needs preprocessing
LatencyUltra-low (milliseconds)Higher (requires buffer + batch inference)
Power EfficiencyMicro-watt to milli-watt rangeSeveral watts to tens of watts
Accuracy with NaNoseDemonstrated 97%+ accuracy using spike-based classificationPotentially similar, but with higher power cost and latency
Training ApproachOnline/on-device continual learningTypically pre-trained and updated via cloud or edge inference
ScalabilityExcellent for embedded sensorsBetter for multi-modal, compute-heavy scenarios


🎯 Why Akida May Be Better Suited for VOC Detection in NaNose-Type Devices​

  1. Real-Time Spike-Based Classification
    • VOCs generate dynamic, time-dependent signals. Akida processes them directly as spikes, which mimics the biological olfactory system.
    • Jetson would require conversion of VOC sensor data into tensors, potentially losing time-domain resolution and increasing preprocessing complexity.
  2. On-Device Learning
    • Akida supports on-chip continual learning, useful for adapting to new users, environments, or sensor drift.
    • Jetson typically relies on cloud-based retraining or large local models with fixed parameters.
  3. Power Budget
    • NaNose aims for handheld, portable diagnostics — environments where Jetson’s 10–15W draw (or more) is impractical compared to Akida’s sub-1W footprint.
  4. Biological Relevance
    • Akida’s neuromorphic design is inspired by the human olfactory cortex, making it intrinsically well-suited to VOC processing.
    • Jetson is general-purpose AI hardware — very powerful but not biologically inspired or VOC-specialized.

🔄 Could They Be Used Together?​

Yes. It's theoretically feasible for a hybrid system:
  • Akida could perform first-pass classification or anomaly detection from e-nose input.
  • Jetson could handle broader context processing, GUI, connectivity, or integration with other modalities (e.g., image + breath analysis).

🧠 Conclusion​

While Jetson modules are powerful general-purpose AI edge processors, BrainChip’s Akida is purpose-built for ultra-low-power, spike-based sensory tasks like VOC detection. In a medical e-nose like NaNose, Akida is likely more efficient, biologically aligned, and better suited to the task, especially where power, latency, and real-time accuracy are critical.






ChatGPT can make mistakes. Check important info.
This is why Nanose originally chose BRN:
" Because Akida’s “Edge AI” is a complete neural processor that does not require an external CPU, memory, or Deep Learning Accelerator (DLA), and because it is ultra-low power, it offers a realistic opportunity for an inexpensive hand-held diagnostic testing tool akin to a breathalyzer."
 
  • Like
  • Love
  • Fire
Reactions: 19 users
Followed by counter offers from other big tech competitors ending up at $14 a share. One can dream but if you have an offer you want more than one interested.

SC
How about 70 of us on this forum purchase a syndicate ticket in next Tuesday's $70 million OZ Lotto draw?

:)
 
  • Haha
  • Like
  • Fire
Reactions: 10 users

manny100

Regular

"BrainChip Eyes Edge AI Market Dominance"​

" After acknowledging past inflexibility, BrainChip has overhauled its product offering, now boasting a suite of products designed to adapt to changing market conditions. The company has seen more engagements in the past five months than ever before, driving confidence in future bookings."
" CEO Sean Hehir highlighted the increasing demand for Edge AI solutions, fueled by the transition of Generative AI from data centers. BrainChip is focusing on State-Space Models (SSMs), particularly Temporal Event-based Neural Networks (TENNs), to establish leadership in Edge applications."
 
  • Like
  • Love
  • Fire
Reactions: 22 users

Slade

Top 20
IMG_3869.jpeg

Soon
 
  • Love
  • Like
Reactions: 18 users
Thanks Bravo, so hopefully the two ( jetson and Akida ) are integrated into the nanose which would make sense imo.
 
Last edited:
  • Like
Reactions: 2 users
How about 70 of us on this forum purchase a syndicate ticket in next Tuesday's $70 million OZ Lotto draw?

:)
Already have

SC
 
Yeeeehaa record braking! Already 2 announcements this week… will we see 3 or 4???
We're they official on the ASX
 

Frangipani

Top 20
The EDGX DPU, based on an NVIDIA Jetson Orin NX and designed for LEO (Low Earth Orbit) missions, can now be sourced through https://satsearch.co/ - with an optional neuromorphic BrainChip Akida add-on:




View attachment 84896

View attachment 84897 View attachment 84898


View attachment 84899 View attachment 84900



View attachment 84901

The EDGX team recently presented their EDGX DPU to the Belgian Federal Minister for Public Enterprises, Civil Service, State Building Management, Digitalization and Science Policy & her entourage, which also included a Programme Manager from the Belgian Science Policy Office (BELSPO):


View attachment 84902

View attachment 84905


By the looks, EDGX are counting down to a launch 🚀 in Q1/2026:

View attachment 84903


As LinkedIn doesn’t provide the exact date of the post, I checked out the Minister’s Instagram profile for any clues - et voilà!

Besides finding out that her visit in Ghent must have been on 16 April itself or a couple of days earlier at most (although the lack of time adverbs such as ‘yesterday‘ or ‘last week’ suggest it was likely a same-day post), I also spotted a BrainChip sticker at the EDGX headquarters in Ghent. 😊

So the launch must be scheduled for no later than 1 March 2026, then.

View attachment 84904

Our Belgian space-tech partner EDGX just unveiled their new logo with the catchphrase “Sky-high Intelligence” at SmallSat Europe 2025 in Amsterdam, where they’re showcasing their EDGX DPU that was “designed to enable real-time in-orbit edge computing for next-gen missions in Earth Observation, Satcom and In-Orbit Servicing”. It runs on an NVIDIA Jetson Orin NX and comes with an optional neuromorphic Akida module for ultra-low-power applications:


02C0F12A-A0CA-4EE1-9D92-CF9C5BFE650A.jpeg






22955AA7-14B2-49AE-B773-6048BF135999.jpeg


A09FFFC9-73BF-4DCA-B32B-232CD5E27747.jpeg


They also appear to have freshened up their website (which still has the old logo, though): https://www.edgx.space/

223A0374-6B10-4909-9297-00076F07DBDC.jpeg
60148CDB-3C17-40F7-A9F6-BAE43B32F5B9.jpeg
D96961C4-1B59-4FD5-98BC-DBFA4C6F8B12.jpeg
 
Last edited:
  • Like
  • Fire
Reactions: 10 users

Diogenese

Top 20
This video is about NASA/DARPA/Lockheed updated AI Warthog:

 
  • Like
  • Fire
  • Wow
Reactions: 13 users

Baneino

Regular
Es folgten Gegenangebote anderer großer Tech-Konkurrenten, die bei 14 Dollar pro Aktie endeten. Man kann zwar träumen, aber bei einem Angebot möchte man mehr als nur einen Interessenten.

SC
45$ ❤️
 
  • Love
  • Like
Reactions: 4 users

Frangipani

Top 20
Uni researchers in the United Arab Emirates (from New York University Abu Dhabi and Khalifa University, Abu Dhabi) have experimented with Akida for neuromorphic AI-based robotics - the field our CTO Dr. Tony Lewis is an expert in:


View attachment 60342
View attachment 60343

View attachment 60344

Four Bachelor of Computer Engineering students from Abu Dhabi’s Khalifa University recently used Akida for their Senior Design Project titled “SNN-based Object Tracking using DVS Data”:


View attachment 84793

View attachment 84799

View attachment 84802

Future work will include “Benchmark on other neuromorphic chips (e.g. Intel Loihi)”.



Their supervisor was Athanasios (Thanos) Stouraitis, a Professor in Khalifa University’s Computer and Information Engineering Department.


View attachment 84797
View attachment 84798

While we know of a number of researchers (Jorge Dias, Fakhreddine Zayer) and students at Abu Dhabi’s Khalifa University that have been exploring Akida 👆🏻, others there have been researching neuromorphic sensors and processors by the SynSense Group (“the world’s first fully neuromorphic, end-to-end sensing and processing company”, cf. https://www.synsense.ai/synsense-an...orm-leading-neuromorphic-technology-provider/), which was created when Synsense acquired iniVation last year.

Today, Khalifa University announced a strategic collaboration between its spin-off Kumrah AI, a deep-tech startup from the Advanced Research and Innovation Center (ARIC), and Zurich-based iniVation.

The signed Memorandum of Understanding is “aimed at establishing a global joint venture to develop and commercialize advanced neuromorphic vision-based inspection and autonomy systems (…) The partnership will focus on creating cutting-edge AI-powered hardware and software solutions for industrial and aerospace applications, with plans to localize technologies for the MENA region* and scale them globally. (…) The joint venture that consolidates each party’s technological, commercial, and operational strengths will focus on developing and deploying neuromorphic sensors, processors, and adaptive software tailored for industrial inspection, quality assurance, and autonomous decision-making. The partners will also collaborate on system design, prototype development, and technical validation for key industry verticals, while jointly identifying funding sources and commercial opportunities.

*MENA region = Middle East and North Africa



FCD0D273-71D6-465C-988B-93CDDF234F1F.jpeg





Khalifa University Spin-off Kumrah AI and iniVation, a SynSense Group company, to Launch Neuromorphic Tech Joint Venture​

2025-05-29
By Christy Aerne

AM206033-1024x683.jpg

MoU Signed at ‘Make it in the Emirates’ to Drive Regional and Global Commercialization of Advanced AI-Powered Systems

Khalifa University of Science and Technology today announced a strategic collaboration between its spin-off Kumrah Artificial Intelligence (AI), a deep-tech startup from the Advanced Research and Innovation Center (ARIC) and Swiss neuromorphic technology leader iniVation, a SynSense Group company, aimed at establishing a global joint venture to develop and commercialize advanced neuromorphic vision-based inspection and autonomy systems.

The Memorandum of Understanding was signed by Hussain Sajwani, CEO, Kumrah AI, and Dr. Kynan Eng, Chief Commercial Officer, iniVation/SynSense, in the presence of Her Excellency Amal Al Jaberi, CEO, Khalifa University Enterprises Company (KUEC), and Dr. Abdelqader Abusafieh, SVP, Technology & Advanced Materials, Strata, Mubadala, during ‘Make it in the Emirates’ 2025. The MoU underscores the UAE’s commitment to fostering strategic, innovation-driven industrial partnerships that align with its ambition to be a global leader in AI.

Her Excellency Amal Al Jaberi, said: “This collaboration demonstrates Khalifa University’s and KUEC’s role in turning deep research into deployable solutions. By facilitating strategic partnerships through spin-offs like Kumrah AI from the Advanced Research and Innovation Center, we are strengthening the UAE’s leadership in frontier technologies while supporting economic diversification. As the country accelerates its position as a global hub for artificial intelligence, initiatives like this joint venture will help translate emerging research into scalable, market-ready solutions that benefit both local and international industries.”

Hussain Sajwani said: “Signing the MoU at Make it in the Emirates 2025, a platform that provides emerging technology startups the necessary momentum, also reflects Kumrah AI’s commitment to contribute to the UAE’s ambition to be a global leader in AI. Our collaboration with IniVation prepares Kumrah AI to play a defining role in the next wave of AI-driven technologies. We believe the synergy resulting from this partnership will firmly place us among future technology leaders in this arena, positioning the UAE at the forefront of next-generation computing technology.”

Dr. Ning Qiao, CEO, iniVation (a SynSense Group company), and Dr. Kynan Eng, Chief Commercial Officer, said: “Over the years we have been working to bring neuromorphic technology to real-world applications, we have evaluated many developments by customers all over the world. Our decision to collaborate with Kumrah AI and KUEC is based on our conviction that, together with our partners, we have all of the ingredients required to succeed in global markets in aviation, energy, and other industries. We are looking forward to moving forward rapidly together, to bring the benefits of our combined technology and expertise to our customers.”

The partnership will focus on creating cutting-edge AI-powered hardware and software solutions for industrial and aerospace applications, with plans to localize technologies for the MENA region and scale them globally. The joint venture will combine Kumrah’s access to licensed intellectual property and commercialization pathways with iniVation’s neuromorphic vision systems and technical expertise. IniVation is a Swiss firm affiliated with SynSense, together forming a world leader in neuromorphic sensing and processing technologies.

The joint venture that consolidates each party’s technological, commercial, and operational strengths will focus on developing and deploying neuromorphic sensors, processors, and adaptive software tailored for industrial inspection, quality assurance, and autonomous decision-making. The partners will also collaborate on system design, prototype development, and technical validation for key industry verticals, while jointly identifying funding sources and commercial opportunities.


Kumrah AI will lead commercialization and system integration in the MENA region, engaging with local customers, regulators, and partners to ensure successful market deployment. The company will also coordinate workforce development, and regional stakeholder engagement. IniVation will serve as the technical lead, contributing proprietary neuromorphic vision technologies, hardware platforms, and software tools. It will also spearhead global R&D, facilitate knowledge transfer and training, and drive expansion into international markets including Europe, North America, and Asia.

About Khalifa University Enterprises Company (KUEC)
Khalifa University Enterprises Company (KUEC), a fully owned subsidiary of Khalifa University of Science and Technology, was established to strategically commercialize intellectual property and assets from Khalifa University and beyond.
For more information visit: https://www.ku.ac.ae/kuec/

About IniVation AG
iniVation AG is a member of the SynSense Group. The group designs and produces advanced sensors and processors for applications in consumer electronics, industrial automation, aerospace, and other markets.
For more information visit: https://www.inivation.com and https://www.synsense.ai

About Khalifa University of Science and Technology
Khalifa University of Science and Technology, the UAE’s top-ranked research-intensive institution, focuses on developing world leaders and critical thinkers in science, engineering and health science. The world-class university endeavors to be a catalyst to the growth of Abu Dhabi and the UAE’s rapidly developing knowledge economy as an education destination of choice and a global leader among widely acknowledged international universities.
For more information visit: http://www.ku.ac.ae/




The gentleman standing next to iniVation/SynSense CCO Kynan Eng in the released photo is Yahya Zweiri, Professor in the Department of Aerospace Engineering at Khalifa University, Director of Khalifa University’s Advanced Research & Innovation Centre (ARIC) and one of the co-founders of Kumrah AI.


245B4F29-E02D-4A04-AB2D-B84228D34BCF.jpeg




FD9547F7-D32A-4C31-9A4F-5084698EE454.jpeg
1F0D6C6A-15A6-418F-8B14-D20E1A8AB6DC.jpeg
7432C606-DBB3-45E0-AAD0-B7948A6B7EF2.jpeg



Not much information is yet available about Kumrah AI - the website https://www.kumrah.ai/ is still more or less in stealth mode…


96C169A6-075C-4649-BD3A-77A624D29ADE.jpeg


But here is a short video from the recent ‘Make it in the Emirates’ Exhibition featuring Kumrah AI as one of three Khalifa University spin-offs:

 
Last edited:
  • Like
  • Thinking
  • Love
Reactions: 13 users
Top Bottom