BRN Discussion Ongoing

Frangipani

Top 20
There will be an inaugural EDGE AI FOUNDATION Neuromorphic Livestream on 17 February at 4:30 pm CET (Central European Time):
“Beyond von Neumann Compute: Neuromorphic AI at the Edge”:



View attachment 94358

Looks like the EDGE AI FOUNDATION has shifted next week’s inaugural Neuromorphic Livestream from 17 February (Tuesday) to 18 February 2025 (Wednesday) - same time of the day, though (7.30 am Pacific Time corresponds to 4.30 pm Central European Time).


09989555-598C-4254-8521-0F7C093AD0F3.jpeg
 
  • Like
  • Fire
Reactions: 12 users

7für7

Top 20
This guy continues on his agenda 😂 no matter what people tell him who are involved in this project

 
  • Haha
  • Like
Reactions: 4 users

White Horse

Regular
Bascom Hunter posted about their 3U VPX SNAP (Spiking Neuromorphic Advanced Processor) Card on LinkedIn earlier today.

Unfortunately they don’t mention the product’s “five BrainChip AKD1000 spiking neuromorphic processors” in their post (although they do link to https://bascomhunter.com/deg/digita...c-processors/asic-solutions/3u-vpx-snap-card/, where this is explicitly stated), and the image is (deliberately?) blurry where we know these five processors are…



View attachment 95027


View attachment 95028
Hi Frangi,
I wouldn't worry to much about a low res pic on Linkedin.
Their web site has been upgraded in the last few weeks.
And we are still there, large as life.
 
  • Like
  • Love
  • Fire
Reactions: 9 users

White Horse

Regular
Hi Flenton,

I assume you actually meant to reply to my preceding post, the one about the guy from Brazil, whose texts and images are so obviously AI-generated?

That alone should be a HUGE red flag to any reader, given the frequent occurrence of hallucinations in such Generative AI outputs - a danger, which has been addressed on this forum multiple times. And the more of these texts and images get generated, the more those hallucinations will spread, as they will in turn feed newly generated texts and images…

Plus, I noticed that quite a few of those who upvoted those fictitious and misleading posts about an alleged SpaceX/Starlink and Akida/GR801 connection were not exactly BRN shareholders that you would associate with the proverb “A drowning man will clutch at a straw”. Maybe it is a consolation to some of them that they are in fact “in good company”: Even some well-known researchers who have first-hand experience with Akida either fell for the fake news or “liked” one of those posts after only glancing over it (I’d say the latter explains the 👍🏻👍🏻👍🏻 of at least one BrainChip and two Frontgrade Gaisler employees that I spotted).

However, I honestly can’t get my head around why any BRN shareholder who regularly likes BrainChip LinkedIn posts as well as those by our company’s partner Frontgrade Gaisler would even think it possible that GR801 - a future product which is still clearly marked as “under development” on Frontgrade Gaisler’s website, a space-grade SoC that is verifiably not yet taped out - could already have long been in the hands of SpaceX engineers?!

Frontgrade Gaisler on 4 February 2026:
“We plan to tape out the GR765 and GR801 products in the first half of 2026 […]. Early prototypes are expected at the end of this year. And together with these prototypes, we expect to have evaluation and development boards ready at the same time for early adopters.”

https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-480760

View attachment 95008

https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-481900

View attachment 95009

Although Kenneth Östberg unmistakably told Daniel Azevedo Novais to stop posting AI-generated images of GR801 “with no truth in them” without clearly stating they are not real, he (or a bot pretending to be a human?) continues to do so…


View attachment 95010

View attachment 95011


All these misleading images should be clearly marked as AI-generated and fictitious.

“Alternative facts” are not helpful when we want to see a sustainable share price rise.
I have just reported this clown for misinformation.
I suggest others with a Linkedin a/c do the same.
 
  • Like
  • Fire
Reactions: 11 users
Whats the difference between 3 years ago when people were saying where goimg to the moon,
Our company was nothing back then.
Look at us now on how far we have developed and how many varieties of avenues of potential income with the array of products availiable.
Why did people believe then but not now,
Look at the prices? Do yourself a favour
 
  • Like
  • Fire
Reactions: 3 users

manny100

Top 20
Bascom Hunter posted about their 3U VPX SNAP (Spiking Neuromorphic Advanced Processor) Card on LinkedIn earlier today.

Unfortunately they don’t mention the product’s “five BrainChip AKD1000 spiking neuromorphic processors” in their post (although they do link to https://bascomhunter.com/deg/digita...c-processors/asic-solutions/3u-vpx-snap-card/, where this is explicitly stated), and the image is (deliberately?) blurry where we know these five processors are…



View attachment 95027


View attachment 95028
It appears the Bascom Hunter 3U VPX is commercialiy available. They have a Product Brochure.
The question is what did they have planned for the $100k worth of 1500 chips in Dec'24.
Radar? or something else?
 
  • Like
  • Fire
  • Love
Reactions: 16 users

7für7

Top 20
Whats the difference between 3 years ago when people were saying where goimg to the moon,
Our company was nothing back then.
Look at us now on how far we have developed and how many varieties of avenues of potential income with the array of products availiable.
Why did people believe then but not now,
Look at the prices? Do yourself a favour
The difference with the “moon” statement is..we all knew back then ..this was just wishful thinking. It’s totally fine to have visions and debate what could be possible with the tech. People say “Terminator is coming” all the time… that doesn’t mean it will become real.

But what this guy is doing is different: he presents his claims as if they were real, uses real product/company names, and even fake images. That can easily be misleading … especially for new investors … and could make them think “BrainChip is a scam.”

Just my opinion — DYOR.
 
  • Like
Reactions: 4 users

Diogenese

Top 20
It appears the Bascom Hunter 3U VPX is commercialiy available. They have a Product Brochure.
The question is what did they have planned for the $100k worth of 1500 chips in Dec'24.
Radar? or something else?
Hi Manny,

Only tangential to BH, but with the RTX microDoppler radar, the first requirement is a radar receiver sensitive to detect the small frequency variations (Doppler effect) in the reflected signal caused by the vibrations of the target object. So clearly RTX have such a radar TX/RX.

Then there is a need for a model/database correlating the frequency modulations with corresponding objects. Again, this must be developed from the received sensor signals, so RTX must have this.

Akida's role is to provide real time analysis/inference/classification of received signals.

The thing is that the Doppler signals in the model and in the reflected radar pulse contain additional information, which means that the Akida processor needs to be able to process the additional information. Akida 1 has 4-bit processing, while Akida 2 has 8-bit processing, so Akida 2 would be more suited to microDoppler than Akida 1.

Which brings us to the MegaChips/Acumino robot radar. For radar to be useful in a domestic robot, then, apart from basic navigation, it would be beneficial to have radar which could recognize items*. If, as we all would like to believe, this partnership incorporates the Megachips/Akida licence, it would be sensible for this project to incorporate Akida 2 (or higher). This would tally with the imminent production of Akida 2 ASIC.

* Radar is colour blind, so for some tasks, a RGB camera would be useful.
 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 14 users

Diogenese

Top 20
Hi Manny,

Only tangential to BH, but with the RTX microDoppler radar, the first requirement is a radar receiver sensitive to detect the small frequency variations (Doppler effect) in the reflected signal caused by the vibrations of the target object. So clearly RTX have such a radar TX/RX.

Then there is a need for a model/database correlating the frequency modulations with corresponding objects. Again, this must be developed from the received sensor signals, so RTX must have this.

Akida's role is to provide real time analysis/inference/classification of received signals.

The thing is that the Doppler signals in the model and in the reflected radar pulse contain additional information, which means that the Akida processor needs to be able to process the additional information. Akida 1 has 4-bit processing, while Akida 2 has 8-bit processing, so Akida 2 would be more suited to microDoppler than Akida 1.

Which brings us to the MegaChips/Acumino robot radar. For radar to be useful in a domestic robot, then, apart from basic navigation, it would be beneficial to have radar which could recognize items*. If, as we all would like to believe, this partnership incorporates the Megachips/Akida licence, it would be sensible for this project to incorporate Akida 2 (or higher). This would tally with the imminent production of Akida 2 ASIC.

* Radar is colour blind, so for some tasks, a RGB camera would be useful.
Thinking a bit more about it, the see-in-the-dark radar application does not use Doppler modulation, because not everything vibrates.

So it is just the high precision radar TX/RX and a high definition object model that are needed, ie, the radar RX provides high precision information from the reflected signal, but does not need to demodulate the Doppler information and is independent of whether or not the reflected signal includes Doppler modulation. Akida's ultra low latency enables this information to be processed for inference/classification. With hindsight, I guess that any Doppler information present can be processed in the same manner. If this is correct, there would be no need for a separate Doppler frequency demodulation step.
 
Last edited:
  • Love
  • Like
  • Wow
Reactions: 8 users

7für7

Top 20
Off topic lol but FF is jealous because rayz got more than 70 likes haha

Fun Popcorn GIF
 
Last edited:
  • Haha
Reactions: 1 users

Diogenese

Top 20
Thinking a bit more about it, the see-in-the-dark radar application does not use Doppler modulation, because not everything vibrates.

So it is just the high precision radar TX/RX and a high definition object model that are needed, ie, the radar RX provides high precision information from the reflected signal, but does not need to demodulate the Doppler information and is independent of whether or not the reflected signal includes Doppler modulation. Akida's ultra low latency enables this information to be processed for inference/classification. With hindsight, I guess that any Doppler information present can be processed in the same manner. If this is correct, there would be no need for a separate Doppler frequency demodulation step.
Akida 3's new solid state interconnection mesh (replacing packet switched) will have even lower latency and lower power usage, as well as higher precision.
 
  • Like
  • Fire
  • Wow
Reactions: 17 users

Diogenese

Top 20
Akida 3's new solid state interconnection mesh (replacing packet switched) will have even lower latency and lower power usage, as well as higher precision.
I'd like to see BRM Marketing produce an inclusive product brochure with a table which explains the relative advantages of each chip/IP in our product portfolio, and also the future capabilities of the designs in the pipeline:

Akida 1 - Low latency, low power, 1 to 4 bit precision, applications, $ range, ...

Akida 1500 - Low latency, low power, 1 to 4 bit precision, applications, $ range, ...

Pico - ultralow power, applications, $range, ...

Akida 2 - Low latency, low power, 1 to bit precision, applications, $ range, ...

Akida 3 (in progress) - TENNS, Ultra-low latency, ultra-low power, 16-bit/FP32 bit precision, applications, $ range ...

Gen AI (in progress) - TENNs SLM, RAG, applications, S range ...

Maybe some comparative performance graphs?

Something to show customers and potential investors at a glance both the current capabilities and future potential.
 
  • Like
  • Love
  • Fire
Reactions: 20 users
Thinking a bit more about it, the see-in-the-dark radar application does not use Doppler modulation, because not everything vibrates.

So it is just the high precision radar TX/RX and a high definition object model that are needed, ie, the radar RX provides high precision information from the reflected signal, but does not need to demodulate the Doppler information and is independent of whether or not the reflected signal includes Doppler modulation. Akida's ultra low latency enables this information to be processed for inference/classification. With hindsight, I guess that any Doppler information present can be processed in the same manner. If this is correct, there would be no need for a separate Doppler frequency demodulation step.
Glad you corrected yourself Dio; cause that was my thoughts too 😂
 
  • Haha
  • Like
Reactions: 8 users
I'd like to see BRM Marketing produce an inclusive product brochure with a table which explains the relative advantages of each chip/IP in our product portfolio, and also the future capabilities of the designs in the pipeline:

Akida 1 - Low latency, low power, 1 to 4 bit precision, applications, $ range, ...

Akida 1500 - Low latency, low power, 1 to 4 bit precision, applications, $ range, ...

Pico - ultralow power, applications, $range, ...

Akida 2 - Low latency, low power, 1 to bit precision, applications, $ range, ...

Akida 3 (in progress) - TENNS, Ultra-low latency, ultra-low power, 16-bit/FP32 bit precision, applications, $ range ...

Gen AI (in progress) - TENNs SLM, RAG, applications, S range ...

Maybe some comparative performance graphs?

Something to show customers and potential investors at a glance both the current capabilities and future potential.
I hope you don't mind , I forwarded this to IR
 
  • Like
  • Fire
  • Love
Reactions: 9 users
Any hint on the Watch us Now statement was abour
 

manny100

Top 20
  • Fire
  • Like
  • Love
Reactions: 13 users
?


✅ November 2025: Sony Semiconductor Solutions released an upgraded Akida neuromorphic processor via collaboration with BrainChip, optimized for event-based vision in autonomous vehicles and robotics. This R&D advancement supports asynchronous processing for 50% latency reduction in dynamic scenes, aligning with Japan's mobility initiatives.
 
  • Thinking
  • Wow
  • Fire
Reactions: 11 users

jrp173

Regular
?


✅ November 2025: Sony Semiconductor Solutions released an upgraded Akida neuromorphic processor via collaboration with BrainChip, optimized for event-based vision in autonomous vehicles and robotics. This R&D advancement supports asynchronous processing for 50% latency reduction in dynamic scenes, aligning with Japan's mobility initiatives.

You are seriously asking TSE posters what they think about this BS??

Why don't you email the author and ask him where he got his bullshit from, instead of posting unsubstantiated nonsense on here.

You could you saved us from this crap....

Seriously a website called "Open PR"....



Here's his details.

Contact Us -

Company Name: DataM Intelligence
Contact Person: Sai Kiran
Email: Sai.k@datamintelligence.com
Phone: +1 877 441 4866
Website: https://www.datamintelligence.com
 
Last edited:
  • Like
  • Haha
Reactions: 2 users

manny100

Top 20
I asked 'chat' to run a table showing possible revenue from different types of Neuromorphic Edge uses. Pretty much what is expected but good to see in a table.
Defense is the start but the endgame.

"Here’s a clear table showing the revenue potential of each market."

Revenue Potential Comparison

MarketVolumePrice per ChipRevenue PotentialNotes
Defense (radar, EW, ISR, drones)Low–mediumHigh marginMillionsLong sales cycles, small batches, but stable
Defense wearables (soldier systems)MediumMediumTens of millionsGrowing interest in all‑weather navigation aids
Industrial roboticsMedium–highMediumHundreds of millionsWarehouses, mining, logistics, agriculture
Consumer roboticsVery highLow–mediumBillionsHome robots, drones, personal assistants
Assistive devices (vision‑impaired)MediumMediumHundreds of millionsStrong social impact, regulatory hurdles
Smartphones / AR glassesExtremely highLowBillions+If radar‑AI becomes a standard sensor
Automotive (ADAS, autonomous)Extremely highMediumBillions+Radar + neuromorphic AI is a perfect fit
 
  • Like
  • Fire
  • Haha
Reactions: 11 users

Rach2512

Regular
ARQUIMEA

 
  • Like
  • Love
  • Fire
Reactions: 10 users
Top Bottom