BRN Discussion Ongoing

White Horse

Regular
I could give you a reasonable explanation for every single point – why I react the way I do and what I write. But there’s no real point, because you clearly twist things into whatever suits you – taking comments out of context just to make me look bad. Edit: you still don’t know the Funktion “show ignored content”

You also seem to have no understanding of normal human behavior: sometimes people post emojis just for fun, and sometimes, after doing more research, they realise that a previous post wasn’t 100% accurate and then start questioning it. It’s completely normal to later come to a different conclusion and for earlier posts to become irrelevant.

Instead, you’re busy tracking what I post in other forums just to try and expose me here. You can keep trying if you like – I honestly don’t care. In the end, you’re only making yourself look more ridiculous, even though you actually had the potential to be taken more seriously by some people.

I am also requesting that you delete that post and stop this stalking behaviour.
If you tag me, mention me again, or repost my content here without my consent, I will take action – fully in line with the terms and conditions and usage policies of this forum.
You are a two faced clown. And a hypocrite.
 
  • Like
  • Fire
  • Love
Reactions: 5 users

Cardpro

Regular
Argh... I’m so frustrated with the recent progress... and the capital raise... They said “Watch us now,” but then went ahead with a CR when the share price is already so low… It makes me wonder whether they actually have any deals lined up or there will be any positive announcements coming soon... I am guessing no... Sean’s $9M revenue target is starting to feel like complete BS at this point… ;(

IMO only...
 
  • Like
  • Fire
Reactions: 7 users

7für7

Top 20
I see Olivier has recently provided some comments to a post by a freelancer named Artiom as per link.

Olivier talks about some tech details I found interesting (my bold) and also, when you go to Olivier's LinkedIn, it says working for BRN? I thought he moved on or I mistake something?




Olivier Coenen
From theoretical models to chips: I design SOTA spatiotemporal NNs (PLEIADES, TENN (Akida 2+), SCORPIUS) that train like CNNs and run efficiently at the edge like RNNs.
6d

The Austrian Institute of Technology (AIT) where Christoph Posch developed the ATIS sensor developed a linear event-based visual array of 3(?) pixel wide to scan objects on a conveyor belt moving really fast. We used the technique presented here to categorize moving grocery items and read their barcodes when put in a shopping basket. The difficulty was the effective temporal resolution, only about 0.1 ms at best, far from the ns resolution of the timestamps. So we needed better techniques, dynamical NNs to extract all the info and compensate for the time-dependent response of a pixel: if fired recently, less likely to fire again for same contrast. We didn’t have that NN then and I think I have it now with PLEIADES and successors.
Like
Reply
Olivier Coenen, graphic
Olivier Coenen
From theoretical models to chips: I design SOTA spatiotemporal NNs (PLEIADES, TENN (Akida 2+), SCORPIUS) that train like CNNs and run efficiently at the edge like RNNs.
6d

The increase resolution brings out the another main advantage of event-based vision sensors, one can eventually see/resolve objects that one couldn’t with same frame-based resolution. We could still track drones flying in the sky that were 1/10 the size of a single pixel for example. Try that with a frame-based camera. We could generate Gigapixel resolution images with a DAVIS240 (240 x 180) of our surroundings while driving a vehicle offroad on a bumpy ride.
Like
Reply
Vincenzo Polizzi, graphic
Vincenzo Polizzi
PhD Candidate at University of Toronto | SLAM, Machine Learning, Robotics
3d

This is a great application, very nicely explained! One interesting aspect visible in your results is that edges parallel to the motion direction are much less represented, simply because they don’t generate many events. A way to address this is to excite pixels from multiple directions, ensuring that all edges trigger events. This is exactly what we explored in a class project that eventually evolved into our paper VibES: we mounted the event camera on a small mechanical platform that oscillates in both X and Y, essentially a tiny “washing-machine-like” motion 😄. By injecting a known motion pattern (and estimating the sinusoidal components online), we obtained dense event streams and significantly sharper reconstructions, even for edges that would otherwise remain silent. Glad to hear your thoughts!
Like
Reply
1 Reaction
Olivier Coenen, graphic
Olivier Coenen
From theoretical models to chips: I design SOTA spatiotemporal NNs (PLEIADES, TENN (Akida 2+), SCORPIUS) that train like CNNs and run efficiently at the edge like RNNs.
6d

This brings out another point: single cone/rod in our retinas don’t have the angular resolution to clearly see stars in the night sky, our eyes just don’t have the necessary resolution, yet we can clearly stars. How is that then possible? Eye movement and spike timing. Within a certain spatial resolution, spikes fire at the same spatial location when our eyes move. The temporal resolution of the spike translates to the spatial resolution that we can achieve when the spikes accumulates around the same spatial location where the star is. Be drunk and the stars should appear thicker or will appear to move because the temporal spike alignment is off. Thus, what we see is clearly a result of the neural processing, not just the inputs, the neural networks processing the spike firing. That’s the neural processing we need to take full advantage of event-based (vision) sensors, without having to resort to the processing of periodic sampling of frames that the engineering world is stuck in today.
 
  • Like
  • Fire
Reactions: 12 users

GStocks123

Regular

Attachments

  • IMG_5035.png
    IMG_5035.png
    273 KB · Views: 64
  • Like
  • Fire
  • Love
Reactions: 18 users

Guzzi62

Regular
  • Like
  • Fire
Reactions: 9 users

Diogenese

Top 20
That was a very long but interesting read.

I recommend all BRN investors reading it, the guy chatting with GPT is clearly very knowledgeable and ask hard questions.

I have to read it again and let it sink in.
Hi Guzzi,

About half way through - It's quite repetitive, so I've probly glossed over some of the nuances.

Chatty suggests that GPU is better at classifying several items in a field of view. Is this just a question of Akida not having a sufficiently developed model library, or is it based on processing power, remembering that Akida can come in herds.

On evolution - When a brave neuron is being attacked, it does not want to be hampered by a heavy MAC - fight or flight.

I was going to say something profound, but had to take a toilet break and lost the thread (not an image to conjure with).

We are told the yet-to-be-disclosed patent application does 32-bit accuracy with 4-bits.

Given that Akida is moving to FP32 in the future, will that give it CPU/GPU capabilities. Of course, the software support becomes critical.
 
  • Like
  • Fire
  • Haha
Reactions: 11 users

Esq.111

Fascinatingly Intuitive.
Good Morning Chippers ,

Just saw this over at the other site , unshaw of any connection to BRN , though possibly.

1764018340561.png


Regards,
Esq.
 
  • Like
  • Thinking
  • Fire
Reactions: 11 users

Mccabe84

Regular
So did management sell shares through the raise to JP morgan, then they used those shares to pay back the shorts they had out or am I totally misreading everything?
 
  • Thinking
  • Sad
Reactions: 2 users

HopalongPetrovski

I'm Spartacus!
So did management sell shares through the raise to JP morgan, then they used those shares to pay back the shorts they had out or am I totally misreading everything?
From what I can understand, over the period from 18 July through to 17 November they bought 120 million odd which meant they had just over 5% voting power and so were mandated to announce.
They have since sold a bit over 10 million leaving them with a balance of 109,736,076 as at end 20 November and so slipping back under the 5% threshold, have to inform the market.
God only knows what they will do with their remaining substantial stake but whilst it remains below a reportable figure they can resume operating in relative anonymity.
About 8 million of the shares they have relinquished are classed as borrow return.
 
  • Like
  • Fire
Reactions: 13 users

Mccabe84

Regular
From what I can understand, over the period from 18 July through to 17 November they bought 120 million odd which meant they had just over 5% voting power and so were mandated to announce.
They have since sold a bit over 10 million leaving them with a balance of 109,736,076 as at end 20 November and so slipping back under the 5% threshold, have to inform the market.
God only knows what they will do with their remaining substantial stake but whilst it remains below a reportable figure they can resume operating in relative anonymity.
About 8 million of the shares they have relinquished are classed as borrow return.
Thanks for you thoughts, im just a novice long term holder and don't fully understand everything
 
  • Like
Reactions: 4 users

Adika

Regular
From what I can understand, over the period from 18 July through to 17 November they bought 120 million odd which meant they had just over 5% voting power and so were mandated to announce.
They have since sold a bit over 10 million leaving them with a balance of 109,736,076 as at end 20 November and so slipping back under the 5% threshold, have to inform the market.
God only knows what they will do with their remaining substantial stake but whilst it remains below a reportable figure they can resume operating in relative anonymity.
About 8 million of the shares they have relinquished are classed as borrow return.
Hi Hoppy,

I was looking at this before trading opened today.

Have we just cap raised providing shares to an institution that then sold the shares to a company who has been actively shorting BRN?

As of 12 Nov short positions were 137mill.


Screenshot 2025-11-25 at 10.10.58 am.png



Shares traded either side of 17th total 45mill.

1764027701320.png



Cap raise was 200mill and shares were issued on 17 Nov at 17.5 cents a share.


Screenshot 2025-11-25 at 10.12.51 am.png


JP Morgan Chase became a substantial holder on 17 Nov due to this purchase of 46mill shares at 18 cents per share.

Screenshot 2025-11-25 at 10.12.21 am.png


As of 17 Nov short positions were 91mill, a difference of 46mill from 12th Nov.

Screenshot 2025-11-25 at 10.11.16 am.png
 
  • Like
  • Fire
  • Wow
Reactions: 15 users

HopalongPetrovski

I'm Spartacus!
Hi Hoppy,

I was looking at this before trading opened today.

Have we just cap raised providing shares to an institution that then sold the shares to a company who has been actively shorting BRN?

As of 12 Nov short positions were 137mill.


View attachment 93300


Shares traded either side of 17th total 45mill.

View attachment 93304


Cap raise was 200mill and shares were issued on 17 Nov at 17.5 cents a share.


View attachment 93303

JP Morgan Chase became a substantial holder on 17 Nov due to this purchase of 46mill shares at 18 cents per share.

View attachment 93302

As of 17 Nov short positions were 91mill, a difference of 46mill from 12th Nov.

View attachment 93305
Hi Adika.
I really don't have either the relevant resources, connections or inclination to accurately and forensically account for the intricate details of the dealings of an institution such as JP Morgans.
However, I understand that institutions such as they, collectively known as the BEOT, regularly hedge (short), employ Bots and encourage FUDster's, and do their damnedest to manipulate markets and market behaviour in an effort to maximise their profits, so I wouldn't be surprised if you were right.
Once BrainChip, or any other company for that matter, create and then sell shares, they lose control over what happens to them or into whose hands they fall.
Unfortunately, whilst the company is confined to these methods of raising capital in order to continue its existence, we will continue to be subject to this potentially negative behaviour, as has been the case over the past 2 or 3 years, since we came to the attention of the predators with our inclusion into the ASX indices.
 
  • Like
  • Fire
Reactions: 11 users

Xray1

Regular
Argh... I’m so frustrated with the recent progress... and the capital raise... They said “Watch us now,” but then went ahead with a CR when the share price is already so low… It makes me wonder whether they actually have any deals lined up or there will be any positive announcements coming soon... I am guessing no... Sean’s $9M revenue target is starting to feel like complete BS at this point… ;(

IMO only...
The "FLUFF" word comes to my mind when the words "Watch Us Now" are used
 
  • Like
Reactions: 5 users

Diogenese

Top 20
Hufffff ... huffff ... thump ... thump ... testing! ... testing!


_._.
__._

_._.
__._

_._.
__._
_.. .
 
  • Haha
  • Like
Reactions: 4 users
FF

If you dig into NaNose Medical you will find out a number of things going back to Brainchip's first engagement with this company dating back to 2020.

1. The initial engagement with Brainchip was for the purpose of analysing the Volatile Organic Compound profile for Covid 19 samples gathered by Chinese researchers on behalf of NaNose Medical (Diagnose at that time) using AKD1000 at the Brainchip Research Centre in Perth Western Australia.

2. It was proven that AKIDA technologies ability to recognise patterns allowed it to excel and if memory now serves me it initially achieved results of greater than 94% accuracy which was State of the Art and as the trials progressed that accuracy increased to again from memory greater than 98%.

3. Brainchip AKIDA technologies since that time has been proven across numerous fields including cyber securityshowing it excels in pattern matching and identification tasks.

4. NaNose Medical was initially spun out of Technion in Israel where the inventor of their nano sensor array Professor Haick was and still is employed.

5. Brainchip partnered with Cornell Tech which in turn is partnered with Technion under the Technion-Cornell Institute.

6. The purpose of NaNose Medical and Professor Haick has always been to develop a portable non invasive device and prior to successfully partnering with Brainchip had worked for some years with Siemens Medical unsuccessfully to this end. Professor Haick's ultimate ambition was to develop a low cost handheld device to diagnose a range of disease based on samples of breath and the Volatile Organic Compounds contained therein. Professor Haick was aiming for each test to cost between $2.00 to $3.00.

7. In all of the material that is presented on NaNose Medical's current website they refer to their nanoparticle sensors and their algorithm but do not identify the compute upon which the algorithm runs. They do however confirm that they train the nanoparticle sensors to recognise specific VOCs and convert same to electrical impulses that are then interpreted by their algorithm. They do however many times reference pattern matching.

8. This interview from March, 2025 taken from the NaNose Medical website is the most recent information concerning the development of their device:
https://www.linkedin.com/posts/nanose-medical_ilay-marom-of-nanose-medical-activity-7312802439059980288-hSt6/?utm_source=share&utm_medium=member_desktop&rcm=ACoAADbuxfYBCB0Ad15QrKdpKbGWKn9sob5MzLc

It is impossible to say either way whether they are using Brainchip's AKIDA Technology however it would seem strange at least to me that having achieved state of the art performance with Brainchip's technology and that Brainchip's technology has undergone continuous improvement making it even more efficient and intelligent that they would have thrown out something that worked to State of the Art accuracy.

It also seems strange to me again that if they were using something as mainstream as say Nvidia Jetson Nano or an ARM based semiconductor that they would not happily disclose same as their secret sauce is their nanoparticle sensor array and algorithm as I would have expected they would want to crow about how it was portable across any computing technology.

My opinion only DYOR

Fact Finder
 
  • Like
  • Fire
Reactions: 11 users
Ongoing Technical Integration in Core IP: NaNose’s US Patent 20230152319A1 (“Device and Method for Rapid Detection of Viruses,” filed April 2021, published May 2023) explicitly credits Brainchip for SNN analysis in VOC detection from breath samples. It describes: • Nanoparticle sensor arrays (gold nanoparticles with ligands like dodecanethiol) capturing VOC changes as electrical resistance signals. • AI algorithms (e.g., discriminant function analysis, linear discriminant analysis) processing these into “VOC prints” for classifying infections (e.g., COVID-19 vs. healthy/controls, 90-100% sensitivity/specificity). • SNN via Brainchip: Achieved 90.5% accuracy (95% sensitivity, 87.87% specificity) on multi-use sensor data—outperforming other methods like LDA (86.5%). The patent implies Brainchip’s hardware for efficient, edge-based SNN processing, tying directly to Akida’s neuromorphic strengths in bio-inspired pattern recognition
 
  • Fire
  • Love
  • Like
Reactions: 4 users

manny100

Top 20
Trump has spoken.
Very positive for AI.
Maybe a reason Tapson had a meeting with senator/s connected to appropriations and briefed house staff.
" Today, America is in a race for global technology dominance in the development of artificial intelligence (AI), an important frontier of scientific discovery and economic growth."
Perhaps connected with Seans 'watch us now' statement?
 
  • Like
  • Love
Reactions: 5 users

Diogenese

Top 20
Ongoing Technical Integration in Core IP: NaNose’s US Patent 20230152319A1 (“Device and Method for Rapid Detection of Viruses,” filed April 2021, published May 2023) explicitly credits Brainchip for SNN analysis in VOC detection from breath samples. It describes: • Nanoparticle sensor arrays (gold nanoparticles with ligands like dodecanethiol) capturing VOC changes as electrical resistance signals. • AI algorithms (e.g., discriminant function analysis, linear discriminant analysis) processing these into “VOC prints” for classifying infections (e.g., COVID-19 vs. healthy/controls, 90-100% sensitivity/specificity). • SNN via Brainchip: Achieved 90.5% accuracy (95% sensitivity, 87.87% specificity) on multi-use sensor data—outperforming other methods like LDA (86.5%). The patent implies Brainchip’s hardware for efficient, edge-based SNN processing, tying directly to Akida’s neuromorphic strengths in bio-inspired pattern recognition
Hi SS18,

Good pickup.

The patent has a priority date of 20200421, and was granted in 2023.

The reference to BRN is to the original database test, so this does not prove an ongoing relationship.

The patent also discusses discarding "erroneous" sensor readings:

1764060639112.png



[0120] The first dataset was collected with the first-generation device with singe use units that include the sensors of the invention. The dataset included subjects tested with the device at two sites: 35 samples from Northwell N.Y., and 31 samples from Shamir medical center IL. Each test file consisted of responses from duplicated sensor array, and therefore each test file was split into two sample files, based on the sensor sets. Some of the sensors failed to respond, and therefore datasets that included failed sensors were discarded. The total number of sample files that were analyzed after the error-prone samples were discarded is: Northwell—35 sample files (representing 24 tested subjects—17 positives, 7 negatives) and Shamir medical center—31 sample files (representing 21 tested subjects—14 positives, 7 negatives). The data was analyzed by Brainchip with a Spiking Neural Network, the adjacent confusion matrix shows the results on the test set.
 
  • Like
Reactions: 7 users

Diogenese

Top 20
Trump has spoken.
Very positive for AI.
Maybe a reason Tapson had a meeting with senator/s connected to appropriations and briefed house staff.
" Today, America is in a race for global technology dominance in the development of artificial intelligence (AI), an important frontier of scientific discovery and economic growth."
Perhaps connected with Seans 'watch us now' statement?
Hi Manny,

The Genesis AI program is under the auspices of the DOE. There is an emphasis on cyber security in Genesis.

https://brainchip.com/brainchip-and-quantum-ventura-partner-to-develop-cyber-threat-detection/

Laguna Hills, Calif. – May 15, 2023 BrainChip Holdings Ltd (ASX: BRN, OTCQX: BRCHF, ADR: BCHPY), the world’s first commercial producer of ultra-low power, fully digital, event-based, neuromorphic AI IP, announced today Quantum Ventura Inc., a San Jose-based provider of AI/ML research and technologies, will use BrainChip’s Akida™ technology to develop new cyber threat-detection tools.

In this federally funded phase 2 program, Quantum Ventura is creating state-of-the-art cybersecurity applications for the U.S. Department of Energy under the Small Business Innovation Research (SBIR) Program. The program is focused on “Cyber threat-detection using neuromorphic computing,” which aims to develop an advanced approach to detect and prevent cyberattacks on computer networks and critical infrastructure using brain-inspired artificial intelligence
.


Who was it who visited the Capitol from BRN a couple of months ago?

https://www.linkedin.com/posts/jona...-i-visited-activity-7358570977691099136-iFIV/

Dr Tapson goes to Washington - I visited several legislators last week to promote our Federal contracting agenda. The first photo shows me at our discussion with Senator Cindy Hyde-Smith, who is on the Senate Committee on Appropriations; other appointments were with Senator Roger Wicker, Chair of the Senate Armed Services Committee, and Representative David Min, our Laguna Hills CA-47 Rep. I also gave a presentation on BrainChip to House staffers in the historic House Majority Hearing Room, which is easily the grandest venue I have ever presented in.

More seriously - the US AI industry is becoming increasingly integrated with Defense and associated Departments in the US Government, and companies such as Anduril and Palantir are showing the way. BrainChip will be part of this integration - I also visited potential partners in the Beltway on this visit. Looking forward to an exciting future! Many thanks to our excellent partners at Spartan Group for organizational and logistic support.

All this ... and TENNs.

"Watch us now!"
 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 15 users
Top Bottom