BRN Discussion Ongoing

White Horse

Regular
I could give you a reasonable explanation for every single point – why I react the way I do and what I write. But there’s no real point, because you clearly twist things into whatever suits you – taking comments out of context just to make me look bad. Edit: you still don’t know the Funktion “show ignored content”

You also seem to have no understanding of normal human behavior: sometimes people post emojis just for fun, and sometimes, after doing more research, they realise that a previous post wasn’t 100% accurate and then start questioning it. It’s completely normal to later come to a different conclusion and for earlier posts to become irrelevant.

Instead, you’re busy tracking what I post in other forums just to try and expose me here. You can keep trying if you like – I honestly don’t care. In the end, you’re only making yourself look more ridiculous, even though you actually had the potential to be taken more seriously by some people.

I am also requesting that you delete that post and stop this stalking behaviour.
If you tag me, mention me again, or repost my content here without my consent, I will take action – fully in line with the terms and conditions and usage policies of this forum.
You are a two faced clown. And a hypocrite.
 
  • Like
  • Fire
  • Love
Reactions: 4 users

Cardpro

Regular
Argh... I’m so frustrated with the recent progress... and the capital raise... They said “Watch us now,” but then went ahead with a CR when the share price is already so low… It makes me wonder whether they actually have any deals lined up or there will be any positive announcements coming soon... I am guessing no... Sean’s $9M revenue target is starting to feel like complete BS at this point… ;(

IMO only...
 
  • Like
  • Fire
Reactions: 5 users

7für7

Top 20
I see Olivier has recently provided some comments to a post by a freelancer named Artiom as per link.

Olivier talks about some tech details I found interesting (my bold) and also, when you go to Olivier's LinkedIn, it says working for BRN? I thought he moved on or I mistake something?




Olivier Coenen
From theoretical models to chips: I design SOTA spatiotemporal NNs (PLEIADES, TENN (Akida 2+), SCORPIUS) that train like CNNs and run efficiently at the edge like RNNs.
6d

The Austrian Institute of Technology (AIT) where Christoph Posch developed the ATIS sensor developed a linear event-based visual array of 3(?) pixel wide to scan objects on a conveyor belt moving really fast. We used the technique presented here to categorize moving grocery items and read their barcodes when put in a shopping basket. The difficulty was the effective temporal resolution, only about 0.1 ms at best, far from the ns resolution of the timestamps. So we needed better techniques, dynamical NNs to extract all the info and compensate for the time-dependent response of a pixel: if fired recently, less likely to fire again for same contrast. We didn’t have that NN then and I think I have it now with PLEIADES and successors.
Like
Reply
Olivier Coenen, graphic
Olivier Coenen
From theoretical models to chips: I design SOTA spatiotemporal NNs (PLEIADES, TENN (Akida 2+), SCORPIUS) that train like CNNs and run efficiently at the edge like RNNs.
6d

The increase resolution brings out the another main advantage of event-based vision sensors, one can eventually see/resolve objects that one couldn’t with same frame-based resolution. We could still track drones flying in the sky that were 1/10 the size of a single pixel for example. Try that with a frame-based camera. We could generate Gigapixel resolution images with a DAVIS240 (240 x 180) of our surroundings while driving a vehicle offroad on a bumpy ride.
Like
Reply
Vincenzo Polizzi, graphic
Vincenzo Polizzi
PhD Candidate at University of Toronto | SLAM, Machine Learning, Robotics
3d

This is a great application, very nicely explained! One interesting aspect visible in your results is that edges parallel to the motion direction are much less represented, simply because they don’t generate many events. A way to address this is to excite pixels from multiple directions, ensuring that all edges trigger events. This is exactly what we explored in a class project that eventually evolved into our paper VibES: we mounted the event camera on a small mechanical platform that oscillates in both X and Y, essentially a tiny “washing-machine-like” motion 😄. By injecting a known motion pattern (and estimating the sinusoidal components online), we obtained dense event streams and significantly sharper reconstructions, even for edges that would otherwise remain silent. Glad to hear your thoughts!
Like
Reply
1 Reaction
Olivier Coenen, graphic
Olivier Coenen
From theoretical models to chips: I design SOTA spatiotemporal NNs (PLEIADES, TENN (Akida 2+), SCORPIUS) that train like CNNs and run efficiently at the edge like RNNs.
6d

This brings out another point: single cone/rod in our retinas don’t have the angular resolution to clearly see stars in the night sky, our eyes just don’t have the necessary resolution, yet we can clearly stars. How is that then possible? Eye movement and spike timing. Within a certain spatial resolution, spikes fire at the same spatial location when our eyes move. The temporal resolution of the spike translates to the spatial resolution that we can achieve when the spikes accumulates around the same spatial location where the star is. Be drunk and the stars should appear thicker or will appear to move because the temporal spike alignment is off. Thus, what we see is clearly a result of the neural processing, not just the inputs, the neural networks processing the spike firing. That’s the neural processing we need to take full advantage of event-based (vision) sensors, without having to resort to the processing of periodic sampling of frames that the engineering world is stuck in today.
 
  • Like
  • Fire
Reactions: 11 users

GStocks123

Regular

Attachments

  • IMG_5035.png
    IMG_5035.png
    273 KB · Views: 44
  • Like
  • Fire
Reactions: 16 users

Guzzi62

Regular
  • Like
  • Fire
Reactions: 8 users

Diogenese

Top 20
That was a very long but interesting read.

I recommend all BRN investors reading it, the guy chatting with GPT is clearly very knowledgeable and ask hard questions.

I have to read it again and let it sink in.
Hi Guzzi,

About half way through - It's quite repetitive, so I've probly glossed over some of the nuances.

Chatty suggests that GPU is better at classifying several items in a field of view. Is this just a question of Akida not having a sufficiently developed model library, or is it based on processing power, remembering that Akida can come in herds.

On evolution - When a brave neuron is being attacked, it does not want to be hampered by a heavy MAC - fight or flight.

I was going to say something profound, but had to take a toilet break and lost the thread (not an image to conjure with).

We are told the yet-to-be-disclosed patent application does 32-bit accuracy with 4-bits.

Given that Akida is moving to FP32 in the future, will that give it CPU/GPU capabilities. Of course, the software support becomes critical.
 
  • Like
  • Fire
  • Haha
Reactions: 10 users

Esq.111

Fascinatingly Intuitive.
Good Morning Chippers ,

Just saw this over at the other site , unshaw of any connection to BRN , though possibly.

1764018340561.png


Regards,
Esq.
 
  • Like
  • Thinking
  • Fire
Reactions: 9 users

Mccabe84

Regular
So did management sell shares through the raise to JP morgan, then they used those shares to pay back the shorts they had out or am I totally misreading everything?
 
  • Thinking
  • Sad
Reactions: 2 users

HopalongPetrovski

I'm Spartacus!
So did management sell shares through the raise to JP morgan, then they used those shares to pay back the shorts they had out or am I totally misreading everything?
From what I can understand, over the period from 18 July through to 17 November they bought 120 million odd which meant they had just over 5% voting power and so were mandated to announce.
They have since sold a bit over 10 million leaving them with a balance of 109,736,076 as at end 20 November and so slipping back under the 5% threshold, have to inform the market.
God only knows what they will do with their remaining substantial stake but whilst it remains below a reportable figure they can resume operating in relative anonymity.
About 8 million of the shares they have relinquished are classed as borrow return.
 
  • Like
  • Fire
Reactions: 5 users

Mccabe84

Regular
From what I can understand, over the period from 18 July through to 17 November they bought 120 million odd which meant they had just over 5% voting power and so were mandated to announce.
They have since sold a bit over 10 million leaving them with a balance of 109,736,076 as at end 20 November and so slipping back under the 5% threshold, have to inform the market.
God only knows what they will do with their remaining substantial stake but whilst it remains below a reportable figure they can resume operating in relative anonymity.
About 8 million of the shares they have relinquished are classed as borrow return.
Thanks for you thoughts, im just a novice long term holder and don't fully understand everything
 
  • Like
Reactions: 3 users

Adika

Regular
From what I can understand, over the period from 18 July through to 17 November they bought 120 million odd which meant they had just over 5% voting power and so were mandated to announce.
They have since sold a bit over 10 million leaving them with a balance of 109,736,076 as at end 20 November and so slipping back under the 5% threshold, have to inform the market.
God only knows what they will do with their remaining substantial stake but whilst it remains below a reportable figure they can resume operating in relative anonymity.
About 8 million of the shares they have relinquished are classed as borrow return.
Hi Hoppy,

I was looking at this before trading opened today.

Have we just cap raised providing shares to an institution that then sold the shares to a company who has been actively shorting BRN?

As of 12 Nov short positions were 137mill.


Screenshot 2025-11-25 at 10.10.58 am.png



Shares traded either side of 17th total 45mill.

1764027701320.png



Cap raise was 200mill and shares were issued on 17 Nov at 17.5 cents a share.


Screenshot 2025-11-25 at 10.12.51 am.png


JP Morgan Chase became a substantial holder on 17 Nov due to this purchase of 46mill shares at 18 cents per share.

Screenshot 2025-11-25 at 10.12.21 am.png


As of 17 Nov short positions were 91mill, a difference of 46mill from 12th Nov.

Screenshot 2025-11-25 at 10.11.16 am.png
 
  • Fire
  • Wow
Reactions: 2 users
Top Bottom