BRN Discussion Ongoing


2024 production models yum yum .. nothing about Akida in the article but a fair assumption
 
  • Like
  • Fire
Reactions: 3 users

BaconLover

Founding Member
No clue what the revenue would be. However I know two things.

1) Holders will be happy with whatever it is, because those who know, know about revenue coming in the second half of this year/early next year.

2) Non-holders will be unhappy and whinging.
 
  • Like
  • Love
  • Haha
Reactions: 30 users
Its a cold Saturday night here in Melbourne. At a guess, how much revenue do you think we'd see in the 4C in a couple of days?
(just for fun guys.....)
I'd say $5mil+
More than 2.5. Less than 5
 
  • Like
  • Fire
Reactions: 7 users

Deadpool

hyper-efficient Ai
  • Haha
  • Like
  • Fire
Reactions: 4 users

Slade

Top 20
Dont you mean China will do its best to STEAL the intel of Akida Technology rather than adopt ?

No doubt they will try.
 
  • Like
  • Thinking
Reactions: 6 users

Slade

Top 20
Designing Smarter and Safer Cars with Essential AI

Conventional AI silicon and cloud-centric inference models do not perform efficiently at the automotive edge. As many semiconductor companies have already realized, latency and power are two primary issues that must be effectively addressed before the automotive industry can manufacture a new generation of smarter and safer cars. To meet consumer expectations, these vehicles need to feature highly personalized and responsive in-cabin systems while supporting advanced assisted driving capabilities. In this paper, we discuss how automotive companies are redefining the in-cabin experience and accelerating assisted driving capabilities by untethering edge AI functions from the cloud - and performing distributed inference computation on local neuromorphic silicon using BrainChip’s AKIDA. We call this Essential AI. This model sequentially leverages multiple AKIDA-powered smart sensors and AKIDA AI SoC accelerators to efficiently capture and analyze inference data within designated regions of interest or ROI (samples within a data set that provide value for inference). AI inference at the automotive edge starts with versatile AKIDA-powered smart sensors capturing entire input images and complete audio streams in real-time. The raw data passes to AKIDA AI accelerators embedded on smart sensors - which analyze and infer meaningful information from specific regions of interest.

Really feel that this post kind of got lost amongst other posts. I would like to encourage everyone to press on the link and read the white paper. I believe it’s new and I think it blow your socks off. It’s a must read.
 
  • Like
  • Fire
  • Love
Reactions: 42 users

equanimous

Norse clairvoyant shapeshifter goddess
Its better to have lower expectations and be surprised then have higher expectations and be dissapointed.

The financial markets just had one of the worst half yearly in financial history. So I wouldnt expect much this quarter.

BRN foundations are solid, the technology is proven in everyday use case and in just about every industry.

We dont need a capital raise.

Brainchip is primed for explosion.
 
  • Like
  • Love
  • Fire
Reactions: 32 users

TheFunkMachine

seeds have the potential to become trees.
I highly recommend reading the new EE times. Article published on Brainchip Linked in page.https://www.eetimes.com/cars-that-think-like-you/

The thing that was the most mind blowing to me is when Jerome Nadal said that there are over 70 sensors in a car and Akida has the potential to sit next to every one of those sensors. 70 Akidas in one car.

When you start doing the math on that it will blow your mind.

If you include both the potential of Mercedes, Ford, Valeo, Renasas and any not to mention any other car sensors Brainchip is involved with behind the scene.

Example :
1 mill cars
70 mill sensors
70 mill Akida potential
10-15 $ chip( I know the path is IP, but for the sake of the example I chose chip sales)

70mill x 10$ = 700 mill $
 
  • Like
  • Fire
Reactions: 20 users

Xhosa12345

Regular
  • Like
Reactions: 4 users

Serengeti

Regular
Its a cold Saturday night here in Melbourne. At a guess, how much revenue do you think we'd see in the 4C in a couple of days?
(just for fun guys.....)
I'd say $5mil+
For shits and giggles:

- 4C between 500k - 1.7mil
- Next ASX price sensitive announcement: November 2022 with Ford / General Motors
 
  • Like
  • Fire
Reactions: 9 users
D

Deleted member 118

Guest
Hey all

I’m seeing a lot of great content but also a bit of general lifestyle chatter back and forth in the BRN forum / thread.

Can I please remind everyone that for non-BRN chat, we have a chat room and also The Lounge where you can talk about anything not specifically related to a stock: https://thestockexchange.com.au/forums/the-lounge-where-anything-goes.15/
I understand that many members will be getting tired of wading through the banter to pick out the good research that is being done. I’ve been guilty myself of posting a fair bit of nonsense over the years. I am thankful it was tolerated but I look back and think it probably annoyed a few people. It might be an idea to create another thread that can accomodate more Memes and banter so as to keep this thread a bit tidier. It actually doesn’t worry me as I’m retired and have plenty of time on my hands. However, thinking of those that have busy lives it might make things easier for them. I don’t know what the solution is but I am sure something could be worked out.

We also have a lot of different pages set up
to discuss things like nasa, renesas etc and if people were to use these instead of posting everything here it would make things a lot easier as well. I’m one guilty of posting inappropriate posts on this BRn page, but at least I make an effort to post all my findings in the correct places
 
  • Like
  • Love
Reactions: 12 users

Mws

Regular
We also have a lot of different pages set up
to discuss things like nasa, renesas etc and if people were to use these instead of posting everything here it would make things a lot easier as well. I’m one guilty of posting inappropriate posts on this BRn page, but at least I make an effort to post all my findings in the correct places
Like a photo of your hairy legs. Sorry
 
  • Like
  • Haha
Reactions: 5 users
D

Deleted member 118

Guest
Like a photo of your hairy legs. Sorry
Hands up I’m guilty, but I was also know as the class clown and I’ll never change even 35 years later
 
  • Like
  • Love
Reactions: 7 users

Sirod69

bavarian girl ;-)
Really feel that this post kind of got lost amongst other posts. I would like to encourage everyone to press on the link and read the white paper. I believe it’s new and I think it blow your socks off. It’s a must read.
So nice Slade, i think so too and really important to read!!!
 
  • Like
  • Love
  • Fire
Reactions: 19 users

zeeb0t

Administrator
Staff member
Zeebot why did rise leave?

If u dont know ,all good. Just say
I don’t know, I haven’t been told anything.
 
  • Like
  • Haha
  • Thinking
Reactions: 11 users

Xhosa12345

Regular
  • Like
Reactions: 5 users

TheFunkMachine

seeds have the potential to become trees.
  • Like
Reactions: 8 users

Diogenese

Top 20
Designing Smarter and Safer Cars with Essential AI

Conventional AI silicon and cloud-centric inference models do not perform efficiently at the automotive edge. As many semiconductor companies have already realized, latency and power are two primary issues that must be effectively addressed before the automotive industry can manufacture a new generation of smarter and safer cars. To meet consumer expectations, these vehicles need to feature highly personalized and responsive in-cabin systems while supporting advanced assisted driving capabilities. In this paper, we discuss how automotive companies are redefining the in-cabin experience and accelerating assisted driving capabilities by untethering edge AI functions from the cloud - and performing distributed inference computation on local neuromorphic silicon using BrainChip’s AKIDA. We call this Essential AI. This model sequentially leverages multiple AKIDA-powered smart sensors and AKIDA AI SoC accelerators to efficiently capture and analyze inference data within designated regions of interest or ROI (samples within a data set that provide value for inference). AI inference at the automotive edge starts with versatile AKIDA-powered smart sensors capturing entire input images and complete audio streams in real-time. The raw data passes to AKIDA AI accelerators embedded on smart sensors - which analyze and infer meaningful information from specific regions of interest.

Thanks Sirod,

Not that we didn't already know it, but it's good to see written confirmation of both in-cabin and external sensor designers are using Akida.

"With AKIDA, automotive manufacturers are designing sophisticated edge AI systems that deliver immersive end-user experiences, support the ever-increasing data and compute requirements of assisted driving capabilities, and enable the self-driving cars and trucks of the future."
...

"Neuromorphic silicon - which processes data with efficiency, precision, and economy of energy - is playing a major role in transforming vehicles into transportation capsules with personalized features and applications to accommodate both work and entertainment. This evolution is driven by smart sensors capturing yottabytes of data from the automotive edge to create holistic and immersive in-cabin experiences."

That's a lotta bytes.
...

"To efficiently process millions of data points simultaneously, we propose designing new LiDAR systems that leverage the principles of sequential computation with AKIDA-powered smart sensors and AI accelerators. This approach enables automotive manufacturers to significantly improve the inferential capabilities, scalability, and energy efficiency of LiDAR."
...

"These fast and energy efficient (AKIDA) ADAS systems are already helping automotive companies accelerate the rollout of increasingly advanced assisted driving capabilities."

... and remember, automotive is only one of the strings to Akida's bow.

"It won't happen overnight, but ..." "we gonna need a bigger bus."
 
  • Like
  • Fire
  • Love
Reactions: 86 users
i am not sure if posted?
i think many here are in a green zone, but now to sell, isn´t a good idea, because brainchip is on the begining


You all say next week we get the Q2
but I found 31.08, what is right now?

BRAINCHIP AKTIE TERMINE​


TerminartSchätzung EPSInfoDatum
Quartalszahlen-Q2 202231.08.2022 (e)*
Quartalszahlen-Q4 202201.03.2023 (e)*
Quartalszahlen-Q2 202330.08.2023 (e)*
Quartalszahlen-Q4 202328.02.2024 (e)*

*estimated / geschätzt
zu den Quartalsschätzungen für BrainChip
You're doing some great work Sirod69 👍
I know I've said before, that we're on the cusp, but now we're on the cusp of the cusp!

And the quarterly report, is definitely out next week.
 
  • Like
  • Love
  • Fire
Reactions: 33 users

Dallas

Regular
With the Speak-to-Chat feature, transparency operation is enabled simply by let's talk. Then a short hello is enough and the transparency mode is activated. Unlike the finger-on gesture, the music stops completely. Voice recognition works very well and we get voice activation for transparency mode as a result.
 
  • Like
Reactions: 8 users
Top Bottom