BRN Discussion Ongoing

CHIPS

Regular
  • Like
  • Thinking
  • Haha
Reactions: 10 users

Damo4

Regular
"At the end of the year, the Group had consolidated net assets of $16,834,321 (2022: $23,718,406), including cash and cash equivalents of $14,343,381 (2022: $23,165,288)."

So if we ignore the cash/equivalents, BRN now holds approx $2.5m in assets, up from about $550k?

What's Sean cooking? Akida2 chips? Edge boxes?
Or is it just boring lab equipment?
 
  • Like
  • Thinking
  • Fire
Reactions: 11 users

Teach22

Regular
Not great reading of the previous 12 months as far as revenue is concerned but nothing the punters didn’t know already.
The dialogue going forward sounds real good and the market is a forward thinking beast, is it not?

We‘ll see how Mr. Market reacts. Would love to see a bit of enthusiasm by Sean tomorrow.
 
  • Like
  • Love
  • Fire
Reactions: 13 users

IloveLamp

Top 20
1000013595.jpg
 
  • Like
  • Fire
Reactions: 18 users
"At the end of the year, the Group had consolidated net assets of $16,834,321 (2022: $23,718,406), including cash and cash equivalents of $14,343,381 (2022: $23,165,288)."

So if we ignore the cash/equivalents, BRN now holds approx $2.5m in assets, up from about $550k?

What's Sean cooking? Akida2 chips? Edge boxes?
Or is it just boring lab equipment?
Cooking?

Are we running some other type of lab on the side I don't know about? :LOL:
 
  • Haha
  • Like
Reactions: 14 users

SERA2g

Founding Member
They better be giving out free beer coolers with the Brainchip logo to everyone that logs on to the Webby tomorrow :) Not a great read so far.

Can someone explain to me how the Dec Qtrly (ending Dec 2023) had reciepts from customers at $778K but the Yearly says income was $232k?

I'm confused......
This question is asked EVERY SINGLE TIME an annual report or half yearly is released.

The report is on an accruals basis. The quarterlies are on a cash basis.

The quarterly obviously reported cash received for sales reported in a period prior to this reporting period. Check the last report, there was a large trade debtors amount. It would have been that income being received now.
 
  • Like
  • Fire
  • Haha
Reactions: 30 users

Jchandel

Regular
  • Like
  • Fire
Reactions: 8 users
  • Haha
  • Like
  • Fire
Reactions: 13 users
  • Like
  • Haha
Reactions: 4 users

HopalongPetrovski

I'm Spartacus!
Not great reading of the previous 12 months as far as revenue is concerned but nothing the punters didn’t know already.
The dialogue going forward sounds real good and the market is a forward thinking beast, is it not?

We‘ll see how Mr. Market reacts. Would love to see a bit of enthusiasm by Sean tomorrow.
I don’t recall ever seeing Sean anything other than enthusiastic. But then, that is his job and perhaps knowing all that we can only guess about, gives him the perspective required to do it honestly.
I think it probably a tough position knowing what he does and yet still having to dance around what can be divulged or even discussed due to nda’s and commercial in confidence restrictions which we would all crucify him over should he let something slip inadvertently.
And yet here we are, all the myriad eyes, scrutinising every gesture and word in the hope of finding some clue.
It’s weird to me how we all so desperately want something that could well be deleterious to us should we receive it. 🤣
 
  • Like
  • Love
  • Fire
Reactions: 23 users
Some of them are partners now.. so don’t think it’s ideal to name them when comparing
Well....in reality, we can hope so but we can't definitively draw that conclusion as they haven't identified anyone in the table unfortunately.
 
  • Like
Reactions: 2 users

RobjHunt

Regular
The Yearly Report is what the Yearly Report does. I’m happy with it TBH, moving forward as expected.

We asked for transparency and in my opinion we will receive just that tomorrowday.

Makes sense now, and well planned. Report comes out today and BrainChip has asked for SH’s to ask questions at tomorrow’s event. Well planned.

Pantene Peeps 😉
 
  • Like
  • Love
  • Fire
Reactions: 44 users

7für7

Top 20
I guess FF will make a 80page annual report analysis about the annual report ! Looking forward to read the first 3 sentences, and like it
 
  • Haha
  • Like
Reactions: 21 users

Calsco

Regular
Does anyone have the link yet for this presentation? I subscribed for it but haven’t received it yet.
 

Diogenese

Top 20
One of the things I've been wondering about is whether the adoption of ChatGPT by Mercedes will preclude the use of "Hey Mercedes!" function of EQXX fame.

https://media.mercedes-benz.com/article/323212b5-1b56-458a-9324-20b25cc176cb
  • MBUX beta programme starts June 16, 2023
Mercedes-Benz is further expanding the use of artificial intelligence and integrating it into the voice control of its vehicles as the next step. By adding ChatGPT, voice control via the MBUX Voice Assistant's Hey Mercedes will become even more intuitive. An optional beta programme will start June 16, 2023 in the U.S. for over 900.000 vehicles equipped with the MBUX infotainment system[1].

Customers can participate via the Mercedes me app or directly from the vehicle using the voice command “Hey Mercedes, I want to join the beta programme”[2]. The rollout of the beta programme will happen over the air. Mercedes-Benz is integrating ChatGPT through Azure OpenAI Service, leveraging the enterprise-grade capabilities of Microsoft’s cloud and AI platform
.

...
ChatGPT complements the existing intuitive voice control via Hey Mercedes. While most voice assistants are limited to predefined tasks and responses, ChatGPT leverages a large language model to greatly improve natural language understanding and expand the topics to which it can respond.

From the original Jan 2022 EQXX report:
https://media.mercedes-benz.com/article/d31bf12a-a2d4-4978-b176-17c6a0fea6dc

Neuromorphic computing – a car that thinks like you​

Another key efficiency feature of the VISION EQXX that takes its cue from nature is the way it thinks. It uses an innovative form of information processing called neuromorphic computing. The hardware runs so-called spiking neural networks. Information is coded in discrete spikes and energy is only consumed when a spike occurs, which reduces energy consumption by orders of magnitude.

Working with California-based artificial intelligence experts BrainChip, Mercedes-Benz engineers developed systems based on BrainChip’s Akida hardware and software. The example in the VISION EQXX is the “Hey Mercedes” hot-word detection. Structured along neuromorphic principles, it is five to ten times more efficient than conventional voice control.

Although neuromorphic computing is still in its infancy, systems like these will be available on the market in just a few years. When applied on scale throughout a vehicle, they have the potential to radically reduce the energy needed to run the latest AI technologies
.

,,,

The road trip sidekick in the VISION EQXX is also fun to talk to. The further development of the “Hey Mercedes” voice assistant is emotional and expressive thanks to a collaboration between Mercedes-Benz engineers and the voice synthesis experts from Sonantic. With the help of machine learning, the team have given “Hey Mercedes” its own distinctive character and personality. As well as sounding impressively real, the emotional expression places the conversation between driver and car on a whole new level that is more natural and intuitive, underscoring the progressive feel of the modern luxury conveyed by the UI/UX in the VISION EQXX.


... but really, nothing to see here ...
 
  • Like
  • Love
  • Fire
Reactions: 31 users

cassip

Regular
Sorry if postet before: Samsung presents its Galaxy ring

"Samsung wants to analyze most of the data locally in the smartphone. "90-plus percent of the data remains on the customer's device," assures Pak."


English article without notice that most data is stored locally:

 
  • Like
  • Wow
  • Thinking
Reactions: 10 users

Quatrojos

Regular
We are green already, but it is going up and down all the time, a bit of green and then a bit of red and back.


I was talking about this blend of green:

1708947105476.png



Germany will enjoy a spike in tourism...
 
  • Haha
  • Like
  • Love
Reactions: 9 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
One of the things I've been wondering about is whether the adoption of ChatGPT by Mercedes will preclude the use of "Hey Mercedes!" function of EQXX fame.

https://media.mercedes-benz.com/article/323212b5-1b56-458a-9324-20b25cc176cb
  • MBUX beta programme starts June 16, 2023
Mercedes-Benz is further expanding the use of artificial intelligence and integrating it into the voice control of its vehicles as the next step. By adding ChatGPT, voice control via the MBUX Voice Assistant's Hey Mercedes will become even more intuitive. An optional beta programme will start June 16, 2023 in the U.S. for over 900.000 vehicles equipped with the MBUX infotainment system[1].

Customers can participate via the Mercedes me app or directly from the vehicle using the voice command “Hey Mercedes, I want to join the beta programme”[2]. The rollout of the beta programme will happen over the air. Mercedes-Benz is integrating ChatGPT through Azure OpenAI Service, leveraging the enterprise-grade capabilities of Microsoft’s cloud and AI platform
.

...
ChatGPT complements the existing intuitive voice control via Hey Mercedes. While most voice assistants are limited to predefined tasks and responses, ChatGPT leverages a large language model to greatly improve natural language understanding and expand the topics to which it can respond.

From the original Jan 2022 EQXX report:
https://media.mercedes-benz.com/article/d31bf12a-a2d4-4978-b176-17c6a0fea6dc

Neuromorphic computing – a car that thinks like you​

Another key efficiency feature of the VISION EQXX that takes its cue from nature is the way it thinks. It uses an innovative form of information processing called neuromorphic computing. The hardware runs so-called spiking neural networks. Information is coded in discrete spikes and energy is only consumed when a spike occurs, which reduces energy consumption by orders of magnitude.

Working with California-based artificial intelligence experts BrainChip, Mercedes-Benz engineers developed systems based on BrainChip’s Akida hardware and software. The example in the VISION EQXX is the “Hey Mercedes” hot-word detection. Structured along neuromorphic principles, it is five to ten times more efficient than conventional voice control.

Although neuromorphic computing is still in its infancy, systems like these will be available on the market in just a few years. When applied on scale throughout a vehicle, they have the potential to radically reduce the energy needed to run the latest AI technologies
.

,,,

The road trip sidekick in the VISION EQXX is also fun to talk to. The further development of the “Hey Mercedes” voice assistant is emotional and expressive thanks to a collaboration between Mercedes-Benz engineers and the voice synthesis experts from Sonantic. With the help of machine learning, the team have given “Hey Mercedes” its own distinctive character and personality. As well as sounding impressively real, the emotional expression places the conversation between driver and car on a whole new level that is more natural and intuitive, underscoring the progressive feel of the modern luxury conveyed by the UI/UX in the VISION EQXX.


... but really, nothing to see here ...
The good thing is now we can ask Sam Altman what he reckons about it, rather than going through Mercedes.(y)
 
  • Fire
  • Like
Reactions: 8 users

Diogenese

Top 20
  • Like
  • Thinking
  • Fire
Reactions: 20 users
Top Bottom