BRN Discussion Ongoing

IloveLamp

Top 20
1000020929.jpg
1000020926.jpg
 
  • Like
  • Fire
  • Love
Reactions: 53 users

Frangipani

Regular
By the way, someone else from our now closed-down Perth Research Institute recently relocated from WA to CA - right into the heart of Silicon Valley:

View attachment 69048

For the past couple of weeks, I’ve been waiting for Vi Nguyen Thanh Le to come out of stealth by posting the customary LinkedIn message

“I am happy to share that I am starting a new position as …”

Santa Clara is of course the heart of Silicon Valley, where tech giants such as AMD, Intel and NVIDIA are headquartered, but also lots of smaller semiconductor companies and start-ups:


And then there is a plethora of further potential employers in the surrounding cities of Silicon Valley, there is also Stanford University nearby etc

It will be very interesting to see who hired Vi Nguyen Thanh Le - as her LinkedIn profile says she is now based in the US, her work visa must have already been approved (once again, assuming she is not a US citizen). I am pretty sure she wouldn’t have updated her location on LinkedIn if her California stint was just a tourist or family visit.

What this reveal likely won’t tell us, though, is:
Was our company not able to offer her a US-based job after closing down her former workplace in Perth, did she consciously choose to switch employers or will she be working for a company hiding behind an NDA with BrainChip…

Lakshmi Varshika Mirtinti is currently a PhD student at Drexel University (graduating in 2025) who - as we find out from her comment underneath our CTO’s post - was working with Akida for her PhD thesis and is now keen on applying for a job with our company. Nice!

But there is something else about their little dialogue that caught my eye:

View attachment 74032

View attachment 74033
View attachment 74041

View attachment 74038

Why was she expressing her interest in joining BrainChip’s Santa Clara team? Does that signify there is one? 🤔


Tony Lewis responded somewhat cryptically by saying “I am afraid the R&D team is located down south.”
He could have said “We don’t have a team in Santa Clara”, but he didn’t.

Not sure whether I am reading too much into his reply, but given that one of our former Perth Research Institute staff members - Vi Nguyen Thanh Le - has been based in Santa Clara for months now without ever updating her LinkedIn profile with regards to a new employer, could BrainChip either have placed a number of staff there with another company on a contract basis or possibly even have opened a small office in Silicon Valley?

Nothing but a wild idea so far - in case there is more to it regarding the latter, it will surely be featured in the upcoming podcast (even though I personally would expect them to announce such news through their social media channels at the time they eventuate.)

I just find the concrete reference to a BrainChip Santa Clara team odd, and our CTO’s reply struck me as somewhat ambiguous… Time will tell.

View attachment 74040

Further to my previous posts 👆🏻 speculating about a potential BrainChip presence of some kind in Santa Clara, in the heart of Silicon Valley, where tech giants such as AMD, Intel and NVIDIA are headquartered, but also lots of smaller semiconductor companies and start-ups:

Earlier today, I noticed a new job opening listed on the Brainchip website, looking for a Sales Director, US/Bay Area (advertised as being a remote job, though), who “will spearhead the sales initiatives of BrainChip Inc. in the US (Bay Area), focusing on expanding market share, fostering customer relationships, and driving revenue growth. This position is a pivotal member of the sales team, reporting directly to the VP of Global Sales.”

For those among you unfamiliar with the geographical term Bay Area: it refers to the nine California counties surrounding the San Francisco Bay. The Bay Area’s southern part encompasses what has become known as Silicon Valley.

What I find particularly intriguing is the sentence: “Expectation to complete at least one contract/deal within your first year of employment.”


Initially, I was a little puzzled by the word “expand” in the very first of the listed “essential job duties and responsibilities”…

Market Expansion: Develop and implement strategic sales plans to achieve sales targets and expand BrainChip's IP adoption in the US/Bay Area market

… but then I recalled that our first IP license was not signed with Tokyo-headquartered Renesas Electronics Corporation, the Japanese parent company, but with its wholly owned US subsidiary Renesas Electronics America, headquartered in Milpitas, CA, which happens to be in Santa Clara County!

(So if there really is a BrainChip “Santa Clara team”, I suppose they could potentially also be staff assisting Renesas engineers with Akida 2.0 and/or TENNs?)




6F820572-C23D-41A7-A60E-0355EEE3A822.jpeg





8E6E1F3C-CEF8-4EC7-9113-96DABEAC963F.jpeg

 
  • Like
  • Thinking
  • Fire
Reactions: 48 users
I just watched the interview with ceo of weebitnano
He really spells out the difficulty in getting a deal when you are a small company without runs on the board.
A very insightful interview for the people who can’t get there head around the length of time it actually takes to get there head beast moving.
Definitely worth a look
 
  • Like
  • Fire
Reactions: 24 users
This may have been posted already ... I pop in & out of the forum.
I just had a look at the other place, had a shower to cleanse myself & will not return for a while.
The comments below are from the GM of our CONTRACTED customer.

There is much to like in the comments.

- integrated with RISC-V
- Frontgrade Gaisler had internal Akida product evangelists pushing internally. That's how I see Sounak Dey and team at TCS. That is what is needed to get early adopters across the line.
- their microprocessors have been deployed to every planet in the solar system. Future processors will be augmented with neuromorphic AI from BrainChip. That is, BrainChip will also reach across the solar system & beyond. That is cool !


View attachment 75230
Brainchip reaching across the solar system. Gets intercepted by alien life forms. They employ the tractor beam to the satellite and reverse engineer Akida. Who is this genius, this PVDM. Let's head to earth and probe this PVDM.
Look out Peter, prepare to be probed.

SC
 
  • Haha
  • Like
  • Fire
Reactions: 13 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
Interview in link, I think we are a solid chance to be embedded within Mercedes SDV I’m not buying into the talk of neuromorphic technology being sometime away . That said there’s no mention of neuromorphic technology in the vid , interesting listen .

View attachment 75225




I'm 100% in agreement with you @Tothemoon24!

I'm quite confident our technology will be included in the “Hey Mercedes” voice control system in the electric vehicles anticipated to enter production in 2026, tying in with the first models expected to feature the new Mercedes MB.OS

Some people think that it would be impossible for our technology to be included because we haven't achieved the relevant safety standards such as ISO 26262. However when you listen to the video that you linked to, Magnus Östberg talks about the autonomous driving system being mission critical with the brakes and drive-train requiring the highest level of safety standards.

Importantly, at the 2.30 minute mark Magnus states "Then we have areas which are less critical. For example we have voice assistant."

This demonstrates to me that there would be less hoops to jump through as a result, which is why I remain so confident.
 
  • Like
  • Love
  • Fire
Reactions: 34 users

manny100

Regular
Further to my previous posts 👆🏻 speculating about a potential BrainChip presence of some kind in Santa Clara, in the heart of Silicon Valley, where tech giants such as AMD, Intel and NVIDIA are headquartered, but also lots of smaller semiconductor companies and start-ups:

Earlier today, I noticed a new job opening listed on the Brainchip website, looking for a Sales Director, US/Bay Area (advertised as being a remote job, though), who “will spearhead the sales initiatives of BrainChip Inc. in the US (Bay Area), focusing on expanding market share, fostering customer relationships, and driving revenue growth. This position is a pivotal member of the sales team, reporting directly to the VP of Global Sales.”

For those among you unfamiliar with the geographical term Bay Area: it refers to the nine California counties surrounding the San Francisco Bay. The Bay Area’s southern part encompasses what has become known as Silicon Valley.

What I find particularly intriguing is the sentence: “Expectation to complete at least one contract/deal within your first year of employment.”


Initially, I was a little puzzled by the word “expand” in the very first of the listed “essential job duties and responsibilities”…

Market Expansion: Develop and implement strategic sales plans to achieve sales targets and expand BrainChip's IP adoption in the US/Bay Area market

… but then I recalled that our first IP license was not signed with Tokyo-headquartered Renesas Electronics Corporation, the Japanese parent company, but with its wholly owned US subsidiary Renesas Electronics America, headquartered in Milpitas, CA, which happens to be in Santa Clara County!

(So if there really is a BrainChip “Santa Clara team”, I suppose they could potentially also be staff assisting Renesas engineers with Akida 2.0 and/or TENNs?)




View attachment 75228




View attachment 75229

Thanks, very informative post. All adds to the 2025 will be a great year statement by Sean.
Wise heads it appears have been accumulating holdings last week.
 
  • Like
  • Fire
Reactions: 14 users

7für7

Top 20
I'm 100% in agreement with you @Tothemoon24!

I'm quite confident our technology will be included in the “Hey Mercedes” voice control system in the electric vehicles anticipated to enter production in 2026, tying in with the first models expected to feature the new Mercedes MB.OS

Some people think that it would be impossible for our technology to be included because we haven't achieved the relevant safety standards such as ISO 26262. However when you listen to the video that you linked to, Magnus Östberg talks about the autonomous driving system being mission critical with the brakes and drive-train requiring the highest level of safety standards.

Importantly, at the 2.30 minute mark Magnus states "Then we have areas which are less critical. For example we have voice assistant."

This demonstrates to me that there would be less hoops to jump through as a result, which is why I remain so confident.
I think similarly. Many people focus too much on safety-related topics that are crucial for autonomous driving. However, there are already functioning systems for that. Sure, Akida could optimize those, but it would take longer because it involves a high risk for the passengers.

I believe the topic of interior experience and communication with the car will become a significant focus. This not only enhances comfort but also makes the driving or travel experience more interesting when you can communicate more naturally with the vehicle or even have entire conversations with it—like K.I.T.T. It might sound funny, but that’s how it will be in the end.

On the other hand, the car could notify you of issues like, “I think it’s time to change my oil,” or “It seems like a tire has lost some air pressure,” or manage settings for climate control, reminders, preheating the interior, or even upcoming appointments. The possibilities are endless.

This area alone would be enough to make Akida highly useful.

I don’t think it will only be integrated in Mercedes. Sooner or later it will get standard in every car
 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 10 users

Guzzi62

Regular
Found this on the other place.

Interesting take!

 
  • Like
  • Fire
Reactions: 2 users

Frangipani

Regular
  • Like
  • Fire
  • Love
Reactions: 66 users

Derby1990

Regular
Doing some Saturday dreaming. Can we get back here again next week? I'd like to see that.
Got RSI hitting that refresh button on that day, waiting to see that little number... 1



3.JPG
 
  • Like
  • Love
  • Fire
Reactions: 26 users

manny100

Regular
I'm 100% in agreement with you @Tothemoon24!

I'm quite confident our technology will be included in the “Hey Mercedes” voice control system in the electric vehicles anticipated to enter production in 2026, tying in with the first models expected to feature the new Mercedes MB.OS

Some people think that it would be impossible for our technology to be included because we haven't achieved the relevant safety standards such as ISO 26262. However when you listen to the video that you linked to, Magnus Östberg talks about the autonomous driving system being mission critical with the brakes and drive-train requiring the highest level of safety standards.

Importantly, at the 2.30 minute mark Magnus states "Then we have areas which are less critical. For example we have voice assistant."

This demonstrates to me that there would be less hoops to jump through as a result, which is why I remain so confident.
Agree, BRN recently confirmed that we still retain a commercial relationship with Mercedes and the other marquee clients listed under their 'Why Invest'.
I think that Mercedes as the owner/builder of its operating system is responsible for obtaining ISO approvals for its operating system - that is what i have read. I do not believe we have to receive ISO seperately for our chips.
BRN just supplied the chips that form part of the system. The system itself is rigorously tested by Mercedes before getting approvals.
All AKIDA1000 chips are the same whether they are for auto, space or health etc.
When for example if Tata use our chips for a hand held medical instruments they will have to get the finished product tested and approved by Health authorities. We will not need to get our chips separately tested and approved.
So IMO we are well in the race for all Mercedes chip requirements.
The choice of Frontgrade for Space and USAF for defense testing will make Mercedes look 2nd rate if they do not use us.
 
  • Like
  • Fire
  • Love
Reactions: 38 users
Doing some Saturday dreaming. Can we get back here again next week? I'd like to see that.
Got RSI hitting that refresh button on that day, waiting to see that little number... 1



View attachment 75236
Not without a significant IP deal, or partnership with a FAANG type Company.

Otherwise, dreaming is all it is.

There was still big volume Friday, but quite a bit lower than the previous sessions.

If the whales were still there, they weren't as ravenous..
 
  • Like
Reactions: 8 users

hotty4040

Regular
Found this on the other place.

Interesting take!




Very interesting listen Guzzi62: Enter, the "neuromorphic" solution to these dilemmas possibly/probably !!!???
Surprised this guy didn't mention/allude to this possibility in his meandering thoughts.
Nice pickup from the "dark side" IMO......


Hotty...
 
  • Like
Reactions: 5 users

Esq.111

Fascinatingly Intuitive.
The latest Brains and Machines podcast just came out, interviewing Australian roboticist Rodney Brooks, who is an MIT Emeritus Professor and successful robotics entrepreneur.



The podcast’s next episode will FINALLY be about BrainChip! 🥳

View attachment 75235
Afternoon Frangipani ,

Good find.

I got halfway through the interview, very interesting , then at about the 26 min mark , my phone went on random scroll and would not stop , THERE MAY BE A BUG ATTACHED TO THIS FILE.

Thankyou once again and very much looking foward to their next episode.

Regards,
Esq.
 
  • Like
  • Wow
Reactions: 8 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
Did someone say Cerence?

I just said Cerence.

Note to self: Don't get me started on Cerence! 😝


Some background info:

BrainChip's Akida technology, as utilized in the Mercedes-Benz Vision EQXX, employs neuromorphic computing to improve the efficiency of voice control systems.

Cerence has been a key provider of voice and AI-driven features in Mercedes-Benz's MBUX system although it wasn't directly incorporated into the Vision EQXX's voice assistant.

Mercedes-Benz has been collaborating with NVIDIA to integrate advanced computing platforms into their vehicles. Currently, Mercedes-Benz utilizes NVIDIA's DRIVE Orin system-on-a-chip (SoC) to power its autonomous systems.

The announcement below describes an expanded partnership between Cerence and NVIDIA and that The "integration involves leveraging Nvidia’s AI Enterprise platform, a comprehensive software solution for cloud-native AI applications, alongside Nvidia DRIVE AGX Orin for specific functionalities of CaLLM Edge. Cerence's CaLLM technology is designed to enhance in-car voice assistants by providing intelligent, natural, and personalized interactions between humans and their vehicles.

It would make sense IMO to integrate Cerence's CaLLM technology with BrainChip's Akida to potentially combine the strengths of both systems; to offer advanced AI-driven voice interactions alongside energy-efficient processing. This could lead to more responsive and efficient in-car voice assistants. Here's hoping this could be on the cards in the near future. Al parties would have to be aware of one another.



Cerence Soars 120% on Expanded Nvidia Partnership​

January 3, 2025 Shira Astmann

Nvidia

Cerence Inc. (CRNC) saw its stock price surge by over 120% on Friday following the announcement of an expanded partnership with Nvidia (NVDA). This collaboration aims to enhance the capabilities of Cerence’s CaLLM™ family of language models, specifically bolstering both its cloud-based Cerence Automotive Large Language Model (CaLLM) and the CaLLM Edge, which is an embedded small language model.
The integration involves leveraging Nvidia’s AI Enterprise platform, a comprehensive software solution for cloud-native AI applications, alongside Nvidia DRIVE AGX Orin for specific functionalities of CaLLM Edge. This partnership marks a significant step towards creating more sophisticated in-car AI assistants that can interact seamlessly through both cloud and embedded systems, requiring a blend of hardware, software, and user experience (UX) expertise.
Cerence stated that through close collaboration with Nvidia’s engineers, it has been able to significantly accelerate the development and deployment of its AI technologies. The use of Nvidia’s TensorRT-LLM and NeMo frameworks has been instrumental. TensorRT-LLM optimizes large language models for inference on Nvidia GPUs, while NeMo provides an end-to-end platform for building, customizing, and deploying AI models. This synergy has allowed Cerence to:
Boost the performance of in-vehicle assistants by utilizing Nvidia’s accelerated computing solutions and system-on-chips (SoCs). The result is a faster, more responsive interaction within vehicles, enhancing the driving experience.
Develop specialized guardrails for in-car AI using Nvidia NeMo Guardrails. This ensures that Cerence’s AI systems can handle the unique conversational and safety requirements of automotive environments, navigating the complexities of human interaction in a moving vehicle.
Implement an agentic architecture on the CaLLM Edge using Nvidia DRIVE AGX Orin. This approach not only optimizes performance but also paves the way for future advancements in vehicle user interfaces, offering a more personalized and intuitive interaction model.
This strategic alliance with Nvidia provides Cerence with the tools and infrastructure necessary to support its automotive partners in delivering cutting-edge user experiences. The focus is on creating systems that offer not just performance but also privacy, security, and resilience against malicious interactions, addressing key consumer and manufacturer concerns in the connected car era.
The market’s enthusiastic response to the announcement underscores the potential seen in Cerence’s ability to lead in the automotive AI space, leveraging Nvidia’s technology to push the boundaries of what’s possible in vehicle intelligence. This move positions Cerence to further solidify its role in shaping the future of in-car AI, where the emphasis is increasingly on seamless, secure, and user-friendly technology integration.

 
Last edited:
  • Like
  • Thinking
  • Love
Reactions: 37 users

TECH

Regular
Did someone say "Big Kev"....I'm excited, remember how I and others have highlighted Accenture's Patents
throughout the year, including one podcast with an Accenture engineer (from memory)...didn't he say he loved Akida or
words to that effect....hope their customers are tuned in.

At least 3 Patents naming Akida in the artwork (from memory)....check this little bit of positivity, not naming us as such, but
presenting at CES 2025....CES is huge.



Listen to the video attached...what this guy says, relates to our company and all companies attending and presenting their wares
at CES 2025.

Tech x
 
  • Like
  • Fire
  • Love
Reactions: 39 users
  • Like
Reactions: 3 users

Diogenese

Top 20
Did someone say Cerence?

I just said Cerence.

Note to self: Don't get me started on Cerence! 😝


Some background info:

BrainChip's Akida technology, as utilized in the Mercedes-Benz Vision EQXX, employs neuromorphic computing to improve the efficiency of voice control systems.

Cerence has been a key provider of voice and AI-driven features in Mercedes-Benz's MBUX system although it wasn't directly incorporated into the Vision EQXX's voice assistant.

Mercedes-Benz has been collaborating with NVIDIA to integrate advanced computing platforms into their vehicles. Currently, Mercedes-Benz utilizes NVIDIA's DRIVE Orin system-on-a-chip (SoC) to power its autonomous systems.

The announcement below describes an expanded partnership between Cerence and NVIDIA and that The "integration involves leveraging Nvidia’s AI Enterprise platform, a comprehensive software solution for cloud-native AI applications, alongside Nvidia DRIVE AGX Orin for specific functionalities of CaLLM Edge. Cerence's CaLLM technology is designed to enhance in-car voice assistants by providing intelligent, natural, and personalized interactions between humans and their vehicles.

It would make sense IMO to integrate Cerence's CaLLM technology with BrainChip's Akida to potentially combine the strengths of both systems; to offer advanced AI-driven voice interactions alongside energy-efficient processing. This could lead to more responsive and efficient in-car voice assistants. Here's hoping this could be on the cards in the near future. Al parties would have to be aware of one another.



Cerence Soars 120% on Expanded Nvidia Partnership​

January 3, 2025 Shira Astmann

Nvidia

Cerence Inc. (CRNC) saw its stock price surge by over 120% on Friday following the announcement of an expanded partnership with Nvidia (NVDA). This collaboration aims to enhance the capabilities of Cerence’s CaLLM™ family of language models, specifically bolstering both its cloud-based Cerence Automotive Large Language Model (CaLLM) and the CaLLM Edge, which is an embedded small language model.
The integration involves leveraging Nvidia’s AI Enterprise platform, a comprehensive software solution for cloud-native AI applications, alongside Nvidia DRIVE AGX Orin for specific functionalities of CaLLM Edge. This partnership marks a significant step towards creating more sophisticated in-car AI assistants that can interact seamlessly through both cloud and embedded systems, requiring a blend of hardware, software, and user experience (UX) expertise.
Cerence stated that through close collaboration with Nvidia’s engineers, it has been able to significantly accelerate the development and deployment of its AI technologies. The use of Nvidia’s TensorRT-LLM and NeMo frameworks has been instrumental. TensorRT-LLM optimizes large language models for inference on Nvidia GPUs, while NeMo provides an end-to-end platform for building, customizing, and deploying AI models. This synergy has allowed Cerence to:
Boost the performance of in-vehicle assistants by utilizing Nvidia’s accelerated computing solutions and system-on-chips (SoCs). The result is a faster, more responsive interaction within vehicles, enhancing the driving experience.
Develop specialized guardrails for in-car AI using Nvidia NeMo Guardrails. This ensures that Cerence’s AI systems can handle the unique conversational and safety requirements of automotive environments, navigating the complexities of human interaction in a moving vehicle.
Implement an agentic architecture on the CaLLM Edge using Nvidia DRIVE AGX Orin. This approach not only optimizes performance but also paves the way for future advancements in vehicle user interfaces, offering a more personalized and intuitive interaction model.
This strategic alliance with Nvidia provides Cerence with the tools and infrastructure necessary to support its automotive partners in delivering cutting-edge user experiences. The focus is on creating systems that offer not just performance but also privacy, security, and resilience against malicious interactions, addressing key consumer and manufacturer concerns in the connected car era.
The market’s enthusiastic response to the announcement underscores the potential seen in Cerence’s ability to lead in the automotive AI space, leveraging Nvidia’s technology to push the boundaries of what’s possible in vehicle intelligence. This move positions Cerence to further solidify its role in shaping the future of in-car AI, where the emphasis is increasingly on seamless, secure, and user-friendly technology integration.

Hi Bravo,

Cerence could certainly benefit from a sprinkle of Akida.

According to these patents, they use cloud-based software for key word spotting:

US2022358924A1 METHODS AND APPARATUS FOR DETECTING A VOICE COMMAND 20130312 – 20220718

1735975102586.png






US11676600B2

1735975153480.png


We could get our mates at GMAC to ask "Do you want Akida with that?"
 
Last edited:
  • Like
  • Haha
  • Fire
Reactions: 22 users

Terroni2105

Founding Member
  • Like
Reactions: 5 users

Boab

I wish I could paint like Vincent
  • Haha
  • Like
Reactions: 5 users
Top Bottom