BRN Discussion Ongoing

BaconLover

Founding Member
  • Like
Reactions: 19 users

0D5BD73E-F4A5-4E82-A8C0-23CC9E30E8D9.jpeg
 
  • Like
  • Fire
Reactions: 21 users
From the 18 page Mercedes Benz EQXX pdf which is part of @Pmel ‘s
post above:

“Neuromorphic computing – a car that thinks like you
Another key efficiency feature of the VISION EQXX that takes its cue from nature is the way it thinks. It uses an innovative form of information processing called neuromorphic computing. The hardware runs so-called spiking neural networks. Information is coded in discrete spikes and energy is only consumed when a spike occurs, which reduces energy consumption by orders of magnitude.

Working with California-based artificial intelligence experts BrainChip, Mercedes-Benz engineers developed systems based on BrainChip’s Akida hardware and software. The example in the VISION EQXX is the “Hey Mercedes” hot-word detection. Structured along neuromorphic principles, it is five to ten times more efficient than conventional voice control.

Although neuromorphic computing is still in its infancy, systems like these will be available on the market in just a few years. When applied on scale throughout a vehicle, they have the potential to radically reduce the energy needed to run the latest AI technologies.”

Locked away in a pdf document and made available for ongoing public consumption. Brainchip gives all the appearance of being part of the furniture at Mercedes Benz.

My opinion only DYOR
FF

AKIDA BALLISTA
 
  • Like
  • Love
  • Fire
Reactions: 71 users
https://www.design-reuse.com/redir3/35893/352793/8IDCcehr87FC7QuZSKvfeNO7vLEwt

The above link is taken from todays Design & Reuse daily email update under the heading Artificial Intelligence News. It is free to subscribe and Brainchip information and releases regularly feature now.

This entry appears with one other under Artificial Intelligence which is about using Ai and CHATGpt to maximise the design for incorporating Ai in your chips.

My opinion only DYOR
FF

AKIDA BALLISTA
 
  • Like
  • Fire
  • Love
Reactions: 26 users

Iseki

Regular
OPPO article mentions production H2 CY2023 for launch next year. Most likely February/March when they launch new models. And they will launch their own SoC.

So if any new smartphones next year will have Akida IP they will have to be manufactured in H2 CY2023.

Vivo X90 launched 3 February, 2023 was first phone to use Snapdragon 8 Gen2 SoC. New Vivo model next year could be the first phone to have Prophesee tech. One to keep an eye on for specs.


OPPO to Launch its Own SoC in 2024: Here’s What We Know​

TECHNOLOGY
By Sidharth Joseph Last updated Feb 22, 2023

View attachment 30995


Chinese smartphone manufacturer, OPPO is planning to bring the company’s very own self developed SoC in 2024. This in-house chipset that the company will be introducing will make the brand much more independent and will give an extra advantage in terms of performance and pricing with its competitors.

Reports on the Chinese microblogging platform, Weibo reveals that OPPO has already started its work on its very own smartphone chipset which the company would release in 2024. The chipset is going to use 4nm manufacturing process and is expected to be made by TSMC. It is also expected that the chipset will be compatible with 5G smartphones.

It was from a MediaTek executive that we had first received the information about OPPO’s self made SoC and later an insider of OPPO had also revealed that the company has already started its works. Reports also reveal that the company has invested about 1.4 billion in the research and development of its chipset.

OPPO’s first custom made chipset, MariSilicon X was released in 2021 and was a 6nm imaging chipset. The company also has a connector chipset as well called MariSilicon Y. All these indicate that the brand in the future is trying not to depend more on leading SoC manufacturers like MediaTek and Qualcomm but to rely more on their own self developed chipsets.

Leading smartphone manufacturers like Samsung, Apple and Google already have their own chipsets and with OPPO also following the same trend, it will definitely enable the company to achieve independence and come to the forefront in the smartphone market.

https://www.thetechoutlook.com/news/technology/oppo-to-launch-its-own-soc-in-2024-heres-what-we-know/#:~:text=Chinese%20smartphone%20manufacturer%2C%20OPPO%20is,and%20pricing%20with%20its%20competitors.
Debayan Roy (Gadgetsdata)
@Gadgetsdata


Oppo's Self-Developed smartphone SoC is expected to arrive in next year, 2024.
• It is a 4nm SoC, supports 5G.
• The design process of this SoC is completed and it is ready to be sent into manufacturing in the 2nd Half of 2023.
• It might be manufactured by TSMC.
Via: Weibo



Another company to keep an eye on is MediaTek.

As the industry leader in developing powerful, highly integrated and efficient system-on-chip products, MediaTek is enabling the future of AI by creating an ecosystem of Edge-AI hardware processing paired with comprehensive software tools across its product range - smartphones to smart homes, wearables, IoT and connected cars.​


MediaTek NeuroPilot

We’re meeting the Edge AI challenge head-on with MediaTek NeuroPilot. Through heterogeneous computing capabilities such as CPUs, GPUs and APUs (AI processing units) we embed into our system-on-chip products, we are providing high performance and power efficiency for AI features and applications. Developers can target these specific processing units within the system-on-chip or, they can let MediaTek NeuroPoint SDK intelligently handle the processing allocation for them.

Learn how it works >



Many companies becoming active in edge AI. Hopefully, BRN's Akida will become a high volume building block for a majority of devices in future.


Osram recently partnered with MegaChips' other AI partner Quadric. Will be using Quadric's Chimera general purpose neural processor. Appears this application may be better suited to Quadric in leau of Akida. Or they don't know about Akida yet.

Quadric, Ams Osram to develop smart sensing solutions for edge-based applications​

Jan 11, 2023 | Abhishek Jadhav

Quadric, an edge AI chip provider, has partnered with Ams Osram, an optical solutions provider, to create a smart sensing solution for edge computing applications. The partnership will combine Ams Osram’s Mira family of CMOS sensors and Quadric’s Chimera general-purpose neural processors. Leveraging the power of both their respective strengths, Quadric says it has developed a low-power smart sensing module that combines image capturing and machine learning capabilities.

The Ams Osram Mira220 CMOS image sensor is designed for use in 2D and 3D consumer and industrial machine vision applications. With the intention of further enhancing its sensor and releasing versions with higher resolutions, Quadric is integrating future models of its CMOS sensors with a range of computing power from 1 to 16 TOPs in its Chimera processor lineup. Both companies showcased this sensing module at CES 2023.

The company says that the Mira CMOS image sensor family provides maximum resolution in a compact form factor while minimizing power consumption. By incorporating new modules within the Mira family, the company says it can provide a broad selection of resolutions that cater to different applications needing high performance and energy efficiency.

“The combination of Ams Osram’s sensors and Quadric’s processing into a single low-power module opens up vast new possibilities for the deployment of smart vision sensing,” said Joost Seijnaeve, the vice president and general manager of CMOS Image Sensors for Ams Osram.

The Quadric Chimera general-purpose neural processor has a unified hardware and software architecture optimized for on-device AI computing. This architecture enables the processor to execute matrix and vector operations, as well as scalar code within a single execution pipeline. The processor’s design combines a neural processor, digital signal processor and real-time CPU into an individual programmable core. This means that it is exceptionally proficient in dealing with multiple tasks, the company says.

“Quadric is excited to be joining forces with Ams Osram,” said Veer Kheterpal, the CEO at Quadric. “Empowering device makers with the capability for a fully-programmable smart sensing device at incredibly low power levels will open a vast new tranche of deployments of machine learning in edge devices.”

In March of 2022, Quadric raised $21 million in a Series B funding round led by mobility supplier Denso’s NSITEXE subsidiary. The company stated the funding would, among other things, accelerate advancements in its next-generation processor architecture.

We know Quadric. They are part owned by MegaChips, who have a deal with us. So if Quadric can start using Akida IP this will be huge.
 
  • Like
Reactions: 12 users

wilzy123

Founding Member
  • Like
Reactions: 1 users

Dhm

Regular
From the 18 page Mercedes Benz EQXX pdf which is part of @Pmel ‘s
post above:

“Neuromorphic computing – a car that thinks like you
Another key efficiency feature of the VISION EQXX that takes its cue from nature is the way it thinks. It uses an innovative form of information processing called neuromorphic computing. The hardware runs so-called spiking neural networks. Information is coded in discrete spikes and energy is only consumed when a spike occurs, which reduces energy consumption by orders of magnitude.

Working with California-based artificial intelligence experts BrainChip, Mercedes-Benz engineers developed systems based on BrainChip’s Akida hardware and software. The example in the VISION EQXX is the “Hey Mercedes” hot-word detection. Structured along neuromorphic principles, it is five to ten times more efficient than conventional voice control.

Although neuromorphic computing is still in its infancy, systems like these will be available on the market in just a few years. When applied on scale throughout a vehicle, they have the potential to radically reduce the energy needed to run the latest AI technologies.”

Locked away in a pdf document and made available for ongoing public consumption. Brainchip gives all the appearance of being part of the furniture at Mercedes Benz.

My opinion only DYOR
FF

AKIDA BALLISTA
Whilst a great article, it was published in January 2022.
 
  • Like
Reactions: 4 users

Boab

I wish I could paint like Vincent
From the 18 page Mercedes Benz EQXX pdf which is part of @Pmel ‘s
post above:

“Neuromorphic computing – a car that thinks like you
Another key efficiency feature of the VISION EQXX that takes its cue from nature is the way it thinks. It uses an innovative form of information processing called neuromorphic computing. The hardware runs so-called spiking neural networks. Information is coded in discrete spikes and energy is only consumed when a spike occurs, which reduces energy consumption by orders of magnitude.

Working with California-based artificial intelligence experts BrainChip, Mercedes-Benz engineers developed systems based on BrainChip’s Akida hardware and software. The example in the VISION EQXX is the “Hey Mercedes” hot-word detection. Structured along neuromorphic principles, it is five to ten times more efficient than conventional voice control.

Although neuromorphic computing is still in its infancy, systems like these will be available on the market in just a few years. When applied on scale throughout a vehicle, they have the potential to radically reduce the energy needed to run the latest AI technologies.”

Locked away in a pdf document and made available for ongoing public consumption. Brainchip gives all the appearance of being part of the furniture at Mercedes Benz.

My opinion only DYOR
FF

AKIDA BALLISTA
I also liked this bit..

By making efficiency the new currency, Mercedes-Benz has created a common denominator for quantifying
technological development across the board – beyond fuel efficiency alone. As well as meaning more range
from less energy, it also means more tangible luxury and convenience with less impact on nature, and more
electric mobility with less waste.
 
  • Like
  • Fire
  • Love
Reactions: 12 users

Fox151

Regular
Why are people getting all excited about the release of Chinese smartphones?? "We don't need china".
Also. I for one see the upside of not posting updates to the ASX. It means the hot crapper crew don't see them. If most of the shorts are living there, I don't mind them being the last to know.
 
  • Like
Reactions: 4 users

jtardif999

Regular
Prophesee already have the metavision sensor from Sony. The metavision sensor will be installed in phones with Qualcomm's Snapdragon SoC.

Qualcomm's SoC will require a processor for the Prophesee metavision sensor. Will either be an in-house Qualcomm processor as has been mentioned here and/or SynSense (unlikely) or Akida (likely).

Another way around it is for the foundry to have the licence agreement.
Your logic (and persistence) is starting to win me over to thinking there could be something in this 😎
 
  • Like
  • Haha
  • Fire
Reactions: 12 users
Whilst a great article, it was published in January 2022.
And herein lies the point. On a Mercedes Benz public social media site Mercedes’ Benz on 3 March, 2023 Mercedes Benz says:

“Did you know the aim for the VISION EQXX was not to build yet another show car? The actual mission was to develop a technology programme that would bring innovative solutions into series production faster than ever. Learn more: http://mb4.me/the-vision-eqxx

When you click on the link it takes the public to the pdf containing these quotes.

You do not have to be Alex the Rocket Scientist as Blind Freddie says to understand that this means Mercedes Benz is adopting this earlier article as a current and correct statement of the facts contained therein.

In other words what we said then is still current and is the program we are following to bring innovative solutions into series production.

It could not be clearer Brainchip and Mercedes Benz are still engaged together for the purpose of bringing neuromorphic computing solutions into series production.

My opinion only DYOR
FF

AKIDA BALLISTA
 
  • Like
  • Love
  • Fire
Reactions: 76 users

IloveLamp

Top 20


Screenshot_20230303_080015_LinkedIn.jpg
 
  • Like
  • Fire
Reactions: 12 users

TopCat

Regular
Your logic (and persistence) is starting to win me over to thinking there could be something in this 😎
Some more to think about 😀

Amon told CNBC’s Karen Tso and Arjun Kharpal. “For you to make that happen, you can’t run everything in a data center, you’re going to have to bring the AI to the devices.”

Large-language models will be generated entirely within smartphones, he said, meaning that they will be able to work without being connected to the internet.

“The ability to create that much processing power in a smartphone and run that without compromising the battery life is something that only Qualcomm can do,” he claimed.

 
  • Like
  • Fire
  • Love
Reactions: 24 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
There is absolutely no comparison between Tesla autonomous or FSD a Mercedes…Tesla are light years ahead, you can find a heap of comparisons on YouTube. One difference is Mercedes only works on pre mapped roads, Tesla on any road. Tesla do have a way to go before FSD is the real deal but they are light years ahead of all the competition with autonomous

Screen Shot 2023-03-02 at 5.30.3.png


 
  • Like
  • Fire
Reactions: 14 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
Will Mercedes mandate the use of Akida with Luminar?

I'm going to say "yes".
 
Last edited:
  • Like
  • Haha
  • Love
Reactions: 12 users

buena suerte :-)

BOB Bank of Brainchip
Last edited:
  • Like
Reactions: 4 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
From the 18 page Mercedes Benz EQXX pdf which is part of @Pmel ‘s
post above:

“Neuromorphic computing – a car that thinks like you
Another key efficiency feature of the VISION EQXX that takes its cue from nature is the way it thinks. It uses an innovative form of information processing called neuromorphic computing. The hardware runs so-called spiking neural networks. Information is coded in discrete spikes and energy is only consumed when a spike occurs, which reduces energy consumption by orders of magnitude.

Working with California-based artificial intelligence experts BrainChip, Mercedes-Benz engineers developed systems based on BrainChip’s Akida hardware and software. The example in the VISION EQXX is the “Hey Mercedes” hot-word detection. Structured along neuromorphic principles, it is five to ten times more efficient than conventional voice control.

Although neuromorphic computing is still in its infancy, systems like these will be available on the market in just a few years. When applied on scale throughout a vehicle, they have the potential to radically reduce the energy needed to run the latest AI technologies.”

Locked away in a pdf document and made available for ongoing public consumption. Brainchip gives all the appearance of being part of the furniture at Mercedes Benz.

My opinion only DYOR
FF

AKIDA BALLISTA


Hi Fact Finder, my favourite part would have to be this "when applied at scale throughout a vehicle"...because it doesn't mean when attached to a set of bathroom scales, it means when applied in LARGE quantities! 🥳
 
  • Like
  • Love
  • Fire
Reactions: 26 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
Shot2MERC2.png




New Mercedes E-class sedan will have artificial intelligence, Zoom calls, disco lights​

Mark Phelan
Detroit Free Press



Artificial intelligence and advanced entertainment and information features will adorn the next version of Mercedes’s E-class when the midsize luxury sedan debuts later this year.
Mercedes recently provided a glimpse, showing an elaborately equipped interior while hiding the car’s exterior styling under a shroud.
The 2024 Mercedes E-class sedan's interior will feature up to three screens - including one that's only active when the passenger seat is occupied.


Some of the features are shared with Mercedes new EVs, including the EQS and EQE sedans. Others likely will debut in the E-class, then work their way across the luxury brand’s whole model line.

Window-to-window screens

The dashboard will include as many as three separate screens. A high-def gauge cluster and central touch screen are standard. An optional passenger screen in front of the passenger seat can stream video and allow the passenger to suggest waypoints and new destinations during trips.
The 2024 Mercedes E-class sedan's interior will feature customizable ambient LED lights that can be synced with the Burmeister sound system.


The passenger screen is intended to be visible to the front passenger, but not the driver when dynamic content like a movie is streaming. In addition, a gaze tracking camera makes sure the driver isn’t looking at the passenger screen when the car is moving. The screen displays a decorative pattern, stars on a black background, for instance, when the passenger seat is empty.

Ambient lights ring the dashboard. They can be programmed for a number of colors and effects.
Unfortunately, the interior appears to be short of buttons and dials for features like volume, fan, tuning and temperature.

Google maps, Mercedes design

Mercedes and Google are working on a connected navigation system that puts Google data on geography, traffic points of interest and more into a visual display designed by Mercedes. Google’s database will include more than 200 million businesses around the world, with hours, ratings and reviews. The system will also have access to YouTube, though presumably not visible to the driver when the car is moving.
Mercedes and Google are developing a cloud-based navigation system that will be able to access info on more than 200 million businesses.


Mercedes is also investigating uses for Google’s artificial intelligence system.

Conference calls and selfies

Built-in selfie and video cameras can work with apps ranging from Zoom and Webex to TikTok. Video display pauses when the vehicle is moving.
Among the apps that will be available from launch:
  • Angry Birds
  • TikTok
  • Vivaldi web browser
  • Webex
  • Zoom
The Zync streaming platform will be available for sports, news and other programming.
The 2024 Mercedes E-class sedan's interior will allow apps like Zoon for conference - when it's not moving.


Music you hear, see and feel

Top models will have a 17-speaker Burmeister system that includes transducers in the seat backrests for subsonic vibrations.
The 2024 Mercedes E-class sedan's interior will feature customizable ambient LED lights that can be synced with the Burmeister sound system.


The LED light strip along the top of the instrument panel and doors can be set to respond to music or other audio sources. The changing light is synchronized to the music’s beat and vary with tempo changes. Bass, midrange and treble notes are presented as lights in different places and intensities.
 

Attachments

  • Screen Shot 2023-03-03 at 10.36.22.png
    Screen Shot 2023-03-03 at 10.36.22.png
    112 KB · Views: 60
  • Like
  • Fire
Reactions: 21 users

skutza

Regular
And herein lies the point. On a Mercedes Benz public social media site Mercedes’ Benz on 3 March, 2023 Mercedes Benz says:

“Did you know the aim for the VISION EQXX was not to build yet another show car? The actual mission was to develop a technology programme that would bring innovative solutions into series production faster than ever. Learn more: http://mb4.me/the-vision-eqxx

When you click on the link it takes the public to the pdf containing these quotes.

You do not have to be Alex the Rocket Scientist as Blind Freddie says to understand that this means Mercedes Benz is adopting this earlier article as a current and correct statement of the facts contained therein.

In other words what we said then is still current and is the program we are following to bring innovative solutions into series production.

It could not be clearer Brainchip and Mercedes Benz are still engaged together for the purpose of bringing neuromorphic computing solutions into series production.

My opinion only DYOR
FF

AKIDA BALLISTA
And that is why anyone selling this bad boy is crazy as F!.

IMO and in others opinions here also is why I will hold long and hard and will buy more when funds are available, a few thousand bucks at a time. It's pretty clear that Mercedes will be using Brainchip.
1677800005869.png






So does anyone believe that Sean won't put this badboy into a trading halt when Mercedes comes to the table. The more time passes the the smaller the window becomes before the halt appears. Is is tomorrow, 1 week, 1 month? I have no idea. But what I am 100% sure about is what will happen when it does, and know that I am in the passenger seat of the Brainchip rocket ship. Anyone that misses that rocket, bus, boat or whatever you measure it by, just know, in the words of Warren and our good man Tony,

You transfered your wealth to me and the rest of the holders here. And for that I truly thank you :) Actually my family and their family for the next few generations thank you. (unless my daughter marries a loser who is shit at business or gambles heavily, but hey, you get the idea I'm sure.)

Tick tock, tick tock Do you feel lucky punk, well do you?


EDIT: Was just thinking, it's more likely that Mercedes will use a third party license, do you think?
 
  • Like
  • Love
  • Fire
Reactions: 41 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
I think I read somewhere that the EQS has as many as 350 sensors, depending on the equipment.

And from this link #3,159 you can see that Markus Shafer has stated "The plan is to bring most of the elements of the Vision EQXX into real vehicles in a time frame of 2024 to 2025," He also said in the future, Mercedes will “make sure we have custom, standardized chips in the car. Not a thousand different chips.”

So, let's just ponder for a moment what this all means in the context of Markus Shafer also having suggested neuromorphic systems will be applied at scale throughout a vehicle and you'll feel quite warm and fuzzy inside.
 
  • Like
  • Love
  • Fire
Reactions: 30 users
Top Bottom