supersonic001
Regular
Chris Stevens on LinkedIn: It's great to be working with Edge Impulse - awesome partner with many…
It's great to be working with Edge Impulse - awesome partner with many exciting things in the works!
Hi Jesse,A blog on the Brainchip website, take note of the language used.
![]()
Designing Smarter and Safer Cars with Essential AI
Essential AI shaping car design. BrainChip revolutionizes automotive technology. Explore our page for insights.brainchip.com
“Conventional AI silicon and cloud-centric inference models do not perform efficiently at the automotive edge. As many semiconductor companies have already realized, latency and power are two primary issues that must be effectively addressed before the automotive industry can manufacture a new generation of smarter and safer cars. To meet consumer expectations, these vehicles need to feature highly personalized and responsive in-cabin systems while supporting advanced assisted driving capabilities.
That’s why automotive companies are untethering edge AI functions from the cloud – and performing distributed inference computation on local neuromorphic silicon using BrainChip’s AKIDA. This Essential AI model sequentially leverages multiple AKIDA-powered smart sensors and AKIDA AI SoC accelerators to efficiently capture and analyze inference data within designated regions of interest or ROI (samples within a data set that provide value for inference).
![]()
In-Cabin Experience
According to McKinsey analysts, the in-cabin experience is poised to become one of the most important differentiators for new car buyers. With AKIDA, automotive companies are designing lighter, faster, and more energy efficient in-cabin systems that can act independent of the cloud. These include advanced facial detection and customization systems that automatically adjust seats, mirrors, infotainment settings, and interior temperatures to match driver preferences.
AKIDA also enables sophisticated voice control technology that instantly responds to commands – as well as gaze estimation and emotion classification systems that proactively prompt drivers to focus on the road. Indeed, the Mercedes-Benz Vision EQXX features AKIDA-powered neuromorphic AI voice control technology which is five to ten times more efficient than conventional systems.
Neuromorphic silicon – which processes data with efficiency, precision, and economy of energy – is playing a major role in transforming vehicles into transportation capsules with personalized features and applications to accommodate both work and entertainment. This evolution is driven by smart sensors capturing yottabytes of data from the automotive edge to create holistic and immersive in-cabin experiences.
Assisted Driving – Computer Vision
In addition to redefining the in-cabin experience, AKIDA allows advanced driver assistance systems (ADAS) such as computer vision to detect vehicles, pedestrians, bicyclists, signs, and objects with incredibly high levels of precision.
Specifically, pairing two-stage object detection inference algorithms with local AKIDA silicon enables computer vision systems to efficiently perform processing in two primary stages – at the sensor (inference) and AI accelerator (classification). Using this paradigm, computer vision systems rapidly and efficiently analyze vast amounts of inference data within specific ROIs.
Intelligently refining inference data eliminates the need for compute heavy hardware such as general-purpose CPUs and GPUs that draw considerable amounts of power and increase the size and weight of computer vision systems. It also allows ADAS to generate incredibly detailed real-time 3D maps that help drivers safely navigate busy roads and highways.
Assisted Driving – LiDAR
Automotive companies are also leveraging a sequential computation model with AKIDA-powered smart sensors and AI accelerators to enable the design of new LiDAR systems. This is because most LiDAR typically rely on general-purpose GPUs – and run cloud-centric, compute-heavy inference models that demand a large carbon footprint to process enormous amounts of data.
Indeed, LiDAR sensors typically fire 8 to 108 laser beams in a series of cyclical pulses, each emitting billions of photons per second. These beams bounce off objects and are analyzed to identify and classify vehicles, pedestrians, animals, and street signs. With AKIDA, LiDAR processes millions of data points simultaneously, using only minimal amounts of compute power to accurately detect – and classify – moving and stationary objects with equal levels of precision.
Limiting inference to a ROI helps automotive manufacturers eliminate compute and energy heavy hardware such as general-purpose CPUs and GPUs in LiDAR systems – and accelerate the rollout of advanced assisted driving capabilities.
Essential AI
Scalable AKIDA-powered smart sensors bring common sense to the processing of automotive data – freeing ADAS and in-cabin systems to do more with less by allowing them to infer the big picture from the basics. With AKIDA, automotive manufacturers are designing sophisticated edge AI systems that deliver immersive end-user experiences, support the ever-increasing data and compute requirements of assisted driving capabilities, and enable the self-driving cars and trucks of the future.
To learn more about how BrainChip brings Essential AI to the next generation of smarter cars, download our new white paper here.
All this and transformers still to come!Hi Jesse,
As you say, "take note of the language used".
The words I like:
"That’s why automotive companies are untethering edge AI functions from the cloud – and performing distributed inference computation on local neuromorphic silicon using BrainChip’s AKIDA. This Essential AI model sequentially leverages multiple AKIDA-powered smart sensors and AKIDA AI SoC accelerators to efficiently capture and analyze inference data within designated regions of interest or ROI (samples within a data set that provide value for inference)"
"multiple AKIDA-powered smart sensors and AKIDA AI SoC accelerators"
"With AKIDA, automotive companies are designing lighter, faster, and more energy efficient in-cabin systems that can act independent of the cloud."
"AKIDA also enables sophisticated voice control technology that instantly responds to commands – as well as gaze estimation and emotion classification systems that proactively prompt drivers to focus on the road."
"Assisted Driving – LiDAR
Automotive companies are also leveraging a sequential computation model with AKIDA-powered smart sensors and AI accelerators to enable the design of new LiDAR systems."
Present tense = it's happening now!
[Sorry - I have to stop - I'm getting a nose bleed.]
2nd hand embarrassmentIn the same vein sort of, because it is not Intel related, I noticed that Valeo posted a short video on Linkedin that was a group of happy employees dancing to the song YMCA. Cute and self depricating, but also about them just celebrating a great CES event that they had, backstopped and supported of course by their A.I. technology. That post was pointed out origionally by others here, ...Learning... I believe.
The video is on a Valeo Linkedin post and generated lot's of backslapping and celebrating THEIR moment with some comments and hundreds ofreactions. Many of the
were from Valeo employees.
Note that they were celebrating and thumbs upping THEIR moment. Not ours.....but theirs. And among the actual comments there is one
that simply says, "Akida 1000 ? " Meaning I presume,....is Valeo's A.I. tech powered by and made worth demo-ing by dancing and celebrating because of Akida technology. I ask this poster / commentor who is obviously a shareholder and may be among this TSE crowd,....1) what did you expect Valeo to say to your comment? 2) Did you seriously think you would actually get a response? 3) If not, why say it? 4) What did you hope to accomplish by asking about Akida 1000 during their party? Feel free to respond to those questions.
As far as I'm concerned the commentor crashed their "feel good moment party" and threw a turd in their punchbowl by commenting with the "Akida 1000?" words. A thumbs up reaction would have been fine, but your comment crossed the line simply because it was inappropriate in this case.
I ask this person ....do you visit Brainchip's website? Do you see Valeo listed as a trusted partner? If yes, then why might Brainchip do that, what possible reason would they have?
Perhaps yet another cringeworthy moment for Brainchips management., unfortunately. Regards, .... dippY
All in my opinion
On this part: "latency and power are two primary issues that must be effectively addressed".A blog on the Brainchip website, take note of the language used.
![]()
Designing Smarter and Safer Cars with Essential AI
Essential AI shaping car design. BrainChip revolutionizes automotive technology. Explore our page for insights.brainchip.com
“Conventional AI silicon and cloud-centric inference models do not perform efficiently at the automotive edge. As many semiconductor companies have already realized, latency and power are two primary issues that must be effectively addressed before the automotive industry can manufacture a new generation of smarter and safer cars. To meet consumer expectations, these vehicles need to feature highly personalized and responsive in-cabin systems while supporting advanced assisted driving capabilities.
That’s why automotive companies are untethering edge AI functions from the cloud – and performing distributed inference computation on local neuromorphic silicon using BrainChip’s AKIDA. This Essential AI model sequentially leverages multiple AKIDA-powered smart sensors and AKIDA AI SoC accelerators to efficiently capture and analyze inference data within designated regions of interest or ROI (samples within a data set that provide value for inference).
![]()
In-Cabin Experience
According to McKinsey analysts, the in-cabin experience is poised to become one of the most important differentiators for new car buyers. With AKIDA, automotive companies are designing lighter, faster, and more energy efficient in-cabin systems that can act independent of the cloud. These include advanced facial detection and customization systems that automatically adjust seats, mirrors, infotainment settings, and interior temperatures to match driver preferences.
AKIDA also enables sophisticated voice control technology that instantly responds to commands – as well as gaze estimation and emotion classification systems that proactively prompt drivers to focus on the road. Indeed, the Mercedes-Benz Vision EQXX features AKIDA-powered neuromorphic AI voice control technology which is five to ten times more efficient than conventional systems.
Neuromorphic silicon – which processes data with efficiency, precision, and economy of energy – is playing a major role in transforming vehicles into transportation capsules with personalized features and applications to accommodate both work and entertainment. This evolution is driven by smart sensors capturing yottabytes of data from the automotive edge to create holistic and immersive in-cabin experiences.
Assisted Driving – Computer Vision
In addition to redefining the in-cabin experience, AKIDA allows advanced driver assistance systems (ADAS) such as computer vision to detect vehicles, pedestrians, bicyclists, signs, and objects with incredibly high levels of precision.
Specifically, pairing two-stage object detection inference algorithms with local AKIDA silicon enables computer vision systems to efficiently perform processing in two primary stages – at the sensor (inference) and AI accelerator (classification). Using this paradigm, computer vision systems rapidly and efficiently analyze vast amounts of inference data within specific ROIs.
Intelligently refining inference data eliminates the need for compute heavy hardware such as general-purpose CPUs and GPUs that draw considerable amounts of power and increase the size and weight of computer vision systems. It also allows ADAS to generate incredibly detailed real-time 3D maps that help drivers safely navigate busy roads and highways.
Assisted Driving – LiDAR
Automotive companies are also leveraging a sequential computation model with AKIDA-powered smart sensors and AI accelerators to enable the design of new LiDAR systems. This is because most LiDAR typically rely on general-purpose GPUs – and run cloud-centric, compute-heavy inference models that demand a large carbon footprint to process enormous amounts of data.
Indeed, LiDAR sensors typically fire 8 to 108 laser beams in a series of cyclical pulses, each emitting billions of photons per second. These beams bounce off objects and are analyzed to identify and classify vehicles, pedestrians, animals, and street signs. With AKIDA, LiDAR processes millions of data points simultaneously, using only minimal amounts of compute power to accurately detect – and classify – moving and stationary objects with equal levels of precision.
Limiting inference to a ROI helps automotive manufacturers eliminate compute and energy heavy hardware such as general-purpose CPUs and GPUs in LiDAR systems – and accelerate the rollout of advanced assisted driving capabilities.
Essential AI
Scalable AKIDA-powered smart sensors bring common sense to the processing of automotive data – freeing ADAS and in-cabin systems to do more with less by allowing them to infer the big picture from the basics. With AKIDA, automotive manufacturers are designing sophisticated edge AI systems that deliver immersive end-user experiences, support the ever-increasing data and compute requirements of assisted driving capabilities, and enable the self-driving cars and trucks of the future.
To learn more about how BrainChip brings Essential AI to the next generation of smarter cars, download our new white paper here.
Thank you immensely for that link that's going to keep me busy for some time.Info
View attachment 27006
![]()
Powering Transformative innovation and New Market Entrants: Arm at CES 2023
A look at some of the latest products and solutions built on Arm technology that were on display or announced at CES 20323www.arm.com
Good post, thanks.I would step back and say the following comments:
1. CES 2023 was great success for BRN and it's partners.
2. The potential leads have really increased and our technology is out there.
The pull back in SP is market fighting shorting pumping longs and many other variables out of control to retail investors.
The lack of to revenue is not a result of lack of activity. I think the end product needs to be sold by the end suppliers before we see payments down the line.
I would also add to the few that poke these partners to be humble and let them market ther products weather it's Akida or not.
Not many companies are going to thank IP suppliers for the innovation and they don't need to it's there product they spent the money and that's it. Poking these people and c companies will only slow us and make it harder for BRN employees to market products.
I would say that black outs on info could even be a result of how the share holders will circle the company of the next product and tell them to put Akida on the package lol.
Let's look at Megachips we don't even know who they work for but they grow.
In the end like The CEO says watch the financial numbers. They I believe are comming there is a Lot of activity and soon we will start seeing it roll in.
Let's be professional. I know not many have not been but please to those that poke these companies let them enjoy their moments too they engineered the product also lots of amazing work went in to the whole product.
I’m not selling until at least 4 stock splits on our Full Nasdaq Listing folks. By then we could be 50-100 dollars Or MORE. Cheers Chippers VladMate I reckon we're in good shape while the Fool keeps bagging us. Lookout when they decide to pump our stock and recommend it - that might be time to sell. Don't fret tho, it'll be 10 bucks plus by then![]()
![]()
SiFive
Did you miss us at the 2022 #RISCVSummit? We’ve got you covered! Learn more about our collaboration with
@Intel and the new Horse Creek development board from SiFive’s Jack Kang: