BRN Discussion Ongoing

I think that encapsulates the quandary - back in the last millennium, marketing used to talk about customers' wants and customers' needs.

Because the capabilities of Akida are far beyond those of known technology, customers do not know what they want it to do, let alone what they need it to do.

Although I would assume that the customers that don't know what they are looking for (yet), probably are in some part of the market that isn't that close to the edge. Every company that is designing/building sensors or uses certain sensors for specific aspects of their (hardware) product probably is aware of their top most areas to improve their product, e.g. their power budgets, latency, etc.

Companies in different sectors might just be trying to get a foot into this AI/ML thing and probably still need to figure out what they really want/need. At this state they would probably try to avoid mistakes and not to over optimize too much already in the beginning. I assume they would be looking for something more "general purpose" kind of, "you can run all the foundational models and then some" even if the performance could be better.

But maybe Renesas, ARM and the like of could help here until Brainchip becomes a brand in that space, gets recoginition for and/or verification by available end user/BTB products.
 
  • Like
Reactions: 10 users
Interesting short article from the recent Autosens as an insight into the vehicle sensors required.

Also a snippet of a MB presso general statement. That's a shitload of data moving and how much latency is added?

Solutions like NVIDIA GPU AI may be one of the interim / transitional ones (imo) given their overall size, resources and supply ability however, as per a recent prev post of a conference paper, this processing comes at a substantial power cost.

There will come a point where the power consideration becomes a selling point of differentiation and / or an absolute necessity to end users / developers.

That's where we must be ready, developed and at the cutting edge after all this DD, testing etc that's been going on with EAP, continuing partnerships and potential early adopters.



How Many Senses Do You Need To Drive A Car?​

Automotive computing, sensing, and data transport requirements are growing enormously.

JUNE 1ST, 2023 - BY: FRANK SCHIRRMEISTER

The recent AutoSens conference in Detroit left me questioning whether I should turn in my driver’s license. The answer the attending OEMs gave to all the discussions about the advantages of RGB cameras, ultrasound, radar, lidar, and thermal sensors was a unanimous “We probably need all of them in some form of combination” to make autonomy a reality in automotive. Together, these sensors are much better than my eyes and ears.

Technology progress is speedy in Automated Driving Assistance Systems (ADAS) and autonomous driving, but we are not there yet holistically. So I am keeping my license for some more time.

I traveled to Auto City to speak on a panel organized by Ann Mutschler that focused on the design chain aspects, together with Siemens EDA and GlobalFoundries. Ann’s write-up “Automotive Relationships Shifting With Chiplets” summarizes the panel well. The conference was a great experience as the networking allowed talking to the whole design chain from OEMs through Tier 1 system suppliers, Tier 2 semis and software developers, to Tier 3s like us in semiconductor IP. Given that the panel had a foundry, an IP vendor, and an EDA vendor, we quickly focused our discussions on chiplets.

Concerning the sensing aspects, Owl.Ai’s CEO and co-founder, Chuck Gershman, gave an excellent presentation summarizing the problem the industry is trying to solve – 700K annual worldwide pedestrian fatalities, 59% increase in pedestrian deaths in the last decade in the US and 76% of fatalities occurring at night. Government regulations are coming for pedestrian nighttime safety worldwide. Owl.Ai and Flir showcased thermal camera-related technologies, motivated by only 1 out of 23 vehicles passing all tests in a nighttime IIHS test using cameras and radar and on RGB image sensors not being able to see in complete darkness (just like me, I should say, but I am still keeping my driver’s license).


Source: Owl.Ai, AutoSens 2023, Detroit

Chuck nicely introduced the four critical phases of “detection” – is something there – “classification” – is it a person, car, or deer – “range estimation” – what distance in meters is the object – and “acting” – warning the driver or acting automatically. I liked Owl.Ai’s slide above, which shows the various sensing methods’ different use cases and flaws.

And during the discussion I had during the conference, the OEMs agreed that multiple sensors are needed.

Regarding the transition of driving from L3 to L4 robot taxis, Rivian’s Abdullah Zaidi showed the slide below outlining the different needs for cameras, radars, and lidars, and also the compute requirements.


Source: Rivian, AutoSens 2023, Detroit

No wonder automotive is such an attractive space for semiconductors. Computing, sensing, and data transport requirements are just growing enormously. And mind you that the picture above does not mention other cameras for in-cabin monitoring.

Besides the computing requirements, data transport is core to my day-to-day work. In one of his slides, Mercedes-Benz AG’s Konstantin Fichtner presented that the DrivePilot system records 33.73 GB of trigger measurements per minute – 281 times as much as it takes to watch a Netflix 4K stream. That’s a lot of data to transport across networks-on-chips (NoCs), between chips and chiplets. And it, of course, raises the question of on-board vs. off-board processing.

Are we there yet? Not quite, but we are getting closer. On the last day of the conference, Carnegie Mellon University’s Prof. Philip Koopman sobered up the audience with his talk “Defining Safety For Shared Human/Computer Driver Responsibility.” His keynote walked the audience through the accountability dilemma when a highly automated vehicle crashes and made some constructive suggestions on updating state laws. In their recently published essay “Winning the Imitation Game: Setting Safety Expectations for Automated Vehicles.” Prof. Koopman and William H. Widen from the University of Miami School of Law suggest that legislatures amend existing laws to create a new legal category of “computer driver” to allow a plaintiff to make a negligence claim.
To make that exact point, the universe created the situation the next day, which I took a picture of below. Can you see what’s wrong here?


Source: Frank Schirrmeister, May 2023

Yep, a pedestrian ghost in the machine.
To technology’s excuse, there had been a jay-walking pedestrian about 30 seconds ago, which probably erred on the side of caution. But still, this was a good reminder that future sensors are hopefully better than my eyes, and a thermal sensor would have helped here too.

All soberness and glitches aside, let’s not forget the end goal: Reducing fatalities due to traffic situations. And as I joked in my last blog on ISO 26262 safety, How Safe is Safe Enough: “If aliens would arrive and assess how to reduce traffic-related deaths, they would most certainly take humans off the streets.”

Brave new autonomous world, here we come. And I am keeping my license. That 1997 Miata doesn’t drive itself!
 
  • Like
  • Love
  • Fire
Reactions: 24 users
Been wondering about the Renesas tape out and also BRN comments on communications opportunities.

I see that Renesas acquired Celeno Communications out of Israel in Dec 21.

They also acquired Reality AI who is strong in AI algos apparently and Renesas also obviously have a licence and tech partnership with us for SNN.

Just musing if our opp with Renesas could also dovetail into someone like Celeno as a subsidiary with the focus on AIoT and IIoT.

Would keep it all an essentially in house working group :unsure:

Renesas actually have a bit going on entering into the auto radar mkt as well.




The New Convergence: IoT, AI, And 5G Bring Actionable Intelligence To The Factory Floor​



AIoT is driving a shift from centralized, cloud-based architectures to distributed, edge-based designs.

JANUARY 26TH, 2023 - BY: SAILESH CHITTIPEDDI

Last year, I reflected on the Renesas Renaissance in terms of how our long-term growth strategy is positioning the company as a full-spectrum, global technology solutions provider with an extended physical footprint in the U.S., Europe, and China. Thanks to the acquisitions of Intersil, IDT, Dialog Semiconductor, and Celeno we now have expansive design capabilities that surround our embedded processor expertise with four core analog and mixed-signal competencies: sensors and sensor signal conditioning, connectivity, actuation and power management.

In each case, these companies and their engineers and scientists are contributing significantly to our key growth objectives, with overall revenue increasing at a CAGR of 17.7% between 2019 and 2021 and operating margin growing 16.9% over the same period. Perhaps most telling is the fact that, between 2020 and 2022, our infrastructure and industrial businesses grew by 33% and 37%, respectively, while our internet of things (IoT) segment saw an astounding 79% jump (37% organically) in CAGR.

That the industrial IoT (IIoT) marketplace is expanding at such a torrid pace is not surprising. For Renesas it signals an opportunity to accelerate adoption among our customers and ecosystem partners by enabling the convergence of three key technology areas that are maturing at roughly the same time: IoT, 5G connectivity and artificial intelligence (AI). We call this AI IoT, or AIoT, and the trend is driving a shift in how we collect, store, process, distribute, secure and power data in order to turn it into actionable intelligence we can learn from.

Such a sea change entails a move away from centralized, cloud-based architectures to distributed, edge-based designs that use tiny machine learning (ML) nodes like MCUs and MPUs to define the endpoint, accelerate mathematical models and improve the performance of deep neural networks.

While many people might associate AI with futuristic consumer applications like robotic assistants, the fact is that much of the initial impact will be felt in the industrial space
where when IoT endpoint node creation is exploding at an 85% CAGR (2017-2025), yielding an almost unfathomable 73 zetabytes of data, according to IDC. From an applications perspective, these growth lines will open new markets and revenue channels in areas such as predictive maintenance, rapid defect detection, biometric recognition, and asset tracking, to name just a few.


That’s what led us to one of our most recent and significant 2022 acquisitions – Reality AI. The company is especially strong in developing algorithms for the industrial space, which is helping to fulfill our long-term vision of combining advanced signal processing and mathematical modeling with AI to build machine learning models that we can implement on our embedded processors – from 16 to 64 bits.

The Reality AI acquisition is an important component of realizing our AIoT vision, which this year also included investments in companies like Syntiant and Arduino that are extending our reach into more complex use cases, as well as a platinum sponsorship with the Tiny ML Consortium.

For Renesas, those strategic investments are part of a long-term, three-pronged approach to enabling AIoT, which includes access to world-class MCU and MPU devices and a commitment to the broader AIoT ecosystem, where we have more than 200 technology partners. Together with 5G and other forms of wireless connectivity, such as WiFi 6/6E/7 and near-field communications (NFC), we are enabling a fully distributed AIoT network that will revolutionize how the industrial workplace is managed by bringing constant, on-device learning and decision making to the factory floor.
 
  • Like
  • Fire
  • Love
Reactions: 35 users
Sensors Converge happening.at the mo.


I see Prophesee and Oculi (recall that name) speaking.

But also exhibiting are Renesas and Socionext who.

At this year’s event, Socionext will showcase its new automotive radar sensor technology including advanced RF CMOS Sensors for in-vehicle driver and passenger monitoring systems deliver ground-breaking functional and safety benefits
 
  • Like
  • Fire
  • Love
Reactions: 22 users

FlipDollar

Never dog the boys
This thread has become more about every other company, than it is about BRN 😂😂

Still patiently waiting 😉 happy hump day!
 
  • Like
  • Haha
  • Thinking
Reactions: 8 users
D

Deleted member 118

Guest
This thread has become more about every other company, than it is about BRN 😂😂

Still patiently waiting 😉 happy hump day!
This page would be eerily quiet if they never
 
  • Like
  • Haha
Reactions: 6 users

Boab

I wish I could paint like Vincent
  • Like
  • Fire
  • Love
Reactions: 25 users

equanimous

Norse clairvoyant shapeshifter goddess
This thread has become more about every other company, than it is about BRN 😂😂

Still patiently waiting 😉 happy hump day!
Brainchip is essentially for every other company to use to their advantage. 2024-2025 is looking like right time frame to shine
 
  • Like
  • Fire
  • Love
Reactions: 10 users
D

Deleted member 118

Guest
  • Sad
  • Like
  • Haha
Reactions: 3 users

Boab

I wish I could paint like Vincent
  • Like
  • Sad
  • Haha
Reactions: 5 users

Giddy Up Chippas !

IMG_9341.jpeg
 
  • Like
  • Fire
  • Love
Reactions: 8 users

Frangipani

Top 20
Morning Frangipani,

May help with leg re-attachment..

Two options ...

😃.

Regards,
Esq.
Thank you so much, both @Esq.111 and @cosors for your kind and helpful suggestions on how to best get our dining table repaired - I was very touched! 😍

By juxtaposing our three-legged table in need of surgery with that sturdy three-legged stool, I had simply intended to amuse and was totally not expecting any expert DIY or TYC (Tell Your Carpenter) advice! 🙏🏻💐

Wow, what an accomplished artist you are, @Esq.111 - those drawings are stunning! Sadly, I myself totally lack talent, when it comes to sketching...

Getting the broken table leg fixed may have to take a back seat, though, as using those earmarked 💶 for topping up more BRN shares at this wobbly share price instead is just too tempting at the moment. Carpe diem.
Luckily, we still have a garden table “on all fours” which can double up as our temporary dining table for a while…

P.S.: As for option 2 - LOL! 🤣
I’ll keep that in the back of my mind in the (unlikely) case of hyperinflation.

In Zimbabwe, where this continues to be a huge problem, investing in cattle is considered a much safer option to protect your money than putting it in a bank. Quite literally cash cows. Does that ring a cowbell?


 
  • Love
  • Haha
  • Like
Reactions: 10 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
Could someone please ring Rob Telson and ask him to give Renee Haas (CEO Arm) a call, to see if Rene would mind giving Masayoshi Son (CEO SoftBank) a call and to remind Son when he is speaking to Sam Altman (CEO Open AI) tomorrow, if he could please quickly mention Shyamal Anadkat from OpenAI's (ChatGPT) recent Linkedin post below, which mentions us?

TIA 😘




View attachment 38620

https://lnkd.in/gEN3NdXt


View attachment 38619
Won't be long now Brain Fam!

Masayoshi Son (CEO SoftBank) said today that Arm is at the beginning of an “explosive” period of growth!


budgies-always-play-it-cool.gif



Screen Shot 2023-06-21 at 12.56.22 pm.png


SoftBank pm.png

 
  • Like
  • Fire
  • Love
Reactions: 48 users
In the course of sales between 11:15am and 12:20pm, I counted 46 individual trades of either 125 or 126. The 125 lots and a 126 lots generally seem to be adjacent to each other. Interesting sort of activity going on there.
 
  • Like
  • Haha
  • Thinking
Reactions: 16 users

buena suerte :-)

BOB Bank of Brainchip
Won't be long now Brain Fam!

Masayoshi Son (CEO SoftBank) said today that Arm is at the beginning of an “explosive” period of growth!


View attachment 38654


View attachment 38652

View attachment 38653
Your happy faces are getting ...very funny! :-/
 
  • Haha
  • Like
Reactions: 5 users

HopalongPetrovski

I'm Spartacus!
In the course of sales between 11:15am and 12:20pm, I counted 46 individual trades of either 125 or 126. The 125 lots and a 126 lots generally seem to be adjacent to each other. Interesting sort of activity going on there.
 
  • Haha
  • Wow
  • Like
Reactions: 7 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
  • Like
  • Haha
  • Love
Reactions: 26 users

ketauk

Emerged
Here is a good video that shows 3 of the leading autonomous driving systems racing from points A to B in San Francisco, really impressive how far we are progressing.



- Waymo (fully autonomous, geolocked, 24/7, HD maps, Lidar, cameras)
- Cruise (fully autonomous, geolocked, restricted time of use, HD maps, Lidar, cameras)
- Tesla FSD (human supervision in autonomous mode, 24/7, use anywhere, cameras)





Interesting short article from the recent Autosens as an insight into the vehicle sensors required.

Also a snippet of a MB presso general statement. That's a shitload of data moving and how much latency is added?

Solutions like NVIDIA GPU AI may be one of the interim / transitional ones (imo) given their overall size, resources and supply ability however, as per a recent prev post of a conference paper, this processing comes at a substantial power cost.

There will come a point where the power consideration becomes a selling point of differentiation and / or an absolute necessity to end users / developers.

That's where we must be ready, developed and at the cutting edge after all this DD, testing etc that's been going on with EAP, continuing partnerships and potential early adopters.



How Many Senses Do You Need To Drive A Car?​

Automotive computing, sensing, and data transport requirements are growing enormously.

JUNE 1ST, 2023 - BY: FRANK SCHIRRMEISTER

The recent AutoSens conference in Detroit left me questioning whether I should turn in my driver’s license. The answer the attending OEMs gave to all the discussions about the advantages of RGB cameras, ultrasound, radar, lidar, and thermal sensors was a unanimous “We probably need all of them in some form of combination” to make autonomy a reality in automotive. Together, these sensors are much better than my eyes and ears.

Technology progress is speedy in Automated Driving Assistance Systems (ADAS) and autonomous driving, but we are not there yet holistically. So I am keeping my license for some more time.

I traveled to Auto City to speak on a panel organized by Ann Mutschler that focused on the design chain aspects, together with Siemens EDA and GlobalFoundries. Ann’s write-up “Automotive Relationships Shifting With Chiplets” summarizes the panel well. The conference was a great experience as the networking allowed talking to the whole design chain from OEMs through Tier 1 system suppliers, Tier 2 semis and software developers, to Tier 3s like us in semiconductor IP. Given that the panel had a foundry, an IP vendor, and an EDA vendor, we quickly focused our discussions on chiplets.

Concerning the sensing aspects, Owl.Ai’s CEO and co-founder, Chuck Gershman, gave an excellent presentation summarizing the problem the industry is trying to solve – 700K annual worldwide pedestrian fatalities, 59% increase in pedestrian deaths in the last decade in the US and 76% of fatalities occurring at night. Government regulations are coming for pedestrian nighttime safety worldwide. Owl.Ai and Flir showcased thermal camera-related technologies, motivated by only 1 out of 23 vehicles passing all tests in a nighttime IIHS test using cameras and radar and on RGB image sensors not being able to see in complete darkness (just like me, I should say, but I am still keeping my driver’s license).


Source: Owl.Ai, AutoSens 2023, Detroit

Chuck nicely introduced the four critical phases of “detection” – is something there – “classification” – is it a person, car, or deer – “range estimation” – what distance in meters is the object – and “acting” – warning the driver or acting automatically. I liked Owl.Ai’s slide above, which shows the various sensing methods’ different use cases and flaws.

And during the discussion I had during the conference, the OEMs agreed that multiple sensors are needed.

Regarding the transition of driving from L3 to L4 robot taxis, Rivian’s Abdullah Zaidi showed the slide below outlining the different needs for cameras, radars, and lidars, and also the compute requirements.


Source: Rivian, AutoSens 2023, Detroit

No wonder automotive is such an attractive space for semiconductors. Computing, sensing, and data transport requirements are just growing enormously. And mind you that the picture above does not mention other cameras for in-cabin monitoring.

Besides the computing requirements, data transport is core to my day-to-day work. In one of his slides, Mercedes-Benz AG’s Konstantin Fichtner presented that the DrivePilot system records 33.73 GB of trigger measurements per minute – 281 times as much as it takes to watch a Netflix 4K stream. That’s a lot of data to transport across networks-on-chips (NoCs), between chips and chiplets. And it, of course, raises the question of on-board vs. off-board processing.

Are we there yet? Not quite, but we are getting closer. On the last day of the conference, Carnegie Mellon University’s Prof. Philip Koopman sobered up the audience with his talk “Defining Safety For Shared Human/Computer Driver Responsibility.” His keynote walked the audience through the accountability dilemma when a highly automated vehicle crashes and made some constructive suggestions on updating state laws. In their recently published essay “Winning the Imitation Game: Setting Safety Expectations for Automated Vehicles.” Prof. Koopman and William H. Widen from the University of Miami School of Law suggest that legislatures amend existing laws to create a new legal category of “computer driver” to allow a plaintiff to make a negligence claim.
To make that exact point, the universe created the situation the next day, which I took a picture of below. Can you see what’s wrong here?


Source: Frank Schirrmeister, May 2023

Yep, a pedestrian ghost in the machine.
To technology’s excuse, there had been a jay-walking pedestrian about 30 seconds ago, which probably erred on the side of caution. But still, this was a good reminder that future sensors are hopefully better than my eyes, and a thermal sensor would have helped here too.

All soberness and glitches aside, let’s not forget the end goal: Reducing fatalities due to traffic situations. And as I joked in my last blog on ISO 26262 safety, How Safe is Safe Enough: “If aliens would arrive and assess how to reduce traffic-related deaths, they would most certainly take humans off the streets.”

Brave new autonomous world, here we come. And I am keeping my license. That 1997 Miata doesn’t drive itself!
 
  • Like
  • Wow
  • Love
Reactions: 8 users

Diogenese

Top 20
  • Haha
Reactions: 4 users

buena suerte :-)

BOB Bank of Brainchip
  • Like
  • Love
Reactions: 3 users
Top Bottom