BRN Discussion Ongoing

D

Deleted member 118

Guest
  • Sad
  • Like
  • Haha
Reactions: 3 users

Boab

I wish I could paint like Vincent
  • Like
  • Sad
  • Haha
Reactions: 5 users

Giddy Up Chippas !

IMG_9341.jpeg
 
  • Like
  • Fire
  • Love
Reactions: 8 users

Frangipani

Regular
Morning Frangipani,

May help with leg re-attachment..

Two options ...

😃.

Regards,
Esq.
Thank you so much, both @Esq.111 and @cosors for your kind and helpful suggestions on how to best get our dining table repaired - I was very touched! 😍

By juxtaposing our three-legged table in need of surgery with that sturdy three-legged stool, I had simply intended to amuse and was totally not expecting any expert DIY or TYC (Tell Your Carpenter) advice! 🙏🏻💐

Wow, what an accomplished artist you are, @Esq.111 - those drawings are stunning! Sadly, I myself totally lack talent, when it comes to sketching...

Getting the broken table leg fixed may have to take a back seat, though, as using those earmarked 💶 for topping up more BRN shares at this wobbly share price instead is just too tempting at the moment. Carpe diem.
Luckily, we still have a garden table “on all fours” which can double up as our temporary dining table for a while…

P.S.: As for option 2 - LOL! 🤣
I’ll keep that in the back of my mind in the (unlikely) case of hyperinflation.

In Zimbabwe, where this continues to be a huge problem, investing in cattle is considered a much safer option to protect your money than putting it in a bank. Quite literally cash cows. Does that ring a cowbell?


 
  • Love
  • Haha
  • Like
Reactions: 10 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
Could someone please ring Rob Telson and ask him to give Renee Haas (CEO Arm) a call, to see if Rene would mind giving Masayoshi Son (CEO SoftBank) a call and to remind Son when he is speaking to Sam Altman (CEO Open AI) tomorrow, if he could please quickly mention Shyamal Anadkat from OpenAI's (ChatGPT) recent Linkedin post below, which mentions us?

TIA 😘




View attachment 38620

https://lnkd.in/gEN3NdXt


View attachment 38619
Won't be long now Brain Fam!

Masayoshi Son (CEO SoftBank) said today that Arm is at the beginning of an “explosive” period of growth!


budgies-always-play-it-cool.gif



Screen Shot 2023-06-21 at 12.56.22 pm.png


SoftBank pm.png

 
  • Like
  • Fire
  • Love
Reactions: 48 users
In the course of sales between 11:15am and 12:20pm, I counted 46 individual trades of either 125 or 126. The 125 lots and a 126 lots generally seem to be adjacent to each other. Interesting sort of activity going on there.
 
  • Like
  • Haha
  • Thinking
Reactions: 16 users

buena suerte :-)

BOB Bank of Brainchip
Won't be long now Brain Fam!

Masayoshi Son (CEO SoftBank) said today that Arm is at the beginning of an “explosive” period of growth!


View attachment 38654


View attachment 38652

View attachment 38653
Your happy faces are getting ...very funny! :-/
 
  • Haha
  • Like
Reactions: 5 users

HopalongPetrovski

I'm Spartacus!
In the course of sales between 11:15am and 12:20pm, I counted 46 individual trades of either 125 or 126. The 125 lots and a 126 lots generally seem to be adjacent to each other. Interesting sort of activity going on there.
 
  • Haha
  • Wow
  • Like
Reactions: 7 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
  • Like
  • Haha
  • Love
Reactions: 26 users

ketauk

Emerged
Here is a good video that shows 3 of the leading autonomous driving systems racing from points A to B in San Francisco, really impressive how far we are progressing.



- Waymo (fully autonomous, geolocked, 24/7, HD maps, Lidar, cameras)
- Cruise (fully autonomous, geolocked, restricted time of use, HD maps, Lidar, cameras)
- Tesla FSD (human supervision in autonomous mode, 24/7, use anywhere, cameras)





Interesting short article from the recent Autosens as an insight into the vehicle sensors required.

Also a snippet of a MB presso general statement. That's a shitload of data moving and how much latency is added?

Solutions like NVIDIA GPU AI may be one of the interim / transitional ones (imo) given their overall size, resources and supply ability however, as per a recent prev post of a conference paper, this processing comes at a substantial power cost.

There will come a point where the power consideration becomes a selling point of differentiation and / or an absolute necessity to end users / developers.

That's where we must be ready, developed and at the cutting edge after all this DD, testing etc that's been going on with EAP, continuing partnerships and potential early adopters.



How Many Senses Do You Need To Drive A Car?​

Automotive computing, sensing, and data transport requirements are growing enormously.

JUNE 1ST, 2023 - BY: FRANK SCHIRRMEISTER

The recent AutoSens conference in Detroit left me questioning whether I should turn in my driver’s license. The answer the attending OEMs gave to all the discussions about the advantages of RGB cameras, ultrasound, radar, lidar, and thermal sensors was a unanimous “We probably need all of them in some form of combination” to make autonomy a reality in automotive. Together, these sensors are much better than my eyes and ears.

Technology progress is speedy in Automated Driving Assistance Systems (ADAS) and autonomous driving, but we are not there yet holistically. So I am keeping my license for some more time.

I traveled to Auto City to speak on a panel organized by Ann Mutschler that focused on the design chain aspects, together with Siemens EDA and GlobalFoundries. Ann’s write-up “Automotive Relationships Shifting With Chiplets” summarizes the panel well. The conference was a great experience as the networking allowed talking to the whole design chain from OEMs through Tier 1 system suppliers, Tier 2 semis and software developers, to Tier 3s like us in semiconductor IP. Given that the panel had a foundry, an IP vendor, and an EDA vendor, we quickly focused our discussions on chiplets.

Concerning the sensing aspects, Owl.Ai’s CEO and co-founder, Chuck Gershman, gave an excellent presentation summarizing the problem the industry is trying to solve – 700K annual worldwide pedestrian fatalities, 59% increase in pedestrian deaths in the last decade in the US and 76% of fatalities occurring at night. Government regulations are coming for pedestrian nighttime safety worldwide. Owl.Ai and Flir showcased thermal camera-related technologies, motivated by only 1 out of 23 vehicles passing all tests in a nighttime IIHS test using cameras and radar and on RGB image sensors not being able to see in complete darkness (just like me, I should say, but I am still keeping my driver’s license).


Source: Owl.Ai, AutoSens 2023, Detroit

Chuck nicely introduced the four critical phases of “detection” – is something there – “classification” – is it a person, car, or deer – “range estimation” – what distance in meters is the object – and “acting” – warning the driver or acting automatically. I liked Owl.Ai’s slide above, which shows the various sensing methods’ different use cases and flaws.

And during the discussion I had during the conference, the OEMs agreed that multiple sensors are needed.

Regarding the transition of driving from L3 to L4 robot taxis, Rivian’s Abdullah Zaidi showed the slide below outlining the different needs for cameras, radars, and lidars, and also the compute requirements.


Source: Rivian, AutoSens 2023, Detroit

No wonder automotive is such an attractive space for semiconductors. Computing, sensing, and data transport requirements are just growing enormously. And mind you that the picture above does not mention other cameras for in-cabin monitoring.

Besides the computing requirements, data transport is core to my day-to-day work. In one of his slides, Mercedes-Benz AG’s Konstantin Fichtner presented that the DrivePilot system records 33.73 GB of trigger measurements per minute – 281 times as much as it takes to watch a Netflix 4K stream. That’s a lot of data to transport across networks-on-chips (NoCs), between chips and chiplets. And it, of course, raises the question of on-board vs. off-board processing.

Are we there yet? Not quite, but we are getting closer. On the last day of the conference, Carnegie Mellon University’s Prof. Philip Koopman sobered up the audience with his talk “Defining Safety For Shared Human/Computer Driver Responsibility.” His keynote walked the audience through the accountability dilemma when a highly automated vehicle crashes and made some constructive suggestions on updating state laws. In their recently published essay “Winning the Imitation Game: Setting Safety Expectations for Automated Vehicles.” Prof. Koopman and William H. Widen from the University of Miami School of Law suggest that legislatures amend existing laws to create a new legal category of “computer driver” to allow a plaintiff to make a negligence claim.
To make that exact point, the universe created the situation the next day, which I took a picture of below. Can you see what’s wrong here?


Source: Frank Schirrmeister, May 2023

Yep, a pedestrian ghost in the machine.
To technology’s excuse, there had been a jay-walking pedestrian about 30 seconds ago, which probably erred on the side of caution. But still, this was a good reminder that future sensors are hopefully better than my eyes, and a thermal sensor would have helped here too.

All soberness and glitches aside, let’s not forget the end goal: Reducing fatalities due to traffic situations. And as I joked in my last blog on ISO 26262 safety, How Safe is Safe Enough: “If aliens would arrive and assess how to reduce traffic-related deaths, they would most certainly take humans off the streets.”

Brave new autonomous world, here we come. And I am keeping my license. That 1997 Miata doesn’t drive itself!
 
  • Like
  • Wow
  • Love
Reactions: 8 users

Diogenese

Top 20
  • Haha
Reactions: 4 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
Like this...

TNby.gif
 
  • Haha
  • Like
  • Love
Reactions: 17 users
Wonder if we get a look in with eomotion3D as a processor on their new hook ups and solution to be shown in Sept 23?

They looking at real time dowsiness plus others in a sensor fusion set up with their software stack.

From the partnership blurb...note the situations being used for.




The partnership will allow emotion3D to leverage BrainChip’s technology to achieve an ultra-low-power working environment with on-chip learning while processing everything locally on device within the vehicle to ensure data privacy.

“We are committed to setting the standard in driving safety and user experience through the development of camera-based, in-cabin understanding,” says Florian Seitner, CEO at emotion3D. “In combining our in-cabin analysis software with BrainChip’s on-chip compute, we are able to elevate that standard in a faster, safer and smarter way. This partnership will provide a cascading number of benefits that will continue to disrupt the mobility industry.”

Among some of the situations covered by this optimized driver monitoring functionality are warnings for driver distractions and drowsiness, device personalization, gesture recognition, passenger detection, and more.





emotion3D, Chuhang Technologies and Sleep Advice Technologies Announce Innovative Partnership to Develop Multi-sensor In-cabin Analysis System Leveraging Camera and Radar Technology

June 14, 2023 08:00 AM Eastern Daylight Time

VIENNA--(BUSINESS WIRE)--emotion3D, a leading company for camera-based in-cabin analysis software, Chuhang Tech, a visionary startup developing radar technologies for AD and ADAS applications and SAT, an innovative startup specialized in sleep onset prediction have entered into a collaboration for the development of a multi-sensor in-cabin analysis system.

“We believe that our joint solution, combining radar technology with advanced imaging algorithms, will be a game-changer in drowsiness detection and child presence detection, making our roads safer for everyone. In the meantime, it provides a perfect multi-sensor fusion for smart in-cabin applications.”
Tweet this

Drowsiness detection is a critical aspect of road safety, as fatigue-related accidents account for 10-20% of crashes and near-crashes according to the Mobility & Transport department of the European Commission. Drowsiness is a highly complex state to analyse, therefore, multiple sensors and solutions complementary to each other can be applied to get more accurate results. With the aim of creating the new standard of automotive drowsiness detection, emotion3D, Chuhang Tech and SAT have started their joint work.

Within this collaboration, the companies will develop a multi-sensing solution for drowsiness detection. emotion3D’s human analysis software stack CABIN EYE derives valuable information on the driver from camera images while the radar solutions of Chuhang Tech analyses the driver’s vital signs. These two sensing modalities combined with the sleep onset prediction algorithms of SAT will accurately identify signs of drowsiness in real-time. As a result, the system shall become the benchmark for the next generation of in-cabin monitoring systems, setting a new standard in the industry.

In addition to drowsiness detection, this solution will bring a wide range of additional features required by Euro NCAP such as distraction warnings with the help of emotion3D’s CABIN EYE software and child presence detection enabled by Chuhang Tech’s mmWave radar technology. Moreover, numerous user experience features can be realized such as personalization, intuitive interaction and much more.

“We are working towards Vision Zero within all our projects and partnerships and we are happy to have made a step forward in drowsiness detection together with SAT and Chuhang Tech.” says Florian Seitner, CEO of emotion3D.

Similarly, Riccardo Groppo, CEO and Co-Founder of SAT, expressed his thoughts on the partnership by stating “We are glad to combine our sleep onset prediction algorithm with a non-intrusive sensing technology for vital signs. This is the way forward for automotive drowsiness detection.”

Wogong Zhang, the CTO and Co-Founder of Chuhang Tech mentioned "We believe that our joint solution, combining radar technology with advanced imaging algorithms, will be a game-changer in drowsiness detection and child presence detection, making our roads safer for everyone. In the meantime, it provides a perfect multi-sensor fusion for smart in-cabin applications.”

emotion3D, SAT, and Chuhang Tech are committed to the joint development and successful implementation of this advanced solution. By pooling their resources and expertise, the three companies aim to drive innovation and set new benchmarks in automotive safety, improving the lives of drivers, passengers, and pedestrians.

A first solution will be showcased at IAA Mobility in Munich starting September 5th 2023.
 
  • Like
  • Love
  • Fire
Reactions: 27 users

Labsy

Regular
I am the proud owner of more Brn shares....
 
  • Like
  • Fire
  • Love
Reactions: 41 users

TopCat

Regular
Wonder if we get a look in with eomotion3D as a processor on their new hook ups and solution to be shown in Sept 23?

They looking at real time dowsiness plus others in a sensor fusion set up with their software stack.

From the partnership blurb...note the situations being used for.




The partnership will allow emotion3D to leverage BrainChip’s technology to achieve an ultra-low-power working environment with on-chip learning while processing everything locally on device within the vehicle to ensure data privacy.

“We are committed to setting the standard in driving safety and user experience through the development of camera-based, in-cabin understanding,” says Florian Seitner, CEO at emotion3D. “In combining our in-cabin analysis software with BrainChip’s on-chip compute, we are able to elevate that standard in a faster, safer and smarter way. This partnership will provide a cascading number of benefits that will continue to disrupt the mobility industry.”

Among some of the situations covered by this optimized driver monitoring functionality are warnings for driver distractions and drowsiness, device personalization, gesture recognition, passenger detection, and more.





emotion3D, Chuhang Technologies and Sleep Advice Technologies Announce Innovative Partnership to Develop Multi-sensor In-cabin Analysis System Leveraging Camera and Radar Technology

June 14, 2023 08:00 AM Eastern Daylight Time

VIENNA--(BUSINESS WIRE)--emotion3D, a leading company for camera-based in-cabin analysis software, Chuhang Tech, a visionary startup developing radar technologies for AD and ADAS applications and SAT, an innovative startup specialized in sleep onset prediction have entered into a collaboration for the development of a multi-sensor in-cabin analysis system.



Drowsiness detection is a critical aspect of road safety, as fatigue-related accidents account for 10-20% of crashes and near-crashes according to the Mobility & Transport department of the European Commission. Drowsiness is a highly complex state to analyse, therefore, multiple sensors and solutions complementary to each other can be applied to get more accurate results. With the aim of creating the new standard of automotive drowsiness detection, emotion3D, Chuhang Tech and SAT have started their joint work.

Within this collaboration, the companies will develop a multi-sensing solution for drowsiness detection. emotion3D’s human analysis software stack CABIN EYE derives valuable information on the driver from camera images while the radar solutions of Chuhang Tech analyses the driver’s vital signs. These two sensing modalities combined with the sleep onset prediction algorithms of SAT will accurately identify signs of drowsiness in real-time. As a result, the system shall become the benchmark for the next generation of in-cabin monitoring systems, setting a new standard in the industry.

In addition to drowsiness detection, this solution will bring a wide range of additional features required by Euro NCAP such as distraction warnings with the help of emotion3D’s CABIN EYE software and child presence detection enabled by Chuhang Tech’s mmWave radar technology. Moreover, numerous user experience features can be realized such as personalization, intuitive interaction and much more.

“We are working towards Vision Zero within all our projects and partnerships and we are happy to have made a step forward in drowsiness detection together with SAT and Chuhang Tech.” says Florian Seitner, CEO of emotion3D.

Similarly, Riccardo Groppo, CEO and Co-Founder of SAT, expressed his thoughts on the partnership by stating “We are glad to combine our sleep onset prediction algorithm with a non-intrusive sensing technology for vital signs. This is the way forward for automotive drowsiness detection.”

Wogong Zhang, the CTO and Co-Founder of Chuhang Tech mentioned "We believe that our joint solution, combining radar technology with advanced imaging algorithms, will be a game-changer in drowsiness detection and child presence detection, making our roads safer for everyone. In the meantime, it provides a perfect multi-sensor fusion for smart in-cabin applications.”

emotion3D, SAT, and Chuhang Tech are committed to the joint development and successful implementation of this advanced solution. By pooling their resources and expertise, the three companies aim to drive innovation and set new benchmarks in automotive safety, improving the lives of drivers, passengers, and pedestrians.

A first solution will be showcased at IAA Mobility in Munich starting September 5th 2023.

They are also showcasing at the moment with Behr-Hella at the Brussels InCabin expo. Hopefully we’re involved in some way.

IMG_3197.jpeg
 
  • Like
  • Fire
  • Love
Reactions: 11 users
D

Deleted member 118

Guest
Wasn’t the podcast out today?
 

HopalongPetrovski

I'm Spartacus!
  • Like
  • Haha
Reactions: 4 users
D

Deleted member 118

Guest
Yes, was out this morning. Check earlier in this thread.

Must have been really boring then as no one has quoted anything from it unless I’m wrong



Maybe if they stop the podcasts out sp
Might go up

 
  • Haha
  • Like
Reactions: 6 users

miaeffect

Oat latte lover
Hurry up Akida 2.0
Are we there yet???!!
4rqh63.gif
 
  • Like
  • Haha
Reactions: 5 users
Top Bottom