BRN Discussion Ongoing

Bravo

If ARM was an arm, BRN would be its biceps💪!
Along with my previous post, I'm convinced. I literally CANNOT see NVIDIA doing this without us.





Yahoo Finance Video

Nvidia partners with Mercedes to pursue self-driving cars​



Nvidia partners with Mercedes to pursue self-driving cars

Scroll back up to restore default view.
Yahoo Finance
Yahoo Finance
Fri, January 12, 2024 at 9:35 AM GMT+11



AI is the talk of CES 2024 (Consumer Electronics Show) in Las Vegas as the top developers and manufacturers unveil what their large-language models can achieve, especially on the road. Yahoo Finance’s Dan Howley sits down with Nvidia Vice President of Automotive Danny Shapiro to discuss the chipmaker's autonomous driving projects.
Nvidia (NVDA) announced a partnership with Mercedes-Benz to hone self-driving capabilities. Starting with an assisted driving feature, Shapiro believes the technology through this partnership will “add greater and greater levels of autonomy” until full self-driving is achieved.
“Safety has got be the top priority” Shapiro insists, and suggests that a timeline for self-driving cars is hard to gauge.
Going beyond in-car technology, Shapiro states that the company’s generative AI is “leveraging the exact same data that is used to build the car,” by offering an automotive configurator that allows consumers to get the actual experience of the car before buying.
Click here to view more of Yahoo Finance's CES 2024 coverage this week, or you can watch this full episode of Yahoo Finance Live here.
Editor's note: This article was written by Eyek Ntekim.

Video Transcript​

DAN HOWLEY: CES 2024 is in full swing. And right now, we're speaking with NVIDIA VP of Automobiles Danny Shapiro. Danny, thank you so much for joining us. I guess, you know, obviously, the big theme at CES 2024 is AI. So how does that fit into NVIDIA's automotive strategy?
DANNY SHAPIRO: That's a great question. We're super excited to be here in the Mercedes-Benz booth. And what we're showing behind me actually is the new CLA concept, which is going to be the first vehicle from Mercedes-Benz with NVIDIA DRIVE inside.
So that's our AI platform for automated driving, driver assistance, all kinds of convenience features.😘 So we basically bringing the type of AI from the cloud that we're used to seeing but bringing it right into the car, processing the sensor data, and making the vehicle much safer to be in.😘

DAN HOWLEY: Yeah, I think that's one of the interesting things to point out, right, is we all talk about generative AI in general. But self-driving cars-- or self-driving technology only exists because of AI.
DANNY SHAPIRO: That's absolutely right. There's a massive amount of data that's being generated from all the cameras on the car, the radar, now LiDAR on this vehicle. And that has to be processed in real time. So that's where NVIDIA comes in, providing those horsepower to take all that data and make sense of it and understand exactly where the lanes are, where the potential hazards are to be able to read signs, detect the lights.
And so we're bringing that out now to make these vehicles safer, to be an assistance feature for them. But they're software updatable vehicles. So over time, we're developing the software with Mercedes and all of our auto customers to be able to then add greater and greater levels of autonomy. And eventually, we'll get to self-driving.
DAN HOWLEY: I guess when it comes to self-driving, is there a thought on when that might come? I know it's always the, you know, I guess, billion dollar question that everybody's kind of banding about. And the early prognosticators had said, oh, it'll be here in no time. But, obviously, it's a little bit longer than that probably.
DANNY SHAPIRO: Absolutely. You know, so this is a challenge we've been working on for well over a decade. It's something I think the entire industry underestimated the complexity. And the reality is safety has to be the top priority. It is for us. It is for so many of our partners. And we need to make sure we get it right.
So while these estimates were put out there initially, we realized we underestimated the complexity. And so we're focused on making sure that before we put anything out on the road, that it's tested and validated for every possible scenario.
So this is where another aspect of NVIDIA comes in. We can use simulation technology for the creation of the AI but also for testing and validating that AI and making sure that in all types of lighting conditions, all kinds of weather conditions, many different scenarios, the kinds of things that don't happen very often, it's hard to train for. So we can use AI and create synthetic data to understand what possibly could happen and make sure the car will react appropriately.
DAN HOWLEY: Is that something similar to the Omniverse, the digital twin kind of idea that NVIDIA has been working on where, yeah, you can build these digital--
DANNY SHAPIRO: Absolutely.
DAN HOWLEY: --versions of objects, factories, things like that?
DANNY SHAPIRO: You're absolutely right. So we really are able to apply this across the entire workflow within the auto industry from the designers that can use Omniverse and create essentially digital versions of the vehicle, right? That's part of the design process. But then using Omniverse, that exact same model becomes part of what goes into the engineering team, then can go into the manufacturing side where we can create a virtual factory, a digital twin of the factory.
It's modeling every aspect of the factory-- the robots, the conveyors, the other employees working inside the factory. We can model all that, optimize it, and make sure it all works before the factory is even built. And then we can even extend that model beyond into the retail side using all that same data to create a virtual retail or showroom experience. People can customize their car; choose different materials, the interior trims, different wheels; and even take it on a virtual test drive.
So all of that simulation then is a very valuable tool throughout that whole workflow. So in addition to testing and validating the AVs, it really applies to all the mechanical, physical, and even sort of retail and service extensions of that whole workflow.
DAN HOWLEY: So I want to ask you about generative AI. Obviously, it was the huge theme of 2023. Still going to be a big theme into 2024. And I want to get your thoughts on how that kind of fits into the automotive side of things for NVIDIA as well.
DANNY SHAPIRO: Yeah, so generative AI, we've just started. And I think the important thing to recognize, it's not just about text in and text out. That's, you know, what ChatGPT kind of started. It has amazing capabilities. There's a lot of room for improvement-- of course, something that's trained on, just a vast array of data. Some of it real. Some of it not-- means the results aren't going to always be accurate.
So what we're doing is putting together tools in place to be able to curate data and be able to make sure that if you're going to talk to your Mercedes, you wanted to make sure it has accurate information. So Mercedes can train this large language model with the history of Mercedes vehicles with all the information about the CLA concept, the manual, the service manual, whatever it is so that when you have a dialogue with that vehicle, it comes back with the right answer.
But beyond that, we're using generative AI for other data streams that we can put text in and imagery out, text in and video out. Could be video in and text out. So there's so many different ways that generative AI can be used.
Imagine, we have an automated vehicle. The front-facing camera is taking in 30 frames a second of video. We can then use a large language model to convert the pixels in that video into an explanation of what's happening in the scene.
So basically, the car can explain to you why it's making certain driving decisions or tell you what's going on in the scene to improve your trust and confidence in the system or provide alerts that mean something other than just the beep
. So there's so many different ways that generative AI is really helping the auto industry from a designer that may do a sketch. And the generative AI will create a 3D model and in different permutations on that. It becomes a copilot for them, an assistant that is able to make their job and their productivity much better.
So it's not going to take their job away. But it's going to make them more productive and create higher quality results. And in the case of all the safety systems, all these tools are going to increase the safety inside the vehicle.
DAN HOWLEY: You know, on the automotive side and almost kind of the automotive consumer side, I know NVIDIA also introduced an automotive configurator. It's kind of the idea of being able to build your car on a company's website but in a more advanced way. So can you just kind of explain that to us?
DANNY SHAPIRO: Sure. So we're leveraging the exact same data that's used to build the car and putting those models into more of a marketing role as opposed to engineering role. We can photorealistically render it.
People can choose all different aspects of their car, kind of create their dream car. Maybe on their PC, they're doing it. Maybe even on a VR headset experience, look all around, see what that vehicle is going to be like to drive. And maybe even using our simulation technology to take it on a virtual test drive in a digital twin of the city that they live in or see what it looks like parked in their driveway.
So generative AI is going to help create all these different kinds of scenes and be able to help the automakers increase, you know, the types of options that are added to the vehicle. Because if people can kind of see it and maybe experience it, we could simulate different features and functions in the car actually working. And people could add that to their cart as they're ordering their vehicle.
DAN HOWLEY: And, you know, one of the things that NVIDIA is highlighting is just the breadth of different automakers that you work with. I guess, how do you see those relationships continuing to grow over time?
DANNY SHAPIRO: Well, you know, we're working with hundreds of automakers, truck makers, robotaxi companies, shuttle companies and then-- so the whole ecosystem, the tier one suppliers, the sensor companies, the mapping companies, a lot of software developers, they're all building on the NVIDIA DRIVE platform. So we've created this open system such that it's not a fixed function but rather a supercomputer that delivers the horsepower, the computing horsepower to run the software that's required today but also with headroom so that it can continue to evolve and develop in the future.
So a year from now, we might be sitting here at CES talking about all kinds of new AI technology that no one's thought of yet. But we'll be able to update the software in the car to add those new features and capabilities.
DAN HOWLEY: Yeah, unfortunately, you can't do that for my '07 Mustang. That's not really very well connected. But Danny Shapiro, VP of automotive at NVIDIA, thank you so much for joining us.
DANNY SHAPIRO: Thanks, Dan. It's great to be with you.

 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 37 users

Damo4

Regular
  • Haha
Reactions: 2 users
^^ speaking of shitposts, @Iseki and @DK6161, I can't recall seeing this picture posted from CES showing both visual wake word and keyword spotting demo's - a collaborative effort between BrainChip and Microchip from here: https://brainchip.com/ces-2024

View attachment 54063
Hi Wilzy
What I find intriguing is the two piles of what look like booklets. I think I can read the titles well enough to say the one in the foreground is devoted to Visual Wakeword. The one further back must then be (Audio) Keyword Spotting. Those who know how this type of demo works. Are they takeaways or just for walking customers through during the demo or do they scan the bar code on the front and get some sort of material but not covered in secret sauce?
Regards
Fact Finder
 
  • Like
  • Fire
Reactions: 14 users
Hi All
Bit surprised no one has commented on the final podcast for day one of CES 2024 between Nandan and Teksun’s founder:



I thought it was pretty encouraging particularly when he disclosed being partnered with Qualcomm and Renesas and that he is very keen on using AKIDA technology because of its capacity to scale and run multiple sensors such as cameras in automotive and security applications.

If you are bored with podcasts quite a few days to go as day one produced five.

My opinion only DYOR
Fact Finder

Hi @Fact Finder He also explained very eloquently the number of steps to get solutions deployed. It really helped me understand the complexities involved in getting it right which is critical to ongoing commercial success. I thought this one was a standout podcast!
Cheers
Maxanne
 
  • Like
  • Love
  • Fire
Reactions: 17 users

Boab

I wish I could paint like Vincent
  • Haha
  • Like
Reactions: 5 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
  • Like
  • Haha
  • Fire
Reactions: 15 users

Boab

I wish I could paint like Vincent
Hey @Esq.111 what are the tea leaves saying today? with these extremely low levels of trade.
 
  • Like
Reactions: 3 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
  • Haha
  • Like
  • Love
Reactions: 15 users
I think you are reading lot more into the "Trolls" than you need. I run a successful business and if I was to ask for my staff to use social media to post a meeting I had with someone and they then complied but put up a photo of an entirely different person, I would be well pissed. Yes, we all make typos and mistakes, but it doesn't mean we don't try better next time. You have posted maybe 3 or 4 times in the recent past about this issue, but all have been issues. How many is okay?

Will it stop investors? no. (Does that make it okay? no)
Will it stop a partnership? no (Does that make it okay? no)
Is it the first time of a stupid error? No

Was the podcast a good endorsement for BRN, Yes.

If I was Sean, i would be having a chat to whoever the error was made by. Sorry, just not professional.
I know you want a reply to further bury the value content from today's CES but even Blind Freddie will tell you that if you read my whole post you would have read the words I have informed the company and it is being attended to. Obviously implicit in that statement is a level of disapproval.

As you claim to have read my other posts you will note that I suggested the appropriate course is to note the error on here if you must but send an email bringing the error to the company's attention for correction.

What I am not doing like you is to assume that the company which immediately reacted to the notification by me is just laughing it off and not taking any action to improve the employee, cadet or as I would suspect the IR contractors performance.

What I would like to know from you though as a person who runs a successful business why you have so readily leapt to the conclusion that Brainchip's executive staff are not successful business people who work to improve staff performance and counsel the staff member, contractor or even the cadet from one of the University programs responsible for the error? Have you communicated with Brainchip at an executive level and been informed that they find these errors acceptable and they encourage the making thereof by staff?

I must thank you though for a two things:

1. Confirming as I have said many times typos will not destroy the company nor put at risk partnership relations or frighten away real investors.

2. I have heard it screamed by some and the majority on HC that the CEO should be drawn and quartered for letting these typos go out. Now thanks to you if they scream out such nonsense again we can refer them to your post because you have made clear that successful business people like you allow staff to post things without checking every word beforehand then take corrective action.

My opinion only DYOR
Fact Finder
 
  • Like
  • Fire
  • Haha
Reactions: 28 users

Esq.111

Fascinatingly Intuitive.
Afternoon Boab ,

Presently , I'm baffled to be honest .

Think the algos suppressing us are waiting for the slightest whiff of something ....... anything with $ attached .

Sorry , not much help .



Regards ,
Esq.
 
  • Like
Reactions: 6 users
  • Love
  • Like
Reactions: 2 users

Bravo

If ARM was an arm, BRN would be its biceps💪!

I guess you could always take your concerns to the top and call Daimler Media Centre to get them to retract their false and misleading press releases.

There's a "Contact Us" button on the fist link below so you can talk to someone ASAP becasue this level of false advertising is just not on IMO. 😝




Originally Posted by Daimler Media Center 4 Jan 2022

EXTRACT ONLY


Screenshot 2024-01-12 at 11.43.16 am.png

Screenshot 2024-01-12 at 11.47.13 am.png

media.mbusa.com


VISION EQXX – taking electric range and efficiency to an entirely new level

Range and efficiency are set to define the electric era. Exceptional range will make electric cars suitable for every journey and help to increase overall adoption.
media.mbusa.com




And
Screenshot 2024-01-12 at 2.51.27 pm.png



And the rubbish they've officially told other media outlets should also be vaporised from the interwebs.😝
Screenshot 2024-01-12 at 2.55.18 pm.png

Screenshot 2024-01-12 at 2.53.28 pm.png


 
  • Like
  • Love
  • Haha
Reactions: 28 users

Diogenese

Top 20
From this photo we can see that they have 2 tables set up for demo's plus the table in @wilzy123 post
I see a Panda and a toy car but can't make out the 3rd figure??
View attachment 54067
So the left screen is a picture of RT in the Brainchip suite taken by the camera in the centre of the table while RT takes a photo of the table - a bit like a pair of opposed mirrors where the images vanish into infinity, but the main point is - don't much like the curtains!
 
  • Haha
  • Wow
Reactions: 14 users

Diogenese

Top 20
So the left screen is a picture of RT in the Brainchip suite taken by the camera in the centre of the table while RT takes a photo of the table - a bit like a pair of opposed mirrors where the images vanish into infinity, but the main point is - don't much like the curtains!
Here's a thought:

If an object moving at close to light speed passes between a pair of opposed mirrors, would each reflection show the object in different places at one instant?
 
  • Haha
  • Love
  • Like
Reactions: 4 users

keyeat

Regular
  • Like
Reactions: 3 users

Diogenese

Top 20
  • Haha
  • Fire
  • Thinking
Reactions: 5 users

TECH

Regular
"Hey Akida, Zoom and enhance image"
Just like CSI

Hi All.....It's 100% the little TIGER with it's head turned to the left as we see the photo....don't bother Akida, he's too busy with customers
and potential customers and yes he's a he....named after the father :ROFLMAO::ROFLMAO::ROFLMAO:
 
  • Like
  • Haha
Reactions: 8 users
Here's a thought:

If an object moving at close to light speed passes between a pair of opposed mirrors, would each reflection show the object in different places at one instant?
Would it not still be invisible to the human eye???
 
  • Like
Reactions: 2 users

Worker122

Regular
  • Fire
  • Like
Reactions: 2 users
Top Bottom