BRN Discussion Ongoing

wilzy123

Founding Member
I think you are reading lot more into the "Trolls" than you need. I run a successful business and if I was to ask for my staff to use social media to post a meeting I had with someone and they then complied but put up a photo of an entirely different person, I would be well pissed. Yes, we all make typos and mistakes, but it doesn't mean we don't try better next time. You have posted maybe 3 or 4 times in the recent past about this issue, but all have been issues. How many is okay?

Will it stop investors? no. (Does that make it okay? no)
Will it stop a partnership? no (Does that make it okay? no)
Is it the first time of a stupid error? No

Was the podcast a good endorsement for BRN, Yes.

If I was Sean, i would be having a chat to whoever the error was made by. Sorry, just not professional.

TBH, I'd rather read about CES than see another 50 shitposts about an error.
 
  • Like
  • Fire
  • Love
Reactions: 23 users

skutza

Regular
Should I rage-sell all of my shares? yes or no
With a stupid comment like that, yes, you should never invest if that's how you think.
 
  • Like
  • Love
Reactions: 4 users

skutza

Regular
TBH, I'd rather read about CES than see another 50 shitposts about an error.
Well lucky mine was a good post then, I'd hate to have given you a shit one. 😛
 
  • Like
  • Haha
Reactions: 3 users

Damo4

Regular
With a stupid comment like that, yes, you should never invest if that's how you think.
Ok, thanks for the financial advice.
Going by the picture.
View attachment 54052
View attachment 54053
I believe we will be a podcast with Spencer Huang front Edge Impulse 😀. Just me speculation 🥴🥴🥴

View attachment 54056

Learning 🪴

Ooh great find.
I think in the last one yesterday, they said it was the final Podcast from the 9th (Vegas Time)
I presume it take nearly a day to edit and post to socials, but we could be in for quite a few each day!
 
  • Like
  • Fire
Reactions: 7 users

Iseki

Regular
Memories light the corners of my mind. 🎶

Originally Posted by Daimler Media Center 4 Jan 2022



EXTRACT ONLY


View attachment 54057




Funny that, because in the CLA Class it's all about efficiency too.

I wonder what Gerrit Ecke meant when he said the CLA Class carries the genes of the Vision EQXX?

View attachment 54060



View attachment 54059
#15Minutes looks a bit optimistic.
 

skutza

Regular
#15Minutes looks a bit optimistic.
He means 15 minutes of charging gives you 400k, not you can travel 400km in 15 minutes. But, I hope you already knew that.
 
  • Haha
  • Like
Reactions: 3 users
Should I rage-sell all of my shares? yes or no
1705023544113.gif
 
  • Haha
Reactions: 4 users

DK6161

Regular

Iseki

Regular
He means 15 minutes of charging gives you 400k, not you can travel 400km in 15 minutes. But, I hope you already knew that.
'Doh!
Okay, I get it now!
 

wilzy123

Founding Member
^^ speaking of shitposts, @Iseki and @DK6161, I can't recall seeing this picture posted from CES showing both visual wake word and keyword spotting demo's - a collaborative effort between BrainChip and Microchip from here: https://brainchip.com/ces-2024

1.png
 
  • Like
  • Fire
  • Love
Reactions: 43 users
Last edited:
  • Haha
Reactions: 8 users

skutza

Regular
I wonder what the 1st thing they springs to mind when you see this photo?









































View attachment 54065
Nope, my first thought was, looks great, lets hope people in the market think the same and use it!!!!

I'm just not understanding the amount of partnerships we have but still lagging in sales. Lumpy sales, we've seen the low, here's hoping the next 4c is the big lump we are waiting for.
 
  • Like
  • Fire
Reactions: 7 users

Boab

I wish I could paint like Vincent
  • Like
  • Fire
Reactions: 15 users

IloveLamp

Top 20

Screenshot_20240112_124448_LinkedIn~2.jpg
 
  • Like
  • Fire
Reactions: 7 users

TheFunkMachine

seeds have the potential to become trees.
Looking very promising m. Great findings Bravo! With the comments from the Infineon man in the recent podcast of having worked with Brainchip for over a year and how excited he is this makes this theory Within the realms of possibilities!

I did notice however that in the picture below we have SpiNNaker named on the screen who also focus on SNN technology. When was this photo taken in comparison with the timeline of 1 year mentioned by old mate on the podcast?

Could it be that Infineon have been doing research with spiNNaker but have chosen AKIDA for commercial integration into their radar solution?
 
Last edited:
  • Like
  • Fire
Reactions: 8 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
Along with my previous post, I'm convinced. I literally CANNOT see NVIDIA doing this without us.





Yahoo Finance Video

Nvidia partners with Mercedes to pursue self-driving cars​



Nvidia partners with Mercedes to pursue self-driving cars

Scroll back up to restore default view.
Yahoo Finance
Yahoo Finance
Fri, January 12, 2024 at 9:35 AM GMT+11



AI is the talk of CES 2024 (Consumer Electronics Show) in Las Vegas as the top developers and manufacturers unveil what their large-language models can achieve, especially on the road. Yahoo Finance’s Dan Howley sits down with Nvidia Vice President of Automotive Danny Shapiro to discuss the chipmaker's autonomous driving projects.
Nvidia (NVDA) announced a partnership with Mercedes-Benz to hone self-driving capabilities. Starting with an assisted driving feature, Shapiro believes the technology through this partnership will “add greater and greater levels of autonomy” until full self-driving is achieved.
“Safety has got be the top priority” Shapiro insists, and suggests that a timeline for self-driving cars is hard to gauge.
Going beyond in-car technology, Shapiro states that the company’s generative AI is “leveraging the exact same data that is used to build the car,” by offering an automotive configurator that allows consumers to get the actual experience of the car before buying.
Click here to view more of Yahoo Finance's CES 2024 coverage this week, or you can watch this full episode of Yahoo Finance Live here.
Editor's note: This article was written by Eyek Ntekim.

Video Transcript​

DAN HOWLEY: CES 2024 is in full swing. And right now, we're speaking with NVIDIA VP of Automobiles Danny Shapiro. Danny, thank you so much for joining us. I guess, you know, obviously, the big theme at CES 2024 is AI. So how does that fit into NVIDIA's automotive strategy?
DANNY SHAPIRO: That's a great question. We're super excited to be here in the Mercedes-Benz booth. And what we're showing behind me actually is the new CLA concept, which is going to be the first vehicle from Mercedes-Benz with NVIDIA DRIVE inside.
So that's our AI platform for automated driving, driver assistance, all kinds of convenience features.😘 So we basically bringing the type of AI from the cloud that we're used to seeing but bringing it right into the car, processing the sensor data, and making the vehicle much safer to be in.😘

DAN HOWLEY: Yeah, I think that's one of the interesting things to point out, right, is we all talk about generative AI in general. But self-driving cars-- or self-driving technology only exists because of AI.
DANNY SHAPIRO: That's absolutely right. There's a massive amount of data that's being generated from all the cameras on the car, the radar, now LiDAR on this vehicle. And that has to be processed in real time. So that's where NVIDIA comes in, providing those horsepower to take all that data and make sense of it and understand exactly where the lanes are, where the potential hazards are to be able to read signs, detect the lights.
And so we're bringing that out now to make these vehicles safer, to be an assistance feature for them. But they're software updatable vehicles. So over time, we're developing the software with Mercedes and all of our auto customers to be able to then add greater and greater levels of autonomy. And eventually, we'll get to self-driving.
DAN HOWLEY: I guess when it comes to self-driving, is there a thought on when that might come? I know it's always the, you know, I guess, billion dollar question that everybody's kind of banding about. And the early prognosticators had said, oh, it'll be here in no time. But, obviously, it's a little bit longer than that probably.
DANNY SHAPIRO: Absolutely. You know, so this is a challenge we've been working on for well over a decade. It's something I think the entire industry underestimated the complexity. And the reality is safety has to be the top priority. It is for us. It is for so many of our partners. And we need to make sure we get it right.
So while these estimates were put out there initially, we realized we underestimated the complexity. And so we're focused on making sure that before we put anything out on the road, that it's tested and validated for every possible scenario.
So this is where another aspect of NVIDIA comes in. We can use simulation technology for the creation of the AI but also for testing and validating that AI and making sure that in all types of lighting conditions, all kinds of weather conditions, many different scenarios, the kinds of things that don't happen very often, it's hard to train for. So we can use AI and create synthetic data to understand what possibly could happen and make sure the car will react appropriately.
DAN HOWLEY: Is that something similar to the Omniverse, the digital twin kind of idea that NVIDIA has been working on where, yeah, you can build these digital--
DANNY SHAPIRO: Absolutely.
DAN HOWLEY: --versions of objects, factories, things like that?
DANNY SHAPIRO: You're absolutely right. So we really are able to apply this across the entire workflow within the auto industry from the designers that can use Omniverse and create essentially digital versions of the vehicle, right? That's part of the design process. But then using Omniverse, that exact same model becomes part of what goes into the engineering team, then can go into the manufacturing side where we can create a virtual factory, a digital twin of the factory.
It's modeling every aspect of the factory-- the robots, the conveyors, the other employees working inside the factory. We can model all that, optimize it, and make sure it all works before the factory is even built. And then we can even extend that model beyond into the retail side using all that same data to create a virtual retail or showroom experience. People can customize their car; choose different materials, the interior trims, different wheels; and even take it on a virtual test drive.
So all of that simulation then is a very valuable tool throughout that whole workflow. So in addition to testing and validating the AVs, it really applies to all the mechanical, physical, and even sort of retail and service extensions of that whole workflow.
DAN HOWLEY: So I want to ask you about generative AI. Obviously, it was the huge theme of 2023. Still going to be a big theme into 2024. And I want to get your thoughts on how that kind of fits into the automotive side of things for NVIDIA as well.
DANNY SHAPIRO: Yeah, so generative AI, we've just started. And I think the important thing to recognize, it's not just about text in and text out. That's, you know, what ChatGPT kind of started. It has amazing capabilities. There's a lot of room for improvement-- of course, something that's trained on, just a vast array of data. Some of it real. Some of it not-- means the results aren't going to always be accurate.
So what we're doing is putting together tools in place to be able to curate data and be able to make sure that if you're going to talk to your Mercedes, you wanted to make sure it has accurate information. So Mercedes can train this large language model with the history of Mercedes vehicles with all the information about the CLA concept, the manual, the service manual, whatever it is so that when you have a dialogue with that vehicle, it comes back with the right answer.
But beyond that, we're using generative AI for other data streams that we can put text in and imagery out, text in and video out. Could be video in and text out. So there's so many different ways that generative AI can be used.
Imagine, we have an automated vehicle. The front-facing camera is taking in 30 frames a second of video. We can then use a large language model to convert the pixels in that video into an explanation of what's happening in the scene.
So basically, the car can explain to you why it's making certain driving decisions or tell you what's going on in the scene to improve your trust and confidence in the system or provide alerts that mean something other than just the beep
. So there's so many different ways that generative AI is really helping the auto industry from a designer that may do a sketch. And the generative AI will create a 3D model and in different permutations on that. It becomes a copilot for them, an assistant that is able to make their job and their productivity much better.
So it's not going to take their job away. But it's going to make them more productive and create higher quality results. And in the case of all the safety systems, all these tools are going to increase the safety inside the vehicle.
DAN HOWLEY: You know, on the automotive side and almost kind of the automotive consumer side, I know NVIDIA also introduced an automotive configurator. It's kind of the idea of being able to build your car on a company's website but in a more advanced way. So can you just kind of explain that to us?
DANNY SHAPIRO: Sure. So we're leveraging the exact same data that's used to build the car and putting those models into more of a marketing role as opposed to engineering role. We can photorealistically render it.
People can choose all different aspects of their car, kind of create their dream car. Maybe on their PC, they're doing it. Maybe even on a VR headset experience, look all around, see what that vehicle is going to be like to drive. And maybe even using our simulation technology to take it on a virtual test drive in a digital twin of the city that they live in or see what it looks like parked in their driveway.
So generative AI is going to help create all these different kinds of scenes and be able to help the automakers increase, you know, the types of options that are added to the vehicle. Because if people can kind of see it and maybe experience it, we could simulate different features and functions in the car actually working. And people could add that to their cart as they're ordering their vehicle.
DAN HOWLEY: And, you know, one of the things that NVIDIA is highlighting is just the breadth of different automakers that you work with. I guess, how do you see those relationships continuing to grow over time?
DANNY SHAPIRO: Well, you know, we're working with hundreds of automakers, truck makers, robotaxi companies, shuttle companies and then-- so the whole ecosystem, the tier one suppliers, the sensor companies, the mapping companies, a lot of software developers, they're all building on the NVIDIA DRIVE platform. So we've created this open system such that it's not a fixed function but rather a supercomputer that delivers the horsepower, the computing horsepower to run the software that's required today but also with headroom so that it can continue to evolve and develop in the future.
So a year from now, we might be sitting here at CES talking about all kinds of new AI technology that no one's thought of yet. But we'll be able to update the software in the car to add those new features and capabilities.
DAN HOWLEY: Yeah, unfortunately, you can't do that for my '07 Mustang. That's not really very well connected. But Danny Shapiro, VP of automotive at NVIDIA, thank you so much for joining us.
DANNY SHAPIRO: Thanks, Dan. It's great to be with you.

 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 37 users

Damo4

Regular
  • Haha
Reactions: 2 users
^^ speaking of shitposts, @Iseki and @DK6161, I can't recall seeing this picture posted from CES showing both visual wake word and keyword spotting demo's - a collaborative effort between BrainChip and Microchip from here: https://brainchip.com/ces-2024

View attachment 54063
Hi Wilzy
What I find intriguing is the two piles of what look like booklets. I think I can read the titles well enough to say the one in the foreground is devoted to Visual Wakeword. The one further back must then be (Audio) Keyword Spotting. Those who know how this type of demo works. Are they takeaways or just for walking customers through during the demo or do they scan the bar code on the front and get some sort of material but not covered in secret sauce?
Regards
Fact Finder
 
  • Like
  • Fire
Reactions: 14 users
Hi All
Bit surprised no one has commented on the final podcast for day one of CES 2024 between Nandan and Teksun’s founder:



I thought it was pretty encouraging particularly when he disclosed being partnered with Qualcomm and Renesas and that he is very keen on using AKIDA technology because of its capacity to scale and run multiple sensors such as cameras in automotive and security applications.

If you are bored with podcasts quite a few days to go as day one produced five.

My opinion only DYOR
Fact Finder

Hi @Fact Finder He also explained very eloquently the number of steps to get solutions deployed. It really helped me understand the complexities involved in getting it right which is critical to ongoing commercial success. I thought this one was a standout podcast!
Cheers
Maxanne
 
  • Like
  • Love
  • Fire
Reactions: 17 users
Top Bottom