BRN Discussion Ongoing

**Insert dad joke** :ROFLMAO:

Geez....her name might be hard to forget.
Sorry did I forget to mention someone? 🤡😵‍💫
 
  • Haha
  • Like
Reactions: 4 users
Good morning all,

inspired by a teardown article yesterday, i had a closer look into the specs of the GoPro 10 camera.

And one interesting thing for me is, that they removed the "DSP Group DBMD4" from the GoPro 9 camera.
The DBMD4 is an "Ultra Low Power Always-On Voice Activation for Any Device" Audio DSP pre-processor.

A costumer commented on the GoPro forums, that this was due the fact, that the camera often accidently powering on, low user interest and the chip shortage.


Which leads me to the DSP Group Youtube channel and I would like to hear from someone with more expertise about the benchmarks between AKIDA and DSP Group on keyword spotting:






By the way, DSP Group have been acquired by Synaptics, which have been discussed here as a competitor earlier this year.

All I know is there weren’t any marks on the bench when I finished so it happened after I left. 😎
 
  • Haha
Reactions: 4 users

stuart888

Regular
Just clicking the Youtube search reveals a lot about life.

Some of the top tech people here probably have a nice list. What ya got over there? Just having some fun.

1661817843247.png
 
  • Haha
  • Like
Reactions: 11 users
In my opinion brainchip AKIDA IP isn't in this version but will be in the new MB OS system in 2024.
Which will really be in 2023
 
  • Like
Reactions: 7 users
I am not sure if this was for me or Cardpro but either way Trade Secret or not it makes clear that the ASX can decide that it should be disclosed.

When a company makes an announcement it takes the risk that the ASX will intervene and require the disclosure of information that it wants, needs or has agreed to keep confidential. See 3.1A.2 of Guidance Note 8

Of course in the absence of an announcement if the ASX becomes aware from any source that the company may be sitting on material information it can intervene and force disclosure via the ASX.

My opinion only DYOR
FF

AKIDA BALLISTA
Though we know the ASX is not very proactive and they are barely covering off on their basic duties as it is. A lot of companies are getting away with a lot of different things unfortunately.
 
  • Like
  • Love
Reactions: 4 users

Slade

Top 20
View attachment 15381
Is that a new whitepaper released "Learning how to learn neuromorphic AI at the edge" on BRN website or did i miss its release in the past.
This is a great read. Thanks for posting @Dozzaman1977

Q: BrainChip customers are already deploying smarter endpoints with independent learning and inference capabilities, faster response times, and a lower power budget. Can you give us some real-world examples of how AKIDA is revolutionizing AI inference at the edge?

Peter:
Automotive is a good place to start. Using AKIDA-powered sensors and accelerators, automotive companies can now design lighter, faster, and more energy efficient in-cabin systems that enable advanced driver verification and customization, sophisticated voice control technology, and next-level gaze estimation and emotion classification capabilities.

Revolutionizing AI inference at the edge In addition to redefining the automotive in-cabin experience, AKIDA is helping enable new computer vision and LiDAR systems to detect vehicles, pedestrians, bicyclists, street signs, and objects with incredibly high levels of precision. We’re looking forward to seeing how these fast and energy efficient ADAS systems help automotive companies accelerate the rollout of increasingly advanced assisted driving capabilities. In the future, we’d also like to power self-driving cars and trucks. But we don’t want to program these vehicles. To achieve true autonomy, cars and trucks must independently learn how to drive in different environmental and topographical conditions such as icy mountain roads with limited visibility, crowded streets, and fast-moving highways.
Enabling advanced LiDAR with AKIDA Neuromorphic AI inference at the edge With AKIDA, these driving skills can be easily copied and adaptable by millions of self-driving vehicles. This is a particularly important point. AKIDA driving updates will be based on real-world knowledge and skills other cars have learned on the road. We want to avoid boilerplate firmware updates pushed out by engineering teams sitting in cubes. A programmed car lacking advanced learning and inferential capabilities can’t anticipate, understand, or react to new driving scenarios. That’s why BrainChip’s AKIDA focuses on efficiently learning, inferring, and adapting new skills.

Q: Aside from automotive, what are some other multimodal edge use cases AKIDA enables?

Peter:
Smart homes, automated factories and warehouses, vibration monitoring and analysis of industrial equipment, as well as advanced speech and facial recognition applications. AKIDA is also accelerating the design of robots using sophisticated sensors to see, hear, smell, touch, and even taste.​
 
Last edited:
  • Like
  • Love
  • Fire
Reactions: 50 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
In my opinion brainchip AKIDA IP isn't in this version but will be in the new MB OS system in 2024.


Hi @TasTroy77, I'd be interested to know a little more about why you're ruling it out completely.

In the information I posted yesterday, it's clear that Mercedes are taking incremental steps with the introduction of the new operating system. It was reported that "From 2022 to 2023, Mercedes-Benz will be equipped with a lightweight version of the MB.OS operating system on the next generation of new E-class models, before the full version of the MB.OS operating system is launched in 2024". Just because they refer to it as "lightweight" does not necessarily mean it won't include Akida IMO.

On the contrary, I think the signs are all pointing to the inclusion of Akida, particularly when there are references made about advanced voice control.

If we just look at the voice control feature in isolation, another example can be found in the EQS SUV where there are five different equipment packages available. One of the 5 packages is called the Interior Assistant and it includes "the latest iteration of voice control capable of recognizing gestures and it even knows which way the driver is looking thanks to an array of cameras."

The way that I understand it, is "the latest iteration of voice control" is the Vision EQXX's voice control system, which is five to ten times more efficient than conventional voice control, thanks to BrainChip's AKIDA.

Would love to hear your thoughts.




EQE with "advanced voice control"


4pm.png




EQS with "the latest iteration of voice control"

9 am.png



 
  • Like
  • Fire
  • Love
Reactions: 33 users
Though we know the ASX is not very proactive and they are barely covering off on their basic duties as it is. A lot of companies are getting away with a lot of different things unfortunately.
And it stinks. I have seen this type of enforcement across so many areas. When enforcement organisations are failing they try to disguise their failure by selecting high visibility targets that get them publicity.

Brainchip has a high profile thanks to the likes of MF and AFR so the ASX targets them and other high visibility new companies while ignoring the rest and the big boys with massive financial and legal resources to throw at the regulator should it say Boo.

Quite a few years ago now an under resourced NSW Police Fraud Squad was directed not to investigate any fraud under half a million dollars which a couple of years later was increased to one million dollars. Anyone coming to them about a $999,999.99 cent fraud were told to see their local Detectives who had no resources, training or expertise to investigate such crimes. Needless to say if you are a fraudster you sensibly kept your fraud under these limits.

The NSW Police Fraud Squad became known as the departure lounge because the big accounting firms saw a gap in the fraud market and were head hunting their experienced officers to carry out investigations for clients when the fraud did not meet these thresholds. The primary objective then became recovery not prosecution.

The whole ASX regulatory system is flawed.

My opinion only DYOR
FF

AKIDA BALLISTA
 
  • Like
  • Wow
  • Fire
Reactions: 33 users

TechGirl

Founding Member
View attachment 15381
Is that a new whitepaper released "Learning how to learn neuromorphic AI at the edge" on BRN website or did i miss its release in the past.

Hi Dozz,

Yes it's new, great find, I checked out the document properties to find the creation date & it was created on the 24th Aug 2022

See below screenshot.

aaaa.jpg
 
  • Like
  • Fire
  • Love
Reactions: 38 users

Dozzaman1977

Regular
Hi Dozz,

Yes it's new, great find, I checked out the document properties to find the creation date & it was created on the 24th Aug 2022

See below

View attachment 15388

Cheers TechGirl
Chapter 3 is exciting, its all been stated before but hopefully the world is moving in the Akida direction!!!!

Q: With AKIDA’s neuromorphic architecture, BrainChip is enabling the
semiconductor industry to untether edge AI from cloud data centers. This is quite
timely, because conventional AI silicon and cloud-centric inference models aren’t
performing efficiently at the edge, even as the number of edge-enabled IoT
devices are expected to hit 7.8 billion by 2030. Can you elaborate on the notion
of untethering?
Peter: Increasing internet congestion is increasing latency as more edge devices
upload their data. The power consumption and heat production of massive
parallel Von Neumann type processors is also increasing linearly with the
computing power required by AI applications. That’s why untethering edge AI from
the cloud with AKIDA is a critical step to designing faster and more environmentally
sustainable endpoints.
Chapter 3:
Untethering edge AI from cloud data centers
Differentiating intelligent endpoint requirements
Data Center
Server Edge Endpoint
Power intensive
High latency
Huge memory requirement
Big data inference
High bandwidth
Privacy concerns
Power efficient
Ultra low latency
Small memory requirement
Small data inference, one-shot learning
Low bandwidth
On-chip, in-device
Privacy enabling
The Edge

Data centers hosting cloud-based workloads emitted an estimated 600
megatons of greenhouse gases in 2020 alone, more than the consumption of the
entire United Kingdom (GB). Unless something radically changes, data centers
will consume over 20% of the world’s energy by 2050! With its on-chip learning
and low power, high throughput inference capabilities, we believe AKIDA can help
reduce data center carbon emissions by 98% by decentralizing AI processing.
Intelligently analyzing data on-chip will help put an end to the yottabytes of raw,
unprocessed, and mostly irrelevant data sent to cloud data centers by millions of
endpoints, solving the impeding internet congestion problem.

Using image recognition as an example, we can quantify the power savings
enabled by AKIDA’s on-chip capabilities compared to a GPU in today’s data
center. Specifically, AKIDA can efficiently analyze and categorize the 1.2 million
images of the ImageNet dataset with a minimal power budget of 300 milliwatts.
A GPU performing this task consumes up to 300 watts! This huge difference
illustrates why simply scaling down conventional AI hardware to meet the unique
requirements of edge endpoints is insufficient.
 
  • Like
  • Love
  • Fire
Reactions: 32 users

Slade

Top 20
This is a great read. Thanks for posting @Dozzaman1977

Q: BrainChip customers are already deploying smarter endpoints with independent learning and inference capabilities, faster response times, and a lower power budget. Can you give us some real-world examples of how AKIDA is revolutionizing AI inference at the edge?

Peter:
Automotive is a good place to start. Using AKIDA-powered sensors and accelerators, automotive companies can now design lighter, faster, and more energy efficient in-cabin systems that enable advanced driver verification and customization, sophisticated voice control technology, and next-level gaze estimation and emotion classification capabilities.

Revolutionizing AI inference at the edge In addition to redefining the automotive in-cabin experience, AKIDA is helping enable new computer vision and LiDAR systems to detect vehicles, pedestrians, bicyclists, street signs, and objects with incredibly high levels of precision. We’re looking forward to seeing how these fast and energy efficient ADAS systems help automotive companies accelerate the rollout of increasingly advanced assisted driving capabilities. In the future, we’d also like to power self-driving cars and trucks. But we don’t want to program these vehicles. To achieve true autonomy, cars and trucks must independently learn how to drive in different environmental and topographical conditions such as icy mountain roads with limited visibility, crowded streets, and fast-moving highways.
Enabling advanced LiDAR with AKIDA Neuromorphic AI inference at the edge With AKIDA, these driving skills can be easily copied and adaptable by millions of self-driving vehicles. This is a particularly important point. AKIDA driving updates will be based on real-world knowledge and skills other cars have learned on the road. We want to avoid boilerplate firmware updates pushed out by engineering teams sitting in cubes. A programmed car lacking advanced learning and inferential capabilities can’t anticipate, understand, or react to new driving scenarios. That’s why BrainChip’s AKIDA focuses on efficiently learning, inferring, and adapting new skills.

Q: Aside from automotive, what are some other multimodal edge use cases AKIDA enables?

Peter:
Smart homes, automated factories and warehouses, vibration monitoring and analysis of industrial equipment, as well as advanced speech and facial recognition applications. AKIDA is also accelerating the design of robots using sophisticated sensors to see, hear, smell, touch, and even taste.​
Tell me this isn't Valeo's next generation LiDAR: Enabling advanced LiDAR with AKIDA Neuromorphic AI inference at the edge
 
  • Like
  • Love
  • Fire
Reactions: 27 users
Hi Dozz,

Yes it's new, great find, I checked out the document properties to find the creation date & it was created on the 24th Aug 2022

See below screenshot.

View attachment 15388
Yes brand new and I have reported a fault with the security access via my iPhone which prevents me from advancing to the document. The thing I find funny is that Ken the Robot with AKIDA can pass all these select a boat or bus or plane tests they throw up to eliminate robots from accessing. I have also mentioned the humour involved here to the company as well. LOL

Regards
FF


AKIDA BALLISTA
 
  • Like
  • Haha
Reactions: 13 users

Dozzaman1977

Regular
Tell me this isn't Valeo's next generation LiDAR: Enabling advanced LiDAR with AKIDA Neuromorphic AI inference at the edge
Your 100% on the money
 
  • Like
  • Love
  • Fire
Reactions: 17 users
Tell me this isn't Valeo's next generation LiDAR: Enabling advanced LiDAR with AKIDA Neuromorphic AI inference at the edge
Hi @Slade

You are a nice person and as much as I would like to comply with your request I am afraid I just cannot bring myself to tell such a terrible lie.

Regards
FF

AKIDA BALLISTA
 
  • Haha
  • Like
Reactions: 23 users

clip

Regular
Exactly. Nintendo has only the Switch console in program by now. An update is coming 2023. The technical dates of it been leaked earlier this year, no Akida inside. According to the usually well informed games scene a new console will not appear earlier than 2026. When there comes a new console with new in game functionalities, it first needs new games which can handle the new features. Games have a very long development cycle, there is no way to speed this up.
So anything about controller updates is much closer to reality than speculations on a brandnew console coming soon containing Akida IP.

I don't think anyone could say 100% that there won't be Akida inside in the update in 2023. I understand what you're saying about the fact that new games may have to be developed to utilize Akida to the fullest capacity and the development time-frame creating new games is particularly long.

But there's no way of knowing that Akida won't be in the upgrade to assist with other things like AI, self-learnig features, noise reduction, etc, whilst new games are being developed. Just a thought?

Agree with both.

Also, for more potential and functionalities, the new console in my opinion will need to have more sensors like a camera or a microphone, which the update in 2023 likely wont have.
And I also think that new games dont necessarily have to be developed with AKIDA at the beginning.
Maybe Nintendo put AKIDA inside the update console and then try to figure out how it can be used for games in the future.
 
  • Like
  • Fire
  • Thinking
Reactions: 8 users

TechGirl

Founding Member
Cheers TechGirl
Chapter 3 is exciting, its all been stated before but hopefully the world is moving in the Akida direction!!!!

Q: With AKIDA’s neuromorphic architecture, BrainChip is enabling the
semiconductor industry to untether edge AI from cloud data centers. This is quite
timely, because conventional AI silicon and cloud-centric inference models aren’t
performing efficiently at the edge, even as the number of edge-enabled IoT
devices are expected to hit 7.8 billion by 2030. Can you elaborate on the notion
of untethering?
Peter: Increasing internet congestion is increasing latency as more edge devices
upload their data. The power consumption and heat production of massive
parallel Von Neumann type processors is also increasing linearly with the
computing power required by AI applications. That’s why untethering edge AI from
the cloud with AKIDA is a critical step to designing faster and more environmentally
sustainable endpoints.
Chapter 3:
Untethering edge AI from cloud data centers
Differentiating intelligent endpoint requirements
Data Center
Server Edge Endpoint
Power intensive
High latency
Huge memory requirement
Big data inference
High bandwidth
Privacy concerns
Power efficient
Ultra low latency
Small memory requirement
Small data inference, one-shot learning
Low bandwidth
On-chip, in-device
Privacy enabling
The Edge

Data centers hosting cloud-based workloads emitted an estimated 600
megatons of greenhouse gases in 2020 alone, more than the consumption of the
entire United Kingdom (GB). Unless something radically changes, data centers
will consume over 20% of the world’s energy by 2050! With its on-chip learning
and low power, high throughput inference capabilities, we believe AKIDA can help
reduce data center carbon emissions by 98% by decentralizing AI processing.
Intelligently analyzing data on-chip will help put an end to the yottabytes of raw,
unprocessed, and mostly irrelevant data sent to cloud data centers by millions of
endpoints, solving the impeding internet congestion problem.

Using image recognition as an example, we can quantify the power savings
enabled by AKIDA’s on-chip capabilities compared to a GPU in today’s data
center. Specifically, AKIDA can efficiently analyze and categorize the 1.2 million
images of the ImageNet dataset with a minimal power budget of 300 milliwatts.
A GPU performing this task consumes up to 300 watts! This huge difference
illustrates why simply scaling down conventional AI hardware to meet the unique
requirements of edge endpoints is insufficient.

Mind blowing stuff

- Akida 300 milliwatts compared to a GPU performing same task consumes up to 300 watts!

- With its on-chip learning and low power, high throughput inference capabilities, we believe AKIDA can help reduce data center carbon emissions by 98% by decentralizing AI processing.

WOW

Minions Mic Drop GIF
 
  • Like
  • Fire
  • Love
Reactions: 85 users

Learning

Learning to the Top 🕵‍♂️
Consider the Communications possibilities with ARM, MegaChips and LON Systems:

MegaChips

You are here:Home/Smart Building Network Based on HD-PLC/The Smart City/Smart Building Network Based on HD-PLC

The Smart City Blog​


Smart Building Network Based on HD-PLC​

HD-PLC is the new standard for high-speed wireline networks. Based on the IEEE 1901 and ITU-T G.9905 international standards, HD-PLC offers megabit speeds over up to several kilometers of wiring (AC/DC powerlines, twisted-pair, coax, phone lines, etc.). The combination of high-speed, long-distance communication, and physical media flexibility makes it well suited for a wide variety of building automation systems (BAS).
In this article, we review an example smart building network built using HD-PLC. This network can integrate multiple systems found in modern smart buildings, including IP cameras, environmental sensors, security systems, building comfort systems, lighting, and more. Thanks to HD-PLC’s multi-hop technology, any node can act as a repeater to extend network range and robustness. This network can scale to support up to 1024, distributed across the building, using any wiring available.
Illustration of network

Support High-Bandwidth Applications​

Originally developed for multimedia networks in residential applications, HD-PLC supports blazing-fast megabit data rates (up to 240Mbps, PHY) over powerlines and other physical media. It has since been enhanced with industrial-grade robustness mechanisms to ensure reliable operation in smart building systems. These characteristics make it well-suited for IP camera networks.
The left side of the network diagram shows a standard CCTV surveillance application using HD-PLC CCTV-IP cameras. HD-PLC uses IP-based communications (IPv4 or IPv6 multicasting) to encapsulate packets as UDP messages. This means that IP cameras are able to draw power from the main supply and also use it to transfer high-definition video to the central surveillance station. This simplifies the deployment of IP-based CCTV systems, since you don’t need to install new wiring to support the bandwidth required by these cameras.

Develop Faster, Smarter LON Systems​

In 2017, LonMark adopted HD-PLC as the new standard for high-speed wireline networks. Compared to the previous LonWorks solution that was limited to 78kbps over twisted-pair and just 5.4kbps over power line, HD-PLC brings a significant speed advantage to wired control networks. Now system designers can get the cost savings of using existing wiring infrastructure along with the bandwidth they need to build smarter systems.
The right side of the network diagram illustrates a typical LON building automation scenario. The backbone for this network is provided by the main power line, to which a variety of standard LON systems are attached. HD-PLC is fully compatible with LON, and all of the devices are able to behave exactly as they would in any other LON system.
At the top of the diagram is the LON network server (LNS) with SCADA and BMS clients. LonMaker for Windows is another client that one might use in this network to manage installation and maintenance software. These clients are easily connected to the powerline backbone using an HD-PLC/Ethernet adapter.
Connected below the LNS database are HD-PLC LON nodes. These can be connected directly to the powerline, as in Section A. Alternately, they can be connected to an Easylon® HD-PLC router from Gesytec (Sections B and C), which enables seamless translation from HD-PLC LON to other LonWorks channels like the familiar TP/FT-10 physical layer.
In this smart building network example, HD-PLC replaces the existing LON/IP backbones, which require costly Ethernet cabling. By using the same wiring infrastructure used for power distribution, HD-PLC greatly reduces installation cost. It also improves system reliability, since the backbone is now independent of other IT equipment like Ethernet switches.

Simplify Network Deployment and Management​

System integrators appreciate the simplicity of HD-PLC. Its built-in multi-hop technology takes the guesswork out of network planning and design by providing the plug-and-play simplicity of a mesh network. With this technology, the nodes in the network dynamically calculate route cost and select the best route based on link quality. This eliminates bottlenecks and improves robustness, since the network will automatically reroute traffic if any given node fails.
MegaChips HD-PLC solution comes with Network Manager software that makes it easy to configure, monitor, and manage complex networks. The bottom of the network diagram shows how the installation laptop can be connected using an HD-PLC/USB adapter to easily manage the network. Simply plug the HD-PLC/USB adapter into any power outlet, and you’re instantly able to connect to the building automation network.

Getting Started with HD-PLC​

It’s easy to upgrade to megabit data rates; build larger, more robust IIoT networks; and meet new cybersecurity demands with MegaChips’ HD-PLC SoC with Multi-hop. Combining a state-of-the-art analog front-end (AFE) with baseband, physical (PHY), and media access control (MAC) layers into a single compact package, the MegaChips solution enables you to reliably deliver high-speed, bidirectional, IP-based communications over any wiring.
MegaChips provides all the design resources and tools you need to quickly develop and deploy products with HD-PLC. Evaluation kits give you the hardware, software, and documentation to get started. An SDK is also provided with sample firmware and command programs, as well as tools for power control, channel monitoring, and more. And when you’re ready to deploy, MegaChips’ Network Manager makes it easy with an intuitive interface for configuring, monitoring, and managing networks.
Need to get to market even faster? MegaChips has partnered with Gesytec, the leading supplier of LON interfaces, to give you a market-ready HD-PLC module complete with Gesytec’s Easylon Protocol Stack (EPOS).
At the heart of the HD-PLC module is a standard MegaChips EVK running the HD-PLC and LON stacks. To this, Gesytec has connected an ARM Cortex M0-based CPU board to provide additional I/O and run the application-specific code. This simple partitioning enables you to run your application without interfering with device communications. It also gives you the flexibility to choose the right CPU for your application.
The module includes a license for Gesytec’s Easylon IP, a full-featured LON protocol stack that makes integration into LonWorks environments a breeze. Thanks to the proven EPOS LON stack, the HD-PLC LON Platform acts as a standard LON device that can be managed by the LON tools you already know and love, including LonMaker for Windows and other third-party software.
Additional infrastructure devices, such as phase couplers and range extenders, are available as off-the-shelf products from Gesytec and other vendors to facilitate the design of advanced HD-PLC smart building networks.

Ready to Upgrade to HD-PLC?​

Order your evaluation kit today—and discover how easy it is to build smarter, faster, more robust smart building networks with MegaChips.
Portrait of Michael Navid

Michael Navid​

VP, Marketing and Business Development, MegaChips
Michael is an accomplished business executive who has spent the last 15 years working to advance the communications technologies needed to build a smarter planet. The original founder of the G3-PLC Alliance, he was a key contributor in the evolution of G3-PLC as the premier communications technology for smart grids. Today, Michael is applying his experience and energy to bringing the benefits of HD-PLC to smart cities and smart buildings. When he’s not driving technology transformation, you’ll likely find him in one of his vintage cars heading down Pacific Coast Highway in Southern California.
Connect with him on LinkedIn.
Hi FF,
To add to your post.
This is one of my early post in February 2022.

Screenshot_20220830_111914_Chrome.jpg

Screenshot_20220830_113301_Chrome.jpg

It's great to be a shareholder.
 
Last edited:
  • Like
  • Love
  • Fire
Reactions: 30 users

Diogenese

Top 20
Hi @TasTroy77, I'd be interested to know a little more about why you're ruling it out completely.

In the information I posted yesterday, it's clear that Mercedes are taking incremental steps with the introduction of the new operating system. It was reported that "From 2022 to 2023, Mercedes-Benz will be equipped with a lightweight version of the MB.OS operating system on the next generation of new E-class models, before the full version of the MB.OS operating system is launched in 2024". Just because they refer to it as "lightweight" does not necessarily mean it won't include Akida IMO.

On the contrary, I think the signs are all pointing to the inclusion of Akida, particularly when there are references made about advanced voice control.

If we just look at the voice control feature in isolation, another example can be found in the EQS SUV where there are five different equipment packages available. One of the 5 packages is called the Interior Assistant and it includes "the latest iteration of voice control capable of recognizing gestures and it even knows which way the driver is looking thanks to an array of cameras."

The way that I understand it, is "the latest iteration of voice control" is the Vision EQXX's voice control system, which is five to ten times more efficient than conventional voice control, thanks to BrainChip's AKIDA.

Would love to hear your thoughts.




EQE with "advanced voice control"

View attachment 15384



EQS with "the latest iteration of voice control"

View attachment 15387


Hi bravo,

I'm with you on this. I think MB will be keen to get as much field experience in using the new technologies as possible.

When EQXX was first announced, they foreshadowed that optional upgrades would be available in next generation vehicles, and this ties in nicely with the Interior Assistant you mention.

If they are putting in the big screen, I can see no reason why they would hold back the voice assistant (other than component supply).
 
  • Like
  • Love
  • Fire
Reactions: 47 users

buena suerte :-)

BOB Bank of Brainchip

Not sure if already posted ... this vid was put up on the website 7 days ago​


BrainChip demonstrates Regression Analysis with Vibration Sensors​

https://www.youtube.com/embed/6bCTCx7EQUc

In this video, BrainChip demonstrates regression analysis using vibration sensors in a live demonstration as well as numerous use cases for preventative maintenance, safety and monitoring.
 
  • Like
  • Love
  • Fire
Reactions: 29 users
Consider the possibilities:

VALEO GROUP | 14 JUN, 2022 | 5 MIN

Valeo’s third generation LiDAR chosen by Stellantis for its level 3 automation capability​



Stellantis has chosen Valeo's third-generation LiDAR to equip multiple models of its different automotive brands from 2024. The Valeo SCALA 3 LiDAR will enable these vehicles to be certified for level 3 automation, allowing drivers to safely take their hands off the steering wheel and their eyes off the road.



As Yves Bonnefont, Chief Software Officer and member of Stellantis’ Top Executive Team, explains: “What sets cars apart from others today is the driving experience they offer. Thanks to our L3 autonomous driving solution leveraging Valeo’s latest generation LiDAR, we will offer a more enjoyable driving experience and give back time to the driver during their journeys.
Marc Vrecko, President of Valeo’s Comfort & Driving Assistance Systems Business Group, commented: “A new chapter in driving assistance systems is being written with our partners at Stellantis. Level 3 vehicle automation can only be achieved with LiDAR technology. Without it, some objects cannot be detected. Yet at this level of autonomy, the system’s perception capabilities must be extremely precise. Our third-generation Valeo SCALA LiDAR offers a resolution nearly 50 times that of the second-generation device. The technology comes with unique data collection features, allowing Stellantis to pave the way for new vehicle experiences.
Valeo’s third-generation LiDAR sees everything, even when an object is far ahead, invisible to the human eye. It can identify objects more than 150 meters ahead that neither the human eye, cameras nor radars can, such as a small object with very low reflectivity, for example a tire lying in the road. It recreates a 3D image of the vehicle’s surroundings using a point cloud, with an as yet unparalleled resolution for an automotive system. It can map the ground topology and detect road markings.
pixeled view of a two cars, 2 motorcycle and scooter in the street

Valeo LiDAR also features embedded high performance software based on artificial intelligence algorithms, which enables it to steer the vehicle’s trajectory, anticipating obstacle-free zones on the road ahead. It can self-diagnose and triggers its cleaning system when its field of vision is obstructed. Like all Valeo LiDAR, the technology is automotive grade, meaning that the data it generates remain fully reliable and accurate in all usage and weather conditions (from -40 up to +85°C). It stands as the key component in a sensor system enabling vehicles to achieve approval for SAE conditional driving automation (level 3), meeting the legal requirements of UN-R157.
Valeo’s third generation LiDAR makes driving safer and gives time back to the driver in bothersome driving situations, such as when traveling at low or medium speed in heavy traffic. These challenges are central to the partnership between Stellantis and Valeo. Through its data collection capabilities, this LiDAR will enable new services to be offered to Stellantis’ customers.
Valeo is already world number one in advanced driver assistance systems (ADAS), equipping one in three new cars globally with its technologies. It was the first, and to date remains the only, company to produce an automotive LiDAR scanner on an industrial scale. More than 170,000 units have been produced and the technology is protected by more than 500 patents. The Group intends to accelerate even further in this field, as announced in February 2022 with the launch of its Move Up plan, the value creation strategy at the heart of the four megatrends disrupting mobility (electrification, ADAS, reinvention of the interior experience and lighting
 
  • Like
  • Fire
  • Love
Reactions: 38 users
Top Bottom