BRN Discussion Ongoing

Consider the possibilities:

VALEO GROUP | 14 JUN, 2022 | 5 MIN

Valeo’s third generation LiDAR chosen by Stellantis for its level 3 automation capability​



Stellantis has chosen Valeo's third-generation LiDAR to equip multiple models of its different automotive brands from 2024. The Valeo SCALA 3 LiDAR will enable these vehicles to be certified for level 3 automation, allowing drivers to safely take their hands off the steering wheel and their eyes off the road.



As Yves Bonnefont, Chief Software Officer and member of Stellantis’ Top Executive Team, explains: “What sets cars apart from others today is the driving experience they offer. Thanks to our L3 autonomous driving solution leveraging Valeo’s latest generation LiDAR, we will offer a more enjoyable driving experience and give back time to the driver during their journeys.
Marc Vrecko, President of Valeo’s Comfort & Driving Assistance Systems Business Group, commented: “A new chapter in driving assistance systems is being written with our partners at Stellantis. Level 3 vehicle automation can only be achieved with LiDAR technology. Without it, some objects cannot be detected. Yet at this level of autonomy, the system’s perception capabilities must be extremely precise. Our third-generation Valeo SCALA LiDAR offers a resolution nearly 50 times that of the second-generation device. The technology comes with unique data collection features, allowing Stellantis to pave the way for new vehicle experiences.
Valeo’s third-generation LiDAR sees everything, even when an object is far ahead, invisible to the human eye. It can identify objects more than 150 meters ahead that neither the human eye, cameras nor radars can, such as a small object with very low reflectivity, for example a tire lying in the road. It recreates a 3D image of the vehicle’s surroundings using a point cloud, with an as yet unparalleled resolution for an automotive system. It can map the ground topology and detect road markings.
pixeled view of a two cars, 2 motorcycle and scooter in the street

Valeo LiDAR also features embedded high performance software based on artificial intelligence algorithms, which enables it to steer the vehicle’s trajectory, anticipating obstacle-free zones on the road ahead. It can self-diagnose and triggers its cleaning system when its field of vision is obstructed. Like all Valeo LiDAR, the technology is automotive grade, meaning that the data it generates remain fully reliable and accurate in all usage and weather conditions (from -40 up to +85°C). It stands as the key component in a sensor system enabling vehicles to achieve approval for SAE conditional driving automation (level 3), meeting the legal requirements of UN-R157.
Valeo’s third generation LiDAR makes driving safer and gives time back to the driver in bothersome driving situations, such as when traveling at low or medium speed in heavy traffic. These challenges are central to the partnership between Stellantis and Valeo. Through its data collection capabilities, this LiDAR will enable new services to be offered to Stellantis’ customers.
Valeo is already world number one in advanced driver assistance systems (ADAS), equipping one in three new cars globally with its technologies. It was the first, and to date remains the only, company to produce an automotive LiDAR scanner on an industrial scale. More than 170,000 units have been produced and the technology is protected by more than 500 patents. The Group intends to accelerate even further in this field, as announced in February 2022 with the launch of its Move Up plan, the value creation strategy at the heart of the four megatrends disrupting mobility (electrification, ADAS, reinvention of the interior experience and lighting
 
  • Like
  • Fire
  • Love
Reactions: 38 users
The only problem with that interpretation is that Renesas is paying 800,000 approximately and using AKIDA in intelligent MCU’s which will in their lifetime sell in the billions of units for 20 to 30 cents a piece and one billion times 1 cent is $10 million yet five times 800,000 is 4 million.

My opinion only DYOR
FF

AKIDA BALLISTA
mmm.... as always a good point by FF.
Ill take your interpretation.

I love the Megachips BOSCH venture too.
I must be a little slow, I didnt realise BOSCH were into smartphones.
Seans comments re we are talking to a telco, could it have been BOSCH who were directed to Megachips to meet their needs without 'blowing cover'.
Either way; insert name of teclo associated with BRN in the future here...... and Falling Knife will be smug as
 
Last edited:
  • Like
  • Love
  • Fire
Reactions: 15 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
In this extract, Paul Karazuba, vice president of marketing at Expedera, discusses power consumption as being the major issue to resolve in terms of reaching L5 full autonomy. He reckons it will require between 1000 to 3000 TOPS for a vehicle to reach Level 5 full autonomy.

We all know that reducing power consumption is right up Aida's alley. I wonder how many AKIDA's would be required to achieve 1000 TOPS hence solving Paul's problem?



EXTRACT

Putting all of these new pieces together takes time, however. Paul Karazuba, vice president of marketing at Expedera, believes much needs to be done to get to full autonomy, starting with much faster and more energy-efficient processing of an enormous amount of streaming data. “Simply from a hardware perspective, the platforms that would be required to deploy L5 in a car don’t exist, and nor do today’s solutions realistically scale to meet L5,” he said. “L5 is going to require a certain amount of AI processing. It’s probably going to be, depending on who you talk to, between 1,000 and 3,000 tera operations per second (TOPS) of processing power. Today’s most advanced solutions for AI processing that one would find in a car is about 250 TOPS.


Fig. 2: The role of artificial intelligence in self-driving vehicles. Source: Expedera

Fig. 2: The role of artificial intelligence in self-driving vehicles. Source: Expedera


At that level, says Karazuba, the processor consumes about 75 watts of power, which creates significant scaling problems. “If you look at the scaling that’s required, those chips will have to scale by a matter of 4 to 12X in performance, which means the associated scaling of power. Seventy-five watts is about as hot as you can run a chip before you actively need to cool it, and automakers have signaled they have no desire to actively cool electronics in cars because it’s expensive and it adds weight.”


Power constraints
Even if it were possible to scale current solutions to 3,000 TOPS, he said, doing so would consume up to 1% of the battery on one of today’s electric cars. Karazuba said one workaround is putting multiple chips next to one another, but doing so creates both space and cost issues.


“With today’s silicon, you’re talking multiple hundreds of dollars — probably approaching $1,000 per chip — and you’d have four of those in a car, or maybe 12 of those in a car. You also can’t take a chip that’s already pretty darn big and grow it by 4X or 12X. It’s going to have to go to disaggregated processing, either moving processing to multiple places within the car or moving to something like a chiplet model.”


That’s only part of the cost. All of these new features and capabilities require power, and in electric vehicles that power comes from batteries. But battery technology is only improving at a rate of about 5% to 6% per year, while demands for processing more data faster are exploding, creating a gap that widens as more autonomy is added into vehicles.

 
  • Like
  • Love
Reactions: 24 users
“Light Detection and Ranging (LiDAR) sensors are used to detect objects and estimate distances. These sensors are key to the development of automated driving. Valeo was the first automotive player to market a laser LiDAR sensor (Valeo SCALA®), that meets the quality and safety criteria of the auto industry.

“This third- generation LiDAR from Valeo is a major technological step forward for autonomous vehicles. Valeo is currently the only player on the market mass producing an automotive LiDAR system. This advance strengthens Valeo’s technological and industrial leadership position in that market. The primary objective of this system is to save lives on the road,”

says Geoffrey Bouquot, Valeo R&D Director in charge of strategy.

This third-generation LiDAR technology sees what the human eye, cameras, and radar cannot see on the road. This sensor for self-driving cars adapts to all light conditions, even extreme situations, whether there might be too much light or none at all. The LiDAR sensor even evaluates the density of raindrops and calculates the braking distance accordingly.

The Valeo third-generation LiDAR sensor system has an unmatched capacity of perception, scanning the vehicle’s environment 25 times a second. As an innovative autonomous car sensor, it combines long-range detection on the road, covering more than 200 meters, with a wide field of vision.

Valeo’s automotive LiDAR scanner detects, identifies and categorizes all objects around the car. It measures the speed and direction of moving objects. It tracks other vehicles in the surrounding environment, even when they are no longer in the driver’s field of vision. The LiDAR’s algorithms enable it to anticipate the trajectories of moving objects and trigger the necessary safety manoeuvres.

All these features allow LiDAR-equipped self-driving cars to protect both the people in the car and those around it: pedestrians, cyclists and other road users. In addition to feeding information to the vehicle, this LiDAR sensor system will alert other vehicles to road hazards via the cloud so they can benefit from its exceptional perception capabilities.


AUTOMOTIVE LIDAR: THE ESSENTIAL SENSOR FOR AUTONOMOUS DRIVING

Just like we need our five complementary senses, autonomous cars need different types of sensors to drive in complete safety. Valeo offers the broadest range of automotive sensors in the world, and was even the first company to make an automotive LiDAR sensor for series-produced cars.

In 2021, the first two autonomous vehicles in the world authorized to drive at Level 3 autonomy were equipped with Valeo’s first- and second-generation LiDAR sensor systems. Valeo’s LiDAR system can also be used in driverless shuttles, robotaxis and delivery droids….


THE WORLD’S FIRST MASS-PRODUCED 3D LIDAR SYSTEM FOR SELF DRIVING CARS

The Valeo automotive LiDAR sensor is the first 3D laser scanner installed in series-produced vehicles that meets the stringent specifications of auto production. This autonomous vehicle sensor system is protected by 560 patents worldwide.

Since its launch in 2017, more than 150,000 units of Valeo’s laser LiDAR have been produced. Production of Valeo’s second-generation LiDAR technology started in 2021. The Honda Legend, which was the first vehicle in the world to be approved for SAE level 3 automated driving, uses Valeo LiDAR scanners, two frontal cameras and a Valeo data fusion controller. The Mercedes-Benz Class S, the second level 3-certified car, is also equipped with a laser LiDAR technology, Valeo SCALA® Gen2.


Valeo’s third-generation laser LiDAR technology, which is scheduled to hit the market in 2024, will take autonomous driving even further, making it possible to delegate driving to the vehicle in many situations, including at speeds of up to 130 km/h on the highway. Even at high speeds on the highway, autonomous vehicles equipped with this system are able to manage emergency situation autonomously.

AUTOMOTIVE LIDAR TECHNOLOGY MANUFACTURING BASED IN GERMANY

Valeo designs and manufactures the entire system, from the hardware to the software and the associated artificial intelligence, the “brain” that collates all the information and enables the vehicle to make the right decision instantly. The software automatically adapts to the environment and improves its performance through progressive over-the-air updates.

Valeo’s laser LiDARs are produced at a plant in Wemding, in the German state of Bavaria. At that plant, the assembly of the components is measured down to the micron. The production lines utilize advanced expertise in optics, mechanics, and photonics (the area of physics that deals with the emission and reception of light particles, i.e. photons). Valeo has 300 engineers dedicated solely to this automotive LiDAR technology, for which the company has already filed over 500 patents.



SELF DRIVING CAR SENSORS BASED ON TRIPLE REDUNDANCY FOR SAFER MOBILITY

The automotive industry uses the triple redundancy system to guarantee the safety of using autonomous cars. Every item of information received by a sensor must be confirmed by two other sensors of different types. Valeo offers the broadest range of automotive sensors on the market
.

MOBILITY KIT: NEW FORMS OF MOBILITY

In addition to automotive applications, Valeo also offers its Mobility Kits, which combine our capacity to innovate and our technological excellence in a complete range of high-performance plug-and-play systems. These kits, which include the Valeo SCALA® LiDAR and provide developers with turnkey solutions, offer solutions to players in new forms of mobility that meet the demands of the automotive industry.”

“FROM CHAPTER 4: BRAINCHIP REVOLUTIONIZING AI INFERENCING AT THE EDGE. – Q & A WITH PETER VAN DER MADE.

Q:
BrainChip customers are already deploying smarter endpoints with independent learning and inference capabilities, faster response times, and a lower power budget. Can you give us some real-world examples of how AKIDA is revolutionizing AI inference at the edge?

Peter: Automotive is a good place to start. Using AKIDA-powered sensors and accelerators, automotive companies can now design lighter, faster, and more energy efficient in-cabin systems that enable advanced driver verification and customization, sophisticated voice control technology, and next-level gaze estimation and emotion classification capabilities.

In addition to redefining the automotive in-cabin experience, AKIDA is helping enable new computer vision and LiDAR systems to detect vehicles, pedestrians, bicyclists, street signs, and objects with incredibly high levels of precision. We’re looking forward to seeing how these fast and energy efficient ADAS systems help automotive companies accelerate the rollout of increasingly advanced assisted driving capabilities. In the future, we’d also like to power self-driving cars and trucks. But we don’t want to program these vehicles. To achieve true autonomy, cars and trucks must independently learn how to drive in different environmental and topographical conditions such as icy mountain roads with limited visibility, crowded streets, and fast-moving highways. Enabling advanced LiDAR with AKIDA 9 Neuromorphic AI inference at the edge

With AKIDA, these driving skills can be easily copied and adaptable by millions of self-driving vehicles. This is a particularly important point. AKIDA driving updates will be based on real-world knowledge and skills other cars have learned on the road. We want to avoid boilerplate firmware updates pushed out by engineering teams sitting in cubes. A programmed car lacking advanced learning and inferential capabilities can’t anticipate, understand, or react to new driving scenarios. That’s why BrainChip’s AKIDA focuses on efficiently learning, inferring, and adapting new skills.”

MY COMMENTS AND OPINIONS:

Any shareholder who has done even the most rudimentary research will be aware that CNN/SNN offers the most efficient vision processing system from a compute perspective presently available. The design of AKIDA in incorporating CNN with SNN allows Brainchip to offer processing of all five human senses which includes vision as well as LiDAR, Radar and Ultrasound which as we know had Rob Telson extolling the AKIDA capacity to offer Sensor Fusion a feature which Edge Impulse have also mentioned.

Bearing this in mind consider the above two extracts and the parts which I have emboldened and ask yourself is it just a coincidence that the Valeo Scala Kit can offer the automotive industry required feature of triple redundancy system and that Peter van der Made states thatAKIDA is helping enable new computer vision and LiDAR systems to detect vehicles, pedestrians, bicyclists, street signs, and objects with incredibly high levels of precision.” Then consider that Valeo’s Scala system improves its performance through over-the-air updates and Peter van der Made states AKIDA driving updates will be based on real-world knowledge and skills other cars have learned on the road.”

Then consider the fact that AKIDA has the processing power of a GPU on the ImageNet dataset at a power budget of 300 milliwatts compared with the GPU’s 300 watts and AKIDA can efficiently perform up to trillions of operations per second within a minimal power envelope and is already designed to perform sensor fusion and Valeo has been an EAP Customer since mid 2020 and is listed as an Early Adopter of AKIDA technology then it does seem that even Blind Freddie will be comfortably waiting for the grand release of the Valeo Scala platform for autonomous Level 3 driving powered by AKIDA technology in 2024 across the Mercedes Benz, Honda Legend and Stellantis automotive vehicle range at least.

Now this is my anonymous opinion only so DYOR and come back here with your contrary argument so that I can avoid Blind Freddie being disappointed.

FF

AKIDA BALLISTA
 
  • Like
  • Fire
  • Love
Reactions: 69 users
In this extract, Paul Karazuba, vice president of marketing at Expedera, discusses power consumption as being the major issue to resolve in terms of reaching L5 full autonomy. He reckons it will require between 1000 to 3000 TOPS for a vehicle to reach Level 5 full autonomy.

We all know that reducing power consumption is right up Aida's alley. I wonder how many AKIDA's would be required to achieve 1000 TOPS hence solving Paul's problem?



EXTRACT

Putting all of these new pieces together takes time, however. Paul Karazuba, vice president of marketing at Expedera, believes much needs to be done to get to full autonomy, starting with much faster and more energy-efficient processing of an enormous amount of streaming data. “Simply from a hardware perspective, the platforms that would be required to deploy L5 in a car don’t exist, and nor do today’s solutions realistically scale to meet L5,” he said. “L5 is going to require a certain amount of AI processing. It’s probably going to be, depending on who you talk to, between 1,000 and 3,000 tera operations per second (TOPS) of processing power. Today’s most advanced solutions for AI processing that one would find in a car is about 250 TOPS.


Fig. 2: The role of artificial intelligence in self-driving vehicles. Source: Expedera

Fig. 2: The role of artificial intelligence in self-driving vehicles. Source: Expedera


At that level, says Karazuba, the processor consumes about 75 watts of power, which creates significant scaling problems. “If you look at the scaling that’s required, those chips will have to scale by a matter of 4 to 12X in performance, which means the associated scaling of power. Seventy-five watts is about as hot as you can run a chip before you actively need to cool it, and automakers have signaled they have no desire to actively cool electronics in cars because it’s expensive and it adds weight.”


Power constraints
Even if it were possible to scale current solutions to 3,000 TOPS, he said, doing so would consume up to 1% of the battery on one of today’s electric cars. Karazuba said one workaround is putting multiple chips next to one another, but doing so creates both space and cost issues.


“With today’s silicon, you’re talking multiple hundreds of dollars — probably approaching $1,000 per chip — and you’d have four of those in a car, or maybe 12 of those in a car. You also can’t take a chip that’s already pretty darn big and grow it by 4X or 12X. It’s going to have to go to disaggregated processing, either moving processing to multiple places within the car or moving to something like a chiplet model.”


That’s only part of the cost. All of these new features and capabilities require power, and in electric vehicles that power comes from batteries. But battery technology is only improving at a rate of about 5% to 6% per year, while demands for processing more data faster are exploding, creating a gap that widens as more autonomy is added into vehicles.

I am afraid @Bravo that despite his claims to have knowledge of every processor on the market the following statements just show he is totally oblivious to AKIDA. If he asked Peter van der Made he would tell him as he did shareholders at the 2019 AGM that 100 AKD1000 chips could provide all the computing power necessary for full autonomy which back then when the cost of AKIDA was expected to be about $US10.00 would have been $1,000 to do it all. AKIDA should not be spoken about in the same sentence as TOPS:

"At that level, says Karazuba, the processor consumes about 75 watts of power, which creates significant scaling problems. “If you look at the scaling that’s required, those chips will have to scale by a matter of 4 to 12X in performance, which means the associated scaling of power. Seventy-five watts is about as hot as you can run a chip before you actively need to cool it, and automakers have signaled they have no desire to actively cool electronics in cars because it’s expensive and it adds weight.”


Power constraints
Even if it were possible to scale current solutions to 3,000 TOPS, he said, doing so would consume up to 1% of the battery on one of today’s electric cars. Karazuba said one workaround is putting multiple chips next to one another, but doing so creates both space and cost issues.


“With today’s silicon, you’re talking multiple hundreds of dollars — probably approaching $1,000 per chip — and you’d have four of those in a car, or maybe 12 of those in a car. You also can’t take a chip that’s already pretty darn big and grow it by 4X or 12X. It’s going to have to go to disaggregated processing, either moving processing to multiple places within the car or moving to something like a chiplet model.”


My opinion and that of Peter van der Made only so DYOR
FF

AKIDA BALLISTA
 
  • Like
  • Fire
  • Love
Reactions: 26 users

Diogenese

Top 20
“Light Detection and Ranging (LiDAR) sensors are used to detect objects and estimate distances. These sensors are key to the development of automated driving. Valeo was the first automotive player to market a laser LiDAR sensor (Valeo SCALA®), that meets the quality and safety criteria of the auto industry.

“This third- generation LiDAR from Valeo is a major technological step forward for autonomous vehicles. Valeo is currently the only player on the market mass producing an automotive LiDAR system. This advance strengthens Valeo’s technological and industrial leadership position in that market. The primary objective of this system is to save lives on the road,”

says Geoffrey Bouquot, Valeo R&D Director in charge of strategy.

This third-generation LiDAR technology sees what the human eye, cameras, and radar cannot see on the road. This sensor for self-driving cars adapts to all light conditions, even extreme situations, whether there might be too much light or none at all. The LiDAR sensor even evaluates the density of raindrops and calculates the braking distance accordingly.

The Valeo third-generation LiDAR sensor system has an unmatched capacity of perception, scanning the vehicle’s environment 25 times a second. As an innovative autonomous car sensor, it combines long-range detection on the road, covering more than 200 meters, with a wide field of vision.

Valeo’s automotive LiDAR scanner detects, identifies and categorizes all objects around the car. It measures the speed and direction of moving objects. It tracks other vehicles in the surrounding environment, even when they are no longer in the driver’s field of vision. The LiDAR’s algorithms enable it to anticipate the trajectories of moving objects and trigger the necessary safety manoeuvres.

All these features allow LiDAR-equipped self-driving cars to protect both the people in the car and those around it: pedestrians, cyclists and other road users. In addition to feeding information to the vehicle, this LiDAR sensor system will alert other vehicles to road hazards via the cloud so they can benefit from its exceptional perception capabilities.


AUTOMOTIVE LIDAR: THE ESSENTIAL SENSOR FOR AUTONOMOUS DRIVING

Just like we need our five complementary senses, autonomous cars need different types of sensors to drive in complete safety. Valeo offers the broadest range of automotive sensors in the world, and was even the first company to make an automotive LiDAR sensor for series-produced cars.

In 2021, the first two autonomous vehicles in the world authorized to drive at Level 3 autonomy were equipped with Valeo’s first- and second-generation LiDAR sensor systems. Valeo’s LiDAR system can also be used in driverless shuttles, robotaxis and delivery droids….


THE WORLD’S FIRST MASS-PRODUCED 3D LIDAR SYSTEM FOR SELF DRIVING CARS

The Valeo automotive LiDAR sensor is the first 3D laser scanner installed in series-produced vehicles that meets the stringent specifications of auto production. This autonomous vehicle sensor system is protected by 560 patents worldwide.

Since its launch in 2017, more than 150,000 units of Valeo’s laser LiDAR have been produced. Production of Valeo’s second-generation LiDAR technology started in 2021. The Honda Legend, which was the first vehicle in the world to be approved for SAE level 3 automated driving, uses Valeo LiDAR scanners, two frontal cameras and a Valeo data fusion controller. The Mercedes-Benz Class S, the second level 3-certified car, is also equipped with a laser LiDAR technology, Valeo SCALA® Gen2.


Valeo’s third-generation laser LiDAR technology, which is scheduled to hit the market in 2024, will take autonomous driving even further, making it possible to delegate driving to the vehicle in many situations, including at speeds of up to 130 km/h on the highway. Even at high speeds on the highway, autonomous vehicles equipped with this system are able to manage emergency situation autonomously.

AUTOMOTIVE LIDAR TECHNOLOGY MANUFACTURING BASED IN GERMANY

Valeo designs and manufactures the entire system, from the hardware to the software and the associated artificial intelligence, the “brain” that collates all the information and enables the vehicle to make the right decision instantly. The software automatically adapts to the environment and improves its performance through progressive over-the-air updates.

Valeo’s laser LiDARs are produced at a plant in Wemding, in the German state of Bavaria. At that plant, the assembly of the components is measured down to the micron. The production lines utilize advanced expertise in optics, mechanics, and photonics (the area of physics that deals with the emission and reception of light particles, i.e. photons). Valeo has 300 engineers dedicated solely to this automotive LiDAR technology, for which the company has already filed over 500 patents.



SELF DRIVING CAR SENSORS BASED ON TRIPLE REDUNDANCY FOR SAFER MOBILITY

The automotive industry uses the triple redundancy system to guarantee the safety of using autonomous cars. Every item of information received by a sensor must be confirmed by two other sensors of different types. Valeo offers the broadest range of automotive sensors on the market
.

MOBILITY KIT: NEW FORMS OF MOBILITY

In addition to automotive applications, Valeo also offers its Mobility Kits, which combine our capacity to innovate and our technological excellence in a complete range of high-performance plug-and-play systems. These kits, which include the Valeo SCALA® LiDAR and provide developers with turnkey solutions, offer solutions to players in new forms of mobility that meet the demands of the automotive industry.”

“FROM CHAPTER 4: BRAINCHIP REVOLUTIONIZING AI INFERENCING AT THE EDGE. – Q & A WITH PETER VAN DER MADE.

Q:
BrainChip customers are already deploying smarter endpoints with independent learning and inference capabilities, faster response times, and a lower power budget. Can you give us some real-world examples of how AKIDA is revolutionizing AI inference at the edge?

Peter: Automotive is a good place to start. Using AKIDA-powered sensors and accelerators, automotive companies can now design lighter, faster, and more energy efficient in-cabin systems that enable advanced driver verification and customization, sophisticated voice control technology, and next-level gaze estimation and emotion classification capabilities.

In addition to redefining the automotive in-cabin experience, AKIDA is helping enable new computer vision and LiDAR systems to detect vehicles, pedestrians, bicyclists, street signs, and objects with incredibly high levels of precision. We’re looking forward to seeing how these fast and energy efficient ADAS systems help automotive companies accelerate the rollout of increasingly advanced assisted driving capabilities. In the future, we’d also like to power self-driving cars and trucks. But we don’t want to program these vehicles. To achieve true autonomy, cars and trucks must independently learn how to drive in different environmental and topographical conditions such as icy mountain roads with limited visibility, crowded streets, and fast-moving highways. Enabling advanced LiDAR with AKIDA 9 Neuromorphic AI inference at the edge

With AKIDA, these driving skills can be easily copied and adaptable by millions of self-driving vehicles. This is a particularly important point. AKIDA driving updates will be based on real-world knowledge and skills other cars have learned on the road. We want to avoid boilerplate firmware updates pushed out by engineering teams sitting in cubes. A programmed car lacking advanced learning and inferential capabilities can’t anticipate, understand, or react to new driving scenarios. That’s why BrainChip’s AKIDA focuses on efficiently learning, inferring, and adapting new skills.”

MY COMMENTS AND OPINIONS:

Any shareholder who has done even the most rudimentary research will be aware that CNN/SNN offers the most efficient vision processing system from a compute perspective presently available. The design of AKIDA in incorporating CNN with SNN allows Brainchip to offer processing of all five human senses which includes vision as well as LiDAR, Radar and Ultrasound which as we know had Rob Telson extolling the AKIDA capacity to offer Sensor Fusion a feature which Edge Impulse have also mentioned.

Bearing this in mind consider the above two extracts and the parts which I have emboldened and ask yourself is it just a coincidence that the Valeo Scala Kit can offer the automotive industry required feature of triple redundancy system and that Peter van der Made states thatAKIDA is helping enable new computer vision and LiDAR systems to detect vehicles, pedestrians, bicyclists, street signs, and objects with incredibly high levels of precision.” Then consider that Valeo’s Scala system improves its performance through over-the-air updates and Peter van der Made states AKIDA driving updates will be based on real-world knowledge and skills other cars have learned on the road.”

Then consider the fact that AKIDA has the processing power of a GPU on the ImageNet dataset at a power budget of 300 milliwatts compared with the GPU’s 300 watts and AKIDA can efficiently perform up to trillions of operations per second within a minimal power envelope and is already designed to perform sensor fusion and Valeo has been an EAP Customer since mid 2020 and is listed as an Early Adopter of AKIDA technology then it does seem that even Blind Freddie will be comfortably waiting for the grand release of the Valeo Scala platform for autonomous Level 3 driving powered by AKIDA technology in 2024 across the Mercedes Benz, Honda Legend and Stellantis automotive vehicle range at least.

Now this is my anonymous opinion only so DYOR and come back here with your contrary argument so that I can avoid Blind Freddie being disappointed.

FF

AKIDA BALLISTA
Hi FF,

Scala:
"The software automatically adapts to the environment and improves its performance through progressive over-the-air updates."
Akida:
"AKIDA driving updates will be based on real-world knowledge and skills other cars have learned on the road."

There are two parameters of the Akida chip which are "programmable" - the image model library and the node configuration, so I'm assuming that the updates will be in the form of image model library updates. The configuration would be modified rarely.
 
  • Like
  • Fire
  • Love
Reactions: 26 users

uiux

Regular
In this extract, Paul Karazuba, vice president of marketing at Expedera, discusses power consumption as being the major issue to resolve in terms of reaching L5 full autonomy. He reckons it will require between 1000 to 3000 TOPS for a vehicle to reach Level 5 full autonomy.

We all know that reducing power consumption is right up Aida's alley. I wonder how many AKIDA's would be required to achieve 1000 TOPS hence solving Paul's problem?



EXTRACT

Putting all of these new pieces together takes time, however. Paul Karazuba, vice president of marketing at Expedera, believes much needs to be done to get to full autonomy, starting with much faster and more energy-efficient processing of an enormous amount of streaming data. “Simply from a hardware perspective, the platforms that would be required to deploy L5 in a car don’t exist, and nor do today’s solutions realistically scale to meet L5,” he said. “L5 is going to require a certain amount of AI processing. It’s probably going to be, depending on who you talk to, between 1,000 and 3,000 tera operations per second (TOPS) of processing power. Today’s most advanced solutions for AI processing that one would find in a car is about 250 TOPS.


Fig. 2: The role of artificial intelligence in self-driving vehicles. Source: Expedera

Fig. 2: The role of artificial intelligence in self-driving vehicles. Source: Expedera


At that level, says Karazuba, the processor consumes about 75 watts of power, which creates significant scaling problems. “If you look at the scaling that’s required, those chips will have to scale by a matter of 4 to 12X in performance, which means the associated scaling of power. Seventy-five watts is about as hot as you can run a chip before you actively need to cool it, and automakers have signaled they have no desire to actively cool electronics in cars because it’s expensive and it adds weight.”


Power constraints
Even if it were possible to scale current solutions to 3,000 TOPS, he said, doing so would consume up to 1% of the battery on one of today’s electric cars. Karazuba said one workaround is putting multiple chips next to one another, but doing so creates both space and cost issues.


“With today’s silicon, you’re talking multiple hundreds of dollars — probably approaching $1,000 per chip — and you’d have four of those in a car, or maybe 12 of those in a car. You also can’t take a chip that’s already pretty darn big and grow it by 4X or 12X. It’s going to have to go to disaggregated processing, either moving processing to multiple places within the car or moving to something like a chiplet model.”


That’s only part of the cost. All of these new features and capabilities require power, and in electric vehicles that power comes from batteries. But battery technology is only improving at a rate of about 5% to 6% per year, while demands for processing more data faster are exploding, creating a gap that widens as more autonomy is added into vehicles.



@Bravo


Akida’s NPUs include eight neural processing engines (NPEs), which run simultaneously. The NPEs perform event-based convolutions, handling 1x1, 3x3, 5x5, and 7x7 kernels. Each can perform the equivalent of four simultaneous multiply-accumulate (MAC) operations, but since they run asynchronously on the basis of sparse spiking events, a comparison with conventional clocked MAC units is inappropriate. Nevertheless, adding Akida’s 640 PEs at the maximum 300MHz clock frequency totals 1.5 trillion operations per second (TOPS), although it’ll use much less in real-world applications
 
  • Like
  • Love
  • Wow
Reactions: 36 users

Slade

Top 20
“Light Detection and Ranging (LiDAR) sensors are used to detect objects and estimate distances. These sensors are key to the development of automated driving. Valeo was the first automotive player to market a laser LiDAR sensor (Valeo SCALA®), that meets the quality and safety criteria of the auto industry.

“This third- generation LiDAR from Valeo is a major technological step forward for autonomous vehicles. Valeo is currently the only player on the market mass producing an automotive LiDAR system. This advance strengthens Valeo’s technological and industrial leadership position in that market. The primary objective of this system is to save lives on the road,”

says Geoffrey Bouquot, Valeo R&D Director in charge of strategy.

This third-generation LiDAR technology sees what the human eye, cameras, and radar cannot see on the road. This sensor for self-driving cars adapts to all light conditions, even extreme situations, whether there might be too much light or none at all. The LiDAR sensor even evaluates the density of raindrops and calculates the braking distance accordingly.

The Valeo third-generation LiDAR sensor system has an unmatched capacity of perception, scanning the vehicle’s environment 25 times a second. As an innovative autonomous car sensor, it combines long-range detection on the road, covering more than 200 meters, with a wide field of vision.

Valeo’s automotive LiDAR scanner detects, identifies and categorizes all objects around the car. It measures the speed and direction of moving objects. It tracks other vehicles in the surrounding environment, even when they are no longer in the driver’s field of vision. The LiDAR’s algorithms enable it to anticipate the trajectories of moving objects and trigger the necessary safety manoeuvres.

All these features allow LiDAR-equipped self-driving cars to protect both the people in the car and those around it: pedestrians, cyclists and other road users. In addition to feeding information to the vehicle, this LiDAR sensor system will alert other vehicles to road hazards via the cloud so they can benefit from its exceptional perception capabilities.


AUTOMOTIVE LIDAR: THE ESSENTIAL SENSOR FOR AUTONOMOUS DRIVING

Just like we need our five complementary senses, autonomous cars need different types of sensors to drive in complete safety. Valeo offers the broadest range of automotive sensors in the world, and was even the first company to make an automotive LiDAR sensor for series-produced cars.

In 2021, the first two autonomous vehicles in the world authorized to drive at Level 3 autonomy were equipped with Valeo’s first- and second-generation LiDAR sensor systems. Valeo’s LiDAR system can also be used in driverless shuttles, robotaxis and delivery droids….


THE WORLD’S FIRST MASS-PRODUCED 3D LIDAR SYSTEM FOR SELF DRIVING CARS

The Valeo automotive LiDAR sensor is the first 3D laser scanner installed in series-produced vehicles that meets the stringent specifications of auto production. This autonomous vehicle sensor system is protected by 560 patents worldwide.

Since its launch in 2017, more than 150,000 units of Valeo’s laser LiDAR have been produced. Production of Valeo’s second-generation LiDAR technology started in 2021. The Honda Legend, which was the first vehicle in the world to be approved for SAE level 3 automated driving, uses Valeo LiDAR scanners, two frontal cameras and a Valeo data fusion controller. The Mercedes-Benz Class S, the second level 3-certified car, is also equipped with a laser LiDAR technology, Valeo SCALA® Gen2.


Valeo’s third-generation laser LiDAR technology, which is scheduled to hit the market in 2024, will take autonomous driving even further, making it possible to delegate driving to the vehicle in many situations, including at speeds of up to 130 km/h on the highway. Even at high speeds on the highway, autonomous vehicles equipped with this system are able to manage emergency situation autonomously.

AUTOMOTIVE LIDAR TECHNOLOGY MANUFACTURING BASED IN GERMANY

Valeo designs and manufactures the entire system, from the hardware to the software and the associated artificial intelligence, the “brain” that collates all the information and enables the vehicle to make the right decision instantly. The software automatically adapts to the environment and improves its performance through progressive over-the-air updates.

Valeo’s laser LiDARs are produced at a plant in Wemding, in the German state of Bavaria. At that plant, the assembly of the components is measured down to the micron. The production lines utilize advanced expertise in optics, mechanics, and photonics (the area of physics that deals with the emission and reception of light particles, i.e. photons). Valeo has 300 engineers dedicated solely to this automotive LiDAR technology, for which the company has already filed over 500 patents.



SELF DRIVING CAR SENSORS BASED ON TRIPLE REDUNDANCY FOR SAFER MOBILITY

The automotive industry uses the triple redundancy system to guarantee the safety of using autonomous cars. Every item of information received by a sensor must be confirmed by two other sensors of different types. Valeo offers the broadest range of automotive sensors on the market
.

MOBILITY KIT: NEW FORMS OF MOBILITY

In addition to automotive applications, Valeo also offers its Mobility Kits, which combine our capacity to innovate and our technological excellence in a complete range of high-performance plug-and-play systems. These kits, which include the Valeo SCALA® LiDAR and provide developers with turnkey solutions, offer solutions to players in new forms of mobility that meet the demands of the automotive industry.”

“FROM CHAPTER 4: BRAINCHIP REVOLUTIONIZING AI INFERENCING AT THE EDGE. – Q & A WITH PETER VAN DER MADE.

Q:
BrainChip customers are already deploying smarter endpoints with independent learning and inference capabilities, faster response times, and a lower power budget. Can you give us some real-world examples of how AKIDA is revolutionizing AI inference at the edge?

Peter: Automotive is a good place to start. Using AKIDA-powered sensors and accelerators, automotive companies can now design lighter, faster, and more energy efficient in-cabin systems that enable advanced driver verification and customization, sophisticated voice control technology, and next-level gaze estimation and emotion classification capabilities.

In addition to redefining the automotive in-cabin experience, AKIDA is helping enable new computer vision and LiDAR systems to detect vehicles, pedestrians, bicyclists, street signs, and objects with incredibly high levels of precision. We’re looking forward to seeing how these fast and energy efficient ADAS systems help automotive companies accelerate the rollout of increasingly advanced assisted driving capabilities. In the future, we’d also like to power self-driving cars and trucks. But we don’t want to program these vehicles. To achieve true autonomy, cars and trucks must independently learn how to drive in different environmental and topographical conditions such as icy mountain roads with limited visibility, crowded streets, and fast-moving highways. Enabling advanced LiDAR with AKIDA 9 Neuromorphic AI inference at the edge

With AKIDA, these driving skills can be easily copied and adaptable by millions of self-driving vehicles. This is a particularly important point. AKIDA driving updates will be based on real-world knowledge and skills other cars have learned on the road. We want to avoid boilerplate firmware updates pushed out by engineering teams sitting in cubes. A programmed car lacking advanced learning and inferential capabilities can’t anticipate, understand, or react to new driving scenarios. That’s why BrainChip’s AKIDA focuses on efficiently learning, inferring, and adapting new skills.”

MY COMMENTS AND OPINIONS:

Any shareholder who has done even the most rudimentary research will be aware that CNN/SNN offers the most efficient vision processing system from a compute perspective presently available. The design of AKIDA in incorporating CNN with SNN allows Brainchip to offer processing of all five human senses which includes vision as well as LiDAR, Radar and Ultrasound which as we know had Rob Telson extolling the AKIDA capacity to offer Sensor Fusion a feature which Edge Impulse have also mentioned.

Bearing this in mind consider the above two extracts and the parts which I have emboldened and ask yourself is it just a coincidence that the Valeo Scala Kit can offer the automotive industry required feature of triple redundancy system and that Peter van der Made states thatAKIDA is helping enable new computer vision and LiDAR systems to detect vehicles, pedestrians, bicyclists, street signs, and objects with incredibly high levels of precision.” Then consider that Valeo’s Scala system improves its performance through over-the-air updates and Peter van der Made states AKIDA driving updates will be based on real-world knowledge and skills other cars have learned on the road.”

Then consider the fact that AKIDA has the processing power of a GPU on the ImageNet dataset at a power budget of 300 milliwatts compared with the GPU’s 300 watts and AKIDA can efficiently perform up to trillions of operations per second within a minimal power envelope and is already designed to perform sensor fusion and Valeo has been an EAP Customer since mid 2020 and is listed as an Early Adopter of AKIDA technology then it does seem that even Blind Freddie will be comfortably waiting for the grand release of the Valeo Scala platform for autonomous Level 3 driving powered by AKIDA technology in 2024 across the Mercedes Benz, Honda Legend and Stellantis automotive vehicle range at least.

Now this is my anonymous opinion only so DYOR and come back here with your contrary argument so that I can avoid Blind Freddie being disappointed.

FF

AKIDA BALLISTA
And on the front page of BrainChip’s website it states that we are trusted by Valeo. Wouldn’t be much to trust if they weren’t a big customer.
 
  • Like
  • Love
  • Fire
Reactions: 35 users

Proga

Regular
“Light Detection and Ranging (LiDAR) sensors are used to detect objects and estimate distances. These sensors are key to the development of automated driving. Valeo was the first automotive player to market a laser LiDAR sensor (Valeo SCALA®), that meets the quality and safety criteria of the auto industry.

“This third- generation LiDAR from Valeo is a major technological step forward for autonomous vehicles. Valeo is currently the only player on the market mass producing an automotive LiDAR system. This advance strengthens Valeo’s technological and industrial leadership position in that market. The primary objective of this system is to save lives on the road,”

says Geoffrey Bouquot, Valeo R&D Director in charge of strategy.

This third-generation LiDAR technology sees what the human eye, cameras, and radar cannot see on the road. This sensor for self-driving cars adapts to all light conditions, even extreme situations, whether there might be too much light or none at all. The LiDAR sensor even evaluates the density of raindrops and calculates the braking distance accordingly.

The Valeo third-generation LiDAR sensor system has an unmatched capacity of perception, scanning the vehicle’s environment 25 times a second. As an innovative autonomous car sensor, it combines long-range detection on the road, covering more than 200 meters, with a wide field of vision.

Valeo’s automotive LiDAR scanner detects, identifies and categorizes all objects around the car. It measures the speed and direction of moving objects. It tracks other vehicles in the surrounding environment, even when they are no longer in the driver’s field of vision. The LiDAR’s algorithms enable it to anticipate the trajectories of moving objects and trigger the necessary safety manoeuvres.

All these features allow LiDAR-equipped self-driving cars to protect both the people in the car and those around it: pedestrians, cyclists and other road users. In addition to feeding information to the vehicle, this LiDAR sensor system will alert other vehicles to road hazards via the cloud so they can benefit from its exceptional perception capabilities.


AUTOMOTIVE LIDAR: THE ESSENTIAL SENSOR FOR AUTONOMOUS DRIVING

Just like we need our five complementary senses, autonomous cars need different types of sensors to drive in complete safety. Valeo offers the broadest range of automotive sensors in the world, and was even the first company to make an automotive LiDAR sensor for series-produced cars.

In 2021, the first two autonomous vehicles in the world authorized to drive at Level 3 autonomy were equipped with Valeo’s first- and second-generation LiDAR sensor systems. Valeo’s LiDAR system can also be used in driverless shuttles, robotaxis and delivery droids….


THE WORLD’S FIRST MASS-PRODUCED 3D LIDAR SYSTEM FOR SELF DRIVING CARS

The Valeo automotive LiDAR sensor is the first 3D laser scanner installed in series-produced vehicles that meets the stringent specifications of auto production. This autonomous vehicle sensor system is protected by 560 patents worldwide.

Since its launch in 2017, more than 150,000 units of Valeo’s laser LiDAR have been produced. Production of Valeo’s second-generation LiDAR technology started in 2021. The Honda Legend, which was the first vehicle in the world to be approved for SAE level 3 automated driving, uses Valeo LiDAR scanners, two frontal cameras and a Valeo data fusion controller. The Mercedes-Benz Class S, the second level 3-certified car, is also equipped with a laser LiDAR technology, Valeo SCALA® Gen2.


Valeo’s third-generation laser LiDAR technology, which is scheduled to hit the market in 2024, will take autonomous driving even further, making it possible to delegate driving to the vehicle in many situations, including at speeds of up to 130 km/h on the highway. Even at high speeds on the highway, autonomous vehicles equipped with this system are able to manage emergency situation autonomously.

AUTOMOTIVE LIDAR TECHNOLOGY MANUFACTURING BASED IN GERMANY

Valeo designs and manufactures the entire system, from the hardware to the software and the associated artificial intelligence, the “brain” that collates all the information and enables the vehicle to make the right decision instantly. The software automatically adapts to the environment and improves its performance through progressive over-the-air updates.

Valeo’s laser LiDARs are produced at a plant in Wemding, in the German state of Bavaria. At that plant, the assembly of the components is measured down to the micron. The production lines utilize advanced expertise in optics, mechanics, and photonics (the area of physics that deals with the emission and reception of light particles, i.e. photons). Valeo has 300 engineers dedicated solely to this automotive LiDAR technology, for which the company has already filed over 500 patents.



SELF DRIVING CAR SENSORS BASED ON TRIPLE REDUNDANCY FOR SAFER MOBILITY

The automotive industry uses the triple redundancy system to guarantee the safety of using autonomous cars. Every item of information received by a sensor must be confirmed by two other sensors of different types. Valeo offers the broadest range of automotive sensors on the market
.

MOBILITY KIT: NEW FORMS OF MOBILITY

In addition to automotive applications, Valeo also offers its Mobility Kits, which combine our capacity to innovate and our technological excellence in a complete range of high-performance plug-and-play systems. These kits, which include the Valeo SCALA® LiDAR and provide developers with turnkey solutions, offer solutions to players in new forms of mobility that meet the demands of the automotive industry.”

“FROM CHAPTER 4: BRAINCHIP REVOLUTIONIZING AI INFERENCING AT THE EDGE. – Q & A WITH PETER VAN DER MADE.

Q:
BrainChip customers are already deploying smarter endpoints with independent learning and inference capabilities, faster response times, and a lower power budget. Can you give us some real-world examples of how AKIDA is revolutionizing AI inference at the edge?

Peter: Automotive is a good place to start. Using AKIDA-powered sensors and accelerators, automotive companies can now design lighter, faster, and more energy efficient in-cabin systems that enable advanced driver verification and customization, sophisticated voice control technology, and next-level gaze estimation and emotion classification capabilities.

In addition to redefining the automotive in-cabin experience, AKIDA is helping enable new computer vision and LiDAR systems to detect vehicles, pedestrians, bicyclists, street signs, and objects with incredibly high levels of precision. We’re looking forward to seeing how these fast and energy efficient ADAS systems help automotive companies accelerate the rollout of increasingly advanced assisted driving capabilities. In the future, we’d also like to power self-driving cars and trucks. But we don’t want to program these vehicles. To achieve true autonomy, cars and trucks must independently learn how to drive in different environmental and topographical conditions such as icy mountain roads with limited visibility, crowded streets, and fast-moving highways. Enabling advanced LiDAR with AKIDA 9 Neuromorphic AI inference at the edge

With AKIDA, these driving skills can be easily copied and adaptable by millions of self-driving vehicles. This is a particularly important point. AKIDA driving updates will be based on real-world knowledge and skills other cars have learned on the road. We want to avoid boilerplate firmware updates pushed out by engineering teams sitting in cubes. A programmed car lacking advanced learning and inferential capabilities can’t anticipate, understand, or react to new driving scenarios. That’s why BrainChip’s AKIDA focuses on efficiently learning, inferring, and adapting new skills.”

MY COMMENTS AND OPINIONS:

Any shareholder who has done even the most rudimentary research will be aware that CNN/SNN offers the most efficient vision processing system from a compute perspective presently available. The design of AKIDA in incorporating CNN with SNN allows Brainchip to offer processing of all five human senses which includes vision as well as LiDAR, Radar and Ultrasound which as we know had Rob Telson extolling the AKIDA capacity to offer Sensor Fusion a feature which Edge Impulse have also mentioned.

Bearing this in mind consider the above two extracts and the parts which I have emboldened and ask yourself is it just a coincidence that the Valeo Scala Kit can offer the automotive industry required feature of triple redundancy system and that Peter van der Made states thatAKIDA is helping enable new computer vision and LiDAR systems to detect vehicles, pedestrians, bicyclists, street signs, and objects with incredibly high levels of precision.” Then consider that Valeo’s Scala system improves its performance through over-the-air updates and Peter van der Made states AKIDA driving updates will be based on real-world knowledge and skills other cars have learned on the road.”

Then consider the fact that AKIDA has the processing power of a GPU on the ImageNet dataset at a power budget of 300 milliwatts compared with the GPU’s 300 watts and AKIDA can efficiently perform up to trillions of operations per second within a minimal power envelope and is already designed to perform sensor fusion and Valeo has been an EAP Customer since mid 2020 and is listed as an Early Adopter of AKIDA technology then it does seem that even Blind Freddie will be comfortably waiting for the grand release of the Valeo Scala platform for autonomous Level 3 driving powered by AKIDA technology in 2024 across the Mercedes Benz, Honda Legend and Stellantis automotive vehicle range at least.

Now this is my anonymous opinion only so DYOR and come back here with your contrary argument so that I can avoid Blind Freddie being disappointed.

FF

AKIDA BALLISTA
Valeo’s third-generation laser LiDAR technology, which is scheduled to hit the market in 2024 - which means it will be in the 2025 models which start production in 2024.

Questions is which models? Will it only run on the full MB.OS which at first will limit it to the EQA and EQC.

Hopefully Stellantis will roll it straight out in every model.
 
  • Like
Reactions: 9 users

Diogenese

Top 20
Valeo’s third-generation laser LiDAR technology, which is scheduled to hit the market in 2024 - which means it will be in the 2025 models which start production in 2024.

Questions is which models? Will it only run on the full MB.OS which at first will limit it to the EQA and EQC.

Hopefully Stellantis will roll it straight out in every model.
Sounds like Achilles and the tortoise.
 
  • Like
  • Haha
  • Love
Reactions: 7 users

Proga

Regular
  • Like
Reactions: 1 users

Makeme 2020

Regular
Off topic.
Chris Dawson found guilty of killing his wife 40 years ago the law finally caught up with him.
As the say crime doesn't pay........
 
  • Like
  • Love
Reactions: 11 users

mrgds

Regular
Off topic.
Chris Dawson found guilty of killing his wife 40 years ago the law finally caught up with him.
As the say crime doesn't pay........
I used to frequent the "Timmo" { Time & Tide Hotel) back in the day that the Dawsons started hiring the "nanny",
Goes to show how you never know whats happening around you, let alone right under your nose,
Bit like holding BRN stock i guess, ............... lol

AKIDA BALLISTA
 
Last edited:
  • Like
  • Love
  • Haha
Reactions: 10 users
Hi FF,

Scala:
"The software automatically adapts to the environment and improves its performance through progressive over-the-air updates."
Akida:
"AKIDA driving updates will be based on real-world knowledge and skills other cars have learned on the road."

There are two parameters of the Akida chip which are "programmable" - the image model library and the node configuration, so I'm assuming that the updates will be in the form of image model library updates. The configuration would be modified rarely.
That’s what I like my broad brush paint by numbers statements being filled in with solid engineering oil based pigments.

Many thanks @Diogenese.

FF

AKIDA BALLISTA
 
  • Like
  • Love
  • Fire
Reactions: 15 users

Slade

Top 20
Are we expecting an announcement tomorrow regarding our new patent. Should be a good day if we get it.
 
  • Like
  • Fire
  • Thinking
Reactions: 20 users
Off topic.
Chris Dawson found guilty of killing his wife 40 years ago the law finally caught up with him.
As the say crime doesn't pay........
Unless you skulk about in the ShareMarket regulated by the ASX.
😞😞😞
 
  • Like
  • Fire
  • Love
Reactions: 5 users
I tried to make the point on the weekend that Brainchip's house has many rooms prepared and in one of those already occupied rooms is Renesas. In another is ARM. (Important to remember as you read the article below.) Renesas is one of those unromantic companies that most Australians and even Americans would not recognise by name even though they have a huge reach around the world and are the world wide number one supplier of MCU's. As part of my effort to evangelise to all new shareholders and even the not so new that Brainchip a bit like Renesas has according to the author of the following document quietly and effectively been building a world class ecosystem of partners and customers just as Renesas has over many decades. Take the time to read the following you will be impressed with Brainchips involvement with Renesas:

CLOUD

Renesas Rising Beyond Automotive Into IoT And Infrastructure​

Patrick Moorhead
Senior Contributor
I write about disruptive companies, technologies and usage models.
New! Follow this author to stay notified about their latest stories. Got it!
Follow
Feb 25, 2022,02:20pm EST

Listen to article12 minutes

https://policies.google.com/privacy

In the world of tech, success takes many forms. Some "it" companies blow up overnight. A lucky few of them manage to parlay the hype into long-term success, eventually finding their sea legs. I've got a particular appreciation for the slower-burn companies, the unsung heroes who toil behind the scenes, amassing impressive portfolios of IP over the years that you've probably used before without realizing.

One such company, in my mind, is Renesas Electronics Corporation, headquartered in Tokyo. Founded in 2002 (with operations beginning in 2010). Renesas is the #1 MCU supplier in the world and well known for its automotive capabilities over many decades.


Now, under Representative Director, President and CEO Shibata-san’s leadership, under the mission of “Making People’s Lives easier,” he has expanded scope into the IoT, industrial and infrastructure markets. I think the unsung piece of the Renesas story is the IIBU (IoT and Infrastructure Business Unit). When you combine next-generation cars to the booming industrial IoT space to smart cities and infrastructure, intelligent endpoints have never been a hotter commodity than the present moment. And quite the investment opportunity.

I’d like to spend some time talking about the IIBU.

Technology trends driving growth opportunities

Last month I listened in on a talk delivered by Sailesh Chittipeddi, Executive VP and GM of Renesas' IoT and Infrastructure Business Unit, concerning the company's growth in IIoT. Chittipeddi led off by citing a few notable trends aiding growth: migration to the smart factory, a growing emphasis on sustainability, a shift from ASICs towards more standard products and the popularization of brushless DC motors.
IIOT and Industrial Growth A Snapshot 2020 - 2021

IIOT and Industrial Growth A Snapshot 2020 - 2021
RENESAS
MORE FROMFORBES ADVISOR

Best Travel Insurance Companies

By
Amy Danise
Editor

Best Covid-19 Travel Insurance Plans

By
Amy Danise
Editor
Above, you can see Renesas's projected growth in IoT (30% CAGR), Infrastructure (18% CAGR) and Industrial (15% CAGR) segments for the 2020-2021 period. Chittipeddi warned against attributing Renesas's IoT and Industrial growth to a "rising tides raise all ships" scenario. He pointed out that Renesas doesn't operate as a broad-based point product provider, unlike some competitors. Instead, Renesas targets several specific market areas with its industrial automation and motor control offerings.

In the realm of infrastructure, Renesas's growth is coming predominately from datacenter-focused infrastructure power, timing and memory interfaces. In IoT, the company's core microcontroller business is doing the heavy lifting. In the industrial sector, the ASIC and MPU businesses are driving growth. Overall, Renesas's Industrial Automation segment grew approximately 20% from 2020 to 2021, while its Motor Control segment grew by 30% over the same period. As this growth likely connects to the broader digital transformation and modernization trend across industries, I don't expect to see it slow down any time soon. The widespread adoption of 5G, Wi-Fi 6 and 6e connectivity standards should only accelerate these trends further.


Growth and profit in the industrial sector
Looking at the financials for Renesas's industrial segment since 2019 (see slide below), we can see several positive trends worth noting. Revenue in the industrial segment has grown at a 23% CAGR, gross profits have grown at a 31% CAGR and the segment's operating income has shot through the roof at a CAGR of 62%.
IIOT and Industrial Financial Trends At a Glance

IIOT and Industrial Financial Trends At a Glance
RENESAS
Renesas already has an impressive global customer base in industrial automation. This includes companies such as Rockwell Automation and Emerson in the Americas, Siemens, Hilscher and Schneider Electric in Europe, Inovance and Delta in the Asia-Pacific region and heavy hitters Mitsubishi Electric, Fanuc and Yaskawa in Japan.
With these and many other customers under its belt, Renesas touts itself as a leading semiconductor supplier for "Industry 4.0." In this vision for the future, highly integrated smart factories dominate the industrial landscape, promising to significantly enhance productivity, efficiency, and safety by implementing advanced automation, control, monitoring, and analysis capabilities.
Industrial Automation Pyramid

Industrial Automation Pyramid
RENESAS
Industrial automation
Renesas illustrates its place in the world of industrial automation with a helpful pyramid graphic (see above). At the bottom, the Access Layer consists of the various technology that operates on the factory floor. Here, intelligent sensors and actuators connected through a local field network gather data and send it up the stack for control, monitoring and analysis. The second layer, the Control Layer, includes the system or industrial network and its processes—namely, automation, monitoring and control. Lastly, the Analysis Layer at the top of the pyramid is where most computing and data analysis occurs. Current technological limitations relegate most of this work to the cloud.
As you can see, Renesas's portfolio of digital, analog, timing and standard products are well-represented in the industrial automation pyramid. Renesas offers MPUs, MCUs, PMICs, ASICs, and timing solutions for the Control Layer. In the Access Layer, Renesas provides Smart SSCs, MCUs and sensors for air quality, humidity, position and flow, and interface solutions such as ASICs with IO-link and ASi.
The company's comprehensive portfolio, in my opinion, makes it uniquely positioned to deliver the levels of integration necessary to make "Industry 4.0" a reality. The company's preexisting relationships with big-name customers should also help with integration. Lastly, it's worth mentioning several recent Renesas acquisitions that have augmented the company's value proposition even further.
Dialog Semiconductor, purchased last August, gave Renesas crucial battery management capabilities, Bluetooth low-energy, Wi-Fi, Flash memory and Configurable mixed-signal custom integrated circuits (ICs). This technology, according to Renesas, should help it improve the power efficiency, charge times, performance and productivity of its offerings.
Last October, Renesas also bought Celeno, a company specializing in connectivity solutions. Celeno's Wi-Fi 6 and 6e technology promises to benefit Renesas's industrial automation portfolio by providing higher network capacity, improved signal reliability, lower latency and the ability to mitigate crowded networks.
Our Som Growth

Our Som Growth
RENESAS
I believe all of this translates to a compelling growth story. In the figure above, you can see that Renesas anticipates its SOM (serviceable obtainable market) in industrial automation will outpace the growth of the segment's overall SAM (serviceable addressable market). In other words, not only is the market for industrial automation growing at an impressive rate (11% CAGR), but Renesas's market share is simultaneously growing within it and growing more quickly at that. Whereas Renesas claimed 8% of the overall industrial automation market in 2020, it projects it will secure 9% by 2025.
Motor control
This brings us to motor controls, Renesas's second area of focus in IoT and IIoT. Typically, a motor system consists of gears, a motor and an encoder. Of the different motor types, some examples include BLDC motors (for air conditioners, refrigerators), AC servo motors (which we typically see in industrial automation environments), fan motors (for PCs, servers) and stepping motors (utilized in printers and cameras). As mentioned briefly earlier, the movement away from AC motors towards BLDCs is a trend Renesas is gainfully riding.
Renesas's preexisting customer base is also an advantage in the motor control area. It has years of engineering know-how and customer relationships with makers of power tools, electric motors and machine tools. Its motor control base includes American household brands such as Whirlpool, GE Appliances and Emerson Electric (acquired by Nidec in Japan), European companies such as Miele, Grundfos and Bosch, APAC companies like Delta, Haier (parent company to GE Appliances) and Midea, and Japanese stalwarts Daikin, Hitachi and Mitsubishi Electric. According to Chittibeddi, these key customers form "a strong base" that is the primary driver behind Renesas's innovation engine.
MCU No. 1 Share and Strength For Motor Control

MCU No. 1 Share and Strength For Motor Control
RENESAS
Renesas's market leadership in motor control has historically come from its strength in MCUs, or microcontrollers, that feature Renesas's proprietary cores. However, over the last several years, Renesas launched its RA family of MCUs, which instead leverage 32-bit Arm cores. The rollout of the RA family, starting with the RA2 in 2018, has been aggressive and impressive in equal measure. Though its leading competitor in the Arm ecosystem had a lengthy head start, having launched its first Arm-based MCU in 2010, Renesas has now effectively closed the gap from both a hardware and a software standpoint. As of 2020, Renesas now has Arm-based MCUs that cover the whole range of motor control applications, from the low end (small household and kitchen appliances) to mid-range (large household appliances) to the high end (robotics and other highly advanced applications). With its upcoming RA8 MCUs, Renesas predicts it will finally overtake the competition with the most advanced Arm core on the MCU market.
MCU Product For Motor Control Applications

MCU Product For Motor Control Applications
RENESAS
Wrapping up
Renesas has a busy 2022 on tap and I look forward to following the company more closely than I have in the past. As it currently stands, we'll be hearing more from Renesas at the Embedded World conference in Nuremberg, Germany, in June, the not-to-miss Arm Developer Summit in the Bay Area in October and the Electronica conference in Munich, Germany, in November. At these events, I'll be watching to see if the company's gains in industrial automation and motor control live up to current projections and how Renesas plans to keep the momentum going through 2022 and beyond.
Next week, I will be listening into the company’s investor call to get the latest and greatest.

MY OPINION ONLY DO DYOR
FF

AKIDA BALLISTA
 
Last edited:
  • Like
  • Love
  • Fire
Reactions: 55 users

wilzy123

Founding Member
INCOMING

Global-Dossier-United-States-Patent-and-Trademark-Office (1).jpg


Global-Dossier-United-States-Patent-and-Trademark-Office.jpg



If you want to drill into the details of this filing:

  1. Visit https://globaldossier.uspto.gov/#/
  2. Up top where the search bar is on that page... select "Patent" from the 'type' dropdown... and enter 11429857 into the search field next to it
  3. Then click on the second items link "View Dossier"
 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 114 users

equanimous

Norse clairvoyant shapeshifter goddess
I tried to make the point on the weekend that Brainchip's house has many rooms prepared and in one of those already occupied rooms is Renesas. In another is ARM. (Important to remember as you read the article below.) Renesas is one of those unromantic companies that most Australians and even Americans would not recognise by name even though they have a huge reach around the world and are the world wide number one supplier of MCU's. As part of my effort to evangelise to all new shareholders and even the not so new that Brainchip a bit like Renesas has according to the author of the following document quietly and effectively been building a world class ecosystem of partners and customers just as Renesas has over many decades. Take the time to read the following you will be impressed with Brainchips involvement with Renesas:

CLOUD

Renesas Rising Beyond Automotive Into IoT And Infrastructure​

Patrick Moorhead
Senior Contributor
I write about disruptive companies, technologies and usage models.
New! Follow this author to stay notified about their latest stories. Got it!
Follow
Feb 25, 2022,02:20pm EST

Listen to article12 minutes

https://policies.google.com/privacy

In the world of tech, success takes many forms. Some "it" companies blow up overnight. A lucky few of them manage to parlay the hype into long-term success, eventually finding their sea legs. I've got a particular appreciation for the slower-burn companies, the unsung heroes who toil behind the scenes, amassing impressive portfolios of IP over the years that you've probably used before without realizing.

One such company, in my mind, is Renesas Electronics Corporation, headquartered in Tokyo. Founded in 2002 (with operations beginning in 2010). Renesas is the #1 MCU supplier in the world and well known for its automotive capabilities over many decades.


Now, under Representative Director, President and CEO Shibata-san’s leadership, under the mission of “Making People’s Lives easier,” he has expanded scope into the IoT, industrial and infrastructure markets. I think the unsung piece of the Renesas story is the IIBU (IoT and Infrastructure Business Unit). When you combine next-generation cars to the booming industrial IoT space to smart cities and infrastructure, intelligent endpoints have never been a hotter commodity than the present moment. And quite the investment opportunity.

I’d like to spend some time talking about the IIBU.

Technology trends driving growth opportunities

Last month I listened in on a talk delivered by Sailesh Chittipeddi, Executive VP and GM of Renesas' IoT and Infrastructure Business Unit, concerning the company's growth in IIoT. Chittipeddi led off by citing a few notable trends aiding growth: migration to the smart factory, a growing emphasis on sustainability, a shift from ASICs towards more standard products and the popularization of brushless DC motors.
IIOT and Industrial Growth A Snapshot 2020 - 2021

IIOT and Industrial Growth A Snapshot 2020 - 2021
RENESAS
MORE FROMFORBES ADVISOR

Best Travel Insurance Companies

By
Amy Danise
Editor

Best Covid-19 Travel Insurance Plans

By
Amy Danise
Editor
Above, you can see Renesas's projected growth in IoT (30% CAGR), Infrastructure (18% CAGR) and Industrial (15% CAGR) segments for the 2020-2021 period. Chittipeddi warned against attributing Renesas's IoT and Industrial growth to a "rising tides raise all ships" scenario. He pointed out that Renesas doesn't operate as a broad-based point product provider, unlike some competitors. Instead, Renesas targets several specific market areas with its industrial automation and motor control offerings.

In the realm of infrastructure, Renesas's growth is coming predominately from datacenter-focused infrastructure power, timing and memory interfaces. In IoT, the company's core microcontroller business is doing the heavy lifting. In the industrial sector, the ASIC and MPU businesses are driving growth. Overall, Renesas's Industrial Automation segment grew approximately 20% from 2020 to 2021, while its Motor Control segment grew by 30% over the same period. As this growth likely connects to the broader digital transformation and modernization trend across industries, I don't expect to see it slow down any time soon. The widespread adoption of 5G, Wi-Fi 6 and 6e connectivity standards should only accelerate these trends further.


Growth and profit in the industrial sector
Looking at the financials for Renesas's industrial segment since 2019 (see slide below), we can see several positive trends worth noting. Revenue in the industrial segment has grown at a 23% CAGR, gross profits have grown at a 31% CAGR and the segment's operating income has shot through the roof at a CAGR of 62%.
IIOT and Industrial Financial Trends At a Glance

IIOT and Industrial Financial Trends At a Glance
RENESAS
Renesas already has an impressive global customer base in industrial automation. This includes companies such as Rockwell Automation and Emerson in the Americas, Siemens, Hilscher and Schneider Electric in Europe, Inovance and Delta in the Asia-Pacific region and heavy hitters Mitsubishi Electric, Fanuc and Yaskawa in Japan.
With these and many other customers under its belt, Renesas touts itself as a leading semiconductor supplier for "Industry 4.0." In this vision for the future, highly integrated smart factories dominate the industrial landscape, promising to significantly enhance productivity, efficiency, and safety by implementing advanced automation, control, monitoring, and analysis capabilities.
Industrial Automation Pyramid

Industrial Automation Pyramid
RENESAS
Industrial automation
Renesas illustrates its place in the world of industrial automation with a helpful pyramid graphic (see above). At the bottom, the Access Layer consists of the various technology that operates on the factory floor. Here, intelligent sensors and actuators connected through a local field network gather data and send it up the stack for control, monitoring and analysis. The second layer, the Control Layer, includes the system or industrial network and its processes—namely, automation, monitoring and control. Lastly, the Analysis Layer at the top of the pyramid is where most computing and data analysis occurs. Current technological limitations relegate most of this work to the cloud.
As you can see, Renesas's portfolio of digital, analog, timing and standard products are well-represented in the industrial automation pyramid. Renesas offers MPUs, MCUs, PMICs, ASICs, and timing solutions for the Control Layer. In the Access Layer, Renesas provides Smart SSCs, MCUs and sensors for air quality, humidity, position and flow, and interface solutions such as ASICs with IO-link and ASi.
The company's comprehensive portfolio, in my opinion, makes it uniquely positioned to deliver the levels of integration necessary to make "Industry 4.0" a reality. The company's preexisting relationships with big-name customers should also help with integration. Lastly, it's worth mentioning several recent Renesas acquisitions that have augmented the company's value proposition even further.
Dialog Semiconductor, purchased last August, gave Renesas crucial battery management capabilities, Bluetooth low-energy, Wi-Fi, Flash memory and Configurable mixed-signal custom integrated circuits (ICs). This technology, according to Renesas, should help it improve the power efficiency, charge times, performance and productivity of its offerings.
Last October, Renesas also bought Celeno, a company specializing in connectivity solutions. Celeno's Wi-Fi 6 and 6e technology promises to benefit Renesas's industrial automation portfolio by providing higher network capacity, improved signal reliability, lower latency and the ability to mitigate crowded networks.
Our Som Growth

Our Som Growth
RENESAS
I believe all of this translates to a compelling growth story. In the figure above, you can see that Renesas anticipates its SOM (serviceable obtainable market) in industrial automation will outpace the growth of the segment's overall SAM (serviceable addressable market). In other words, not only is the market for industrial automation growing at an impressive rate (11% CAGR), but Renesas's market share is simultaneously growing within it and growing more quickly at that. Whereas Renesas claimed 8% of the overall industrial automation market in 2020, it projects it will secure 9% by 2025.
Motor control
This brings us to motor controls, Renesas's second area of focus in IoT and IIoT. Typically, a motor system consists of gears, a motor and an encoder. Of the different motor types, some examples include BLDC motors (for air conditioners, refrigerators), AC servo motors (which we typically see in industrial automation environments), fan motors (for PCs, servers) and stepping motors (utilized in printers and cameras). As mentioned briefly earlier, the movement away from AC motors towards BLDCs is a trend Renesas is gainfully riding.
Renesas's preexisting customer base is also an advantage in the motor control area. It has years of engineering know-how and customer relationships with makers of power tools, electric motors and machine tools. Its motor control base includes American household brands such as Whirlpool, GE Appliances and Emerson Electric (acquired by Nidec in Japan), European companies such as Miele, Grundfos and Bosch, APAC companies like Delta, Haier (parent company to GE Appliances) and Midea, and Japanese stalwarts Daikin, Hitachi and Mitsubishi Electric. According to Chittibeddi, these key customers form "a strong base" that is the primary driver behind Renesas's innovation engine.
MCU No. 1 Share and Strength For Motor Control

MCU No. 1 Share and Strength For Motor Control
RENESAS
Renesas's market leadership in motor control has historically come from its strength in MCUs, or microcontrollers, that feature Renesas's proprietary cores. However, over the last several years, Renesas launched its RA family of MCUs, which instead leverage 32-bit Arm cores. The rollout of the RA family, starting with the RA2 in 2018, has been aggressive and impressive in equal measure. Though its leading competitor in the Arm ecosystem had a lengthy head start, having launched its first Arm-based MCU in 2010, Renesas has now effectively closed the gap from both a hardware and a software standpoint. As of 2020, Renesas now has Arm-based MCUs that cover the whole range of motor control applications, from the low end (small household and kitchen appliances) to mid-range (large household appliances) to the high end (robotics and other highly advanced applications). With its upcoming RA8 MCUs, Renesas predicts it will finally overtake the competition with the most advanced Arm core on the MCU market.
MCU Product For Motor Control Applications

MCU Product For Motor Control Applications
RENESAS
Wrapping up
Renesas has a busy 2022 on tap and I look forward to following the company more closely than I have in the past. As it currently stands, we'll be hearing more from Renesas at the Embedded World conference in Nuremberg, Germany, in June, the not-to-miss Arm Developer Summit in the Bay Area in October and the Electronica conference in Munich, Germany, in November. At these events, I'll be watching to see if the company's gains in industrial automation and motor control live up to current projections and how Renesas plans to keep the momentum going through 2022 and beyond.
Next week, I will be listening into the company’s investor call to get the latest and greatest.

MY OPINION ONLY DO DYOR
FF

AKIDA BALLISTA
Cloud Computing Pyramid Scheme?
 
  • Haha
Reactions: 5 users

equanimous

Norse clairvoyant shapeshifter goddess
 
  • Like
  • Fire
  • Love
Reactions: 32 users
Top Bottom