BRN Discussion Ongoing

Harwig

Regular
Hey Harwig good to see you “Emerged”
Thank you. It's great to be a part of this forum. Lots of wonderful insightful, thought provoking comments.
 
  • Like
Reactions: 9 users

BaconLover

Founding Member
Screenshot_20220506-175915_Brave.jpg


Well, because Motley Drools talk always about Brainchip it's only fair we get to share these around too.

Doesn't surprise me.

Motley Fools. Stay away from them. Cancel the subscription because your dollar is better spend elsewhere.
 
Last edited:
  • Like
  • Haha
  • Love
Reactions: 32 users

HopalongPetrovski

I'm Spartacus!
Someone must know something that we don't.
Or possibly just found out what we already know and thought....."Crikey, I betta get in before anyone else finds out!"

crikey.png
 
  • Like
  • Haha
  • Fire
Reactions: 15 users

equanimous

Norse clairvoyant shapeshifter goddess
View attachment 5748

Well, because Motley Drools talk always about Brainchip it's only fair we get to share these around too.

Doesn't surprise me.

Motley Fools. Stay away from them. Cancel the subscription because your dollar is better spend elsewhere.
I wonder how much Motley fool make before and after publications
 
  • Like
  • Fire
Reactions: 10 users

M_C

Founding Member
Here's a very interesting article published 5 hours ago. It's a Q&A with Michael Hurlston, the CEO of Synaptics, a developer of hardware and software used in touchpads in computers, autos and smart home devices.. Lots of interesting info about their revenue, R&D spending, share price, difficulties in hiring, etc. TSMC are their largest supplier.

The article states "In March, Synaptics showcased innovations at tinyML Summit, including edge AI tech with low-power SoCs relying on neural network engines for vision, sound-detection and speech processing.


But what really stood out to me was this section...

Extract Only

FE: How does the demand for lower power in chips affect you?


Hurlston: With almost everything we do, power is an issue and so having lower power makes a difference. With AI at the edge, we know that a big driver of the battery in a phone is the display, so we have a face-detect AI algorithm. So, when the phone is close to the face when someone’s in the act of talking on the phone, we shut down the display. It’s a simple AI algorithm and we lock it in. Our phone manufacturing customers beat it up to death in testing, so they are not having fake shutdowns. They try to trick it a hundred different ways.

Here's another example of AI and machine learning… Today, we can do simple things like read license plates or read meters or count people. We can count people coming of a room. Instead of passing that task to a data center, you resolve it on a chip and the advantage is power savings where you don’t need to have a big engine, with passing back to the data center and the latency involved and ultimately the cost.

Take the example of a general purpose, low-power camera handed to a customer with tinyML to do something like identify sick chickens in a chicken coop that are sneezing. That’s a very specific use case, but a general purpose approach takes data and generates an ML model compiled onto a chip.

@Bravo


SAN JOSE, Calif., March 28, 2022 – Synaptics Incorporated (Nasdaq: SYNA) will demonstrate how its ultra-low-power, edge-based artificial intelligence (AI) SoC solutions can be used to quickly and easily implement vision, sound-event detection (SED), keyword spotting, and speech processing capabilities in a variety of applications at this week’s tinyML Summit (March 28-30). A Platinum sponsor of the gathering for the growing ecosystem of low-power machine learning (ML) developers, Synaptics will feature its Katana and DBM10L system-on-chip (SoC) ICs and development tools. Both processors have dedicated, on-chip neural network inference engines that are key to current and future ML applications and IoT devices.

In addition to the application demonstrations, an on-site poster will detail a power-efficient vision system for AI-based people-counting and occupancy applications.

“Ultra-low-power Edge solutions require localized intelligence that relies upon a unique combination of power-efficient hardware and sophisticated AI algorithms, yet must be readily adopted to new applications,” said Venkat Kodavati, SVP and Chief Product Officer at Synaptics. “We’ve developed example applications that show how effective both our Katana and DBM10L platforms and associated tools are at enabling training models for specific use cases for the IoT. They are well suited to help accelerate the adoption of tinyML for the microwatt era of smart and flexible battery-powered devices across a wide range of industries.”
 
  • Like
  • Fire
  • Love
Reactions: 15 users

chapman89

Founding Member
So are we sure Brainchip isn’t in this?

From the CTO of Mercedes Benz just posted on LinkedIn-


As announced earlier, I am delighted that we finally reached the next milestone in automated driving. The innovative DRIVE PILOT, our Level 3 conditional automated driving system, will be available for the #SClass as well as the fully electric #EQS from May 17th, 2022, on.

For me as Chief Technology Officer at Mercedes-Benz AG, it’s both exciting and a source of pride that we achieve the official German sales launch of this worldwide leading #technology that takes #luxury and #safety to a whole new level.

The activated system controls speed and distance while guiding the vehicle within its lane, considering all the time the course of the route, route events that may occur, as well as traffic signs and signals. The #groundbreaking technology builds on the environment sensor technology of the Driving Assistance package and includes additional sensors such as radar, LiDAR and cameras. In addition, ultrasonic and wetness sensors provide valuable data.

A #digital high-precision map provides a three-dimensional road and environment image with all information on road geometry, route characteristics, traffic signs and special traffic events.
Soon, we at #MercedesBenz want to obtain the official certification for DRIVE PILOT in the United States, provided the legal situation allows it – and of course, we will also be looking at other markets in the future.

So, stay tuned!

#AutomatedDriving #LeadInSoftware #Innovation #FutureOfMobility #DRIVEPILOT
_______
Conditionally automated driving (SAE Level 3): the automated driving function takes over certain driving tasks. However, a driver is still required. The driver must be ready to take control of the vehicle at all times when prompted to intervene by the vehicle. Orders of Mercedes-Benz DRIVE PILOT will start on May 17, 2022 in Germany.

Availability and use of future DRIVE PILOT features on motorways depends on options, countries and relevant laws. The pictures show a test drive on a city highway in a development vehicle with a professional test driver”
 
  • Like
  • Fire
  • Love
Reactions: 57 users

M_C

Founding Member
So are we sure Brainchip isn’t in this?

From the CTO of Mercedes Benz just posted on LinkedIn-


As announced earlier, I am delighted that we finally reached the next milestone in automated driving. The innovative DRIVE PILOT, our Level 3 conditional automated driving system, will be available for the #SClass as well as the fully electric #EQS from May 17th, 2022, on.

For me as Chief Technology Officer at Mercedes-Benz AG, it’s both exciting and a source of pride that we achieve the official German sales launch of this worldwide leading #technology that takes #luxury and #safety to a whole new level.

The activated system controls speed and distance while guiding the vehicle within its lane, considering all the time the course of the route, route events that may occur, as well as traffic signs and signals. The #groundbreaking technology builds on the environment sensor technology of the Driving Assistance package and includes additional sensors such as radar, LiDAR and cameras. In addition, ultrasonic and wetness sensors provide valuable data.

A #digital high-precision map provides a three-dimensional road and environment image with all information on road geometry, route characteristics, traffic signs and special traffic events.
Soon, we at #MercedesBenz want to obtain the official certification for DRIVE PILOT in the United States, provided the legal situation allows it – and of course, we will also be looking at other markets in the future.

So, stay tuned!

#AutomatedDriving #LeadInSoftware #Innovation #FutureOfMobility #DRIVEPILOT
_______
Conditionally automated driving (SAE Level 3): the automated driving function takes over certain driving tasks. However, a driver is still required. The driver must be ready to take control of the vehicle at all times when prompted to intervene by the vehicle. Orders of Mercedes-Benz DRIVE PILOT will start on May 17, 2022 in Germany.

Availability and use of future DRIVE PILOT features on motorways depends on options, countries and relevant laws. The pictures show a test drive on a city highway in a development vehicle with a professional test driver”

"So are we sure Brainchip isn’t in this?"


no yes.gif
 
  • Like
  • Haha
  • Fire
Reactions: 29 users

RobjHunt

Regular
I'll ask Stevie Wonder who once received a cheese grater for Xmas. Said it was the most violent book he ever read.
An oldie but a goodie!
 
  • Like
  • Haha
Reactions: 5 users

Dozzaman1977

Regular
Wow the world MCU market is expected to reach 48 billion in sales by 2027...... And Renesas is a major player putting akida IP into a SET of MCU products, not just one product, a SET of products!!!!!!!!
Screenshot_20220506-183230.png
 
  • Like
  • Fire
  • Love
Reactions: 27 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
  • Like
  • Haha
  • Love
Reactions: 24 users
Wow the world MCU market is expected to reach 48 billion in sales by 2027...... And Renesas is a major player putting akida IP into a SET of MCU products, not just one product, a SET of products!!!!!!!! View attachment 5754
I want to know what royalty brainchip would get for a product ?
 
  • Like
Reactions: 1 users

IloveLamp

Top 20
So are we sure Brainchip isn’t in this?

From the CTO of Mercedes Benz just posted on LinkedIn-


As announced earlier, I am delighted that we finally reached the next milestone in automated driving. The innovative DRIVE PILOT, our Level 3 conditional automated driving system, will be available for the #SClass as well as the fully electric #EQS from May 17th, 2022, on.

For me as Chief Technology Officer at Mercedes-Benz AG, it’s both exciting and a source of pride that we achieve the official German sales launch of this worldwide leading #technology that takes #luxury and #safety to a whole new level.

The activated system controls speed and distance while guiding the vehicle within its lane, considering all the time the course of the route, route events that may occur, as well as traffic signs and signals. The #groundbreaking technology builds on the environment sensor technology of the Driving Assistance package and includes additional sensors such as radar, LiDAR and cameras. In addition, ultrasonic and wetness sensors provide valuable data.

A #digital high-precision map provides a three-dimensional road and environment image with all information on road geometry, route characteristics, traffic signs and special traffic events.
Soon, we at #MercedesBenz want to obtain the official certification for DRIVE PILOT in the United States, provided the legal situation allows it – and of course, we will also be looking at other markets in the future.

So, stay tuned!

#AutomatedDriving #LeadInSoftware #Innovation #FutureOfMobility #DRIVEPILOT
_______
Conditionally automated driving (SAE Level 3): the automated driving function takes over certain driving tasks. However, a driver is still required. The driver must be ready to take control of the vehicle at all times when prompted to intervene by the vehicle. Orders of Mercedes-Benz DRIVE PILOT will start on May 17, 2022 in Germany.

Availability and use of future DRIVE PILOT features on motorways depends on options, countries and relevant laws. The pictures show a test drive on a city highway in a development vehicle with a professional test driver”
Mercedes is using LUMINAR............what's good enough for the goose..............is good enough for the gander!


As manufacturers try to turn their vehicles into rolling smartphone-like devices, the race will revolve around the next big things in chips that upgrade infotainment and vision systems, as well as the car’s general controls.

“That is really the big change that’s happening in the industry,” Jim Rowan, a former top executive at BlackBerry and Dyson who started as CEO of Volvo Car last week, said Tuesday in an interview.

Capture.PNG
 
  • Like
  • Fire
Reactions: 8 users

M_C

Founding Member
 
  • Like
  • Haha
  • Fire
Reactions: 24 users

RobjHunt

Regular
The only disappointing thing about today's remarkable performance by Brainchip completely against the rest of the ASX is that it gave this group an opening to write an article and lure other unsuspecting investors into their trap.

My opinion only DYOR
FF

AKIDA BALLISTA
In my opinion this copy and paste part that has been part of their commentary in their past articles will appear again and again;

Should you invest $1,000 in BrainChip right now?

Before you consider BrainChip, you'll want to hear this.

Motley Fool Investing expert Scott Phillips just revealed what he believes are the 5 best stocks for investors to buy right now... and BrainChip wasn't one of them.
 
  • Love
  • Like
Reactions: 2 users

cosors

👀
I think he or she is on safe ground saying this as they use entirely different approaches and systems so are not compatible one with the other. Put'em together and what have you got? A mess. No bipperty bobbity boo.

But @DingoBorat there have been a number of publications including from the US Defence Department pointing out the shortfalls with these two different technologies and even this morning the post put up by @Rocket577 included a paper out of IMEC which had the following to say:


II. COMPARISON WITH OTHER DIGITAL NEUROMORPHIC
PL ATFORMS

To the best of our knowledge, the SpiNNaker architecture
[3] is a closest neuromorphic platform to SENeCA. SpiNNaker
contains several ARM cores as the processing units connected
through an advanced single router star-type multicasting asyn-
chronous packet-switched network. SpiNNaker2 [4] added
several accelerated arithmetic processing units and advanced
power management techniques in the GF22nm technology
node. On the contrary, SENeCA uses one of the smallest open-
source RISC-V processors as the controller (not used for event
processing) together with optimized accelerators and a low-
overhead mesh-type multicasting NoC (with reduced func-
tionality compared to SpiNNaker) for sparse parallel event-
based computation. Unlike SpiNNaker which is designed for
the simulation of brain-inspired research, the primary purpose
of SENeCA is to have both the hardware and software open
for optimizations and innovations in the EdgeAI neuromorphic
computation.
On the other hand, IBM TrueNorth [5] uses a plain mesh
packet-switched network (uni-cast) but with optimized (inflex-
ible) processing cores. Each core in the TrueNorth architecture
emulates exactly 256 neurons. Each neuron has 256 input
synapses, organized in a crossbar architecture, with a single
output axon connected to 256 neurons in another core. This
optimized processing core resulted in a power-efficient neuron
update (about 26pJ). µBrain [6] goes further in optimized
processing core and allows for ultra-low-power application-
specific IP (in contrast with the multi-purpose neuromorphic
processor).
In Intel Loihi [7], the processing cores are more flexible than
TrueNorth, and the interconnect is a simple uni-cast packet-
switched mesh. Also, Loihi cores accelerate a bio-inspired
learning algorithm. The cost of this flexibility is having a
higher neuron update energy (about 80pJ) in comparison
with the TrueNorth (while using a better technology node).
Loihi2 [8] scaled up the Loihi chip by packing more neurons
and synapses in a die, using the Intel4 technology node.
Additionally, it introduced programmable neurons with micro-
code, a feature also available in SENeCA. Both Loihi chips
accelerate a specific kind of bio-inspired learning mechanism
on-chip."

The other thing to note about IBM's True North is it makes no claims to having application at the Edge that is not the territory it is trying to mark out for itself and still maintains it is in research.

Intel's Loihi 1 & 2 are continuously described as only research chips and in the latest release from Intel posted here Intel has stated recently that they are still to identify a use case for Loihi and it may never be produced as a commercial chip and be utilised in the cloud.

If someone wants to use a neuromorphic commercial chip off the shelf at the edge the three major players SpinNaker, IBM and Intel having nothing available in their catalogue so they are compelled to look elsewhere. Brainchip's AKIDA is the undisputed most versatile neuromorphic edge chip on the market today and can also be bought as IP.

Blind Freddie just cannot believe that the sighted people cannot see the bleeding obvious particularly when the following is there in public view for all to see:

1. Nvidia is partnered with Mercedes

2. Brainchip is partnered with Mercedes

3. SiFive is partnered with Nvidia for RISC-V

4. Brainchip is partnered with SiFive to bring Ai to RISC-V

5. Brainchip is partnered with MegaChips for automotive

6. Brainchip’s Rob Telson stated in answer about competing with Nvidia that they see Nvidia more as a partner in the future

7. Nviso is partnered with Brainchip

8. Nviso is working in robotics and is partnering with Brainchip specifically for this purpose

9. Nviso is partnered with Panasonic for robotics

10. MegaChips is also partnered with Brainchip for Industrial Robotics.

11. Brainchip is partnered with Valeo

12. Valeo is partnered with Mercedes for LiDAR

13. Valeo is partnered with Honda for LiDAR

Personally I do not believe anyone knows anymore than we do.

I believe they are just starting to catch up.

My anonymous opinion only so DYOR
FF

AKIDA BALLISTA
Thank you very much for your overview and for your tireless work!
I've only been with BRN since November and with the huge flow of information here it's hard for me to keep track of everything, including how things are connected.
Thanks to all of you who contribute so much! I miss the other side zero and feel like a fish in water on tse. A good friend will soon follow.
My thanks of course also @zeeb0t!
...I have made BRN my strongest position in the last few days ;)
 
  • Like
  • Fire
  • Love
Reactions: 44 users

RobjHunt

Regular
The interesting thing is, every time the Fool does an article on BRN, they always have 5 better buys - given how well BRN has done in the time I've held, I wish I had followed their advice.
Do I detect a hint of jibe Dio?
 
  • Like
  • Love
Reactions: 3 users
@Bravo


SAN JOSE, Calif., March 28, 2022 – Synaptics Incorporated (Nasdaq: SYNA) will demonstrate how its ultra-low-power, edge-based artificial intelligence (AI) SoC solutions can be used to quickly and easily implement vision, sound-event detection (SED), keyword spotting, and speech processing capabilities in a variety of applications at this week’s tinyML Summit (March 28-30). A Platinum sponsor of the gathering for the growing ecosystem of low-power machine learning (ML) developers, Synaptics will feature its Katana and DBM10L system-on-chip (SoC) ICs and development tools. Both processors have dedicated, on-chip neural network inference engines that are key to current and future ML applications and IoT devices.

In addition to the application demonstrations, an on-site poster will detail a power-efficient vision system for AI-based people-counting and occupancy applications.

“Ultra-low-power Edge solutions require localized intelligence that relies upon a unique combination of power-efficient hardware and sophisticated AI algorithms, yet must be readily adopted to new applications,” said Venkat Kodavati, SVP and Chief Product Officer at Synaptics. “We’ve developed example applications that show how effective both our Katana and DBM10L platforms and associated tools are at enabling training models for specific use cases for the IoT. They are well suited to help accelerate the adoption of tinyML for the microwatt era of smart and flexible battery-powered devices across a wide range of industries.”
I have read pages and pages about the Katana Ai Edge platform from Synaptics and despite the claim they are low power and can run on a battery they appear to never have told anyone since introducing the platform in 2020 what low power actually means.

I also have a suspicion they use multiple Katana chips on the platform to run more than one function but again they are a bit vague but it does seem they are all pre-trained to the particular function so if you want to run a security camera it is set up for that purpose and cannot be further trained on device. Same for voice.

They had a low power function to count people entering a room but to achieve this low power (not quantified) they used a low resolution camera that provided images that would not allow you to identify the people.

Clearly the demonstration by Brainchip available on the video section of the website where AKIDA is identifying and naming people walking past a camera in a foyer serviced by lifts is well beyond them at the sort of power draw of AKIDA.

My opinion only DYOR
FF

AKIDA BALLISTA
 
  • Like
  • Fire
  • Love
Reactions: 27 users

GStocks123

Regular
  • Like
  • Fire
  • Love
Reactions: 19 users
So are we sure Brainchip isn’t in this?

From the CTO of Mercedes Benz just posted on LinkedIn-


As announced earlier, I am delighted that we finally reached the next milestone in automated driving. The innovative DRIVE PILOT, our Level 3 conditional automated driving system, will be available for the #SClass as well as the fully electric #EQS from May 17th, 2022, on.

For me as Chief Technology Officer at Mercedes-Benz AG, it’s both exciting and a source of pride that we achieve the official German sales launch of this worldwide leading #technology that takes #luxury and #safety to a whole new level.

The activated system controls speed and distance while guiding the vehicle within its lane, considering all the time the course of the route, route events that may occur, as well as traffic signs and signals. The #groundbreaking technology builds on the environment sensor technology of the Driving Assistance package and includes additional sensors such as radar, LiDAR and cameras. In addition, ultrasonic and wetness sensors provide valuable data.

A #digital high-precision map provides a three-dimensional road and environment image with all information on road geometry, route characteristics, traffic signs and special traffic events.
Soon, we at #MercedesBenz want to obtain the official certification for DRIVE PILOT in the United States, provided the legal situation allows it – and of course, we will also be looking at other markets in the future.

So, stay tuned!

#AutomatedDriving #LeadInSoftware #Innovation #FutureOfMobility #DRIVEPILOT
_______
Conditionally automated driving (SAE Level 3): the automated driving function takes over certain driving tasks. However, a driver is still required. The driver must be ready to take control of the vehicle at all times when prompted to intervene by the vehicle. Orders of Mercedes-Benz DRIVE PILOT will start on May 17, 2022 in Germany.

Availability and use of future DRIVE PILOT features on motorways depends on options, countries and relevant laws. The pictures show a test drive on a city highway in a development vehicle with a professional test driver”
I know what others have posted but here is the actual facts. If as we all believe AKIDA is being used by Valeo then Brainchip is at the heart of this Mercedes advance:





Mercedes-Benz puts the self-driving driver on the road in its best models. (Photo: Mercedes-Benz)



In this six-month period, the time has come: Mercedes-Benz seeks to claim the pioneer role in driving freely and Run experimentsbring to the street. Shortly before the start of the year, the manufacturer of the prepayment was able to report that the verification had been completed,after CEO Ola Källenius announced this for 2021. With the approval of the Federal Motor Vehicle Authority (KBA) – in accordance with the technical authorization code UN-R157 – the system should start. in the new S-Class and then used in the EQS for all electronics.
After the work of driving freely has already been largely presented to the public, it is now a matter of adhering to an independent schedule. Market launch in Germany marks an important milestone. Introduction procedures are then scheduled at the end of the year, once there is a national legal framework that, in particular, allows for a shift from driving to work.


Drive Pilot announces a conceptual change​

yes Run experiments so previously it will take all the driving activities on German baans as long as only 60 kilometers per hour are not passed. According to the manufacturer, a total of 13,200 kilometers of route has been approved. The driver may dedicate himself to secondary activities, but he must be prepared to take the position. However, the system does not work in tunnels and parts of construction sites, when emergency vehicles are approaching or heavy rain.
“We are the first manufacturers to enter the production of automatic operating series in Germany”says Markus Schäfer, Daimler and Mercedes-Benz Management Board Member and Chief Technology Officer. “At this crucial point, we are once again demonstrating our pioneer work in automotive driving and we are also introducing a radical change of perspective. certain.

The new Lidar from Valeo becomes a key feature​

Technical basis of Run experiments creates a combination of radar, lidar, camera and microphone. ForDirections are provided with information from HD maps in the background and a positioning system that, according to Mercedes, goes far beyond conventional GPS systems. Non-essential steering and brake systems as well as the unlimited board network should ensure maneuver even in the event of a system failure.
The heart of the sensors is Lidar’s second generation Valeo Scala, which works in all light conditions, eliminates possible distortions – such as raindrops – by using software and should trigger an autonomous cleaning system if the field of vision is blocked by ice. or dust. According to the provider, the Lidar generation is celebrating its first show in the new S-Class.


Vision of free parking​

Not only driving, but also parking should be free. Similar to the evolution of Mind drivingfor Run experiments he gets Memory Store Assistant in level 2 of SAE, which facilitates learning in a specific parking space, a visionary partner. So it is with S-Class and EQS and that Mind of parking trials ready for Automatic valet parking (AVP) has been upgraded to SAE Level 4.
In multi-storey car parks with the necessary infrastructure, completely automatic and driver-free parking can be entered and delivered using smart phones. The vehicle is dropped off at a designated parking area, drives to free parking space and, if desired, returns to the pick-up area.

Mercedes-Benz explores the potential of synergies​

While autonomous parking continues to be tested alongside Bosch – in Stuttgart and Beijing, among other places – the two partners have been moving differently in the development of robotic taxis since the summer of 2021. belongs to the American company Luminar close to Nvida for the remaining prominent allies. It should be higher, cheaperDistribute clues to series production. Although Markus Schäfer emphasizes the importance of collaboration in this sense, it seems as if the group is focusing on their own competence. In doing so, he sets unlike competitors like Volkswagen – not only does it explicitly focus on autonomous driving functions for private vehicles, but it also always sets the possibility of co-operation with the test.
A good example of this is the development partnership with BMW, which will begin the 2020 summer season was suspended after one year. After the elephant wedding in the free-driving arena, which according to rumors Audi could also attend, German car manufacturers returned to normal – a competitive relationship. Because of “great efforts on a common technological basis”and “General terms of trade and economy”the partnership could not be successfully implemented, it was said at the time. Je! Run experiments if it hit the road in the near future, the Stuttgart company would have started well in the race against the competition


My opinion only DYOR
FF

AKIDA BALLISTA
 
  • Like
  • Love
  • Fire
Reactions: 41 users

Diogenese

Top 20
I have read pages and pages about the Katana Ai Edge platform from Synaptics and despite the claim they are low power and can run on a battery they appear to never have told anyone since introducing the platform in 2020 what low power actually means.

I also have a suspicion they use multiple Katana chips on the platform to run more than one function but again they are a bit vague but it does seem they are all pre-trained to the particular function so if you want to run a security camera it is set up for that purpose and cannot be further trained on device. Same for voice.

They had a low power function to count people entering a room but to achieve this low power (not quantified) they used a low resolution camera that provided images that would not allow you to identify the people.

Clearly the demonstration by Brainchip available on the video section of the website where AKIDA is identifying and naming people walking past a camera in a foyer serviced by lifts is well beyond them at the sort of power draw of AKIDA.

My opinion only DYOR
FF

AKIDA BALLISTA

This looks like the inner edge ...

Synaptics:
US2021352347A1 ADAPTIVE VIDEO STREAMING SYSTEMS AND METHODS

Systems and method for streaming video content include downscaling video content using a downscaling model to generate downscaled video content and downloading the downscaled video content as a video stream and a corresponding upscaling model to a client device. The client device upscales the video stream using the received upscaling model for display by the client device in real-time. A training system trains the downscaling model to generate the downscaled video content, based on associated metadata identifying a type of video content. The downscaled video content and one or more associated upscaling models is stored for access by an edge server, which downloads a plurality of upscaling models to a client device configured to select an upscaling model for use by the client device. Example systems may include video streaming systems and video conferencing systems.

1651832161761.png


1651832096763.png


[0049] In various embodiments, the client device 550 may be configured to capture the camera stream at a resolution that both end points have determined to be optimal for the conditions, thereby avoiding the need to downscale the stream before transmission. For example, both end points can determine that they can stream at 720p and let the respective artificial intelligence (AI) upscaling models scale the streams to 4K. In other embodiments, peer-to-peer communications may be established without use of an intermediary session manager, for example, by using an application and/or protocol that determines the video resolution for streaming and predetermined upscaling neural network models for processing the incoming video stream(s). It will be appreciated that the video conferencing system may be used with more than two client devices in both the hosted and peer-to-peer implementations.
I know what others have posted but here is the actual facts. If as we all believe AKIDA is being used by Valeo then Brainchip is at the heart of this Mercedes advance:





Mercedes-Benz puts the self-driving driver on the road in its best models. (Photo: Mercedes-Benz)



In this six-month period, the time has come: Mercedes-Benz seeks to claim the pioneer role in driving freely and Run experimentsbring to the street. Shortly before the start of the year, the manufacturer of the prepayment was able to report that the verification had been completed,after CEO Ola Källenius announced this for 2021. With the approval of the Federal Motor Vehicle Authority (KBA) – in accordance with the technical authorization code UN-R157 – the system should start. in the new S-Class and then used in the EQS for all electronics.
After the work of driving freely has already been largely presented to the public, it is now a matter of adhering to an independent schedule. Market launch in Germany marks an important milestone. Introduction procedures are then scheduled at the end of the year, once there is a national legal framework that, in particular, allows for a shift from driving to work.


Drive Pilot announces a conceptual change​

yes Run experiments so previously it will take all the driving activities on German baans as long as only 60 kilometers per hour are not passed. According to the manufacturer, a total of 13,200 kilometers of route has been approved. The driver may dedicate himself to secondary activities, but he must be prepared to take the position. However, the system does not work in tunnels and parts of construction sites, when emergency vehicles are approaching or heavy rain.
“We are the first manufacturers to enter the production of automatic operating series in Germany”says Markus Schäfer, Daimler and Mercedes-Benz Management Board Member and Chief Technology Officer. “At this crucial point, we are once again demonstrating our pioneer work in automotive driving and we are also introducing a radical change of perspective. certain.

The new Lidar from Valeo becomes a key feature​

Technical basis of Run experiments creates a combination of radar, lidar, camera and microphone. ForDirections are provided with information from HD maps in the background and a positioning system that, according to Mercedes, goes far beyond conventional GPS systems. Non-essential steering and brake systems as well as the unlimited board network should ensure maneuver even in the event of a system failure.
The heart of the sensors is Lidar’s second generation Valeo Scala, which works in all light conditions, eliminates possible distortions – such as raindrops – by using software and should trigger an autonomous cleaning system if the field of vision is blocked by ice. or dust. According to the provider, the Lidar generation is celebrating its first show in the new S-Class.


Vision of free parking​

Not only driving, but also parking should be free. Similar to the evolution of Mind drivingfor Run experiments he gets Memory Store Assistant in level 2 of SAE, which facilitates learning in a specific parking space, a visionary partner. So it is with S-Class and EQS and that Mind of parking trials ready for Automatic valet parking (AVP) has been upgraded to SAE Level 4.
In multi-storey car parks with the necessary infrastructure, completely automatic and driver-free parking can be entered and delivered using smart phones. The vehicle is dropped off at a designated parking area, drives to free parking space and, if desired, returns to the pick-up area.

Mercedes-Benz explores the potential of synergies​

While autonomous parking continues to be tested alongside Bosch – in Stuttgart and Beijing, among other places – the two partners have been moving differently in the development of robotic taxis since the summer of 2021. belongs to the American company Luminar close to Nvida for the remaining prominent allies. It should be higher, cheaperDistribute clues to series production. Although Markus Schäfer emphasizes the importance of collaboration in this sense, it seems as if the group is focusing on their own competence. In doing so, he sets unlike competitors like Volkswagen – not only does it explicitly focus on autonomous driving functions for private vehicles, but it also always sets the possibility of co-operation with the test.
A good example of this is the development partnership with BMW, which will begin the 2020 summer season was suspended after one year. After the elephant wedding in the free-driving arena, which according to rumors Audi could also attend, German car manufacturers returned to normal – a competitive relationship. Because of “great efforts on a common technological basis”and “General terms of trade and economy”the partnership could not be successfully implemented, it was said at the time. Je! Run experiments if it hit the road in the near future, the Stuttgart company would have started well in the race against the competition


My opinion only DYOR
FF

AKIDA BALLISTA
I would think the elephant wedding in the free-driving arena would create a considerable traffic hazard - both hands on the wheel please.
 
  • Like
  • Love
  • Fire
Reactions: 14 users
Top Bottom