Bravo
If ARM was an arm, BRN would be its biceps💪!
Ok, so status quo then and the instos will continue to lend their shares to shorters...... excellent
I think I just had an extremely unpleasant out of body experience.
Ok, so status quo then and the instos will continue to lend their shares to shorters...... excellent
| 1 | EP3913534A4 | SYSTEM AND METHOD FOR REAL-TIME RADAR-BASED ACTION RECOGNITION USING SPIKING NEURAL NETWORK(SNN) Application | Publication/Patent Number: EP3913534A4 | Publication Date: 2021-11-24 | Application Number: EP20213433 | Filing Date: 2020-12-11 | Inventor: Rani, Smriti Dey, Sounak George, Arun Pal, Arpan Banerjee, Dighanchal Chakravarty, Tapas Chowdhury, Arijit Mukherjee, Arijit | Assignee: Tata Consultancy Services Limited | IPC: G06K9/00 | Abstract: This disclosure relates generally to action recognition and more particularly to system and method for real-time radar-based action recognition. The classical machine learning techniques used for learning and inferring human actions from radar images are compute intensive, and require volumes of training data, making them unsuitable for deployment on network edge. The disclosed system utilizes neuromorphic computing and Spiking Neural Networks (SNN) to learn human actions from radar data captured by radar sensor(s). In an embodiment, the disclosed system includes a SNNmodel having a data pre-processing layer, Convolutional SNN layers and a Classifier layer. The preprocessing layer receives radar data including doppler frequencies reflected from the target and determines a binarized matrix. The CSNN layers extracts features (spatial and temporal) associated with the target's actions based on the binarized matrix. The classifier layer identifies a type of the action performed by the target based on the features |
| 2 | EP3913534A1 | SYSTEM AND METHOD FOR REAL-TIME RADAR-BASED ACTION RECOGNITION USING SPIKING NEURAL NETWORK(SNN) Application | Publication/Patent Number: EP3913534A1 | Publication Date: 2021-11-24 | Application Number: EP20213433.4 | Filing Date: 2020-12-11 | Inventor: Dey, Sounak Mukherjee, Arijit Banerjee, Dighanchal Rani, Smriti George, Arun Chakravarty, Tapas Chowdhury, Arijit Pal, Arpan | Assignee: Tata Consultancy Services Limited | IPC: G06K9/46 | Abstract: This disclosure relates generally to action recognition and more particularly to system and method for real-time radar-based action recognition. The classical machine learning techniques used for learning and inferring human actions from radar images are compute intensive, and require volumes of training data, making them unsuitable for deployment on network edge. The disclosed system utilizes neuromorphic computing and Spiking Neural Networks (SNN) to learn human actions from radar data captured by radar sensor(s). In an embodiment, the disclosed system includes a SNNmodel having a data pre-processing layer, Convolutional SNN layers and a Classifier layer. The preprocessing layer receives radar data including doppler frequencies reflected from the target and determines a binarized matrix. The CSNN layers extracts features (spatial and temporal) associated with the target's actions based on the binarized matrix. The classifier layer identifies a type of the action performed by the target based on the features |
| 3 | US20210365778A1 | SYSTEM AND METHOD FOR REAL-TIME RADAR-BASED ACTION RECOGNITION USING SPIKING NEURAL NETWORK(SNN) Application | Publication/Patent Number: US20210365778A1 | Publication Date: 2021-11-25 | Application Number: US17/122,041 | Filing Date: 2020-12-15 | Inventor: Dey, Sounak Mukherjee, Arijit Banerjee, Dighanchal Rani, Smriti George, Arun Chakravarty, Tapas Chowdhury, Arijit Pal, | Assignee: Tata Consultancy Services Limited | IPC: G06N3/08 | Abstract: This disclosure relates generally to action recognition and more particularly to system and method for real-time radar-based action recognition. The classical machine learning techniques used for learning and inferring human actions from radar images are compute intensive, and require volumes of training data, making them unsuitable for deployment on network edge. The disclosed system utilizes neuromorphic computing and Spiking Neural Networks (SNN) to learn human actions from radar data captured by radar sensor(s). In an embodiment, the disclosed system includes a SNNmodel having a data pre-processing layer, Convolutional SNN layers and a Classifier layer. The preprocessing layer receives radar data including doppler frequencies reflected from the target and determines a binarized matrix. The CSNN layers extracts features (spatial and temporal) associated with the target's actions based on the binarized matrix. The classifier layer identifies a type of the action performed by the target based on the features. |
I don’t want to inform on anyone but was Mercedes Benz eavesdropping on the last podcast where Peter van der Made confessed to having his first version of beneficial artificial general intelligence available for full autonomous driving in approximately 7 years which some might conclude is about 2030????Crikey!
Mercedes Says Level 4 Self-Driving Will Happen By 2030
MAR. 02, 2023 6:32 PM ETBY JAY TRAUGOTT TECHNOLOGY / 5 COMMENTS
The German automaker has already beaten rivals for Level 3 approval.
Mercedes-Benz has gone on record stating that Level 4 self-driving is "doable" by decade's end. Speaking to Automotive News, the German automaker's Chief Technology Officer, Markus Schafer, said, "private-owned Level 4 cars, absolutely. This is something that I see in the future."
Mercedes' Level 3 technology, called Drive Pilot, is the first of its kind in the US and has already been approved for use in Nevada, and other states such as California are expected to follow suit in the coming months. For now, only the 2024 S-Class and EQS Sedan offer Level 3, both of which will go on sale in the second half of this year. Unlike Level 2+ systems like GM's Super Cruise, Ford's Blue Cruise, and Tesla's Autopilot and Full Self-Driving, Drive Pilot can "hand over the dynamic driving task to the vehicle under certain conditions."
The system utilizes a combination of LiDAR, radar, and various sensors to allow for safe highway driving at speeds of up to 40 mph.
Mercedes-Benz
Mercedes-Benz
Yes, Level 3 makes it possible to text and drive (although we should point out that this is still illegal), but the driver must still be ready to assume control of the vehicle at a moment's notice if the system detects any sort of obstacle. Level 4 takes things up a notch. "Just imagine you are in a big city, and you come from work, and you are sitting for two hours in traffic, and you press the button and go to sleep," Schaefer added. "There will be a demand for that."
One key difference between Level 3 and Level 4 is that human drivers don't have to keep an eye on the road in most conditions. Level 4 is ideally suited for heavy traffic within cities, but things like extreme weather are a different matter.
Level 5, which requires absolutely no human involvement, is still years away from becoming possible. Companies like Waymo and Cruise are currently testing driverless taxis with that tech, but it's still far from perfect and remains expensive.
Mercedes is taking responsibility for Drive Pilot's accuracy and safety by assuming liability if the vehicle were to be the cause of a highway crash, for example.
The carmaker's Level 3 headstart places it in a prime position to introduce Level 4. Its upcoming new Modular Architecture platform, due to launch in the middle of the decade, will come hardwired with Level 4 capability once the technology is ready and approved for use by government safety regulators. The race to Level 4 brings not only prestige and bragging rights but also significant revenue.
Automakers - especially luxury brands - know that consumers will be more than happy to pay more for the technology because it brings a lot of additional conveniences. But the most difficult task they will face - and Mercedes is no exception - is proving to the public that Level 4 is safe. The introduction of Level 3 Drive Pilot is a significant step in that direction.
Mercedes Says Level 4 Self-Driving Will Happen By 2030
The German automaker has already beaten rivals for Level 3 approval.carbuzz.com
And, no, I don't mean YAMULKAS, before anyone asks.
Sorry. Too many big words for me.Apologies I can’t seem to shorten .
I’m a big fan of Tata / TCS some interesting patent information.
Generally speaking this is above my hourly rate
$2.34
1
EP3913534A4SYSTEM AND METHOD FOR REAL-TIME RADAR-BASED ACTION RECOGNITION USING SPIKING NEURAL NETWORK(SNN)
ApplicationPublication/Patent Number: EP3913534A4 Publication Date: 2021-11-24 Application Number: EP20213433 Filing Date: 2020-12-11 Inventor: Rani, Smriti Dey, Sounak George, Arun Pal, Arpan Banerjee, Dighanchal Chakravarty, Tapas Chowdhury, Arijit Mukherjee, Arijit Assignee: Tata Consultancy Services Limited IPC: G06K9/00 Abstract: This disclosure relates generally to action recognition and more particularly to system and method for real-time radar-based action recognition. The classical machine learning techniques used for learning and inferring human actions from radar images are compute intensive, and require volumes of training data, making them unsuitable for deployment on network edge. The disclosed system utilizes neuromorphic computing and Spiking Neural Networks (SNN) to learn human actions from radar data captured by radar sensor(s). In an embodiment, the disclosed system includes a SNNmodel having a data pre-processing layer, Convolutional SNN layers and a Classifier layer. The preprocessing layer receives radar data including doppler frequencies reflected from the target and determines a binarized matrix. The CSNN layers extracts features (spatial and temporal) associated with the target's actions based on the binarized matrix. The classifier layer identifies a type of the action performed by the target based on the features 2
EP3913534A1SYSTEM AND METHOD FOR REAL-TIME RADAR-BASED ACTION RECOGNITION USING SPIKING NEURAL NETWORK(SNN)
ApplicationPublication/Patent Number: EP3913534A1 Publication Date: 2021-11-24 Application Number: EP20213433.4 Filing Date: 2020-12-11 Inventor: Dey, Sounak Mukherjee, Arijit Banerjee, Dighanchal Rani, Smriti George, Arun Chakravarty, Tapas Chowdhury, Arijit Pal, Arpan Assignee: Tata Consultancy Services Limited IPC: G06K9/46 Abstract: This disclosure relates generally to action recognition and more particularly to system and method for real-time radar-based action recognition. The classical machine learning techniques used for learning and inferring human actions from radar images are compute intensive, and require volumes of training data, making them unsuitable for deployment on network edge. The disclosed system utilizes neuromorphic computing and Spiking Neural Networks (SNN) to learn human actions from radar data captured by radar sensor(s). In an embodiment, the disclosed system includes a SNNmodel having a data pre-processing layer, Convolutional SNN layers and a Classifier layer. The preprocessing layer receives radar data including doppler frequencies reflected from the target and determines a binarized matrix. The CSNN layers extracts features (spatial and temporal) associated with the target's actions based on the binarized matrix. The classifier layer identifies a type of the action performed by the target based on the features 3
US20210365778A1SYSTEM AND METHOD FOR REAL-TIME RADAR-BASED ACTION RECOGNITION USING SPIKING NEURAL NETWORK(SNN)
ApplicationPublication/Patent Number: US20210365778A1 Publication Date: 2021-11-25 Application Number: US17/122,041 Filing Date: 2020-12-15 Inventor: Dey, Sounak Mukherjee, Arijit Banerjee, Dighanchal Rani, Smriti George, Arun Chakravarty, Tapas Chowdhury, Arijit Pal, Assignee: Tata Consultancy Services Limited IPC: G06N3/08 Abstract: This disclosure relates generally to action recognition and more particularly to system and method for real-time radar-based action recognition. The classical machine learning techniques used for learning and inferring human actions from radar images are compute intensive, and require volumes of training data, making them unsuitable for deployment on network edge. The disclosed system utilizes neuromorphic computing and Spiking Neural Networks (SNN) to learn human actions from radar data captured by radar sensor(s). In an embodiment, the disclosed system includes a SNNmodel having a data pre-processing layer, Convolutional SNN layers and a Classifier layer. The preprocessing layer receives radar data including doppler frequencies reflected from the target and determines a binarized matrix. The CSNN layers extracts features (spatial and temporal) associated with the target's actions based on the binarized matrix. The classifier layer identifies a type of the action performed by the target based on the features.
Me too ,Sorry. Too many big words for me.
Anytime someone drops the "D" word (Doppler), I start to lose consciousness a little bit.Me too ,
I just like SNN , Neuromorphic,
Agree Dio has the same effect on meAnytime someone drops the "D" word (Doppler), I just start to lose consciousness a little bit.
Well if you follow the reasoning of the Federal Court in GetSwift and ASIC such a conditional agreement with payment conditioned on the first sale would not constitute an agreement that can be announced on the ASX.So Teksun is a good example
"With BrainChip’s Akida processor, we will be able to deliver next-generation AIoT devices that are faster, more efficient, and more intelligent than ever before and not just meet, but exceed the expectations of our customers"
I have no doubt that they will have customers that will have Akida solving their predictive maintenance in the next year.
As a customer progresses to add Akida they are not each going to take out a licencing agreement are they? That can get quite involving legally with individual agreements and Teksun having to contact us each time. Will Teksun take one out on behalf of all their clients. Or the fact we are in partnership with them do we have a partnership agreement for royalty share in the future??
If it is Teksun that takes it out. Why not take it out now and write contract that payment isn't due till first products produced etc.
What are they teaching the young ones these days in school. We did the Doppler Effect in first year science before computers, mobile phones and Google.Anytime someone drops the "D" word (Doppler), I start to lose consciousness a little bit.
Thanks FF. I agree with you and this could be true for all our partnerships prophesee included. So we could have already come to some terms for income from Qualcomm for example and no need to share this as sales are some time off. It would also mean never a license needed??Well if you follow the reasoning of the Federal Court in GetSwift and ASIC such a conditional agreement with payment conditioned on the first sale would not constitute an agreement that can be announced on the ASX.
Why because those sales may never occur and therefore a defined sum of money cannot be calculated by the Board.
My opinion only so take real live legal advice DYOR
FF
AKIDA BALLISTA
You just gotta 300 minus the 200 X the actual current position / MC + SOI X Pi obtain the square root.....Hi all, can someone please explain to me how we stayed in the ASX200 when we're lucky be in the top 300? I really don't understand.

I don’t want to inform on anyone but was Mercedes Benz eavesdropping on the last podcast where Peter van der Made confessed to having his first version of beneficial artificial general intelligence available for full autonomous driving in approximately 7 years which some might conclude is about 2030????
My speculation and giant dot joining so DYOR
FF
AKIDA BALLISTA
With only 30 minutes to go before close of week, I found a way to cover my face with egg. Happy to be wrong.Its probably the investment funds selling most if not all their BRN holdings ahead of the ASX 200 index rebalance. I expect a few more days of this before the share price can bounce back.
The question that has me thinking is, if the institutions that are selling down their BRN holdings had also lent their shares out to shorters, wouldn't the shorters have to buy back their shorts also. Should we expect a few large cross trades in the next few days?
Hi all, can someone please explain to me how we stayed in the ASX200 when we're lucky be in the top 300? I really don't understand.

I have no idea either. I mean I only just this minute learnt from Fact Finder what the Doppler Effect was! However, I do know what a Doppelgänger is and obviously it's a much bigger word, which I only know on account of having a striking resemblance to Agelina Jolie. Just kidding.You just gotta 300 minus the 200 X the actual current position / MC + SOI X Pi obtain the square root.....
Yeah me neither
I put my hand up and say I was wrong. I thought we would be out of the asx 200 going by weighted market cap over the period .
My previously mentioned thoughts on some of the partnerships is that they may be joint ventures where both parties agree to co-develop a product (possibly Socionext, Prophesee, ...). This is not a licence agreement. There is no licence fee involved and there are no royalties payable. The parties share in the revenue from sales in proportion to their contribution to the JV.Thanks FF. I agree with you and this could be true for all our partnerships prophesee included. So we could have already come to some terms for income from Qualcomm for example and no need to share this as sales are some time off. It would also mean never a license needed??