BRN Discussion Ongoing

Bravo

If ARM was an arm, BRN would be its biceps💪!
Ok, so status quo then and the instos will continue to lend their shares to shorters...... excellent 🤔

I think I just had an extremely unpleasant out of body experience.

qxvmNw.gif
 
  • Haha
Reactions: 6 users

Tothemoon24

Top 20
Apologies I can’t seem to shorten .

I’m a big fan of Tata / TCS some interesting patent information.
Generally speaking this is above my hourly rate
$2.34


defimg.png
1
EP3913534A4
SYSTEM AND METHOD FOR REAL-TIME RADAR-BASED ACTION RECOGNITION USING SPIKING NEURAL NETWORK(SNN)
Application
Publication/Patent Number: EP3913534A4Publication Date: 2021-11-24Application Number: EP20213433Filing Date: 2020-12-11Inventor: Rani, Smriti Dey, Sounak George, Arun Pal, Arpan Banerjee, Dighanchal Chakravarty, Tapas Chowdhury, Arijit Mukherjee, Arijit Assignee: Tata Consultancy Services Limited IPC: G06K9/00Abstract: This disclosure relates generally to action recognition and more particularly to system and method for real-time radar-based action recognition. The classical machine learning techniques used for learning and inferring human actions from radar images are compute intensive, and require volumes of training data, making them unsuitable for deployment on network edge. The disclosed system utilizes neuromorphic computing and Spiking Neural Networks (SNN) to learn human actions from radar data captured by radar sensor(s). In an embodiment, the disclosed system includes a SNNmodel having a data pre-processing layer, Convolutional SNN layers and a Classifier layer. The preprocessing layer receives radar data including doppler frequencies reflected from the target and determines a binarized matrix. The CSNN layers extracts features (spatial and temporal) associated with the target's actions based on the binarized matrix. The classifier layer identifies a type of the action performed by the target based on the features
imgf0001.tif_210x210.png
2
EP3913534A1
SYSTEM AND METHOD FOR REAL-TIME RADAR-BASED ACTION RECOGNITION USING SPIKING NEURAL NETWORK(SNN)
Application
Publication/Patent Number: EP3913534A1Publication Date: 2021-11-24Application Number: EP20213433.4Filing Date: 2020-12-11Inventor: Dey, Sounak Mukherjee, Arijit Banerjee, Dighanchal Rani, Smriti George, Arun Chakravarty, Tapas Chowdhury, Arijit Pal, Arpan Assignee: Tata Consultancy Services Limited IPC: G06K9/46Abstract: This disclosure relates generally to action recognition and more particularly to system and method for real-time radar-based action recognition. The classical machine learning techniques used for learning and inferring human actions from radar images are compute intensive, and require volumes of training data, making them unsuitable for deployment on network edge. The disclosed system utilizes neuromorphic computing and Spiking Neural Networks (SNN) to learn human actions from radar data captured by radar sensor(s). In an embodiment, the disclosed system includes a SNNmodel having a data pre-processing layer, Convolutional SNN layers and a Classifier layer. The preprocessing layer receives radar data including doppler frequencies reflected from the target and determines a binarized matrix. The CSNN layers extracts features (spatial and temporal) associated with the target's actions based on the binarized matrix. The classifier layer identifies a type of the action performed by the target based on the features
US20210365778A1-20211125-D00000.TIF_210x210.png
3
US20210365778A1
SYSTEM AND METHOD FOR REAL-TIME RADAR-BASED ACTION RECOGNITION USING SPIKING NEURAL NETWORK(SNN)
Application
Publication/Patent Number: US20210365778A1Publication Date: 2021-11-25Application Number: US17/122,041Filing Date: 2020-12-15Inventor: Dey, Sounak Mukherjee, Arijit Banerjee, Dighanchal Rani, Smriti George, Arun Chakravarty, Tapas Chowdhury, Arijit Pal, Assignee: Tata Consultancy Services Limited IPC: G06N3/08Abstract: This disclosure relates generally to action recognition and more particularly to system and method for real-time radar-based action recognition. The classical machine learning techniques used for learning and inferring human actions from radar images are compute intensive, and require volumes of training data, making them unsuitable for deployment on network edge. The disclosed system utilizes neuromorphic computing and Spiking Neural Networks (SNN) to learn human actions from radar data captured by radar sensor(s). In an embodiment, the disclosed system includes a SNNmodel having a data pre-processing layer, Convolutional SNN layers and a Classifier layer. The preprocessing layer receives radar data including doppler frequencies reflected from the target and determines a binarized matrix. The CSNN layers extracts features (spatial and temporal) associated with the target's actions based on the binarized matrix. The classifier layer identifies a type of the action performed by the target based on the features.
 
  • Like
  • Fire
  • Love
Reactions: 9 users
Crikey!

Mercedes Says Level 4 Self-Driving Will Happen By 2030​

MAR. 02, 2023 6:32 PM ETBY JAY TRAUGOTT TECHNOLOGY / 5 COMMENTS
The German automaker has already beaten rivals for Level 3 approval.
Mercedes-Benz has gone on record stating that Level 4 self-driving is "doable" by decade's end. Speaking to Automotive News, the German automaker's Chief Technology Officer, Markus Schafer, said, "private-owned Level 4 cars, absolutely. This is something that I see in the future."
Mercedes' Level 3 technology, called Drive Pilot, is the first of its kind in the US and has already been approved for use in Nevada, and other states such as California are expected to follow suit in the coming months. For now, only the 2024 S-Class and EQS Sedan offer Level 3, both of which will go on sale in the second half of this year. Unlike Level 2+ systems like GM's Super Cruise, Ford's Blue Cruise, and Tesla's Autopilot and Full Self-Driving, Drive Pilot can "hand over the dynamic driving task to the vehicle under certain conditions."
The system utilizes a combination of LiDAR, radar, and various sensors to allow for safe highway driving at speeds of up to 40 mph.
1113756.jpg
Mercedes-Benz
1113755.jpg
Mercedes-Benz


1113756.jpg

1113755.jpg




Yes, Level 3 makes it possible to text and drive (although we should point out that this is still illegal), but the driver must still be ready to assume control of the vehicle at a moment's notice if the system detects any sort of obstacle. Level 4 takes things up a notch. "Just imagine you are in a big city, and you come from work, and you are sitting for two hours in traffic, and you press the button and go to sleep," Schaefer added. "There will be a demand for that."
One key difference between Level 3 and Level 4 is that human drivers don't have to keep an eye on the road in most conditions. Level 4 is ideally suited for heavy traffic within cities, but things like extreme weather are a different matter.
Level 5, which requires absolutely no human involvement, is still years away from becoming possible. Companies like Waymo and Cruise are currently testing driverless taxis with that tech, but it's still far from perfect and remains expensive.



1113751.jpg

1113759.jpg

1113760.jpg

Mercedes is taking responsibility for Drive Pilot's accuracy and safety by assuming liability if the vehicle were to be the cause of a highway crash, for example.
The carmaker's Level 3 headstart places it in a prime position to introduce Level 4. Its upcoming new Modular Architecture platform, due to launch in the middle of the decade, will come hardwired with Level 4 capability once the technology is ready and approved for use by government safety regulators. The race to Level 4 brings not only prestige and bragging rights but also significant revenue.
Automakers - especially luxury brands - know that consumers will be more than happy to pay more for the technology because it brings a lot of additional conveniences. But the most difficult task they will face - and Mercedes is no exception - is proving to the public that Level 4 is safe. The introduction of Level 3 Drive Pilot is a significant step in that direction.

I don’t want to inform on anyone but was Mercedes Benz eavesdropping on the last podcast where Peter van der Made confessed to having his first version of beneficial artificial general intelligence available for full autonomous driving in approximately 7 years which some might conclude is about 2030????

My speculation and giant dot joining so DYOR
FF

AKIDA BALLISTA
 
  • Like
  • Love
  • Thinking
Reactions: 22 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
  • Haha
  • Like
Reactions: 6 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
Apologies I can’t seem to shorten .

I’m a big fan of Tata / TCS some interesting patent information.
Generally speaking this is above my hourly rate
$2.34


defimg.png
1
EP3913534A4
SYSTEM AND METHOD FOR REAL-TIME RADAR-BASED ACTION RECOGNITION USING SPIKING NEURAL NETWORK(SNN)
Application
Publication/Patent Number: EP3913534A4Publication Date: 2021-11-24Application Number: EP20213433Filing Date: 2020-12-11Inventor: Rani, Smriti Dey, Sounak George, Arun Pal, Arpan Banerjee, Dighanchal Chakravarty, Tapas Chowdhury, Arijit Mukherjee, ArijitAssignee: Tata Consultancy Services LimitedIPC: G06K9/00Abstract: This disclosure relates generally to action recognition and more particularly to system and method for real-time radar-based action recognition. The classical machine learning techniques used for learning and inferring human actions from radar images are compute intensive, and require volumes of training data, making them unsuitable for deployment on network edge. The disclosed system utilizes neuromorphic computing and Spiking Neural Networks (SNN) to learn human actions from radar data captured by radar sensor(s). In an embodiment, the disclosed system includes a SNNmodel having a data pre-processing layer, Convolutional SNN layers and a Classifier layer. The preprocessing layer receives radar data including doppler frequencies reflected from the target and determines a binarized matrix. The CSNN layers extracts features (spatial and temporal) associated with the target's actions based on the binarized matrix. The classifier layer identifies a type of the action performed by the target based on the features
imgf0001.tif_210x210.png
2
EP3913534A1
SYSTEM AND METHOD FOR REAL-TIME RADAR-BASED ACTION RECOGNITION USING SPIKING NEURAL NETWORK(SNN)
Application
Publication/Patent Number: EP3913534A1Publication Date: 2021-11-24Application Number: EP20213433.4Filing Date: 2020-12-11Inventor: Dey, Sounak Mukherjee, Arijit Banerjee, Dighanchal Rani, Smriti George, Arun Chakravarty, Tapas Chowdhury, Arijit Pal, ArpanAssignee: Tata Consultancy Services LimitedIPC: G06K9/46Abstract: This disclosure relates generally to action recognition and more particularly to system and method for real-time radar-based action recognition. The classical machine learning techniques used for learning and inferring human actions from radar images are compute intensive, and require volumes of training data, making them unsuitable for deployment on network edge. The disclosed system utilizes neuromorphic computing and Spiking Neural Networks (SNN) to learn human actions from radar data captured by radar sensor(s). In an embodiment, the disclosed system includes a SNNmodel having a data pre-processing layer, Convolutional SNN layers and a Classifier layer. The preprocessing layer receives radar data including doppler frequencies reflected from the target and determines a binarized matrix. The CSNN layers extracts features (spatial and temporal) associated with the target's actions based on the binarized matrix. The classifier layer identifies a type of the action performed by the target based on the features
US20210365778A1-20211125-D00000.TIF_210x210.png
3
US20210365778A1
SYSTEM AND METHOD FOR REAL-TIME RADAR-BASED ACTION RECOGNITION USING SPIKING NEURAL NETWORK(SNN)
Application
Publication/Patent Number: US20210365778A1Publication Date: 2021-11-25Application Number: US17/122,041Filing Date: 2020-12-15Inventor: Dey, Sounak Mukherjee, Arijit Banerjee, Dighanchal Rani, Smriti George, Arun Chakravarty, Tapas Chowdhury, Arijit Pal,Assignee: Tata Consultancy Services LimitedIPC: G06N3/08Abstract: This disclosure relates generally to action recognition and more particularly to system and method for real-time radar-based action recognition. The classical machine learning techniques used for learning and inferring human actions from radar images are compute intensive, and require volumes of training data, making them unsuitable for deployment on network edge. The disclosed system utilizes neuromorphic computing and Spiking Neural Networks (SNN) to learn human actions from radar data captured by radar sensor(s). In an embodiment, the disclosed system includes a SNNmodel having a data pre-processing layer, Convolutional SNN layers and a Classifier layer. The preprocessing layer receives radar data including doppler frequencies reflected from the target and determines a binarized matrix. The CSNN layers extracts features (spatial and temporal) associated with the target's actions based on the binarized matrix. The classifier layer identifies a type of the action performed by the target based on the features.
Sorry. Too many big words for me.
 
  • Haha
  • Like
Reactions: 4 users

Tothemoon24

Top 20
  • Haha
  • Like
  • Love
Reactions: 4 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
  • Haha
  • Like
Reactions: 5 users

Tothemoon24

Top 20
Anytime someone drops the "D" word (Doppler), I just start to lose consciousness a little bit.
Agree Dio has the same effect on me ✍️
 
  • Haha
  • Like
  • Sad
Reactions: 6 users
So Teksun is a good example

"With BrainChip’s Akida processor, we will be able to deliver next-generation AIoT devices that are faster, more efficient, and more intelligent than ever before and not just meet, but exceed the expectations of our customers"

I have no doubt that they will have customers that will have Akida solving their predictive maintenance in the next year.

As a customer progresses to add Akida they are not each going to take out a licencing agreement are they? That can get quite involving legally with individual agreements and Teksun having to contact us each time. Will Teksun take one out on behalf of all their clients. Or the fact we are in partnership with them do we have a partnership agreement for royalty share in the future??

If it is Teksun that takes it out. Why not take it out now and write contract that payment isn't due till first products produced etc.
Well if you follow the reasoning of the Federal Court in GetSwift and ASIC such a conditional agreement with payment conditioned on the first sale would not constitute an agreement that can be announced on the ASX.

Why because those sales may never occur and therefore a defined sum of money cannot be calculated by the Board.

My opinion only so take real live legal advice DYOR
FF

AKIDA BALLISTA
 
  • Like
  • Love
  • Fire
Reactions: 10 users

Diogenese

Top 20
Just picked this up over at Talga:




1677825436399.png



1677825594083.png
 
  • Like
  • Fire
Reactions: 14 users

Steve7777

Regular
Hi all, can someone please explain to me how we stayed in the ASX200 when we're lucky be in the top 300? I really don't understand.





 

Attachments

  • 1677825482103.png
    1677825482103.png
    209.7 KB · Views: 94
  • Like
  • Thinking
Reactions: 5 users
Anytime someone drops the "D" word (Doppler), I start to lose consciousness a little bit.
What are they teaching the young ones these days in school. We did the Doppler Effect in first year science before computers, mobile phones and Google.

Our science teacher even had us stand on the hockey field while she drove past holding the horn in her car on to demonstrate it in practice.

😕🙁🙁🙁🙁🙁😩😩😫
 
  • Haha
  • Like
Reactions: 12 users

ndefries

Regular
Well if you follow the reasoning of the Federal Court in GetSwift and ASIC such a conditional agreement with payment conditioned on the first sale would not constitute an agreement that can be announced on the ASX.

Why because those sales may never occur and therefore a defined sum of money cannot be calculated by the Board.

My opinion only so take real live legal advice DYOR
FF

AKIDA BALLISTA
Thanks FF. I agree with you and this could be true for all our partnerships prophesee included. So we could have already come to some terms for income from Qualcomm for example and no need to share this as sales are some time off. It would also mean never a license needed??
 
  • Like
  • Thinking
Reactions: 5 users
Hi all, can someone please explain to me how we stayed in the ASX200 when we're lucky be in the top 300? I really don't understand.





You just gotta 300 minus the 200 X the actual current position / MC + SOI X Pi obtain the square root.....

Yeah me neither :ROFLMAO:
 
  • Haha
  • Like
Reactions: 22 users

wilzy123

Founding Member
I don’t want to inform on anyone but was Mercedes Benz eavesdropping on the last podcast where Peter van der Made confessed to having his first version of beneficial artificial general intelligence available for full autonomous driving in approximately 7 years which some might conclude is about 2030????

My speculation and giant dot joining so DYOR
FF

AKIDA BALLISTA

John Carmack believes we will see fully autonomous vehicles by 2030.

 
  • Like
  • Fire
Reactions: 9 users

VictorG

Member
Its probably the investment funds selling most if not all their BRN holdings ahead of the ASX 200 index rebalance. I expect a few more days of this before the share price can bounce back.

The question that has me thinking is, if the institutions that are selling down their BRN holdings had also lent their shares out to shorters, wouldn't the shorters have to buy back their shorts also. Should we expect a few large cross trades in the next few days?
With only 30 minutes to go before close of week, I found a way to cover my face with egg. Happy to be wrong.
BRN now aiming for the ASX 20.
 
  • Like
  • Haha
Reactions: 17 users

Evermont

Stealth Mode
Hi all, can someone please explain to me how we stayed in the ASX200 when we're lucky be in the top 300? I really don't understand.






Because the market makers don't panic based on short term price movements like some shareholders. 🪁
 
  • Like
  • Fire
  • Haha
Reactions: 25 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
You just gotta 300 minus the 200 X the actual current position / MC + SOI X Pi obtain the square root.....

Yeah me neither :ROFLMAO:
I have no idea either. I mean I only just this minute learnt from Fact Finder what the Doppler Effect was! However, I do know what a Doppelgänger is and obviously it's a much bigger word, which I only know on account of having a striking resemblance to Agelina Jolie. Just kidding.

 
  • Haha
  • Like
Reactions: 10 users

Dozzaman1977

Regular
I put my hand up and say I was wrong. I thought we would be out of the asx 200 going by weighted market cap over the period .

That's good news that we stay in mate!!
Interesting that life360 got added to the asx 200 with a market cap of only 920 million 🤔
Im So Stupid Hillary Clinton GIF by Saturday Night Live
 
Last edited:
  • Like
  • Haha
Reactions: 11 users

Diogenese

Top 20
Thanks FF. I agree with you and this could be true for all our partnerships prophesee included. So we could have already come to some terms for income from Qualcomm for example and no need to share this as sales are some time off. It would also mean never a license needed??
My previously mentioned thoughts on some of the partnerships is that they may be joint ventures where both parties agree to co-develop a product (possibly Socionext, Prophesee, ...). This is not a licence agreement. There is no licence fee involved and there are no royalties payable. The parties share in the revenue from sales in proportion to their contribution to the JV.

A short time ago someone posted an email from Tony to the effect that partnerships are different from customer licence arrangements.
 
Last edited:
  • Like
  • Fire
  • Thinking
Reactions: 22 users
Top Bottom