BRN Discussion Ongoing

Thanks @sonofkong ... can someone please tell me how to download the document, can't seem to do it on linkedin.





Markus Schafer’s neuromorphic article is finally up

 
  • Like
Reactions: 2 users

Moonshot

Regular
The Valeo expert commeneted on the post stating "I can confirm that it is a long way to go but very promising". That doesn't fill me with any confidence that we will be in any Valeo products anytime soon.
I didn't see any Valeo person in the comments on Markus' post when I checked just now. Can you send a screenshot?
 
  • Like
Reactions: 1 users

BaconLover

Founding Member
  • Like
Reactions: 5 users
He can't sell the place. It's been for sale for years!

EDIT: the listing says it's been on the market for 1884 days :confused:. If we get taken over for $23, maybe I'll take it off his hands. :cool:
No wonder he can't sell the place, who on Earth would need 15.4 bathrooms, haha
 
  • Haha
Reactions: 5 users

alwaysgreen

Top 20
I didn't see any Valeo person in the comments on Markus' post when I checked just now. Can you send a screenshot?
I think he's talking about this bloke

1674012408393.png


1674012423556.png
 
  • Like
Reactions: 9 users

Kachoo

Regular
I didn't see any Valeo person in the comments on Markus' post when I checked just now. Can you send a screenshot?
Soft Basher I think lol.
 
  • Haha
  • Like
Reactions: 4 users

Diogenese

Top 20
The Valeo expert commeneted on the post stating "I can confirm that it is a long way to go but very promising". That doesn't fill me with any confidence that we will be in any Valeo products anytime soon.


https://www.valeo.com/en/valeo-scala-lidar/

Valeo’s third-generation laser LiDAR technology, which is scheduled to hit the market in 2024, will take autonomous driving even further, making it possible to delegate driving to the vehicle in many situations, including at speeds of up to 130 km/h on the highway. Even at high speeds on the highway, autonomous vehicles equipped with this system are able to manage emergency situation autonomously.
...

  • 2018: Valeo was the first company in the world to run an autonomous vehicle in central Paris, in 100% autonomous driving mode, with Valeo Drive4U, equipped exclusively with series-produced sensors
  • 2021: The Honda Legend and the Mercedes-Benz S-Class are the first cars to have reached level 3 automation in the market. Both models are fitted with Valeo’s LiDAR technology.


I guess they will be doing some testing of Scala 3 before it hits the market. Admittedly, hitting the market in 2024 could mean December 2024, but who releases a new car at Christmas?

We haven't seen anything to prove Scala 3 has Akida, but that's where the smart money is.
 
  • Like
  • Fire
  • Love
Reactions: 29 users

MrNick

Regular
  • Like
Reactions: 1 users

Diogenese

Top 20
  • Haha
  • Like
Reactions: 17 users

tjcov87

Member
 
  • Like
  • Fire
  • Love
Reactions: 31 users

Damo4

Regular


Wow. Not sure what else to say, but the identification of the shark species is fantastic.
Don't care who's tech this is, I'm impressed.

It saw the bull shark before I did.
 
  • Like
Reactions: 15 users

Damo4

Regular
Wow. Not sure what else to say, but the identification of the shark species is fantastic.
Don't care who's tech this is, I'm impressed.

It saw the bull shark before I did.

Was MobileNet V1 one of the things listed somewhere very recently in regards to Akida? On either the benchmarking paper or something released recently regarding platforms?


"Over the last five years remotely piloted drones have become the tool of choice to spot potentially dangerous sharks in New South Wales, Australia. They have proven to be a more effective, accessible and cheaper solution compared to crewed aircraft. However, the ability to reliably detect and identify marine fauna is closely tied to pilot skill, experience and level of fatigue. Modern computer vision technology offers the possibility of improving detection reliability and even automating the surveillance process in the future. In this work we investigate the ability of commodity deep learning algorithms to detect marine objects in video footage from drones, with a focus on distinguishing between shark species. This study was enabled by the large archive of video footage gathered during the NSW Department of Primary Industries Drone Trials since 2016. We used this data to train two neural networks, based on the ResNet-50 and MobileNet V1 architectures, to detect and identify ten classes of marine object in 1080p resolution video footage. Both networks are capable of reliably detecting dangerous sharks: 80% accuracy for RetinaNet-50 and 78% for MobileNet V1 when tested on a challenging external dataset, which compares well to human observers. The object detection models correctly detect and localise most objects, produce few false-positive detections and can successfully distinguish between species of marine fauna in good conditions. We find that shallower network architectures, like MobileNet V1, tend to perform slightly worse on smaller objects, so care is needed when selecting a network to match deployment needs. We show that inherent biases in the training set have the largest effect on reliability. Some of these biases can be mitigated by pre-processing the data prior to training, however, this requires a large store of high resolution images that supports augmentation. A key finding is that models need to be carefully tuned for new locations and water conditions. Finally, we built an Android mobile application to run inference on real-time streaming video and demonstrated a working prototype during fields trials run in partnership with Surf Life Saving NSW."
 
  • Like
  • Love
  • Fire
Reactions: 18 users

Diogenese

Top 20
Was MobileNet V1 one of the things listed somewhere very recently in regards to Akida? On either the benchmarking paper or something released recently regarding platforms?


"Over the last five years remotely piloted drones have become the tool of choice to spot potentially dangerous sharks in New South Wales, Australia. They have proven to be a more effective, accessible and cheaper solution compared to crewed aircraft. However, the ability to reliably detect and identify marine fauna is closely tied to pilot skill, experience and level of fatigue. Modern computer vision technology offers the possibility of improving detection reliability and even automating the surveillance process in the future. In this work we investigate the ability of commodity deep learning algorithms to detect marine objects in video footage from drones, with a focus on distinguishing between shark species. This study was enabled by the large archive of video footage gathered during the NSW Department of Primary Industries Drone Trials since 2016. We used this data to train two neural networks, based on the ResNet-50 and MobileNet V1 architectures, to detect and identify ten classes of marine object in 1080p resolution video footage. Both networks are capable of reliably detecting dangerous sharks: 80% accuracy for RetinaNet-50 and 78% for MobileNet V1 when tested on a challenging external dataset, which compares well to human observers. The object detection models correctly detect and localise most objects, produce few false-positive detections and can successfully distinguish between species of marine fauna in good conditions. We find that shallower network architectures, like MobileNet V1, tend to perform slightly worse on smaller objects, so care is needed when selecting a network to match deployment needs. We show that inherent biases in the training set have the largest effect on reliability. Some of these biases can be mitigated by pre-processing the data prior to training, however, this requires a large store of high resolution images that supports augmentation. A key finding is that models need to be carefully tuned for new locations and water conditions. Finally, we built an Android mobile application to run inference on real-time streaming video and demonstrated a working prototype during fields trials run in partnership with Surf Life Saving NSW."
Hi Damo,

MobileNet is an open source model library database used to test and train NNs.

One of the stats re Akida you may have seen is the time it takes to classify the images in the library.

There are various versions adapted for different subject matter.
 
Last edited:
  • Like
Reactions: 22 users

Damo4

Regular
Hi Damo,

MobileNet is an open source model library database used to test and train NNs.

One of the stats re Akida you may have seen is the time it takes to classify the images in the library.

There are various versions adapted for different subject matter.
Thank you @Diogenese I knew I had seen it somewhere!
 
  • Like
Reactions: 2 users

Diogenese

Top 20
Tough times for tech:

https://www.msn.com/en-au/news/tech...pc=U531&cvid=4d53fcc86fe24e50857948dfedf6763c

Microsoft set to lay off thousands of employees tomorrow​

Story by Tom Warren • 8h ago

Microsoft is preparing to announce job cuts tomorrow. Sky News reports that thousands of roles will be cut, with the software giant said to be looking at cutting around 5 percent of its workforce. With more than 220,000 employees at Microsoft, that could mean more than 10,000 layoffs.
...
The cuts also come just weeks after Microsoft CEO Satya Nadella warned of two years of challenges ahead for the tech industry. In an interview with CNBC, Nadella admitted Microsoft wasn’t “immune to the global changes” and spoke of the need for tech companies to be efficient.

“The next two years are probably going to be the most challenging,” said Nadella. “We did have a lot of acceleration during the pandemic, and there’s some amount of normalization of that demand. And on top of it, there is a real recession in some parts of the world
.”
 
  • Sad
  • Like
  • Wow
Reactions: 23 users

Rskiff

Regular
Tough times for tech:

https://www.msn.com/en-au/news/tech...pc=U531&cvid=4d53fcc86fe24e50857948dfedf6763c

Microsoft set to lay off thousands of employees tomorrow​

Story by Tom Warren • 8h ago

Microsoft is preparing to announce job cuts tomorrow. Sky News reports that thousands of roles will be cut, with the software giant said to be looking at cutting around 5 percent of its workforce. With more than 220,000 employees at Microsoft, that could mean more than 10,000 layoffs.
...
The cuts also come just weeks after Microsoft CEO Satya Nadella warned of two years of challenges ahead for the tech industry. In an interview with CNBC, Nadella admitted Microsoft wasn’t “immune to the global changes” and spoke of the need for tech companies to be efficient.

“The next two years are probably going to be the most challenging,” said Nadella. “We did have a lot of acceleration during the pandemic, and there’s some amount of normalization of that demand. And on top of it, there is a real recession in some parts of the world
.”
And BRN is still on the hunt hiring.
 
  • Like
  • Fire
  • Love
Reactions: 30 users

wilzy123

Founding Member
  • Like
  • Fire
  • Haha
Reactions: 24 users
Last edited:
  • Like
  • Love
  • Fire
Reactions: 59 users

Diogenese

Top 20
Tough times for tech:

https://www.msn.com/en-au/news/tech...pc=U531&cvid=4d53fcc86fe24e50857948dfedf6763c

Microsoft set to lay off thousands of employees tomorrow​

Story by Tom Warren • 8h ago

Microsoft is preparing to announce job cuts tomorrow. Sky News reports that thousands of roles will be cut, with the software giant said to be looking at cutting around 5 percent of its workforce. With more than 220,000 employees at Microsoft, that could mean more than 10,000 layoffs.
...
The cuts also come just weeks after Microsoft CEO Satya Nadella warned of two years of challenges ahead for the tech industry. In an interview with CNBC, Nadella admitted Microsoft wasn’t “immune to the global changes” and spoke of the need for tech companies to be efficient.

“The next two years are probably going to be the most challenging,” said Nadella. “We did have a lot of acceleration during the pandemic, and there’s some amount of normalization of that demand. And on top of it, there is a real recession in some parts of the world
.”
"acceleration during the pandemic" - Looks like they are "right-sizing" after the boost that lockdowns and working from home gave to the interweb business.
 
  • Like
Reactions: 9 users
Talking of Cadence, they're also a sponsor of the upcoming DesignCon happening from Jan 31 to Feb 2:

Also just on the Cadence comments.

I noticed this the other day.

Not so much directly about Akida but interrelationships with partners.

Socionext who were discussing their Automotive Graphics Display Controller recently appear to have utilised Cadences Stratus HSL in the design work.

Like to believe that in these interrelationships and tech cross overs that some cross pollination of ideas etc sometimes get discussed at an informal level.

In this Expert Insights Video, Socionext’s Tim Papenfuss discusses how and why they used SystemC and Stratus high-level synthesis (HLS) to design their SC1701 automotive graphics display controller. A explanation of Clock Domain Crossing in Stratus HLS is outlined.


Intel and Qualcomm also utilise Stratus and have some vids.

Screenshot_2023-01-18-14-11-36-85_4641ebc0df1485bf6b47ebd018b5ee76.jpg
 
  • Like
  • Fire
  • Thinking
Reactions: 14 users
Top Bottom