BRN Discussion Ongoing

Boab

I wish I could paint like Vincent
Well not sure how many have been following what has been happening in the UK but I bet the Conservatives wish they still had Boris as leader.

The PM Liz Truss on regional UK radio all but refused to guarantee UK pensions and this has had a similar effect on market confidence around the globe as a Putin tactical nuclear strike on Ukraine.

While the UK PM’s statement does not in any way reflect on the stability of the Australian economy her failure to make a clear statement that peoples pensions were safe is an economic nightmare.

So it might be a reasonable time to look away or check down the side of the couch for some additional funds depending on your risk profile and plan.

Fortunately we are looking to buy a new lounge and my wife is not as upset as she might have been about my use of the Stanley knife.

My opinion only DYOR
FF

AKIDA BALLISTA

Well not sure how many have been following what has been happening in the UK but I bet the Conservatives wish they still had Boris as leader.

The PM Liz Truss on regional UK radio all but refused to guarantee UK pensions and this has had a similar effect on market confidence around the globe as a Putin tactical nuclear strike on Ukraine.

While the UK PM’s statement does not in any way reflect on the stability of the Australian economy her failure to make a clear statement that peoples pensions were safe is an economic nightmare.

So it might be a reasonable time to look away or check down the side of the couch for some additional funds depending on your risk profile and plan.

Fortunately we are looking to buy a new lounge and my wife is not as upset as she might have been about my use of the Stanley knife.

My opinion only DYOR
FF

AKIDA BALLISTA
There are many pension funds that HAD to invest in Government Bonds and the returns have been minuscule.
Tough times ahead.
 
  • Like
  • Sad
Reactions: 4 users
  • Haha
Reactions: 2 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
Hey Brain Fam,

This is from an article published 28 September 2022. The author gives a very complimentary description of the MBUX voice recognition system saying it "remains among the best I've encountered". Another phrase that caught my attention was "everything is run locally or in Mercedes' own cloud". And I also liked the description of the optional "intelligent recuperation" feature where the cars sensors decide when it's more efficient to regeneratively brake versus coast.

Happy days!



Screen Shot 2022-09-30 at 9.34.27 am.png

Screen Shot 2022-09-30 at 9.34.38 am.png


Screen Shot 2022-09-30 at 9.34.49 am.png
Screen Shot 2022-09-30 at 9.35.01 am.png

 
  • Like
  • Love
  • Fire
Reactions: 36 users

HopalongPetrovski

I'm Spartacus!
  • Like
  • Love
Reactions: 5 users

wilzy123

Founding Member

Amazon Introduces the New Blink Wired Floodlight Camera and Blink Mini Pan Tilt—Offering Customers Even More Flexibility in Security Coverage and Peace of Mind​


7fce4ceb0fbf3f68fdfa5a7ce6ff68cd

Blink Wired Floodlight Camera uses Amazon’s AZ2 Neural Edge Processor to capture and process videos locally—and starts at just $99.99

New Blink Mini Pan Tilt mount brings additional functionality to the popular Blink Mini, giving customers the ability to pan and tilt their cameras remotely


SEATTLE, September 28, 2022--(BUSINESS WIRE)--Amazon (NASDAQ: AMZN) today introduced two new additions to the Blink family of devices—the new Blink Wired Floodlight Camera and the new Blink Mini Pan Tilt. At just $99.99, the Blink Wired Floodlight Camera includes a smart security camera and powerful LED lighting all in one, streamlined design, and Amazon’s AZ2 silicon to process videos without going to the cloud. The Blink Mini Pan Tilt is a new mount that works with Blink Mini to enable you to see a wider field of view and remotely pan and tilt to follow motion.


"The Blink Wired Floodlight Camera is our first wired floodlight device, and it adds to the existing lineup of easy-to-use, reliable, and affordable security devices that help customers keep an eye on their homes," said Mike Harris, chief operating officer at Blink. "With an all-in-one security and lighting design, and a price below $100, it offers a mix of performance and value that’s hard to beat. Plus, it leverages the intelligence of Amazon silicon, enabling us to offer features such as computer vision and local video processing for the first time."

Blink Wired Floodlight Camera—Advanced Features at an Affordable Price

The Blink Wired Floodlight Camera is designed to offer high performance for those looking for a hardwired security solution in an affordable package. Support for preferred motion detection zones means you can focus on the areas that are most important to you, and new person detection provides the ability to limit motion alert notifications to only when a person is present. Blink Wired Floodlight Camera’s enhanced motion detection features are built on the capabilities provided by Amazon’s AZ2 Neural Edge Processor, which also enables video content to be processed locally on the edge.

The Blink Wired Floodlight Camera provides 2600 lumens of LED lighting, 1080p HD live view, and crisp two-way audio. Setup is easy using an existing wired connection, and you can easily store video clips locally with a Sync Module 2 via a USB flash drive (sold separately). With a Blink Subscription Plan, you can also store video clips and photos in the cloud.

Blink Mini Pan Tilt—Adding New Functionality and Flexibility for Blink Mini

The new Blink Mini Pan Tilt adds a motorized mount to the Blink Mini to help keep an eye on even more of your home. With Mini Pan Tilt, you instantly gain the ability to remotely pan left and right, and tilt up and down, using the Blink app—getting corner-to-corner, 360-degree coverage of any room. If you already have a Blink Mini, you can easily add just the mount via a micro-USB, and you can place it on a table or countertop, or connect via a tri-pod or wall-mount (sold separately) for additional functionality.
 
  • Like
  • Fire
  • Thinking
Reactions: 25 users

wilzy123

Founding Member

Amazon Introduces the New Blink Wired Floodlight Camera and Blink Mini Pan Tilt—Offering Customers Even More Flexibility in Security Coverage and Peace of Mind​


7fce4ceb0fbf3f68fdfa5a7ce6ff68cd

Blink Wired Floodlight Camera uses Amazon’s AZ2 Neural Edge Processor to capture and process videos locally—and starts at just $99.99

New Blink Mini Pan Tilt mount brings additional functionality to the popular Blink Mini, giving customers the ability to pan and tilt their cameras remotely


SEATTLE, September 28, 2022--(BUSINESS WIRE)--Amazon (NASDAQ: AMZN) today introduced two new additions to the Blink family of devices—the new Blink Wired Floodlight Camera and the new Blink Mini Pan Tilt. At just $99.99, the Blink Wired Floodlight Camera includes a smart security camera and powerful LED lighting all in one, streamlined design, and Amazon’s AZ2 silicon to process videos without going to the cloud. The Blink Mini Pan Tilt is a new mount that works with Blink Mini to enable you to see a wider field of view and remotely pan and tilt to follow motion.


"The Blink Wired Floodlight Camera is our first wired floodlight device, and it adds to the existing lineup of easy-to-use, reliable, and affordable security devices that help customers keep an eye on their homes," said Mike Harris, chief operating officer at Blink. "With an all-in-one security and lighting design, and a price below $100, it offers a mix of performance and value that’s hard to beat. Plus, it leverages the intelligence of Amazon silicon, enabling us to offer features such as computer vision and local video processing for the first time."

Blink Wired Floodlight Camera—Advanced Features at an Affordable Price

The Blink Wired Floodlight Camera is designed to offer high performance for those looking for a hardwired security solution in an affordable package. Support for preferred motion detection zones means you can focus on the areas that are most important to you, and new person detection provides the ability to limit motion alert notifications to only when a person is present. Blink Wired Floodlight Camera’s enhanced motion detection features are built on the capabilities provided by Amazon’s AZ2 Neural Edge Processor, which also enables video content to be processed locally on the edge.

The Blink Wired Floodlight Camera provides 2600 lumens of LED lighting, 1080p HD live view, and crisp two-way audio. Setup is easy using an existing wired connection, and you can easily store video clips locally with a Sync Module 2 via a USB flash drive (sold separately). With a Blink Subscription Plan, you can also store video clips and photos in the cloud.

Blink Mini Pan Tilt—Adding New Functionality and Flexibility for Blink Mini

The new Blink Mini Pan Tilt adds a motorized mount to the Blink Mini to help keep an eye on even more of your home. With Mini Pan Tilt, you instantly gain the ability to remotely pan left and right, and tilt up and down, using the Blink app—getting corner-to-corner, 360-degree coverage of any room. If you already have a Blink Mini, you can easily add just the mount via a micro-USB, and you can place it on a table or countertop, or connect via a tri-pod or wall-mount (sold separately) for additional functionality.

Further to this, BMW have just announced that they will be using Amazon Alexa AI for their in-cabin voice processing. #interesting!

Just thinking back to the "Hey, Akida" demo in the BMW. hmmmmmmm........

 
Last edited:
  • Like
  • Thinking
  • Fire
Reactions: 25 users
Further to this, BMW have just announced that they will be using Amazon Alexa AI for their in-cabin voice processing. #interesting!

Just thinking back to the "Hey, Akida" demo in the BMW. hmmmmmmm........

Don’t forget NASA referring to using unconnected Alexa in space???

Rob Telson mentioning Alexa by name many times when speaking about AKIDA’s unconnected ability then dropping Alexa from his vocabulary???

Did Alexa become more advanced by itself or did someone or something lend a hand from the future???

My opinion only DYOR
FF

AKIDA BALLISTA
 
  • Like
  • Thinking
  • Love
Reactions: 40 users

equanimous

Norse clairvoyant shapeshifter goddess
BRN holding up really well compared to us stocks
 
  • Like
  • Love
Reactions: 6 users

Learning

Learning to the Top 🕵‍♂️
Don’t forget NASA referring to using unconnected Alexa in space???

Rob Telson mentioning Alexa by name many times when speaking about AKIDA’s unconnected ability then dropping Alexa from his vocabulary???

Did Alexa become more advanced by itself or did someone or something lend a hand from the future???

My opinion only DYOR
FF

AKIDA BALLISTA
“Ambient Intelligence.”


It's great to be a shareholder 🏖
 
  • Like
  • Love
Reactions: 22 users
NXP has been floating around without any firm links. What caught my attention here was the ‘real time’ processing and ‘5nm’. Real time processing has been a quality of AKIDA since year dot and Peter van der Made has mention 5nm.

This is not really even a dot but another company to keep a watch on for articles and patents:

“NXP offers system support for S32Z and S32E processors to accelerate a range of designs. These include the co-developed FS86 ASIL D safety system basis chip (SBC) and PF5030 power-management IC (PMIC) with in-vehicle networking support plus Ethernet switches and PHYs and CAN transceivers, along with other analog companion chips such as the GD3160 IGBT/SiC high-voltage inverter gate driver and MC3377x battery-cell controllers.

NXP’s S32Z280 and S32E288, the first two devices in the new families, are sampling now. The company plans to sample the S32Z1 series a bit further down the road. The real-time processors will also scale into the future with 5-nm products. In fact, NXP has already developed a functional, 5-nm real-time processor test chip.”


My opinion only DYOR
FF

AKIDA BALLISTA
 
  • Like
  • Fire
Reactions: 23 users

SERA2g

Founding Member
You’re referring to me right?

I am still heart broken xxx
 
  • Haha
  • Love
  • Like
Reactions: 4 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
“Ambient Intelligence.”


It's great to be a shareholder 🏖


It certainly does walk and talk like a duck doesn't it @Learning! 🦆



What you need to know about the Amazon AZ2 Neural Engine​


By Jerry Hildenbrand
last updated February 03, 2022


Amazon AZ2 SoC

Amazon AZ2 SoC (Image credit: Amazon)

The Amazon Echo Show 15 not only hangs on your wall but can learn to recognize your face. That's because it has a new piece of Amazon-designed silicon inside dubbed the Amazon AZ2 Neural Engine.

Yes, Amazon custom designs ARM chips. The AZ2 isn't even the first one (hence the 2), but it's a lot more capable than the AZ1, which powers some of the best Alexa speakers and offers something new for Amazon — edge computing.
If you're not sure what edge computing is, this chip and what it does actually makes it easy to understand. All the processing to learn and recognize your face is done using machine learning through the chip itself and nothing needs to be sent across the internet to make that happen.
Sponsored Links

https://info.hearclear.com/latest-Australian-hearing-technology?utm_campaign=1128869&utm_content=3433304281&cid=5ad70f3e65e00&utm_source=taboola&utm_medium=cpc&campaign=HA-AU-HC-2-D&platform=Desktop&utm_term=Pensioners+Are+Ditching+Hearing+Loss+Problems+With+This!&content=https://console.brax-cdn.com/creatives/44dd7285-cd6a-4a0f-9085-8137587509a3/images/nadads-sh-chloe-ha__08b4fa4d-7af5-4f78-9b49-04ed77d730e7_1000x600.jpeg&network=futureplc-androidcentral&title=Pensioners+Are+Ditching+Hearing+Loss+Problems+With+This!&click-id=GiCu3yvkY5g-f0Xl9sdIVLKrNMW9eqfRaFmSCeuCAHGZTSCqs0Ao24qqrZePlo16#tblciGiCu3yvkY5g-f0Xl9sdIVLKrNMW9eqfRaFmSCeuCAHGZTSCqs0Ao24qqrZePlo16
Pensioners Are Ditching Hearing Loss Problems With This!HearClear Hearing Aid Quotes





I still think any computer learning to recognize human faces is pretty creepy but doing it locally instead of through a remote server is pretty cool. Also, you have to opt-in for this feature, so you can still buy Amazon's new Echo Show 15 even if you think it's creepy like I do. But enough about creepy stuff.
RECOMMENDED VIDEOS FOR YOU...


CLOSE

Amazon Echo Show 15 Lifestyle

Source: Amazon (Image credit: Source: Amazon)
What the AZ2 can do — on paper anyway — is pretty impressive. Consider the last-gen AZ1, which was able to recognize your voice without Amazon needing to send that data through the cloud. The new model does that, of course, but it's also capable of performing 22 times the amount of operations each second.

The AZ2 Neural Engine can work 22-times faster than Amazon's last-generation processor.
This means it has plenty of local bandwidth to learn your face as well as your voice. In fact, Amazon says it can process speech and facial recognition simultaneously. A big reason for this is because it's a neural edge processor. Those sound like the kind of words tech companies like to throw around, but they do mean something — the "neural" part means it's a chip used with algorithms for machine learning and the "edge" part means it can do it without calling for backup from some server.
By doing things locally, there is almost zero latency, which means there is virtually zero wait time between operations. We haven't seen how well it actually operates but based on its capabilities, it looks like the perfect chip to put inside something like an Echo Show.
Edge computing is not only better for privacy, but it's faster, too.
Speaking of that, the Echo Show 15 is the only device that will use the new AZ2 Neural Edge chip for now. We expect that to change as Amazon brings its Visual ID feature to other devices. Maybe even drones or robots.
Whether you love Amazon products or hate them, you can't help but be impressed with the new AZ2. It's easy to forget that Amazon is also part of Big Tech, but things like this serve to remind us that some top-level engineers work a lot of hours to build those Echo devices so many people love.
Ring Always Home Cam

Source: Ring (Image credit: Source: Ring)
 
  • Like
  • Fire
  • Love
Reactions: 36 users

Cardpro

Regular
It certainly does walks and talks like a duck doesn't it @Learning! 🦆



What you need to know about the Amazon AZ2 Neural Engine​


By Jerry Hildenbrand
last updated February 03, 2022


Amazon AZ2 SoC

Amazon AZ2 SoC (Image credit: Amazon)

The Amazon Echo Show 15 not only hangs on your wall but can learn to recognize your face. That's because it has a new piece of Amazon-designed silicon inside dubbed the Amazon AZ2 Neural Engine.

Yes, Amazon custom designs ARM chips. The AZ2 isn't even the first one (hence the 2), but it's a lot more capable than the AZ1, which powers some of the best Alexa speakers and offers something new for Amazon — edge computing.
If you're not sure what edge computing is, this chip and what it does actually makes it easy to understand. All the processing to learn and recognize your face is done using machine learning through the chip itself and nothing needs to be sent across the internet to make that happen.
Sponsored Links

https://info.hearclear.com/latest-Australian-hearing-technology?utm_campaign=1128869&utm_content=3433304281&cid=5ad70f3e65e00&utm_source=taboola&utm_medium=cpc&campaign=HA-AU-HC-2-D&platform=Desktop&utm_term=Pensioners+Are+Ditching+Hearing+Loss+Problems+With+This!&content=https://console.brax-cdn.com/creatives/44dd7285-cd6a-4a0f-9085-8137587509a3/images/nadads-sh-chloe-ha__08b4fa4d-7af5-4f78-9b49-04ed77d730e7_1000x600.jpeg&network=futureplc-androidcentral&title=Pensioners+Are+Ditching+Hearing+Loss+Problems+With+This!&click-id=GiCu3yvkY5g-f0Xl9sdIVLKrNMW9eqfRaFmSCeuCAHGZTSCqs0Ao24qqrZePlo16#tblciGiCu3yvkY5g-f0Xl9sdIVLKrNMW9eqfRaFmSCeuCAHGZTSCqs0Ao24qqrZePlo16
Pensioners Are Ditching Hearing Loss Problems With This!HearClear Hearing Aid Quotes





I still think any computer learning to recognize human faces is pretty creepy but doing it locally instead of through a remote server is pretty cool. Also, you have to opt-in for this feature, so you can still buy Amazon's new Echo Show 15 even if you think it's creepy like I do. But enough about creepy stuff.
RECOMMENDED VIDEOS FOR YOU...


CLOSE

Amazon Echo Show 15 Lifestyle

Source: Amazon (Image credit: Source: Amazon)
What the AZ2 can do — on paper anyway — is pretty impressive. Consider the last-gen AZ1, which was able to recognize your voice without Amazon needing to send that data through the cloud. The new model does that, of course, but it's also capable of performing 22 times the amount of operations each second.


This means it has plenty of local bandwidth to learn your face as well as your voice. In fact, Amazon says it can process speech and facial recognition simultaneously. A big reason for this is because it's a neural edge processor. Those sound like the kind of words tech companies like to throw around, but they do mean something — the "neural" part means it's a chip used with algorithms for machine learning and the "edge" part means it can do it without calling for backup from some server.
By doing things locally, there is almost zero latency, which means there is virtually zero wait time between operations. We haven't seen how well it actually operates but based on its capabilities, it looks like the perfect chip to put inside something like an Echo Show.

Speaking of that, the Echo Show 15 is the only device that will use the new AZ2 Neural Edge chip for now. We expect that to change as Amazon brings its Visual ID feature to other devices. Maybe even drones or robots.
Whether you love Amazon products or hate them, you can't help but be impressed with the new AZ2. It's easy to forget that Amazon is also part of Big Tech, but things like this serve to remind us that some top-level engineers work a lot of hours to build those Echo devices so many people love.
Ring Always Home Cam

Source: Ring (Image credit: Source: Ring)
This is amazing, if they did that without brainchip, I might consider selling brainchip and cry.
 
  • Like
Reactions: 2 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
This latest generation of the Amazon Dot includes a new AZ2 processor, a temperature sensor and new tap gesture controls.

Now all the dots really are starting to join! 😝


 
  • Like
  • Love
  • Fire
Reactions: 27 users

ndefries

Regular
It certainly does walk and talk like a duck doesn't it @Learning! 🦆



What you need to know about the Amazon AZ2 Neural Engine​


By Jerry Hildenbrand
last updated February 03, 2022


Amazon AZ2 SoC

Amazon AZ2 SoC (Image credit: Amazon)

The Amazon Echo Show 15 not only hangs on your wall but can learn to recognize your face. That's because it has a new piece of Amazon-designed silicon inside dubbed the Amazon AZ2 Neural Engine.

Yes, Amazon custom designs ARM chips. The AZ2 isn't even the first one (hence the 2), but it's a lot more capable than the AZ1, which powers some of the best Alexa speakers and offers something new for Amazon — edge computing.
If you're not sure what edge computing is, this chip and what it does actually makes it easy to understand. All the processing to learn and recognize your face is done using machine learning through the chip itself and nothing needs to be sent across the internet to make that happen.
Sponsored Links

https://info.hearclear.com/latest-Australian-hearing-technology?utm_campaign=1128869&utm_content=3433304281&cid=5ad70f3e65e00&utm_source=taboola&utm_medium=cpc&campaign=HA-AU-HC-2-D&platform=Desktop&utm_term=Pensioners+Are+Ditching+Hearing+Loss+Problems+With+This!&content=https://console.brax-cdn.com/creatives/44dd7285-cd6a-4a0f-9085-8137587509a3/images/nadads-sh-chloe-ha__08b4fa4d-7af5-4f78-9b49-04ed77d730e7_1000x600.jpeg&network=futureplc-androidcentral&title=Pensioners+Are+Ditching+Hearing+Loss+Problems+With+This!&click-id=GiCu3yvkY5g-f0Xl9sdIVLKrNMW9eqfRaFmSCeuCAHGZTSCqs0Ao24qqrZePlo16#tblciGiCu3yvkY5g-f0Xl9sdIVLKrNMW9eqfRaFmSCeuCAHGZTSCqs0Ao24qqrZePlo16
Pensioners Are Ditching Hearing Loss Problems With This!HearClear Hearing Aid Quotes





I still think any computer learning to recognize human faces is pretty creepy but doing it locally instead of through a remote server is pretty cool. Also, you have to opt-in for this feature, so you can still buy Amazon's new Echo Show 15 even if you think it's creepy like I do. But enough about creepy stuff.
RECOMMENDED VIDEOS FOR YOU...


CLOSE

Amazon Echo Show 15 Lifestyle

Source: Amazon (Image credit: Source: Amazon)
What the AZ2 can do — on paper anyway — is pretty impressive. Consider the last-gen AZ1, which was able to recognize your voice without Amazon needing to send that data through the cloud. The new model does that, of course, but it's also capable of performing 22 times the amount of operations each second.


This means it has plenty of local bandwidth to learn your face as well as your voice. In fact, Amazon says it can process speech and facial recognition simultaneously. A big reason for this is because it's a neural edge processor. Those sound like the kind of words tech companies like to throw around, but they do mean something — the "neural" part means it's a chip used with algorithms for machine learning and the "edge" part means it can do it without calling for backup from some server.
By doing things locally, there is almost zero latency, which means there is virtually zero wait time between operations. We haven't seen how well it actually operates but based on its capabilities, it looks like the perfect chip to put inside something like an Echo Show.

Speaking of that, the Echo Show 15 is the only device that will use the new AZ2 Neural Edge chip for now. We expect that to change as Amazon brings its Visual ID feature to other devices. Maybe even drones or robots.
Whether you love Amazon products or hate them, you can't help but be impressed with the new AZ2. It's easy to forget that Amazon is also part of Big Tech, but things like this serve to remind us that some top-level engineers work a lot of hours to build those Echo devices so many people love.
Ring Always Home Cam

Source: Ring (Image credit: Source: Ring)
We either have a monstrosity of a competitor that has just appeared! Or this is the first dominant signs of our incorporation into all things Amazon. Data centres and edge devices would be the dream.

Would be strange to have such a company on our podcasts if they were about to make an exact replica of our technology.

This is a rather large walking and talking duck. blind freddy.... what's your take????
 
  • Like
  • Love
  • Thinking
Reactions: 29 users

ndefries

Regular
could you imagine the sonic boom when it is annouced Akida is being used in all Amazon edge devices. I do think we will find out here before the market and the popcorn will be ready watching shorters respond.
 
  • Like
  • Haha
Reactions: 20 users

Learning

Learning to the Top 🕵‍♂️
It certainly does walk and talk like a duck doesn't it @Learning! 🦆



What you need to know about the Amazon AZ2 Neural Engine​


By Jerry Hildenbrand
last updated February 03, 2022


Amazon AZ2 SoC

Amazon AZ2 SoC (Image credit: Amazon)

The Amazon Echo Show 15 not only hangs on your wall but can learn to recognize your face. That's because it has a new piece of Amazon-designed silicon inside dubbed the Amazon AZ2 Neural Engine.

Yes, Amazon custom designs ARM chips. The AZ2 isn't even the first one (hence the 2), but it's a lot more capable than the AZ1, which powers some of the best Alexa speakers and offers something new for Amazon — edge computing.
If you're not sure what edge computing is, this chip and what it does actually makes it easy to understand. All the processing to learn and recognize your face is done using machine learning through the chip itself and nothing needs to be sent across the internet to make that happen.
Sponsored Links

https://info.hearclear.com/latest-Australian-hearing-technology?utm_campaign=1128869&utm_content=3433304281&cid=5ad70f3e65e00&utm_source=taboola&utm_medium=cpc&campaign=HA-AU-HC-2-D&platform=Desktop&utm_term=Pensioners+Are+Ditching+Hearing+Loss+Problems+With+This!&content=https://console.brax-cdn.com/creatives/44dd7285-cd6a-4a0f-9085-8137587509a3/images/nadads-sh-chloe-ha__08b4fa4d-7af5-4f78-9b49-04ed77d730e7_1000x600.jpeg&network=futureplc-androidcentral&title=Pensioners+Are+Ditching+Hearing+Loss+Problems+With+This!&click-id=GiCu3yvkY5g-f0Xl9sdIVLKrNMW9eqfRaFmSCeuCAHGZTSCqs0Ao24qqrZePlo16#tblciGiCu3yvkY5g-f0Xl9sdIVLKrNMW9eqfRaFmSCeuCAHGZTSCqs0Ao24qqrZePlo16
Pensioners Are Ditching Hearing Loss Problems With This!HearClear Hearing Aid Quotes





I still think any computer learning to recognize human faces is pretty creepy but doing it locally instead of through a remote server is pretty cool. Also, you have to opt-in for this feature, so you can still buy Amazon's new Echo Show 15 even if you think it's creepy like I do. But enough about creepy stuff.
RECOMMENDED VIDEOS FOR YOU...


CLOSE

Amazon Echo Show 15 Lifestyle

Source: Amazon (Image credit: Source: Amazon)
What the AZ2 can do — on paper anyway — is pretty impressive. Consider the last-gen AZ1, which was able to recognize your voice without Amazon needing to send that data through the cloud. The new model does that, of course, but it's also capable of performing 22 times the amount of operations each second.


This means it has plenty of local bandwidth to learn your face as well as your voice. In fact, Amazon says it can process speech and facial recognition simultaneously. A big reason for this is because it's a neural edge processor. Those sound like the kind of words tech companies like to throw around, but they do mean something — the "neural" part means it's a chip used with algorithms for machine learning and the "edge" part means it can do it without calling for backup from some server.
By doing things locally, there is almost zero latency, which means there is virtually zero wait time between operations. We haven't seen how well it actually operates but based on its capabilities, it looks like the perfect chip to put inside something like an Echo Show.

Speaking of that, the Echo Show 15 is the only device that will use the new AZ2 Neural Edge chip for now. We expect that to change as Amazon brings its Visual ID feature to other devices. Maybe even drones or robots.
Whether you love Amazon products or hate them, you can't help but be impressed with the new AZ2. It's easy to forget that Amazon is also part of Big Tech, but things like this serve to remind us that some top-level engineers work a lot of hours to build those Echo devices so many people love.
Ring Always Home Cam

Source: Ring (Image credit: Source: Ring)
Great minds think alike @Bravo, I was reading the same article 20 minutes ago, but with hand on tools, can't post it. Lol.
So AZ2 is base on an ARM chip. So if our more knowledgeable members can examine/ cross check/ deep dive into this, would be Amazon (Amazing).
Screenshot_20220930_105806_Samsung Notes.jpg

Thank in advance.

Its great to be a shareholder 🏖
 
  • Like
  • Fire
  • Haha
Reactions: 38 users

Learning

Learning to the Top 🕵‍♂️

Amazon Introduces the New Blink Wired Floodlight Camera and Blink Mini Pan Tilt—Offering Customers Even More Flexibility in Security Coverage and Peace of Mind​


7fce4ceb0fbf3f68fdfa5a7ce6ff68cd

Blink Wired Floodlight Camera uses Amazon’s AZ2 Neural Edge Processor to capture and process videos locally—and starts at just $99.99

New Blink Mini Pan Tilt mount brings additional functionality to the popular Blink Mini, giving customers the ability to pan and tilt their cameras remotely


SEATTLE, September 28, 2022--(BUSINESS WIRE)--Amazon (NASDAQ: AMZN) today introduced two new additions to the Blink family of devices—the new Blink Wired Floodlight Camera and the new Blink Mini Pan Tilt. At just $99.99, the Blink Wired Floodlight Camera includes a smart security camera and powerful LED lighting all in one, streamlined design, and Amazon’s AZ2 silicon to process videos without going to the cloud. The Blink Mini Pan Tilt is a new mount that works with Blink Mini to enable you to see a wider field of view and remotely pan and tilt to follow motion.


"The Blink Wired Floodlight Camera is our first wired floodlight device, and it adds to the existing lineup of easy-to-use, reliable, and affordable security devices that help customers keep an eye on their homes," said Mike Harris, chief operating officer at Blink. "With an all-in-one security and lighting design, and a price below $100, it offers a mix of performance and value that’s hard to beat. Plus, it leverages the intelligence of Amazon silicon, enabling us to offer features such as computer vision and local video processing for the first time."

Blink Wired Floodlight Camera—Advanced Features at an Affordable Price

The Blink Wired Floodlight Camera is designed to offer high performance for those looking for a hardwired security solution in an affordable package. Support for preferred motion detection zones means you can focus on the areas that are most important to you, and new person detection provides the ability to limit motion alert notifications to only when a person is present. Blink Wired Floodlight Camera’s enhanced motion detection features are built on the capabilities provided by Amazon’s AZ2 Neural Edge Processor, which also enables video content to be processed locally on the edge.

The Blink Wired Floodlight Camera provides 2600 lumens of LED lighting, 1080p HD live view, and crisp two-way audio. Setup is easy using an existing wired connection, and you can easily store video clips locally with a Sync Module 2 via a USB flash drive (sold separately). With a Blink Subscription Plan, you can also store video clips and photos in the cloud.

Blink Mini Pan Tilt—Adding New Functionality and Flexibility for Blink Mini

The new Blink Mini Pan Tilt adds a motorized mount to the Blink Mini to help keep an eye on even more of your home. With Mini Pan Tilt, you instantly gain the ability to remotely pan left and right, and tilt up and down, using the Blink app—getting corner-to-corner, 360-degree coverage of any room. If you already have a Blink Mini, you can easily add just the mount via a micro-USB, and you can place it on a table or countertop, or connect via a tri-pod or wall-mount (sold separately) for additional functionality.
Great works @wilzy123

Thanks for sharing.

Its great to be a shareholder 🏖
 
  • Like
  • Fire
Reactions: 12 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
Deatils from Amazon's website on Amason's Visual ID...


Visual ID on Echo Show​


Our mission is for Alexa to provide personalised help and delightful experiences, even in shared spaces like the family kitchen. That’s why we created visual ID for Echo Show 8 (2nd Gen), Echo Show 10 (3rd Gen), and the Echo Show 15. When you enroll in visual ID, you can receive personalised information from Alexa simply by being in front of your device. This means that when you’re in the device camera’s field of view, you can automatically receive personalised content at a glance, like calendars and weather, music, and Notes for You.
Visual ID is a smarter way to simplify your day. Imagine it: In the morning, quickly receive your daily schedule and a Notes for You from your partner that reads, “Have a good day at work!” And when someone else in your household, who is also enrolled in visual ID, comes close to the device, they can see information that’s relevant to them, too.
You can enjoy the convenience of visual ID with the peace of mind that your privacy is protected. It’s easy to set up—see below for details—and your visual ID is securely stored on your device, not in the cloud. If you have more than one compatible device, you—and any adult household member who wants to use visual ID—must individually enroll on each device.
If you’re familiar with Alexa’s voice ID feature, you may already have an idea of how visual ID works. Both features let you receive personalised experiences once you’ve been recognised. With voice ID, you teach Alexa your voice by repeating a series of phrases. With visual ID, you can teach Alexa to recognise you by taking a few quick photos of your face from different angles.

How to create your visual ID on an Echo device​

There are three ways to create your visual ID:
  1. Enroll when you set up your device. (You’ll be asked on-screen if you’d like to enroll.)
  2. Enroll by saying the phrase: “Alexa, learn my face.”
  3. Enroll via touchscreen on your device.
    • Open Settings
    • Open Your Profile & Family
    • Select Enable visual ID

Amazon is committed to protecting your privacy and visual ID was designed with your privacy in mind. Here are answers to some questions you might have:

Can I enroll my child in visual ID?​

Families may set-up a visual ID for each child that is linked to an Amazon Kids profile. If you enroll your child in visual ID for kids, the home screen of your device will shift to only show age-appropriate content when your child is recognised. To give customers choice and control over their Alexa experience, visual ID and voice ID are not pre-requisites of one another, so certain content is still accessible via voice. Enrolling kids in both features offers a robust kid-friendly experience when they walk in front of the device and talk to Alexa.

What happens to the images of my face that the device takes?​

The images of your face that are used to create your visual ID are securely stored on your device, and are not stored in Amazon’s cloud.

Is Amazon collecting and selling my facial data?​

No. Amazon isn’t in the business of selling your personal information. Your visual ID remains securely on your device. Amazon does not have access to your visual ID. You remain in control of whether to keep or delete your visual ID.

How can I temporarily disable visual ID?​

You can either press the mic/camera off button to turn off the camera and microphones or close the camera shutter. To turn visual ID back on, simply press the mic/camera button again or open the camera shutter.

How can I delete my visual ID?​

You can delete your visual ID on your device or the Alexa app.

To delete visual ID in the Alexa app:
• Open Settings
• Open Your Profile & Family
• Select Your Profile
• Select visual ID
• Select Delete visual ID
To delete visual ID using your device:
• Open Settings
• Open Profile & Family
• Select Your Profile
• Select visual ID
• Select Delete visual ID

Personalised convenience​

Visual IDs are a way to make your Alexa experience more personalised and convenient while maintaining the privacy protections that you expect from Amazon. For more information on visual ID, check out the Amazon Science blog post, “The science behind visual ID.”

 
  • Like
  • Love
  • Fire
Reactions: 44 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
Computer vision

The science behind visual ID​


A new opt-in feature for Echo Show and Astro provides more-personalized content and experiences for customers who choose to enroll.​


By The Amazon visual ID teams
September 28, 2021



Share

With every feature and device we build, we challenge ourselves to think about how we can create an immersive, personalized, and proactive experience for our customers. Often, our devices are used by multiple people in our homes, and yet there are times when you want a more personalized experience. That was the inspiration for visual ID.

More coverage of devices and services announcements​


On the all-new Echo Show 15, Echo Show 8, and Echo Show 10, you and other members of your household will soon be able to enroll in visual ID, so that at a glance you can see personalized content such as calendars and reminders, recently played music, and notes for you.
And with Astro, a new kind of household robot, enrolling in visual ID enables Astro to do things like find you to deliver something, such as a reminder or an item in Astro’s cargo bin.

Creating your visual ID​


Visual ID is opt-in, so you must first enroll in the feature, much as you can enroll in voice ID (formerly Alexa voice profile) today. During enrollment, you will use the camera on your supported Echo Show device or Astro to take a series of headshots at different angles. For visual ID to accurately recognize you, we require five different angles of your face.
During the enrollment process, the device runs algorithms to ensure that each of the images is of high enough quality. For example, if the room is too dark, you will see on-screen instructions to adjust the lighting and try again. You will also see on-screen notifications as an image of each pose is successfully captured.
The images are used to create numeric representations of your facial characteristics. Called vectors (one for each angle of your face), these numeric representations are just that: a string of numbers. The images are also used to revise the vectors in the event of periodic updates to the visual ID model — meaning customers are not required to re-enroll in visual ID every time there is a model update. These images and vectors are securely stored on-device, not in Amazon’s cloud.
Up to 10 members of a household per account can enroll on each compatible Echo Show or Astro to enjoy more-personalized experiences for themselves. Customers with more than one visual-ID-compatible device will need to enroll on each device individually.
enrollment image_resized.png


A screenshot of the enrollment process, during which the device’s camera takes a series of headshots at different angles.

Identifying an enrolled individual​


Once you’ve enrolled in visual ID, your device attempts to match people who walk into the camera’s field of view with the visual IDs of enrolled household members. There are two steps to this process, facial detection and facial recognition, and both are done through local processing using machine learning models called convolutional neural networks.
To recognize a person, the device first uses a convolutional neural network to detect when a face appears in the camera’s field of view. If a person whom the device does not recognize as enrolled in visual ID walks into the camera’s field of view, the device will determine that there are no matches to the stored vectors. The device does not retain images or vectors from unenrolled individuals after processing. All of this happens in fractions of a second and is done securely on-device.

When your supported Echo Show device recognizes you, your avatar and a personalized greeting will appear in the upper right of the screen.
Echo Show 15_Visual ID.jpg


An example of what Echo Show 15 might show on its screen once an enrolled individual is recognized.
What shows on Astro’s screen will depend on what Astro is doing. For example, if you’ve enrolled in visual ID, and Astro is trying to find you, Astro will display text on its screen — “Looking for [Bob]”, followed by “Found [Bob]” — to acknowledge that it’s recognized you.
Looking for Bob.png


Astro will display text on its screen — “Looking for [Bob]”, followed by “Found [Bob]” — to acknowledge that it’s recognized you.

Enhancing fairness​


We set a high bar for equity when it came to designing visual ID. To clear that bar, our scientists and engineers built and refined our visual ID models using millions of images — collected in studies with participants’ consent — explicitly representing a diversity of gender, ethnicity, skin tone, age, ability, and other factors. We then set performance targets to ensure the visual ID feature performed well across groups.
In addition to consulting with several Amazon Scholars who specialize in computer vision, we also consulted with an external expert in algorithmic bias, Ayanna Howard, dean of the Ohio State University College of Engineering, to review the steps we took to enhance the fairness of the feature. We’ve implemented feedback from our Scholars and Dr. Howard, and we will solicit and listen to customer feedback and make improvements to ensure the feature continues to improve on behalf of our customers.

Privacy by design​


As with all of our products and services, privacy was foundational to how we built and designed visual ID. As mentioned above, the visual IDs of enrolled household members are securely stored on-device, and both Astro and Echo Show devices use local processing to recognize enrolled customers. You can delete your visual ID from individual devices on which you’ve enrolled through on-device settings and, for Echo Show, through the Alexa app. This will delete the stored enrollment images and associated vectors from your device. We will also automatically delete your visual ID from individual devices if your face is not recognized by that device for 18 months.
It’s still day one for visual ID, Echo Show, and Astro. We look forward to hearing how our customers use visual ID to personalize their experiences with our devices.


 
  • Like
  • Fire
  • Love
Reactions: 31 users
Top Bottom