BRN Discussion Ongoing

Damo4

Regular
 
Last edited:
  • Like
  • Thinking
  • Fire
Reactions: 33 users

Esq.111

Fascinatingly Intuitive.
Morning Damo 4 ,

Must have missed this one.... 20 / 5 / 24.???

As a engineer years ago , one of the first things I was taught ... measure twice , cut once.

Esq.
 
  • Like
  • Fire
Reactions: 12 users

Damo4

Regular
Morning Damo 4 ,

Must have missed this one.... 20 / 5 / 24.???

As a engineer years ago , one of the first things I was taught ... measure twice , cut once.

Esq.
Updated now with the corrected tweet
 
  • Like
  • Haha
  • Fire
Reactions: 7 users

Evermont

Stealth Mode
  • Like
  • Fire
  • Love
Reactions: 19 users

IMG_1873.jpeg
 
  • Like
  • Love
Reactions: 30 users

Dougie54

Regular
I had already noticed in last month’s job offer for a new HR manager (replacing Sheila Sabanal-Lau) that our company recently implemented cost saving measures by limiting the legendary free lunch to 2-3 times a week! 😂

Apparantly now on those 3 days its all you can eat!
Another nose for the feed bag
 
  • Like
  • Fire
  • Thinking
Reactions: 7 users
 
  • Like
  • Fire
  • Love
Reactions: 11 users
When putting Neuromorphic into the search bar of Accenture website there a four videos on the topic, here is one


 
  • Like
  • Thinking
  • Fire
Reactions: 10 users
Just a guess but maybe we might see a few lines swiped this afternoon.

1717123295079.gif
 
  • Like
  • Haha
  • Fire
Reactions: 6 users

jtardif999

Regular
Good evening...here we go again, yet another patent only published 7 days ago from our mates at:

ACCENTURE GLOBAL SOLUTIONS LTD

Check it out, a research chip is mentioned but a commercially available chip or IP is available NOW...that's why the
continuous learning of gesture recognition and AKIDA IS MENTIONED YET AGAIN.



Patents are very powerful tools, researching them is a must !

Tech ;)
Thanks again @TECH - from the description and claims this can only be Akida. Accenture must have some pretty big customer(s) for all these patents.

I particularly like the following description: “if a user corrects the gesture recognition model by, for example, vocally giving a description of the gesture that the model misrecognized through a microphone of the device, a conversational agent of the human machine interface can trigger a learning process of the neuromorphic processor and the neuromorphic processor can update the gesture recognition model based on the correction in real-time, e.g., within seconds, of the correction being detected. This real-time updating reduces the number of mistakes made by a gesture recognition component of the device in the future and results in faster and more accurate learning of the gesture recognition model, which in turn reduces user frustration and reduces the number of errors caused by misrecognizing user gestures that cause other components to perform actions.”
 
  • Like
  • Fire
  • Love
Reactions: 36 users

Kachoo

Regular
Thanks again @TECH - from the description and claims this can only be Akida. Accenture must have some pretty big customer(s) for all these patents.

I particularly like the following description: “if a user corrects the gesture recognition model by, for example, vocally giving a description of the gesture that the model misrecognized through a microphone of the device, a conversational agent of the human machine interface can trigger a learning process of the neuromorphic processor and the neuromorphic processor can update the gesture recognition model based on the correction in real-time, e.g., within seconds, of the correction being detected. This real-time updating reduces the number of mistakes made by a gesture recognition component of the device in the future and results in faster and more accurate learning of the gesture recognition model, which in turn reduces user frustration and reduces the number of errors caused by misrecognizing user gestures that cause other components to perform actions.”
The key takeaway from this is SNN chips are catching everyone's attention. Obviously Intel is in the game and will display their brand name. I see Lohi 2 moving to a higher power system. I do not think that on the edge for cheap low power devices Intel can compete for price.

There is a reason we joined the Intel partner system and what they plan to use from us only they know. In the end its all about collaboration and growing revenue.
 
  • Like
  • Love
Reactions: 18 users

jtardif999

Regular
When putting Neuromorphic into the search bar of Accenture website there a four videos on the topic, here is one



Does the website give a date for this video? I suspect it is not current. I remember that Mercedes trialled the use of Loihi before settling on Akida for the Vision EQXX proto type way back when. Even if it is current Loihi and its variants can’t compete with Akida in commercial edge devices for both SWaP and price.
 
  • Like
Reactions: 6 users
Does the website give a date for this video? I suspect it is not current. I remember that Mercedes trialled the use of Loihi before settling on Akida for the Vision EQXX proto type way back when. Even if it is current Loihi and its variants can’t compete with Akida in commercial edge devices for both SWaP and price.
It was back in 2021
 
  • Like
Reactions: 3 users

jtardif999

Regular
The key takeaway from this is SNN chips are catching everyone's attention. Obviously Intel is in the game and will display their brand name. I see Lohi 2 moving to a higher power system. I do not think that on the edge for cheap low power devices Intel can compete for price.

There is a reason we joined the Intel partner system and what they plan to use from us only they know. In the end its all about collaboration and growing revenue.
Loihi and variants can’t learn on the fly like Akida can; the Accenture patents can only be describing Akida capabilities imo. Plus we know Loihi and variants are not commercial chips. The patents are describing commercial applications.
 
  • Like
  • Fire
Reactions: 18 users

7für7

Regular
Loihi and variants can’t learn on the fly like Akida can; the Accenture patents can only be describing Akida capabilities imo. Plus we know Loihi and variants are not commercial chips. The patents are describing commercial applications.

It's an age-old tactic: "If you can't beat them, join them." I bet Intel is absorbing all the knowledge from everyone, and then they'll come up with a product that incorporates something from all their partners, only to drop them with something that the others don't have. The best solution from each competitor in the AI world. BrainChip, be warned! Yes I said now what I’m worrying about and hope I’m wrong
 
  • Like
Reactions: 2 users

Calsco

Regular
Is it too much to ask for a Friday afternoon trading halt and major announcement haha
 
  • Like
  • Haha
Reactions: 6 users

7für7

Regular
Is it too much to ask for a Friday afternoon trading halt and major announcement haha
Actually usually, just before the day ends, cockroaches appear suddenly and we close red. Just watch!
 
  • Like
Reactions: 2 users
A write up on Alfs HPM attendance.

I like his bold statement at the end, though we do know what happens when management have made bold statements previously that they haven't quite met....yet.

The other thing, what work is ANT61 Brain doing with Akida that is a secret :unsure:



Extending the IoT to Mars​

  • May 30, 2024
  • Steve Rogerson
  • Eseye

How far can the IoT go? Further than you think, as Steve Rogerson discovered at this week’s Hardware Pioneers Max show in London.


AKuchenbuch-512x620.jpg
Alf Kuchenbuch from Brainchip.

This was explained by Alf Kuchenbuch, a vice president at Australian technology company Brainchip (brainchip.com), who told HPM delegates how excited he was that his company’s chips were now doing real edge processing in space.

“Nasa and the ESA are picking up on AI,” he said. “They want to see AI in space. They are nervous, but they are acting with urgency.”

Earlier this month, he attended a workshop in the Netherlands organised by the ESA where he said the general view was that everything that happened on Earth would happen in space in five years’ time.

“Some find that shocking, but it is an inevitable truth,” he said. “Nasa is picking up on this too.”

But he said even satellites in low Earth orbit sometimes hit latency problems. There are also bandwidth difficulties. Satellites sending constant images of the Earth’s surface use a lot of bandwidth, but many of those images are useless because of cloud cover. Applying AI to the images on the satellite can pick those that show not just the top of clouds, and sometimes they can stitch images together, reducing drastically the amount of data they need to send. And if they are being used, say, to track ships, they don’t need to keep sending pictures of the ship, but just its coordinates.

Taking a leaf from autonomous vehicles on Earth, similar technology can be used for performing docking manoeuvres in space and, as mentioned, controlling ground vehicles on the Moon or Mars. Another application is debris removal. There is a lot of junk circling the Earth and there are plans to remove it by slowing it down so it falls towards Earth and burns up.

“These are why AI in space is so necessary,” said Alf.

Brainchip is using neuromorphic AI on its chips, which Alf said had a big advantage in that it worked in a similar way to a brain, only processing information when an event happened, lowering the power requirements. The firm’s Akida chip is on SpaceX’s Transporter 10 mission, launched in March

“We are waiting for them to turn it on and for it to start doing its work,” he said. He wouldn’t say what that work was just that: “It is secret.”

Brainchip is also working with Frontgrade Gaisler (www.gaisler.com), a provider of space-grade systems-on-chip, to explore integrating Akida into fault-tolerant, radiation-hardened microprocessors to make space-grade SoCs incorporating AI.

“If this works out, our chip will be going on the Moon landing, or even to Mars,” he said. “Akida is not a dream. It is here today, and it is up there today.”

I was going to end with some joke about the IoT boldly going to the final frontier, but felt the force wouldn’t really be with me, so I didn’t make it so.
 
  • Like
  • Love
  • Fire
Reactions: 65 users

Leevon

Emerged
A write up on Alfs HPM attendance.

I like his bold statement at the end, though we do know what happens when management have made bold statements previously that they haven't quite met....yet.

The other thing, what work is ANT61 Brain doing with Akida that is a secret :unsure:



Extending the IoT to Mars​

  • May 30, 2024
  • Steve Rogerson
  • Eseye

How far can the IoT go? Further than you think, as Steve Rogerson discovered at this week’s Hardware Pioneers Max show in London.


AKuchenbuch-512x620.jpg
Alf Kuchenbuch from Brainchip.

This was explained by Alf Kuchenbuch, a vice president at Australian technology company Brainchip (brainchip.com), who told HPM delegates how excited he was that his company’s chips were now doing real edge processing in space.

“Nasa and the ESA are picking up on AI,” he said. “They want to see AI in space. They are nervous, but they are acting with urgency.”

Earlier this month, he attended a workshop in the Netherlands organised by the ESA where he said the general view was that everything that happened on Earth would happen in space in five years’ time.

“Some find that shocking, but it is an inevitable truth,” he said. “Nasa is picking up on this too.”

But he said even satellites in low Earth orbit sometimes hit latency problems. There are also bandwidth difficulties. Satellites sending constant images of the Earth’s surface use a lot of bandwidth, but many of those images are useless because of cloud cover. Applying AI to the images on the satellite can pick those that show not just the top of clouds, and sometimes they can stitch images together, reducing drastically the amount of data they need to send. And if they are being used, say, to track ships, they don’t need to keep sending pictures of the ship, but just its coordinates.

Taking a leaf from autonomous vehicles on Earth, similar technology can be used for performing docking manoeuvres in space and, as mentioned, controlling ground vehicles on the Moon or Mars. Another application is debris removal. There is a lot of junk circling the Earth and there are plans to remove it by slowing it down so it falls towards Earth and burns up.

“These are why AI in space is so necessary,” said Alf.

Brainchip is using neuromorphic AI on its chips, which Alf said had a big advantage in that it worked in a similar way to a brain, only processing information when an event happened, lowering the power requirements. The firm’s Akida chip is on SpaceX’s Transporter 10 mission, launched in March

“We are waiting for them to turn it on and for it to start doing its work,” he said. He wouldn’t say what that work was just that: “It is secret.”

Brainchip is also working with Frontgrade Gaisler (www.gaisler.com), a provider of space-grade systems-on-chip, to explore integrating Akida into fault-tolerant, radiation-hardened microprocessors to make space-grade SoCs incorporating AI.

“If this works out, our chip will be going on the Moon landing, or even to Mars,” he said. “Akida is not a dream. It is here today, and it is up there today.”

I was going to end with some joke about the IoT boldly going to the final frontier, but felt the force wouldn’t really be with me, so I didn’t make it so.
53 payloads went into space on this March transporter 10 mission including a mix of new and returning SpaceX customers... I wonder which of these company's we are waiting on to "turn it on and start doing the work"? https://spacenews.com/spacex-launches-tenth-transporter-rideshare-mission/
 
  • Like
Reactions: 9 users
Top Bottom