BRN Discussion Ongoing

Just a guess but maybe we might see a few lines swiped this afternoon.

1717123295079.gif
 
  • Like
  • Haha
  • Fire
Reactions: 6 users

jtardif999

Regular
Good evening...here we go again, yet another patent only published 7 days ago from our mates at:

ACCENTURE GLOBAL SOLUTIONS LTD

Check it out, a research chip is mentioned but a commercially available chip or IP is available NOW...that's why the
continuous learning of gesture recognition and AKIDA IS MENTIONED YET AGAIN.



Patents are very powerful tools, researching them is a must !

Tech ;)
Thanks again @TECH - from the description and claims this can only be Akida. Accenture must have some pretty big customer(s) for all these patents.

I particularly like the following description: “if a user corrects the gesture recognition model by, for example, vocally giving a description of the gesture that the model misrecognized through a microphone of the device, a conversational agent of the human machine interface can trigger a learning process of the neuromorphic processor and the neuromorphic processor can update the gesture recognition model based on the correction in real-time, e.g., within seconds, of the correction being detected. This real-time updating reduces the number of mistakes made by a gesture recognition component of the device in the future and results in faster and more accurate learning of the gesture recognition model, which in turn reduces user frustration and reduces the number of errors caused by misrecognizing user gestures that cause other components to perform actions.”
 
  • Like
  • Fire
  • Love
Reactions: 36 users

Kachoo

Regular
Thanks again @TECH - from the description and claims this can only be Akida. Accenture must have some pretty big customer(s) for all these patents.

I particularly like the following description: “if a user corrects the gesture recognition model by, for example, vocally giving a description of the gesture that the model misrecognized through a microphone of the device, a conversational agent of the human machine interface can trigger a learning process of the neuromorphic processor and the neuromorphic processor can update the gesture recognition model based on the correction in real-time, e.g., within seconds, of the correction being detected. This real-time updating reduces the number of mistakes made by a gesture recognition component of the device in the future and results in faster and more accurate learning of the gesture recognition model, which in turn reduces user frustration and reduces the number of errors caused by misrecognizing user gestures that cause other components to perform actions.”
The key takeaway from this is SNN chips are catching everyone's attention. Obviously Intel is in the game and will display their brand name. I see Lohi 2 moving to a higher power system. I do not think that on the edge for cheap low power devices Intel can compete for price.

There is a reason we joined the Intel partner system and what they plan to use from us only they know. In the end its all about collaboration and growing revenue.
 
  • Like
  • Love
Reactions: 18 users

jtardif999

Regular
When putting Neuromorphic into the search bar of Accenture website there a four videos on the topic, here is one



Does the website give a date for this video? I suspect it is not current. I remember that Mercedes trialled the use of Loihi before settling on Akida for the Vision EQXX proto type way back when. Even if it is current Loihi and its variants can’t compete with Akida in commercial edge devices for both SWaP and price.
 
  • Like
Reactions: 6 users
Does the website give a date for this video? I suspect it is not current. I remember that Mercedes trialled the use of Loihi before settling on Akida for the Vision EQXX proto type way back when. Even if it is current Loihi and its variants can’t compete with Akida in commercial edge devices for both SWaP and price.
It was back in 2021
 
  • Like
Reactions: 3 users

jtardif999

Regular
The key takeaway from this is SNN chips are catching everyone's attention. Obviously Intel is in the game and will display their brand name. I see Lohi 2 moving to a higher power system. I do not think that on the edge for cheap low power devices Intel can compete for price.

There is a reason we joined the Intel partner system and what they plan to use from us only they know. In the end its all about collaboration and growing revenue.
Loihi and variants can’t learn on the fly like Akida can; the Accenture patents can only be describing Akida capabilities imo. Plus we know Loihi and variants are not commercial chips. The patents are describing commercial applications.
 
  • Like
  • Fire
Reactions: 18 users

7für7

Top 20
Loihi and variants can’t learn on the fly like Akida can; the Accenture patents can only be describing Akida capabilities imo. Plus we know Loihi and variants are not commercial chips. The patents are describing commercial applications.

It's an age-old tactic: "If you can't beat them, join them." I bet Intel is absorbing all the knowledge from everyone, and then they'll come up with a product that incorporates something from all their partners, only to drop them with something that the others don't have. The best solution from each competitor in the AI world. BrainChip, be warned! Yes I said now what I’m worrying about and hope I’m wrong
 
  • Like
Reactions: 2 users

Calsco

Regular
Is it too much to ask for a Friday afternoon trading halt and major announcement haha
 
  • Like
  • Haha
Reactions: 6 users

7für7

Top 20
Is it too much to ask for a Friday afternoon trading halt and major announcement haha
Actually usually, just before the day ends, cockroaches appear suddenly and we close red. Just watch!
 
  • Like
Reactions: 2 users
A write up on Alfs HPM attendance.

I like his bold statement at the end, though we do know what happens when management have made bold statements previously that they haven't quite met....yet.

The other thing, what work is ANT61 Brain doing with Akida that is a secret :unsure:



Extending the IoT to Mars​

  • May 30, 2024
  • Steve Rogerson
  • Eseye

How far can the IoT go? Further than you think, as Steve Rogerson discovered at this week’s Hardware Pioneers Max show in London.


AKuchenbuch-512x620.jpg
Alf Kuchenbuch from Brainchip.

This was explained by Alf Kuchenbuch, a vice president at Australian technology company Brainchip (brainchip.com), who told HPM delegates how excited he was that his company’s chips were now doing real edge processing in space.

“Nasa and the ESA are picking up on AI,” he said. “They want to see AI in space. They are nervous, but they are acting with urgency.”

Earlier this month, he attended a workshop in the Netherlands organised by the ESA where he said the general view was that everything that happened on Earth would happen in space in five years’ time.

“Some find that shocking, but it is an inevitable truth,” he said. “Nasa is picking up on this too.”

But he said even satellites in low Earth orbit sometimes hit latency problems. There are also bandwidth difficulties. Satellites sending constant images of the Earth’s surface use a lot of bandwidth, but many of those images are useless because of cloud cover. Applying AI to the images on the satellite can pick those that show not just the top of clouds, and sometimes they can stitch images together, reducing drastically the amount of data they need to send. And if they are being used, say, to track ships, they don’t need to keep sending pictures of the ship, but just its coordinates.

Taking a leaf from autonomous vehicles on Earth, similar technology can be used for performing docking manoeuvres in space and, as mentioned, controlling ground vehicles on the Moon or Mars. Another application is debris removal. There is a lot of junk circling the Earth and there are plans to remove it by slowing it down so it falls towards Earth and burns up.

“These are why AI in space is so necessary,” said Alf.

Brainchip is using neuromorphic AI on its chips, which Alf said had a big advantage in that it worked in a similar way to a brain, only processing information when an event happened, lowering the power requirements. The firm’s Akida chip is on SpaceX’s Transporter 10 mission, launched in March

“We are waiting for them to turn it on and for it to start doing its work,” he said. He wouldn’t say what that work was just that: “It is secret.”

Brainchip is also working with Frontgrade Gaisler (www.gaisler.com), a provider of space-grade systems-on-chip, to explore integrating Akida into fault-tolerant, radiation-hardened microprocessors to make space-grade SoCs incorporating AI.

“If this works out, our chip will be going on the Moon landing, or even to Mars,” he said. “Akida is not a dream. It is here today, and it is up there today.”

I was going to end with some joke about the IoT boldly going to the final frontier, but felt the force wouldn’t really be with me, so I didn’t make it so.
 
  • Like
  • Love
  • Fire
Reactions: 65 users

Leevon

Member
A write up on Alfs HPM attendance.

I like his bold statement at the end, though we do know what happens when management have made bold statements previously that they haven't quite met....yet.

The other thing, what work is ANT61 Brain doing with Akida that is a secret :unsure:



Extending the IoT to Mars​

  • May 30, 2024
  • Steve Rogerson
  • Eseye

How far can the IoT go? Further than you think, as Steve Rogerson discovered at this week’s Hardware Pioneers Max show in London.


AKuchenbuch-512x620.jpg
Alf Kuchenbuch from Brainchip.

This was explained by Alf Kuchenbuch, a vice president at Australian technology company Brainchip (brainchip.com), who told HPM delegates how excited he was that his company’s chips were now doing real edge processing in space.

“Nasa and the ESA are picking up on AI,” he said. “They want to see AI in space. They are nervous, but they are acting with urgency.”

Earlier this month, he attended a workshop in the Netherlands organised by the ESA where he said the general view was that everything that happened on Earth would happen in space in five years’ time.

“Some find that shocking, but it is an inevitable truth,” he said. “Nasa is picking up on this too.”

But he said even satellites in low Earth orbit sometimes hit latency problems. There are also bandwidth difficulties. Satellites sending constant images of the Earth’s surface use a lot of bandwidth, but many of those images are useless because of cloud cover. Applying AI to the images on the satellite can pick those that show not just the top of clouds, and sometimes they can stitch images together, reducing drastically the amount of data they need to send. And if they are being used, say, to track ships, they don’t need to keep sending pictures of the ship, but just its coordinates.

Taking a leaf from autonomous vehicles on Earth, similar technology can be used for performing docking manoeuvres in space and, as mentioned, controlling ground vehicles on the Moon or Mars. Another application is debris removal. There is a lot of junk circling the Earth and there are plans to remove it by slowing it down so it falls towards Earth and burns up.

“These are why AI in space is so necessary,” said Alf.

Brainchip is using neuromorphic AI on its chips, which Alf said had a big advantage in that it worked in a similar way to a brain, only processing information when an event happened, lowering the power requirements. The firm’s Akida chip is on SpaceX’s Transporter 10 mission, launched in March

“We are waiting for them to turn it on and for it to start doing its work,” he said. He wouldn’t say what that work was just that: “It is secret.”

Brainchip is also working with Frontgrade Gaisler (www.gaisler.com), a provider of space-grade systems-on-chip, to explore integrating Akida into fault-tolerant, radiation-hardened microprocessors to make space-grade SoCs incorporating AI.

“If this works out, our chip will be going on the Moon landing, or even to Mars,” he said. “Akida is not a dream. It is here today, and it is up there today.”

I was going to end with some joke about the IoT boldly going to the final frontier, but felt the force wouldn’t really be with me, so I didn’t make it so.
53 payloads went into space on this March transporter 10 mission including a mix of new and returning SpaceX customers... I wonder which of these company's we are waiting on to "turn it on and start doing the work"? https://spacenews.com/spacex-launches-tenth-transporter-rideshare-mission/
 
  • Like
Reactions: 9 users
What does the Reneas and Megachip deal really mean to Brn when a 5 yr deal can be a instant game changer
 
53 payloads went into space on this March transporter 10 mission including a mix of new and returning SpaceX customers... I wonder which of these company's we are waiting on to "turn it on and start doing the work"? https://spacenews.com/spacex-launches-tenth-transporter-rideshare-mission/
Suspect it should be within this one.

Ant61 launched with SMC Optimus and we are with them in their "Brain", yet to be switched on.



Space Machines Company: SMC launched its first Optimus OTV, which carried a suite of customer payloads to space. The 270-kg spacecraft can maneuver around to deposit its riders into specific orbits.

ANT61
29/02/2024
No alternative text description for this image

Today, our space hardware product, ANT61 Brain, has been successfully deployed by SpaceX on board the Space Machines Company Optimus-1 spacecraft.
The Brain is the world's first space-grade neuromorphic computer.
In the past, we used GPUs and TPUs for machine learning, but the neuromorphic technology pioneered by BrainChip is up to 5 times more energy-efficient; that's why we chose it for our computer that will analyse footage from engineering cameras, detecting anomalies in spacecraft operation.
The Brain will only be turned on later in the Optimus-1 mission, so stay tuned for future updates on the operation.

We want to take this opportunity to thank people who have been instrumental in making this mission and the ANT61 Brain product possible.
In line with our tradition, your names are written on the flight model that is now in orbit!
 
  • Like
  • Fire
  • Love
Reactions: 30 users

Iseki

Regular
53 payloads went into space on this March transporter 10 mission including a mix of new and returning SpaceX customers... I wonder which of these company's we are waiting on to "turn it on and start doing the work"? https://spacenews.com/spacex-launches-tenth-transporter-rideshare-mission/
That will be Space Machines, an Australian startup, with its Optimus spacecraft, which it described as being the largest commercial spacecraft built to date in Australia. The spacecraft is designed to demonstrate on-orbit maneuvering and future servicing technologies. The Optimus uses the ANT61 Brain™ computer, which serves as the primary intelligent control for a series of repair and maintenance robots that will be used to remotely repair damaged space vehicles. Th ANT61 has an Akida1000 inside.
 
  • Like
  • Love
  • Fire
Reactions: 19 users

I’m not a techie, but for me there’s something cool in this comparative article

Industry titan AWS on the one hand, cloud based, data centres gpu’s et al’, to our chipper holding up the other side of the balance scales, low power, high efficiency SNN for processing at the edge……

Love it……

Let’s hope in the near future, theres a new familiar name in the realm of titans ……. Brainchip !
 
  • Like
  • Fire
Reactions: 23 users
Recent EETimes podcast with Synsense but some conversation / discussion after with Dr. Giulia D’Angelo from the Fortiss research institute in Munich, and Professor Ralph Etienne-Cummings of Johns Hopkins University which I was more interested in.

Couple of excerpts from the tail end and I do hope they manage to organise it for next season.


So I think folks have been thinking about that. And, just to kind of loop this all the way around, Tony recently I believe started with BrainChip—he’s the CTO at BrainChip. And, in fact, I think they announced a new product that they’re selling that is some edge machine-learning device that will, I imagine, take the kind of sensing data and then process it as well. But I don’t think it’s embedded. What I mean by embedded is I don’t think there’s any cameras associated with the chip itself, right? It’s more kind of a distributed system.

.......

SB: I’m hoping to get an interview with Prophesee for the next season when we’ll also be talking to a lot more companies including, hopefully, BrainChip, Rain Neuromorphics, and a few others. Sorry, Giulia.

.......

But that’s the way technology goes. There has to be diversity before you have consolidation. And consolidation is interesting, because we’ve already had consolidation between iniVation and SynSense—which is not surprising, because they have the sort of same founders and are in the same city. I’m wondering how much of that is going to happen, too, within the coming years amongst the others. Innatera and, you know, we’ve got Prophesee doing these deals with Qualcomm, we’ve got BrainChip.
 
  • Like
  • Fire
  • Thinking
Reactions: 31 users

Sirod69

bavarian girl ;-)
Low-Power Image Classification With the BrainChip Akida Edge AI Enablement Platform

 
  • Like
  • Love
  • Fire
Reactions: 23 users

Diogenese

Top 20
Recent EETimes podcast with Synsense but some conversation / discussion after with Dr. Giulia D’Angelo from the Fortiss research institute in Munich, and Professor Ralph Etienne-Cummings of Johns Hopkins University which I was more interested in.

Couple of excerpts from the tail end and I do hope they manage to organise it for next season.


So I think folks have been thinking about that. And, just to kind of loop this all the way around, Tony recently I believe started with BrainChip—he’s the CTO at BrainChip. And, in fact, I think they announced a new product that they’re selling that is some edge machine-learning device that will, I imagine, take the kind of sensing data and then process it as well. But I don’t think it’s embedded. What I mean by embedded is I don’t think there’s any cameras associated with the chip itself, right? It’s more kind of a distributed system.

.......

SB: I’m hoping to get an interview with Prophesee for the next season when we’ll also be talking to a lot more companies including, hopefully, BrainChip, Rain Neuromorphics, and a few others. Sorry, Giulia.

.......

But that’s the way technology goes. There has to be diversity before you have consolidation. And consolidation is interesting, because we’ve already had consolidation between iniVation and SynSense—which is not surprising, because they have the sort of same founders and are in the same city. I’m wondering how much of that is going to happen, too, within the coming years amongst the others. Innatera and, you know, we’ve got Prophesee doing these deals with Qualcomm, we’ve got BrainChip.
For pity sake! Let's hope @Bravo doesn't see this!
 
  • Haha
  • Like
Reactions: 9 users

Sirod69

bavarian girl ;-)
The BrainChip team wrapped up a successful presence at the Hardware Pioneers - Max24 event! Partnering with Edge Impulse, our booth saw a surge of interest in our technology, both from current and new enthusiasts. Alf Kuchenbuch highlighted the clear appeal of neuromorphic AI, emphasizing its orders of magnitude lower power demand. Stay tuned as BrainChip continues to expand into new frontiers, including space and beyond!


BrainChip and neruomorphic AI was celebrated at Hardware Pioneers - Max24!
Partnering with @EdgeImpulse, our booth was buzzing with attention.

 
Last edited:
  • Like
  • Love
  • Fire
Reactions: 42 users
Top Bottom