BRN Discussion Ongoing

Diogenese

Top 20
  • Haha
  • Like
  • Thinking
Reactions: 7 users

Zedjack33

Regular
Mary doesn’t need to explain to us Australians what a TOSA is.

1665227336142.gif
 
  • Haha
  • Like
Reactions: 6 users

Bigal7425

Regular
I am wondering why Edge Impulse is promoting this.

AI brain, AI technology, AI algorithms.. A million references to the mysterious AI antidote 🤔 Whether it's a phone, a robot, a car, a drone, a camera, a VR headset or a damn fridge.. I don't care 😳 I am just desperate to buy something with brainchip on board.. I'm sure there's 1,000+ sales right here if they'd just add a little AKIDA to their marketing and propaganda 😊
 
  • Like
  • Love
  • Thinking
Reactions: 27 users

alwaysgreen

Top 20
AI brain, AI technology, AI algorithms.. A million references to the mysterious AI antidote 🤔 Whether it's a phone, a robot, a car, a drone, a camera, a VR headset or a damn fridge.. I don't care 😳 I am just desperate to buy something with brainchip on board.. I'm sure there's 1,000+ sales right here if they'd just add a little AKIDA to their marketing and propaganda 😊
I bought one already. Regardless if it has Akida onboard or not, their cameras are amazing.
 
Last edited:
  • Like
  • Haha
Reactions: 8 users
I had yesterday a stupid and naive post regarding Akidas in parallel but already deleted again. Nevertheless, the thought does not let me go. The answers to my following questions can be found here somewhere but you know how it is. So I just dare to ask:

1.) How many Akidas can be connected in parallel, at least I assume that it is parallel.

2.) Why is the number limited?

3) I assume that the energy consumption cannot be the same no matter what is calculated. What is the maximum energy consumption per Akida if someone upscaled the full program?

My thought goes in the direction of an Akida cluster. So the maximum what goes. Like an Akida mainframe.
I am not a techie so all I can do is remember what Peter van der Made said at the 2019 AGM when they had finished the design of the original AKD1000 and he stated that 1,024 AKD1000 could be combined.

He also said that 100 combined AKD1000 would provide all the processing power sufficient for a fully autonomous vehicle.

In 2020 the former CEO Mr. Dinardo announced that they had settled on configuring AKD1000 to allow for 64 to be connected as the target markets would not require anymore processing power than this would offer.

My opinion only DYOR
FF

AKIDA BALLISTA
 
  • Like
  • Love
  • Fire
Reactions: 24 users

Sirod69

bavarian girl ;-)
Arm AI Ecosystem Partner Plumerai creates extremely small AND extremely accurate AI models that run #onarm—this is a great overview of how they do it—check it out! 👇👇

A couple of weeks ago I presented a webinar about our fast and accurate people detection. The recording is now available. See how we developed our AI to run on an Arm CPU inside TI’s AM62x Sitara. Live demo starts at 13:34. Thanks for hosting me Texas Instruments!


AND Rob Telson likes it!

How you all always say: It's great to be a shareholder! 😘
 
  • Like
  • Love
  • Fire
Reactions: 29 users

Diogenese

Top 20
Arm AI Ecosystem Partner Plumerai creates extremely small AND extremely accurate AI models that run #onarm—this is a great overview of how they do it—check it out! 👇👇

A couple of weeks ago I presented a webinar about our fast and accurate people detection. The recording is now available. See how we developed our AI to run on an Arm CPU inside TI’s AM62x Sitara. Live demo starts at 13:34. Thanks for hosting me Texas Instruments!


AND Rob Telson likes it!

How you all always say: It's great to be a shareholder! 😘
From about the 20 minute mark, he talks about their model libraries. They use 1-bit weights/activations, so the library may be a good fit for Akida.

The inference engine is software running on Cortex M with sufficiently low power that no fans are required. It can take up to 25 fps, but 5 fps is adequate for tracking.

So maybe Akida loaded with Plumerai model library, Akida doing the object detection (bounding boxes) would take the object detection load off the software, then the software could do the tracking.

Late edition: Using Akida to do the object detection would reduce power consumption and speed up detection. Also I guess it would significantly improve fps because the software detection would be the main bottleneck limiting fps.
 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 35 users

Diogenese

Top 20
From about the 20 minute mark, he talks about their model libraries. They use 1-bit weights/activations, so the library may be a good fit for Akida.

The inference engine is software running on Cortex M with sufficiently low power that no fans are required. It can take up to 25 fps, but 5 fps is adequate for tracking.

So maybe Akida loaded with Plumerai model library, Akida doing the object detection (bounding boxes) would take the object detection load off the software, then the software could do the tracking.

Late edition: Using Akida to do the object detection would reduce power consumption and speed up detection. Also I guess it would significantly improve fps because the software detection would be the main bottleneck limiting fps.
When you think about it, Plumerai is like the Akida simulation in MetaTF.

I had previously suggested that BrainChip could commercialize the Akida simulation to get to the market which can't afford to have a SoC made. Well, it looks like Plumerai have done that, developing their own model libraries in the process. Of course, BrainChip has its own proprietary model libraries.
 
  • Like
  • Fire
  • Love
Reactions: 22 users
From about the 20 minute mark, he talks about their model libraries. They use 1-bit weights/activations, so the library may be a good fit for Akida.

The inference engine is software running on Cortex M with sufficiently low power that no fans are required. It can take up to 25 fps, but 5 fps is adequate for tracking.

So maybe Akida loaded with Plumerai model library, Akida doing the object detection (bounding boxes) would take the object detection load off the software, then the software could do the tracking.

Late edition: Using Akida to do the object detection would reduce power consumption and speed up detection. Also I guess it would significantly improve fps because the software detection would be the main bottleneck limiting fps.
Hi @Diogenese
One of their use cases is a Smartdoor bell security camera with the idea it can monitor for people and will not set off false alarms for birds, cats, dogs etc; while at the same time not alerting for people who simply walk past the premises on the public footpath/sidewalk.

On what I have read it does not seem to have the ability to oneshot learn the occupants of the residence so if I am correct this would be another advantage AKIDA would bring to the table.

My opinion only DYOR
FF

AKIDA BALLISTA
 
  • Like
  • Fire
  • Love
Reactions: 23 users

Diogenese

Top 20
Hi @Diogenese
One of their use cases is a Smartdoor bell security camera with the idea it can monitor for people and will not set off false alarms for birds, cats, dogs etc; while at the same time not alerting for people who simply walk past the premises on the public footpath/sidewalk.

On what I have read it does not seem to have the ability to oneshot learn the occupants of the residence so if I am correct this would be another advantage AKIDA would bring to the table.

My opinion only DYOR
FF

AKIDA BALLISTA
Hi FF,

Good point.

Plumerai "tracks people and assigns up to 20 unique IDs". But that is not necessarily the same as learning in the sense of augmenting the model library. It suggests to me that Plumerai can identify 20 different people in one scene and assign temporary labels to them to facilitate tracking.

https://plumerai.com/people-detection

Functionality​

  • Detects each person in view, even if partially occluded.
  • Tracks people and assigns up to 20 unique IDs.
  • Indoor, outdoor, NIR lighting.
  • Detection distance of more than 20m / 65ft.
  • Trained with 32 million labeled images.
  • Extensively validated on diverse people and settings.
  • Supports lenses up to 180 FOV.
 
  • Like
  • Fire
  • Love
Reactions: 21 users

clip

Regular



I remember Rob Telson saying our Ken the robot might need a companion like a dog.

Loona petbot fits the description. Opinion only, dyor.


Furthermore, with its 3D ToF (time-of-flight) camera, Loona can roam freely using four-wheel and two-wheel motions through its proprietary self-balancing design. Additionally, Loona features a high-performance CPU processor, allowing the chip to perform 54 trillion neural network convolution calculations per second, equating to one-quarter of PC's calculation capability, which is rarely seen in consumer robotics.



Voice based AI: Voice source localization, voice wake-up, voice recognition, and natural language recognition

Visual based AI features: Face recognition, emotion detection, human body recognition, human skeleton-based recognition, gesture recognition, motion detection, edge detection, 3D environment detection, trajectory planning

Emotional Interactive AI model
Loona equips powerful offline process capabilities. All AI functions are processed by Loona locally, except voice recognition and natural language processing which are based on Lax service provided by Amazon. Loona's process ability is as powerful as your smart phone, can perform 5 trillion neural network convolution calculations per second (5 TOPS, at the same level as iPhone XR )


E
dit: I see @clip had mentioned it already a few days ago

And it possibly fits with AKIDA production timeline.
First prototype was made in Januar 2022, the same month when Mercedes announced their partnership with BrainChip ;)

2b96e5a926475d0b665211bcea4aad67_original.png


Like Clicbot, Loona is also programmable.



When you think of the car race demontration video, you can imagine children drawing parcours with chalk on the ground, as one example.

 
Last edited:
  • Like
  • Love
  • Fire
Reactions: 31 users

TheFunkMachine

seeds have the potential to become trees.
With Mercedes we know three things:

1. Hey Mercedes is an example of using AKIDA,

2. There are other uses unspecified,

3. When it is deployed at scale it will revolutionise existing technology in Mercedes vehicles,

thus all of your nominated categories and more are in play for Brainchip to win at Mercedes Benz.

When we listen to the Luca Verre Prophesee podcast despite having worked with Qualcomm Snapdragon, then SynSense it was not till Prophesee worked with Brainchip and used AKIDA with their vision sensor were they able to reach the full potential of their technology.

Paraphrasing what Rob Telson said in the latest ML presentation AKIDA is just better than anything else.

My opinion only DYOR
FF

AKIDA BALLISTA
It was fun when Rob said. And I paraphrase as well. When I’m listening to all of the other presenters, all I can think of is Brainchip. I truely believe he meant that and that was off the script in the moment. It was not a marketing thing, he is sold.
 
  • Like
  • Love
Reactions: 27 users

charles2

Regular
Slow times for semiconductors......But perhaps increased opportunities for Brainchip to herald their ubiquitousness.

 
  • Like
Reactions: 8 users

Foxdog

Regular
I bought one already. Regardless if it has Akida onboard or not, their cameras are amazing.
Feel like dismantling it to see what's inside? 🤔😂
 
  • Haha
  • Like
Reactions: 12 users

Krustor

Regular
 
  • Like
  • Wow
Reactions: 4 users

MDhere

Regular
Feel like dismantling it to see what's inside? 🤔😂
Morning fellow brners,

I can't imagine that when products are out that we won't have any idea if akida is inside without dismantling it lol

I would at the very least, expect that products containing the Akida technology would have Akida stamped on the outside of the product or box. There's got to be some acknowledgement.

But sure for this experiment fire away and dismantle it @alwaysgreen 🤣
 
  • Like
  • Haha
  • Fire
Reactions: 8 users
It was fun when Rob said. And I paraphrase as well. When I’m listening to all of the other presenters, all I can think of is Brainchip. I truely believe he meant that and that was off the script in the moment. It was not a marketing thing, he is sold.
Yes it was a great moment.

Then I thought who does this remind me of and then it hit me could Rob Telson be @MC🐠 ?

I mean everywhere they both look they see Brainchip???

Not conclusive I know as everywhere I look I see Brainchip as well. @Bravo also seems to be afflicted as well.

@MC🐠 has however never been seen in the same room as Rob Telson and he likes to make burgers and did not attend the last AGM in Sydney???

Regards
FF

AKIDA BALLISTA
 
  • Haha
  • Like
  • Fire
Reactions: 41 users

TopCat

Regular
This is a very interesting interview with some of the Sony engineers that worked with Prophesee on their EVS. They sound very confident in the tech. Future looks bright!


Sakai:First of all, I want to devote myself fully to this project until we can achieve mass production and shipment of this product, the IMX636.


Nitta:Generally speaking, I want to focus on having EVS become widespread in the market.

Furukawa:To begin with, I believe we must first properly bring EVS to mass production.

Ihara:In addition to the manufacturing and inspection industries, I would like to expand the application range of EVS to various areas where frame-based image sensors are currently in use.
 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 19 users

BaconLover

Founding Member
I could not find this company mentioned in TSE on my search, so here goes;

https://www.gensight-biologics.com/


Partnered with prophesee and got mentioned by Luca in the recent podcast;


 
  • Like
  • Love
  • Fire
Reactions: 25 users
Top Bottom