Potential Applications for Akida?

uiux

Regular
Hi all

I believe that I may have identified a previously unknown Early Access Partner - Roborigger (Tensa Equipment)

I haven't looked into this in super detail as work has been pretty busy lately, but have listed my initial research below and would welcome any thoughts or comments.

Last week I attended an eGroup session here in Perth and the guest speaker was Derick Markwell from Roborigger.

"The Roborigger is a wireless load controlling system which uses gyroscopic and inertial forces to accurately rotate and orient crane loads"

1. Let's start with the product and Perth innovation market itself as the context here holds some weight in my opinion.​

Derick Markwell and Tensa Equipment, the company that owns Roborigger, are based here in Perth. The innovation hub within Perth is relatively small all things considered and so I will often run into the same innovators, founders, industry experts, and investors at Perth-based innovation and technology events [note: I'm an accountant so am generally attending these events for business development].

Regarding development timelines, Roborigger was undergoing research and development in Perth throughout 2015, 2016 and 2017. The first prototype was considered 'market-ready' in 2018. Comparatively, the Brainchip RTO occurred in 2015 so it wouldn't be a stretch to assume that Derick and Peter VDM would have run into each other during those early years if they were attending Perth-based innovation and tech events at that point in time.

2. Let's now look at Roborigger itself​
The Roborigger comes with a cloud based IoT application. The built-in software collects data from every lift including time, weight, and the location of the loads being lifted - the software is undergoing continual development and I recall Derick mentioning in his presentation that the aim is to continue to develop it so that customers have data to understand the productivity of their cranes, the productivity of the personnel operating the cranes and the productivity of their operational sites in general.

Somewhat related, Roborigger has also developed logistics software that'll work hand in hand from a warehouse point of view to allow for complete end-to-end customer service [arrival at warehouse -> site -> lift -> installation as an example]. The logistics software is still a work-in-progress. The plan is to develop additional products that'll be attached to forklifts and can identify packages via object detection and take photos that'll help to create an inventory audit trail from warehouse to installation.

Roborigger is developing a 'personnel detection system' based on an artificial intelligence (AI) model to detect personnel within the fall zone of a suspended load. The protoype system is already working.

Page 2 of this IoT Brochure shows details of the AI image recognition capability, including how it works.

Note, "Future capability to identify and categorize the type of the loads being lifted e.g. a shipping container or a bundle reinforcing steel, etc. by image recognition."

Here is a video of the detection system in action.

I've included a screenshot below of what the object detection system looks like - looks familiar right!?

View attachment 13748

From memory (hopefully to be confirmed), Derick explained that Roborigger can sound an alarm when personnel walk within the fall zone, even if there is no internet connection. The 'event' is then later logged when the Roborigger reconnects to the cloud. [I have messaged Derick to confirm if this is correct given it is an extremely important point].

In regards to the timing of the development of the AI image recognition capabilities, I've found the following, noting that the dates are important:

15 June 2020 - BrainChip Successfully Launches the Akida Early Access Program

5 October 2020 - Upload date of the [above] Personnel Detection System YouTube video.

4 November 2020 - METS Ignited Sponsors Roborigger To Accelerate IoT Development
"The current Roborigger IoT development roadmap includes AI image recognition capabilities to detect and give warnings when personnel are under the crane loads"

Initial take aways for me:
- the timing of the product development fits nicely with akida
- the location of the company and managing director also fit nicely with brainchip/pvdm
- the look, smell and feel of the product also fit nicely with akida

If it looks like a duck, swims like a duck, and quacks like a duck, then it probably is a duck.

That's all for now. Feel free to poke, prod and respond with ogres should they be warranted.

Cheers all.


The screenshot is just a generic object detection network like Yolo

And you are just plotting out coincidental timelines?

Is that the strongest link you found?
 
  • Like
Reactions: 2 users

SERA2g

Founding Member
The screenshot is just a generic object detection network like Yolo

And you are just plotting out coincidental timelines?

Is that the strongest link you found?
Hi U

The personnel detection systems ability to perform inference on device without cloud connectivity would be the strongest connection in my mind, no?

No word from Derick yet on whether my memory on that subject is correct but I will update here if I receive a response.

I’m not an industry expert like you though so just doing what I can to piece it all together.

Cheers
 
  • Like
  • Love
  • Fire
Reactions: 10 users

stuart888

Regular
Hi all

I believe that I may have identified a previously unknown Early Access Partner - Roborigger (Tensa Equipment)

I haven't looked into this in super detail as work has been pretty busy lately, but have listed my initial research below and would welcome any thoughts or comments.

Last week I attended an eGroup session here in Perth and the guest speaker was Derick Markwell from Roborigger.

"The Roborigger is a wireless load controlling system which uses gyroscopic and inertial forces to accurately rotate and orient crane loads"

1. Let's start with the product and Perth innovation market itself as the context here holds some weight in my opinion.​

Derick Markwell and Tensa Equipment, the company that owns Roborigger, are based here in Perth. The innovation hub within Perth is relatively small all things considered and so I will often run into the same innovators, founders, industry experts, and investors at Perth-based innovation and technology events [note: I'm an accountant so am generally attending these events for business development].

Regarding development timelines, Roborigger was undergoing research and development in Perth throughout 2015, 2016 and 2017. The first prototype was considered 'market-ready' in 2018. Comparatively, the Brainchip RTO occurred in 2015 so it wouldn't be a stretch to assume that Derick and Peter VDM would have run into each other during those early years if they were attending Perth-based innovation and tech events at that point in time.

2. Let's now look at Roborigger itself​
The Roborigger comes with a cloud based IoT application. The built-in software collects data from every lift including time, weight, and the location of the loads being lifted - the software is undergoing continual development and I recall Derick mentioning in his presentation that the aim is to continue to develop it so that customers have data to understand the productivity of their cranes, the productivity of the personnel operating the cranes and the productivity of their operational sites in general.

Somewhat related, Roborigger has also developed logistics software that'll work hand in hand from a warehouse point of view to allow for complete end-to-end customer service [arrival at warehouse -> site -> lift -> installation as an example]. The logistics software is still a work-in-progress. The plan is to develop additional products that'll be attached to forklifts and can identify packages via object detection and take photos that'll help to create an inventory audit trail from warehouse to installation.

Roborigger is developing a 'personnel detection system' based on an artificial intelligence (AI) model to detect personnel within the fall zone of a suspended load. The protoype system is already working.

Page 2 of this IoT Brochure shows details of the AI image recognition capability, including how it works.

Note, "Future capability to identify and categorize the type of the loads being lifted e.g. a shipping container or a bundle reinforcing steel, etc. by image recognition."

Here is a video of the detection system in action.

I've included a screenshot below of what the object detection system looks like - looks familiar right!?

View attachment 13748

From memory (hopefully to be confirmed), Derick explained that Roborigger can sound an alarm when personnel walk within the fall zone, even if there is no internet connection. The 'event' is then later logged when the Roborigger reconnects to the cloud. [I have messaged Derick to confirm if this is correct given it is an extremely important point].

In regards to the timing of the development of the AI image recognition capabilities, I've found the following, noting that the dates are important:

15 June 2020 - BrainChip Successfully Launches the Akida Early Access Program

5 October 2020 - Upload date of the [above] Personnel Detection System YouTube video.

4 November 2020 - METS Ignited Sponsors Roborigger To Accelerate IoT Development
"The current Roborigger IoT development roadmap includes AI image recognition capabilities to detect and give warnings when personnel are under the crane loads"

Initial take aways for me:
- the timing of the product development fits nicely with akida
- the location of the company and managing director also fit nicely with brainchip/pvdm
- the look, smell and feel of the product also fit nicely with akida

If it looks like a duck, swims like a duck, and quacks like a duck, then it probably is a duck.

That's all for now. Feel free to poke, prod and respond with ogres should they be warranted.

Cheers all.
Have to complement @SERA2g on the sweet writing and delivery.

It sure makes it easier on the reader, indentation, italic, highlights, numbering, and the step-by-step flow of your writing. Very well presented!!!

You just open up even more Use-Cases. Thanks so much for the effort.
 
  • Like
Reactions: 10 users

Slymeat

Move on, nothing to see.
A neat video on AI Accelerators moving to the Edge from the Cloud Server. A little history mixed in.

Everywhere I go now, I think about sensors. When driving down the road and I see a highway camera, I used to think pixel data sensor info. Now I think about Neomorphic Event Based Cameras, that deal only when spikes occur, saving energy for only when the pattern needs analysis. Yeah to the myriad of tinyML and pattern recognition use cases!



Thanks for the video @stuart888, especially considering you preset it to the parts relevant to the edge. I haven’t yet watched all of it but felt I needed to comment on something that astounded me—the confession of wasted energy in the ”solutions” of others.

Look at the size of those wires on Google’s TPU board. I assume they are for heat dissipation; what an absolutely absurd amount of wasted energy (as heat).

I have drawn the red circle and line on the snapshot below, BrainChip should use this in their ,arketing to emphasis the huge waste in energy of other ways of doing brute-force AI.

Overclocking a complex chipset, and then simply dealing with all the heat, is such an dated way of solving a problem - the veriatable electronic version of using a sledgehammer to hammer in a nail. The world needs to start using dedicated tools for the job at hand. Akida is such a tool!

1660082934165.png
 
  • Like
  • Fire
Reactions: 8 users

uiux

Regular
Hi U

The personnel detection systems ability to perform inference on device without cloud connectivity would be the strongest connection in my mind, no?

No word from Derick yet on whether my memory on that subject is correct but I will update here if I receive a response.

I’m not an industry expert like you though so just doing what I can to piece it all together.

Cheers

Hi Sera - these are the questions I would ask if I were investigating it. Sorry for being blunt just trying to be helpful.


My PC has a GPU which can run inference fine without the internet, you will find you can run inference on many devices without the cloud but it comes with trade offs like power requirements or chunky form factors etc

Without seeing their set up it's hard to tell
 
  • Like
  • Fire
Reactions: 5 users

SERA2g

Founding Member
Hi Sera - these are the questions I would ask if I were investigating it. Sorry for being blunt just trying to be helpful.


My PC has a GPU which can run inference fine without the internet, you will find you can run inference on many devices without the cloud but it comes with trade offs like power requirements or chunky form factors etc

Without seeing their set up it's hard to tell
Hi U

No stress! I asked for all to put the post through its paces so thanks for helping.

I wouldn’t imagine Roborigger would sell
many units given the niche and expensive (circa $350k) nature of its product.

I believe they rely more heavily on a licence fee/rental model for their cranes so I don’t think it’d be a huge money spinner for Brainchip even if it were akida onboard.

Cheers!
 
  • Like
Reactions: 5 users

uiux

Regular
Hi U

No stress! I asked for all to put the post through its paces so thanks for helping.

I wouldn’t imagine Roborigger would sell
many units given the niche and expensive (circa $350k) nature of its product.

I believe they rely more heavily on a licence fee model for their cranes so I don’t think it’d be a huge money spinner for Brainchip even if it were akida onboard.

Cheers!


The brochure mentions "AI computer" so could just be a PC with GPU?
 
  • Like
Reactions: 2 users

SERA2g

Founding Member
The brochure mentions "AI computer" so could just be a PC with GPU?
You could be right!

The function that follows is “processes” the video so I’d interpreted it to be a processor rather than the literal meaning being a computer.

I have done some high level LinkedIn “research” *cough*stalking of the IoT Product Manager at Roborigger as well as the CEO and have been unable to find any links to Brn staff, whether by connection or likes/comments for what that’s worth to anyone.
 
  • Like
  • Fire
Reactions: 4 users

uiux

Regular
You could be right!

The function that follows is “processes” the video so I’d interpreted it to be a processor rather than the literal meaning being a computer.

I have done some high level LinkedIn “research” *cough*stalking of the IoT Product Manager at Roborigger as well as the CEO and have been unable to find any links to Brn staff, whether by connection or likes/comments for what that’s worth to anyone.

Processes could refer to graphics processor unit, central processor unit, neural processor unit, tensor processor unit


Lol
 
  • Fire
  • Like
Reactions: 2 users

SERA2g

Founding Member
Processes could refer to graphics processor unit, central processor unit, neural processor unit, tensor processor unit


Lol
Not an industry expert as you may recall.

:)
 
  • Like
  • Haha
Reactions: 3 users

uiux

Regular
  • Haha
Reactions: 1 users

SERA2g

Founding Member
  • Like
  • Haha
  • Love
Reactions: 4 users

uiux

Regular
Challenged but not laughed at hahahhaa

I was laughing with


I tried a new direction this time. Might be easier going back to the old method LOL
 
  • Haha
  • Like
Reactions: 5 users

SERA2g

Founding Member
I was laughing with


I tried a new direction this time. Might be easier going back to the old method LOL

I’ve been told at work I can sometimes be “too” professional and should work on a sandwich type approach to my emails being
Warm intro, technical details, warm outro.

Perhaps you could have approached it as below:

“Hi SERA2g,

Fantastic effort, well detailed research, I’ve had a look and here are my thoughts for your consideration:

You’re a fucking moron.

Best wishes and kindest regards

U”

Warm, technical, warm - works 60% of the time, every time!

😂
 
  • Haha
  • Like
  • Love
Reactions: 22 users

uiux

Regular
I’ve been told at work I can sometimes be “too” professional and should work on a sandwich type approach to my emails being
Warm intro, technical details, warm outro.

Perhaps you could have approached it as below:

“Hi SERA2g,

Fantastic effort, well detailed research, I’ve had a look and here are my thoughts for your consideration:

You’re a fucking moron.

Best wishes and kindest regards

U”

Warm, technical, warm - works 60% of the time, every time!

😂

That was my other approach I was backing away from LOL
 
  • Haha
  • Like
Reactions: 7 users

stuart888

Regular
Two-minute smart health use case below. Quick to the point summary.

 
  • Like
  • Fire
  • Love
Reactions: 8 users

equanimous

Norse clairvoyant shapeshifter goddess
  • Like
  • Love
Reactions: 5 users

stuart888

Regular
What about the secret sauce to make inference and learn via accelerometers! Use cases for anomaly detection for vibration detection might be many. Just trying to research vibration sensors, and sound is another way to do it. Might be lots more. Just trying to learn.

1660863783735.png
 
  • Like
  • Wow
Reactions: 4 users

clip

Regular
When i was thinking about possible use cases for akida in cameras, i came across this patent from Sony:


 
Last edited:
  • Like
  • Fire
  • Thinking
Reactions: 8 users

clip

Regular
Just found out that voice control is a relative old feature in GoPro cameras.
First introduced 2016 for the GoPro Hero5.



Since GoPro continuously improving their camera models (a version Hero11 is rumoured coming this September), Akida could further help to improve battery life, voice recognition accuracy or image quality.

Beside that, Brainchip and GoPro both are resident in California :)
 
  • Like
Reactions: 5 users
Top Bottom