Potential Applications for Akida?

I love it how many different people we have on here with immense variance in our knowledge, experience and professions. Individually, we won't be able to fathom all the diverse ways that Akida could be applied, but our experience shapes our understanding of what these applications could be. Thus I'd like to start a conversation here on specific potential applications, given the capability of Akida's neuromorphic architecture to pick up patterns, from our view points. If we can discuss this in the following way Problem ==> Solution with Akida it would be helpful to understand this further.

Let me start out with mine.

Problem;

I used to work in the banking industry with a portfolio of clients, and one of the things that we needed to investigate in our client portfolio is suspicious activity. This could be flagged from dubious counterparties, irregular amounts, or even excessive small transactions. We had a transaction monitoring team which ran programmes and flagged out the transactions pretty manually, but as you can imagine, many times these transactions fly under the radar until it is too late (major issue occurs) then we'd have to forensically examine the historical transactions.

Solution;

I imagine that models can be designed for suspicious transactions and the daily live transaction data can be run through Akida to pick up suspicious transactions in real time to flag them for monitoring, this will also reduce the negative impact and make the portfolio managers' lives a lot easier.
 
Last edited:
  • Like
  • Love
Reactions: 14 users

IloveLamp

Top 20
The possibilities are truly limitless...... limitless
 
  • Like
Reactions: 5 users
Anyone wants to share?
 

stuart888

Regular
This very knowledgeable 1000 eyes group has documented many Use-Cases in this forum. Every major cloud and chip manufacture, even the GPU king Nvidia, is preaching the no-brainier benefits of Edge AI: low latency, reduced bandwidth, improved security, and power energy savings.

One key fact: the Akida Spiking Neural Processor has the secret sauce to do this better, than other Edge AI architectures.

For example, when the owner walks up to enter the 2024 Mercedes, the door camera feeds a photo into Akida, and it processes it, opens the door immediately. No need for all that wasted time/bandwidth/energy used in the AWS cloud. Massive overuse of the cloud is occurring everywhere, big money waster. This is just one use-case. Many more are coming when the MBUX switches to MB.OS (with Akida), when the benefits of true far Edge AI/ML are realized, topped off by greater customer satisfaction.


1650566070947.png
 
  • Like
  • Fire
  • Love
Reactions: 16 users

MADX

Regular
Here's an application for FactFinder's sanction (or not).

Hey FF, knowing your wealth of experience in the law, and bearing in mind the influence of peer pressure on human behaviour, do you think the following would improve the driving behavior of mankind?

The idea is that dashcams include Akida to detect bad driving and infringements (my favorite is tossing cigarette buts out the window). The recorded data, which includes the number plate is reported to the police automatically by e.g. wi-fi.
I guess it would not stand up in court, but an educational letter is then generated and sent to the owner automatically. If multiple reports are generated for the same owner then automatic letters are sent with sterner wording and finally a threat of "You are required to attend a driver education class" or mandatory "Please explain before we prosecute" etc.

Reports by the public avoid the "big brother is watching" effect and makes it peer pressure. Automation minimises valuable police time.

Of course, because I came up with the idea, I would be exempted ;)
 
  • Haha
  • Like
  • Fire
Reactions: 7 users

MADX

Regular
I love it how many different people we have on here with immense variance in our knowledge, experience and professions. Individually, we won't be able to fathom all the diverse ways that Akida could be applied, but our experience shapes our understanding of what these applications could be. Thus I'd like to start a conversation here on specific potential applications, given the capability of Akida's neuromorphic architecture to pick up patterns, from our view points. If we can discuss this in the following way Problem ==> Solution with Akida it would be helpful to understand this further.

Let me start out with mine.

Problem;

I used to work in the banking industry with a portfolio of clients, and one of the things that we needed to investigate in our client portfolio is suspicious activity. This could be flagged from dubious counterparties, irregular amounts, or even excessive small transactions. We had a transaction monitoring team which ran programmes and flagged out the transactions pretty manually, but as you can imagine, many times these transactions fly under the radar until it is too late (major issue occurs) then we'd have to forensically examine the historical transactions.

Solution;

I imagine that models can be designed for suspicious transactions and the daily live transaction data can be run through Akida to pick up suspicious transactions in real time to flag them for monitoring, this will also reduce the negative impact and make the portfolio managers' lives a lot easier.
Similarly, detection of "sus" internet-transmitted data for cybersecurity etc.
 
  • Like
  • Fire
Reactions: 4 users

MADX

Regular
Here's an application for FactFinder's sanction (or not).

Hey FF, knowing your wealth of experience in the law, and bearing in mind the influence of peer pressure on human behaviour, do you think the following would improve the driving behavior of mankind?

The idea is that dashcams include Akida to detect bad driving and infringements (my favorite is tossing cigarette buts out the window). The recorded data, which includes the number plate is reported to the police automatically by e.g. wi-fi.
I guess it would not stand up in court, but an educational letter is then generated and sent to the owner automatically. If multiple reports are generated for the same owner then automatic letters are sent with sterner wording and finally a threat of "You are required to attend a driver education class" or mandatory "Please explain before we prosecute" etc.

Reports by the public avoid the "big brother is watching" effect and makes it peer pressure. Automation minimises valuable police time.

Of course, because I came up with the idea, I would be exempted ;)
I actually feel that the principles of the idea could be extended to a convenient peer-pressure system where grumpy old busy-bodies :rolleyes: could make reports to Crime Stoppers.
 
  • Like
Reactions: 2 users

Perhaps

Regular
Bringing 5G and neuromorphic architecture together will offer a whole new world of applications. That SiFive (Risc V) - Qualcomm (5G) - Brainchip connection could be the most powerful for entering the mass market.
To give an idea of the possibilities here an article about the research of Ericsson:
 
  • Like
  • Love
Reactions: 7 users

Lex555

Regular
Akida used for camera object recognition during supermarket self serve so scan matches bag contents through the one shot learning.
 
  • Like
  • Love
Reactions: 8 users

Labsy

Regular
2 words... "Gun Control"
A little sensor in a handgun or an automatic weapon which only works for the registered owner...
 
  • Like
  • Love
  • Fire
Reactions: 10 users

MADX

Regular
Here's an application for FactFinder's sanction (or not).

Hey FF, knowing your wealth of experience in the law, and bearing in mind the influence of peer pressure on human behaviour, do you think the following would improve the driving behavior of mankind?

The idea is that dashcams include Akida to detect bad driving and infringements (my favorite is tossing cigarette buts out the window). The recorded data, which includes the number plate is reported to the police automatically by e.g. wi-fi.
I guess it would not stand up in court, but an educational letter is then generated and sent to the owner automatically. If multiple reports are generated for the same owner then automatic letters are sent with sterner wording and finally a threat of "You are required to attend a driver education class" or mandatory "Please explain before we prosecute" etc.

Reports by the public avoid the "big brother is watching" effect and makes it peer pressure. Automation minimises valuable police time.

Of course, because I came up with the idea, I would be exempted ;)
Hey Fact Finder. I see you reacted to the above with an Ha Ha. I would genuinely value a brief opinion on it sometime, but don't be diverted from your other valuable work on our forum.
 
  • Like
Reactions: 1 users

cosors

👀
When I think of vibration analysis, I think less of bridges and more immediately of the catastrophic accident at Eschede, where 101 people died. After many years, the reasons have been identified, as well as the chain of tragic mistakes. I do not want to go into the details here. Nevertheless, I imagine permanent vibration analysis in all high-speed trains and the axles of the trains and the wheel flats, without cloud connection. Through machine learning, the Akida can perhaps filter the data far better than the current sensor technology can. So I'm thinking of a sensor specialist supplying train builders worldwide.
I add that are very important from an engineering point of view. As with the axles of the trains, the vibration analysis of shafts for turbines is a very important thing. I think here first of all of shafts in power plant turbines. And this in the internal network, think of cyber attacks. Then, of course, there are also turbines in airplanes, for example. And then that brings me on to the analysis of airfoils. There are probably a thousand more topics for this use case. I hope a sensor specialist will soon be one of our customers or one of the customers of system suppliers who use our hardware or IP.
To this topic I add earthquake detection and analysis. For example also on site at a volcano in a box. Machine learning could save the time for manual analysis. And maybe the detection is better, too. This brings to mind the severe earthquake in Italy. A funded earthquake early detection project of a very different kind is now underway there. It was found that animals became restless before the quakes. In the study, they are fitted with geo-transmitters to warn of a possible earthquake based on the data. With the vibration analysis of earthquakes and Akida, many lives could be saved worldwide.

I'm thinking about the wine testing video in the bar (funny video, I like that kind of thing) and the big incident just recently at Ferrero when masses of products were recalled and production was shut down. The only thing to blame was a single filter that was contaminated with salmonella. I imagine sensor technology that is trained with the Akida and then trains itself further through machine learning and thus automatically detects the smallest fluctuations, for example, when salmonella contaminates the product and slightly changes the composition. Sensor technology and food industry and the analysis and detection of production fluctuations. When I think of the breath Covid test, a sensor in the food industry should also be able to detect bacteria or other impurities.

Akida in FEM (Finite Element Method) to increase the efficiency of simulation technology through machine learning and AI. I have only read about this in theory so far. Data in industry is usually highly sensitive and must be protected in R&D departments. A link with a cloud is therefore a no go.

And Akida for automated penetration testing or side channel analysis in IT security to test systems or devices for security and find vulnerabilities more efficiently with machine learning. And that without connecting to a cloud for similar reasons as above.

A bit of spinning. Agile advertising. In some sci-fi movies and cities of the future, advertising is sometimes switched agilely. How would it be if large displays in the major cities of this world with the very good recognition of Akida display the advertising that is most interesting for the viewer. If a child stands in front of it, toys are shown, distinguishing between boys and girls. Or if a man stands in front of it, other advertising is shown than for women. Or again other advertising when a family, stroller, dog, skateboard or bicycle is recognized. And to top it off, Akida recognizes with Nvisio the facial expression whether the advertisement should be enthusiastic or better something else suggested.

Some may be familiar with the hassle of paying and waiting in line at toll stations on the highway. Akidas recognition of the vehicle and automatic billing. And another topic: speed traps for speeding and the recognition of the driver, self-sufficient in this unit.

Autonomous larger cleaning vehicles or small robot depending on the application. At first, only on company premises or private property until people get used to it. Akida's detection of things on the ground correctly categorizes them and acts accordingly.

Adapted to Perhab's post: For me, the biggest strength of 5G is the minimal latency and automation from factory floors in real time from, Industry 4.0. I've been to a few symposiums on the topic. There was a lot of imagination in it. But with the recognition of Akida and I can imagine tremendous progress in autonomous warehousing. No scanning with the scanner or manual declaring. One shot and the conveyor robot knows what to do, whether to sort into the warehouse or transport directly to the robot cell, where the Akida recognition takes over the parts. What used to be fantasy, I can imagine in reality with Akida.

Another bit of fun: Akida at the poker table can certainly be an exciting experience for one or the other. Surely, all electronics will soon be banned at the tables as soon as someone has made his money with Akida. Besides, I imagine Akida IP in every non Apple and non Google phone anyway. Google immediately makes me think of Google Lens, but that's going too far.

I'm driving past a container terminal right now. Fully automated autonomous loading and unloading at the world's container terminals. Akida automatically detects the position and type of container and recognises the various means of transport, no matter what type.
The container ships lead me directly to the next topic. Driving assistance systems for small ships. Boris Herrmann would not have had a crash with Akida and would certainly have won the Vendée Globe. And I don't just mean a warning system, they have enough of those. But a pure emergency assistance system to avoid accidents and collisions. It was also reported how big the challenge is to recognise objects floating in the sea. This is where Akida could probably be very useful.
And tied to that, another theme for ships. The EQXX gives me the idea. Supporting systems with machine learning for ships to save fuel, depending on the situation, weather data, position, wind and waves.

Next topic, audio recognition. Sooner or later Akida will be in smartphones,) then my thought will be standard or obsolete. Until then, I imagine a display device with Akida for deaf people. They would only have to hold the device and read live on the display what is being said or announced at stations. As with the EQXX, machine learning also enables the recognition of dialects and filters out the clearest from the babble of speech, depending on the volume level or frequency, perhaps even selecting which layer via filters or settings.
I'll just keep going here until it becomes a possible use case book. Is there a character limit here 🤔 I'm threatening to use this as my notebook.
 
Last edited:
  • Like
  • Fire
Reactions: 9 users

Fulltank

Emerged
Hi all,

My son works in the TV caper. He's on the camera filming and also production/control room.
Mainly sports (horse racing, soccer, football).

I spoke to him about the use of AI/ML in the sporting area and when he mentioned it to his work mates, he got blank looks.

Can I get the think tank here to throw up some ideas where AKIDA could be used in the sporting world ?
It'll be interesting to see how viewing sports could be changed with the implementation of AKIDA..

Thanks,
 
  • Like
Reactions: 3 users

Townyj

Ermahgerd
Hi all,

My son works in the TV caper. He's on the camera filming and also production/control room.
Mainly sports (horse racing, soccer, football).

I spoke to him about the use of AI/ML in the sporting area and when he mentioned it to his work mates, he got blank looks.

Can I get the think tank here to throw up some ideas where AKIDA could be used in the sporting world ?
It'll be interesting to see how viewing sports could be changed with the implementation of AKIDA..

Thanks,

I can definitely see Robotic Camera Operators popping up some day. Imagine having a robot on wheels following a footy game around the boundry or drones flying above following the play. Some cool stuff to look forward to!
 
  • Like
Reactions: 4 users

MADX

Regular
I wonder if BRN tech, assisted by appropriate "partners", could be used for detection of cues from body-language e.g.during police interviews, it could prompt appropriate questioning?
Also, it may be used as an adjunct to the accuracy of lie-detectors.
 
  • Like
  • Thinking
Reactions: 3 users

stuart888

Regular
A neat video on AI Accelerators moving to the Edge from the Cloud Server. A little history mixed in.

Everywhere I go now, I think about sensors. When driving down the road and I see a highway camera, I used to think pixel data sensor info. Now I think about Neomorphic Event Based Cameras, that deal only when spikes occur, saving energy for only when the pattern needs analysis. Yeah to the myriad of tinyML and pattern recognition use cases!


 
  • Like
  • Fire
Reactions: 10 users

MADX

Regular
A neat video on AI Accelerators moving to the Edge from the Cloud Server. A little history mixed in.

Everywhere I go now, I think about sensors. When driving down the road and I see a highway camera, I used to think pixel data sensor info. Now I think about Neomorphic Event Based Cameras, that deal only when spikes occur, saving energy for only when the pattern needs analysis. Yeah to the myriad of tinyML and pattern recognition use cases!




Same here. Sense the presence of a sensor, think Akida tech. I can hardly think of anything else.

Is it the case that where algorithms are using the sensor data to cause an action, Akida tech could cause more accurate actions because an algo. is limited to what the programmer has built into it and, for instance, could not cope with the unexpected?
 
  • Like
Reactions: 2 users

equanimous

Norse clairvoyant shapeshifter goddess
1660021060771.png

Researches Use IoT for Cancer Diagnosis​

Artificial neurons and an AI system are being used to assess benign or malignant tumors
  • Written by Scarlett Evans
  • 14th July 2022

Researchers from the Korea Institute of Science and Technology (KIST) have developed a novel cancer diagnosis technology; a simple but accurate method that uses tactile neuron devices combined with AI technology.
Typically, a non-invasive method of diagnosis is ultrasound elastography; however, interpretation of the results can vary. The new method identifies and measures the stiffness and distribution of a tumor, allowing for accurate cancer diagnosis.

The KIST team developed this alternative method to improve accuracy and speed up the time of prognosis. For their experiments, the team combined tactile neuron devices with artificial neural network learning methods, applying pressure to a potentially cancerous site, with the pressing force generating electrical spikes that increase or decrease depending on the stiffness of the object encountered.

The method falls under the category of “neuromorphic technology,” a data processing technology that has become increasingly popular given its compatibility with AI, IoT and autonomous technologies. It seeks to emulate the human brain’s method of processing vast amounts of information using minimal energy, with neurons receiving external stimuli through sensory receptors which are then converted into electrical spike signals.
Deploying this for disease diagnosis, the team used elastography images of malignant and benign breast tumors in combination with a spiking neural network learning method. The pixels from the color-coded ultrasound elastography image correlated to the stiffness of the object encountered and were converted to a frequency value to train the AI.
Following this process, the team reported a breast tumor diagnosis accuracy of 95.8%, saying the developed artificial tactile neuron technology is capable of “detecting and learning mechanical properties with a simple structure and method.”
The team also anticipated the device could be used in low-power and high-accuracy disease diagnosis and applications such as robotic surgery, where a surgical site needs to be quickly determined with minimal to no human interaction.
 
  • Like
  • Love
  • Thinking
Reactions: 13 users

equanimous

Norse clairvoyant shapeshifter goddess
  • Like
Reactions: 1 users

SERA2g

Founding Member
Hi all

I believe that I may have identified a previously unknown Early Access Partner - Roborigger (Tensa Equipment)

I haven't looked into this in super detail as work has been pretty busy lately, but have listed my initial research below and would welcome any thoughts or comments.

Last week I attended an eGroup session here in Perth and the guest speaker was Derick Markwell from Roborigger.

"The Roborigger is a wireless load controlling system which uses gyroscopic and inertial forces to accurately rotate and orient crane loads"

1. Let's start with the product and Perth innovation market itself as the context here holds some weight in my opinion.​

Derick Markwell and Tensa Equipment, the company that owns Roborigger, are based here in Perth. The innovation hub within Perth is relatively small all things considered and so I will often run into the same innovators, founders, industry experts, and investors at Perth-based innovation and technology events [note: I'm an accountant so am generally attending these events for business development].

Regarding development timelines, Roborigger was undergoing research and development in Perth throughout 2015, 2016 and 2017. The first prototype was considered 'market-ready' in 2018. Comparatively, the Brainchip RTO occurred in 2015 so it wouldn't be a stretch to assume that Derick and Peter VDM would have run into each other during those early years if they were attending Perth-based innovation and tech events at that point in time.

2. Let's now look at Roborigger itself​
The Roborigger comes with a cloud based IoT application. The built-in software collects data from every lift including time, weight, and the location of the loads being lifted - the software is undergoing continual development and I recall Derick mentioning in his presentation that the aim is to continue to develop it so that customers have data to understand the productivity of their cranes, the productivity of the personnel operating the cranes and the productivity of their operational sites in general.

Somewhat related, Roborigger has also developed logistics software that'll work hand in hand from a warehouse point of view to allow for complete end-to-end customer service [arrival at warehouse -> site -> lift -> installation as an example]. The logistics software is still a work-in-progress. The plan is to develop additional products that'll be attached to forklifts and can identify packages via object detection and take photos that'll help to create an inventory audit trail from warehouse to installation.

Roborigger is developing a 'personnel detection system' based on an artificial intelligence (AI) model to detect personnel within the fall zone of a suspended load. The protoype system is already working.

Page 2 of this IoT Brochure shows details of the AI image recognition capability, including how it works.

Note, "Future capability to identify and categorize the type of the loads being lifted e.g. a shipping container or a bundle reinforcing steel, etc. by image recognition."

Here is a video of the detection system in action.

I've included a screenshot below of what the object detection system looks like - looks familiar right!?

1660057185683.png


From memory (hopefully to be confirmed), Derick explained that Roborigger can sound an alarm when personnel walk within the fall zone, even if there is no internet connection. The 'event' is then later logged when the Roborigger reconnects to the cloud. [I have messaged Derick to confirm if this is correct given it is an extremely important point].

In regards to the timing of the development of the AI image recognition capabilities, I've found the following, noting that the dates are important:

15 June 2020 - BrainChip Successfully Launches the Akida Early Access Program

5 October 2020 - Upload date of the [above] Personnel Detection System YouTube video.

4 November 2020 - METS Ignited Sponsors Roborigger To Accelerate IoT Development
"The current Roborigger IoT development roadmap includes AI image recognition capabilities to detect and give warnings when personnel are under the crane loads"

Initial take aways for me:
- the timing of the product development fits nicely with akida
- the location of the company and managing director also fit nicely with brainchip/pvdm
- the look, smell and feel of the product also fit nicely with akida

If it looks like a duck, swims like a duck, and quacks like a duck, then it probably is a duck.

That's all for now. Feel free to poke, prod and respond with ogres should they be warranted.

Cheers all.
 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 25 users
Top Bottom