Asynchronously Coded Electronic Skin (ACES) platform

Presume this has possibly been covered (anyone know if working with them?) but only did a cursory search and couldn't see anything so....if has...apols.

Also posted elsewhere.




TAC 1.jpg

TAC 2.jpg

TAC 3.jpg

TAC 4.jpg
 
  • Like
  • Love
  • Fire
Reactions: 29 users

uiux

Regular
  • Like
Reactions: 6 users

xdragon

Member
Very cool! Neuromorphic processors could provide robots with five senses and make them human-like.
 
  • Like
  • Wow
Reactions: 9 users
  • Like
Reactions: 7 users

uiux

Regular
Nice find, thanks

BrainChip has actually output some work based on this tech + research






Based on this dataset:


ST-MNIST -- The Spiking Tactile MNIST Neuromorphic Dataset

Tactile sensing is an essential modality for smart robots as it enables them to interact flexibly with physical objects in their environment. Recent advancements in electronic skins have led to the development of data-driven machine learning methods that exploit this important sensory modality. However, current datasets used to train such algorithms are limited to standard synchronous tactile sensors. There is a dearth of neuromorphic event-based tactile datasets, principally due to the scarcity of large-scale event-based tactile sensors. Having such datasets is crucial for the development and evaluation of new algorithms that process spatio-temporal event-based data. For example, evaluating spiking neural networks on conventional frame-based datasets is considered sub-optimal. Here, we debut a novel neuromorphic Spiking Tactile MNIST (ST-MNIST) dataset, which comprises handwritten digits obtained by human participants writing on a neuromorphic tactile sensor array. We also describe an initial effort to evaluate our ST-MNIST dataset using existing artificial and spiking neural network models. The classification accuracies provided herein can serve as performance benchmarks for future work. We anticipate that our ST-MNIST dataset will be of interest and useful to the neuromorphic and robotics research communities.
 
  • Like
Reactions: 16 users
What I like is if these guys get off the ground in units moved then whose chip the buyer gonna need...2 choices ;)
 
  • Like
  • Fire
Reactions: 6 users
Good short article on the tech.



ACES: Asynchronous Coded Electronic Skin​

BY DYLAN ROCHE

MAY 14, 2021

10:10 A.M.

Arm.jpg



It’s not sci-fi, it’s a medical breakthrough.

For generations past, robotic prosthetics were something you heard about only in science fiction. But it turns out that science fiction is inspiring some major medical breakthroughs—specifically, that scene at the conclusion of The Empire Strikes Back where Luke Skywalker receives a replacement robotic hand. So, if you’re someone who likes to spend May 4 wishing your fellow Jedis, “May the Fourth be with you,” here’s a fun fact you can share with them this year.

Over on the other side of the world, researchers at the National University of Singapore have developed what’s known as asynchronous coded electronic skin, called ACES for short. It’s a type of electronic skin that uses 100 small sensors to recreate the human sense of touch—including texture, pain, heat, cold, and other sensations. Specifically, these sensors can determine up to 30 textures and even read Braille. As this technology develops, researchers expect they can apply ACES to prosthetic limbs and allow people to regain their sense of touch.

Another potential use for ACES? It could be used on robots that would benefit from an artificial sense of touch for when they’re doing tasks that are either too dangerous or mundane for humans to do. Additionally, the sense of touch could be beneficial for robots used to perform surgery—a huge step forward for the medical community!

BY DYLAN ROCHE


MAY 14, 2021

10:10 A.M.
 
  • Like
  • Thinking
Reactions: 10 users
So one school of thought on these guys given they are looking to ship first sensors early 2022.

They were in the EAP, as not sure they could develop and suitably test their sensors prior to release if they're designed around utilising Akida or Intel.

From what I can see I don't think you can get the Loihi 2 unless part of their research community whereas Akida is available now.

So other question is....will they ship their sensors with Akida onboard as if final product was designed to work with Akida then clients will want a working product?

Hmmmm...we wait.
 
  • Like
  • Thinking
Reactions: 9 users

matt

Member
Thanks for sharing mate, the fact that only one of them is commercially available to pair with, makes things fairly simple you’d think.
 
  • Like
Reactions: 6 users

uiux

Regular
Thanks for sharing mate, the fact that only one of them is commercially available to pair with, makes things fairly simple you’d think.

Only one can do edge learning and one-shot learning, too
 
  • Like
Reactions: 9 users
Top Bottom