TechGirl
Founding Member
Maybe an add to the iceberg
Tac-01 Sensors | TacniQ
Our Tac-01 sensors are ideal for a wide range of use cases, from raw data collection to higher order tactile interpretations.tacniq.ai
TECHNOLOGY
Tac-01 Sensors
Our Tac-01 sensors are ideal for a wide range of use cases, from raw data collection to higher order tactile interpretations.
Asynchronous Coded Electronic Skin (ACES)
Asynchronous Coded Electronic Skin (ACES)
Drawing inspiration from the human sensory nervous system, we have developed an advanced artificial skin known asAsynchronous Coded Electronic Skin (ACES) - an event-based neuro-mimetic architecture that enables asynchronous transmission of tactile information.
ACES can detect touch at more than 1,000 times faster than the human sensory nervous system. For example, it is capable of differentiating physical contact between different sensors in less than 60 nanoseconds — the fastest ever achieved for an electronic skin technology — even with large numbers of sensors. ACES-enabled skin can also accurately identify the shape, texture and hardness of objects within 10 milliseconds, ten times faster than the blinking of an eye. This is enabled by the high fidelity and capture speed of the ACES system.
The ACES platform can also be designed to achieve high robustness to physical damage, an important property for electronic skins because they come into the frequent physical contact with the environment. Unlike the current system used to interconnect sensors in existing electronic skins, all the sensors in ACES can be connected to a common electrical conductor with each sensor operating independently. This allows ACES-enabled electronic skins to continue functioning as long as there is one connection between the sensor and the conductor, making them less vulnerable to damage.
Neuromorphic Technology
Event-driven visual-tactile perception system, comprising a novel biologically-inspired tactile sensor and multi-modal spike-based learning.
To break new ground in robotic perception, we are exploring neuromorphic technology – an area of computing that emulates the neural structure and operation of the human brain – to process sensory data from ACES.
We are developing sensory integrated artificial brain system that mimics biological neural networks, which can run on a power-efficient neuromorphic processor, such as Intel’s Loihi chip and Brainchip’s Akida neural processor. This novel system integrates ACES and vision sensors, equipping robots with the ability to draw accurate conclusions about the objects they are grasping based on the data captured by the sensors in real-time - while operating at a power level efficient enough to be deployed directly inside the robot.
Intelligent Sensing
Confusion Matrix on the Texture Classification Task. Average accuracy was 94.3%±5.3%.
Confusion Matrix on the Food Identification Task. Overall accuracy was 90%.
Most of today’s robots operate solely based on visual processing, which limits their capabilities. In order to perform more complex tasks, robots have to be equipped with an exceptional sense of touch and the ability to process sensory information quickly and intelligently.
To improve the robot’s perception capabilities, we have developed machine-learning models for tactile perception and inference. Some of these models are synergized with vision sensing to achieve better performance.
Models' capabilities:
- determining the right amount of strength to use, and grasp an object without letting it slip (proprietary - unpublished)
- detecting slip
- classifying object and weight
- classifying texture
- extending robot’s tactile perception through tools and grasped objects (e.g., food classification through a fork grasped by a robotic arm)
Integrations
Our sensors include various hardware and software integrations you to easily include our tactile intelligence in your application.
Hardware
Robotic Grippers
The sensors have been designed to provide seamless integrations with commercial off-the-shelf robotic grippers. We currently support Robotiq 2F grippers with our own custom low-latency C++ drivers.
Software
C++/Python Support
We provide C++/Python APIs to interface with our sensors, robotic grippers and our tactile intelligence algorithms.
Middleware
ROS/ROS2
Support for ROS is built right into the SDK. We provide ROS nodes and RViz plugins to enable you to integrate the sensors into your robotic applications.
Get
Awesome, thanks Rocket
A bit more info
TACNIQ
tacniq.ai
ARRIVING EARLY 2022
Hello, meet the new Tac-01
The Tac-01 provides robots with the sense of touch via a synergistic combination of tactile sensors and artificial intelligence for intelligent robot manipulation.The sensors and our tactile intelligence algorithms will be available soon as a development kit for purchase.
APPLICATIONS
Sensors to enable the next generation of perceptionOur next generation AI-enabled low-latency tactile sensors for robotic and health applications, built upon a decade of research and testing here in sunny Singapore.
Delicate GraspingTactile VisualisationSlip DetectionPick and Place
Our sensors are able to grasp delicate and fragile objects, without any prior knowledge of the items.
TECHNOLOGY
Neuromorphic sensors for a new class of tactile intelligence
We build upon know-how from the fields of bioengineering and computer science to engineer a sensor that mimics the human sensory system.Tactile Acumen
Our Asynchronously Coded Electronic Skin (ACES) platform, an event-based neuro-mimetic architecture, enables responsive, asynchronous transmission of tactile information for dexterous manipulation tasks that require rapid detection of object slippage and object hardness.High-Frequency, Low-Latency Data
Tactile stimuli can be transmitted at up to 4 kHz, faster than the average human.
40 Taxels per Sensor
Each sensor has 40 taxels (tactile pixels) arranged in a 8x5 grid to provide good spatial resolution.
Event-based or Synchronous-based Tactile Data
Work with data in the synchronous-based or event-based form.
Tactile Intelligence
Our sensors come integrated with our novel Machine Learning models, which uses data from ACES, for fast and accurate tactile perception.Smart Self-Calibration
Built-in self-calibration algorithm allows the user to calibrate the sensors easily, even after many cycles of wear and tear.
Slip Detection
The slip detection algorithm is able to recognise when the dynamics of the robot causes the object to slip from the grasp, and react accordingly.
GraspForceNet
Our novel GraspForceNet AI model is used to ensure that the forces applied by the gripper does not deform or damage the objects.
TAC-01 DEVELOPMENT KIT
Everything you need to get startedEach set of our development kit comes with a pair of the latest Tac-01 sensors, and access to our tactile intelligence APIs with seamless integrations with various robotic grippers.