Stable Genius
Regular
Add in Stellantis and it covers most of them! The others will fall into line quickly enough to keep up with the competition!If we have Valeo and Continental, how many cars is that?

Add in Stellantis and it covers most of them! The others will fall into line quickly enough to keep up with the competition!If we have Valeo and Continental, how many cars is that?
Fair post Rocket. AND you did not call her a Shorter which is an interesting comment coming back from her. Why this word?Fair play, but I never mentioned you being a shorter but I truly hope you and your mum enjoy the rest of your time together if true.
If we have Valeo and Continental, how many cars is that? DioIf we have Valeo and Continental, how many cars is that?
Sorry about this people but I couldn't help myself here. Over at the wonderful world that is HC, I came across possibly the wierdest post I have ever seen, sent by the one and only 'The Dean'.
Not only is The Dean somehow trying to defend Shareman, spending time and effort trying to understand this post is probably not a good idea ie.you'll probably lose your mind. However, enjoy, haha.
The Dean
11,169 Posts.
1047
23/06/22
14:25
Post #: 62163417
Share
Drawing inspiration from the human sensory nervous system, we have developed an advanced artificial skin known asAsynchronous Coded Electronic Skin (ACES) - an event-based neuro-mimetic architecture that enables asynchronous transmission of tactile information.This novel sensor can detect touch more than 1,000 times faster than the human sensory nervous system and identify the shape, texture and hardness of objects 10 times faster than the blink of an eye.
To break new ground in robotic perception, we are exploring neuromorphic technology – an area of computing that emulates the neural structure and operation of the human brain – to process sensory data from ACES.Intel’s Loihi chip processed the sensory data 21% faster than a top performing graphics processing unit (GPU), while using more than 45 times less power.
Hi Bacon.............I would suspect that last lot of Shorts (2.7 Mil) taken out yesterday (Wed) were all dumped in the morning before midday. If not then those particular Shorts would be currently at BREAKEVEN pricing on the trade.5.2 million shares traded today.
Does anyone know if this includes the short sales as well? Recently we have been seeing 3 million short sales each day, just wondering if the total volume includes this also, because if it does, this proves shares are even tightly held and they'll need some extraordinary news to bring it further down.
The sensors and our tactile intelligence algorithms will be available soon as a development kit for purchase.
Maybe an add to the iceberg
Tac-01 Sensors | TacniQ
Our Tac-01 sensors are ideal for a wide range of use cases, from raw data collection to higher order tactile interpretations.tacniq.ai
TECHNOLOGY
Tac-01 Sensors
Our Tac-01 sensors are ideal for a wide range of use cases, from raw data collection to higher order tactile interpretations.
Asynchronous Coded Electronic Skin (ACES)
![]()
Asynchronous Coded Electronic Skin (ACES)
Drawing inspiration from the human sensory nervous system, we have developed an advanced artificial skin known asAsynchronous Coded Electronic Skin (ACES) - an event-based neuro-mimetic architecture that enables asynchronous transmission of tactile information.
ACES can detect touch at more than 1,000 times faster than the human sensory nervous system. For example, it is capable of differentiating physical contact between different sensors in less than 60 nanoseconds — the fastest ever achieved for an electronic skin technology — even with large numbers of sensors. ACES-enabled skin can also accurately identify the shape, texture and hardness of objects within 10 milliseconds, ten times faster than the blinking of an eye. This is enabled by the high fidelity and capture speed of the ACES system.
The ACES platform can also be designed to achieve high robustness to physical damage, an important property for electronic skins because they come into the frequent physical contact with the environment. Unlike the current system used to interconnect sensors in existing electronic skins, all the sensors in ACES can be connected to a common electrical conductor with each sensor operating independently. This allows ACES-enabled electronic skins to continue functioning as long as there is one connection between the sensor and the conductor, making them less vulnerable to damage.
Neuromorphic Technology
![]()
Event-driven visual-tactile perception system, comprising a novel biologically-inspired tactile sensor and multi-modal spike-based learning.
To break new ground in robotic perception, we are exploring neuromorphic technology – an area of computing that emulates the neural structure and operation of the human brain – to process sensory data from ACES.
We are developing sensory integrated artificial brain system that mimics biological neural networks, which can run on a power-efficient neuromorphic processor, such as Intel’s Loihi chip and Brainchip’s Akida neural processor. This novel system integrates ACES and vision sensors, equipping robots with the ability to draw accurate conclusions about the objects they are grasping based on the data captured by the sensors in real-time - while operating at a power level efficient enough to be deployed directly inside the robot.
Intelligent Sensing
![]()
Confusion Matrix on the Texture Classification Task. Average accuracy was 94.3%±5.3%.
![]()
Confusion Matrix on the Food Identification Task. Overall accuracy was 90%.
Most of today’s robots operate solely based on visual processing, which limits their capabilities. In order to perform more complex tasks, robots have to be equipped with an exceptional sense of touch and the ability to process sensory information quickly and intelligently.
To improve the robot’s perception capabilities, we have developed machine-learning models for tactile perception and inference. Some of these models are synergized with vision sensing to achieve better performance.
Models' capabilities:
- determining the right amount of strength to use, and grasp an object without letting it slip (proprietary - unpublished)
- detecting slip
- classifying object and weight
- classifying texture
- extending robot’s tactile perception through tools and grasped objects (e.g., food classification through a fork grasped by a robotic arm)
Integrations
Our sensors include various hardware and software integrations you to easily include our tactile intelligence in your application.
![]()
Hardware
Robotic Grippers
The sensors have been designed to provide seamless integrations with commercial off-the-shelf robotic grippers. We currently support Robotiq 2F grippers with our own custom low-latency C++ drivers.
![]()
Software
C++/Python Support
We provide C++/Python APIs to interface with our sensors, robotic grippers and our tactile intelligence algorithms.
![]()
Middleware
ROS/ROS2
Support for ROS is built right into the SDK. We provide ROS nodes and RViz plugins to enable you to integrate the sensors into your robotic applications.
Get
The sensors and our tactile intelligence algorithms will be available soon as a development kit for purchase.
![]()
TACNIQ
tacniq.ai
ARRIVING EARLY 2022
Hello, meet the newTac-01
The Tac-01 provides robots with the sense of touch via a synergistic combination of tactile sensors and artificial intelligence for intelligent robot manipulation.
![]()
APPLICATIONS
Sensors to enable the next generation of perception
Our next generation AI-enabled low-latency tactile sensors for robotic and health applications, built upon a decade of research and testing here in sunny Singapore.
Select a tab Delicate Grasping Tactile Visualisation Slip Detection Pick and Place
Our sensors are able to grasp delicate and fragile objects, without any prior knowledge of the items.
TECHNOLOGY
Neuromorphic sensors for a new class of tactile intelligence
We build upon know-how from the fields of bioengineering and computer science to engineer a sensor that mimics the human sensory system.
Tactile Acumen
Our Asynchronously Coded Electronic Skin (ACES) platform, an event-based neuro-mimetic architecture, enables responsive, asynchronous transmission of tactile information for dexterous manipulation tasks that require rapid detection of object slippage and object hardness.
High-Frequency, Low-Latency Data
Tactile stimuli can be transmitted at up to 4 kHz, faster than the average human.
40 Taxels per Sensor
Each sensor has 40 taxels (tactile pixels) arranged in a 8x5 grid to provide good spatial resolution.
Event-based or Synchronous-based Tactile Data
Work with data in the synchronous-based or event-based form.
Raw sensor layer, below the protective "skin"![]()
Tactile Intelligence
Our sensors come integrated with our novel Machine Learning models, which uses data from ACES, for fast and accurate tactile perception.
Smart Self-Calibration
Built-in self-calibration algorithm allows the user to calibrate the sensors easily, even after many cycles of wear and tear.
Slip Detection
The slip detection algorithm is able to recognise when the dynamics of the robot causes the object to slip from the grasp, and react accordingly.
GraspForceNet
Our novel GraspForceNet AI model is used to ensure that the forces applied by the gripper does not deform or damage the objects.
Our smart calibration routine uses three object types to run the calculations: soft, hard, intermediate
TAC-01 DEVELOPMENT KIT
Everything you need to get started
Each set of our development kit comes with a pair of the latest Tac-01 sensors, and access to our tactile intelligence APIs with seamless integrations with various robotic grippers.
Next Generation Hardware
Plug-and-play tactile sensors, with an intelligent built-in self-calibration algorithms to last the lifetime of your toughest applications.
Software SDK
Easy to use software SDK with visualisers, tactile intelligence APIs and C++, Python, ROS and ROS2 integrations. Supported on Windows, Mac and Linux.
Systems Integration
Out of the box integration with Robotiq-2F grippers with our custom drivers for a more responsive control. Support for more types of grippers coming soon.
Maybe an add to the iceberg
Tac-01 Sensors | TacniQ
Our Tac-01 sensors are ideal for a wide range of use cases, from raw data collection to higher order tactile interpretations.tacniq.ai
TECHNOLOGY
Tac-01 Sensors
Our Tac-01 sensors are ideal for a wide range of use cases, from raw data collection to higher order tactile interpretations.
Asynchronous Coded Electronic Skin (ACES)
![]()
Asynchronous Coded Electronic Skin (ACES)
Drawing inspiration from the human sensory nervous system, we have developed an advanced artificial skin known asAsynchronous Coded Electronic Skin (ACES) - an event-based neuro-mimetic architecture that enables asynchronous transmission of tactile information.
ACES can detect touch at more than 1,000 times faster than the human sensory nervous system. For example, it is capable of differentiating physical contact between different sensors in less than 60 nanoseconds — the fastest ever achieved for an electronic skin technology — even with large numbers of sensors. ACES-enabled skin can also accurately identify the shape, texture and hardness of objects within 10 milliseconds, ten times faster than the blinking of an eye. This is enabled by the high fidelity and capture speed of the ACES system.
The ACES platform can also be designed to achieve high robustness to physical damage, an important property for electronic skins because they come into the frequent physical contact with the environment. Unlike the current system used to interconnect sensors in existing electronic skins, all the sensors in ACES can be connected to a common electrical conductor with each sensor operating independently. This allows ACES-enabled electronic skins to continue functioning as long as there is one connection between the sensor and the conductor, making them less vulnerable to damage.
Neuromorphic Technology
![]()
Event-driven visual-tactile perception system, comprising a novel biologically-inspired tactile sensor and multi-modal spike-based learning.
To break new ground in robotic perception, we are exploring neuromorphic technology – an area of computing that emulates the neural structure and operation of the human brain – to process sensory data from ACES.
We are developing sensory integrated artificial brain system that mimics biological neural networks, which can run on a power-efficient neuromorphic processor, such as Intel’s Loihi chip and Brainchip’s Akida neural processor. This novel system integrates ACES and vision sensors, equipping robots with the ability to draw accurate conclusions about the objects they are grasping based on the data captured by the sensors in real-time - while operating at a power level efficient enough to be deployed directly inside the robot.
Intelligent Sensing
![]()
Confusion Matrix on the Texture Classification Task. Average accuracy was 94.3%±5.3%.
![]()
Confusion Matrix on the Food Identification Task. Overall accuracy was 90%.
Most of today’s robots operate solely based on visual processing, which limits their capabilities. In order to perform more complex tasks, robots have to be equipped with an exceptional sense of touch and the ability to process sensory information quickly and intelligently.
To improve the robot’s perception capabilities, we have developed machine-learning models for tactile perception and inference. Some of these models are synergized with vision sensing to achieve better performance.
Models' capabilities:
- determining the right amount of strength to use, and grasp an object without letting it slip (proprietary - unpublished)
- detecting slip
- classifying object and weight
- classifying texture
- extending robot’s tactile perception through tools and grasped objects (e.g., food classification through a fork grasped by a robotic arm)
Integrations
Our sensors include various hardware and software integrations you to easily include our tactile intelligence in your application.
![]()
Hardware
Robotic Grippers
The sensors have been designed to provide seamless integrations with commercial off-the-shelf robotic grippers. We currently support Robotiq 2F grippers with our own custom low-latency C++ drivers.
![]()
Software
C++/Python Support
We provide C++/Python APIs to interface with our sensors, robotic grippers and our tactile intelligence algorithms.
![]()
Middleware
ROS/ROS2
Support for ROS is built right into the SDK. We provide ROS nodes and RViz plugins to enable you to integrate the sensors into your robotic applications.
Get
You let me see in for seconds, I actually didn't want to give them a click. I saw that first and my thought, why are they advertising themselves with four year old numbersHA HA ...............ok I have just been over on Holy Crapper doing my thing reporting the shitheads and hopefully getting shareman modded some more! Love it as I have not posted for ages.
AND I could not help posting a excerpt from FF post just before (Below)
................................................
"BrainChip – Annual General Meeting CEO and Chairman’s Address Sydney 24 May 2022: BrainChip Holdings Ltd (ASX:BRN), appends the Chairman’s address, Chief Executive Officer’s address and presentation to the Annual General Meeting, in accordance with the ASX Listing Rules.
This announcement is authorised for release by the BRN Board of Directors."
CEO Sean Hehir statement regarding his due diligence process regarding his appointment -
"Being a Silicon Valley based executive I had easy access to some of the world’s best technical minds who I engaged to evaluate the core technology.
The overwhelming feedback was the technology is visionary in its design, unparalleled in flexibility, and transformative in performance.
The Market is moving to the edge, and we are already here...
..........................
Ok above says it all................
Yak52
................................................................................
Waiting to see if it gets MODDED now! lol
Yak52![]()
short data is out from last friday, increased to 74mil short positions.
Hopefully I will find a few more talks with Renesas at the Embedded world 2022
One of the references to the primary source document is:Awesome, thanks Rocket
A bit more info
![]()
TACNIQ
tacniq.ai
ARRIVING EARLY 2022
Hello, meet the new Tac-01
The Tac-01 provides robots with the sense of touch via a synergistic combination of tactile sensors and artificial intelligence for intelligent robot manipulation.
![]()
APPLICATIONS
Sensors to enable the next generation of perception
Our next generation AI-enabled low-latency tactile sensors for robotic and health applications, built upon a decade of research and testing here in sunny Singapore.
Delicate GraspingTactile VisualisationSlip DetectionPick and Place
Our sensors are able to grasp delicate and fragile objects, without any prior knowledge of the items.
TECHNOLOGY
Neuromorphic sensors for a new class of tactile intelligence
We build upon know-how from the fields of bioengineering and computer science to engineer a sensor that mimics the human sensory system.
Tactile Acumen
Our Asynchronously Coded Electronic Skin (ACES) platform, an event-based neuro-mimetic architecture, enables responsive, asynchronous transmission of tactile information for dexterous manipulation tasks that require rapid detection of object slippage and object hardness.
High-Frequency, Low-Latency Data
Tactile stimuli can be transmitted at up to 4 kHz, faster than the average human.
40 Taxels per Sensor
Each sensor has 40 taxels (tactile pixels) arranged in a 8x5 grid to provide good spatial resolution.
Event-based or Synchronous-based Tactile Data
Work with data in the synchronous-based or event-based form.
Taxel representation of right and left fingers, grasping a tomato![]()
Tactile Intelligence
Our sensors come integrated with our novel Machine Learning models, which uses data from ACES, for fast and accurate tactile perception.
Smart Self-Calibration
Built-in self-calibration algorithm allows the user to calibrate the sensors easily, even after many cycles of wear and tear.
Slip Detection
The slip detection algorithm is able to recognise when the dynamics of the robot causes the object to slip from the grasp, and react accordingly.
GraspForceNet
Our novel GraspForceNet AI model is used to ensure that the forces applied by the gripper does not deform or damage the objects.
Tofu grasping without GraspForceNet (left) and with GraspForceNet (right)![]()
TAC-01 DEVELOPMENT KIT
Everything you need to get started
Each set of our development kit comes with a pair of the latest Tac-01 sensors, and access to our tactile intelligence APIs with seamless integrations with various robotic grippers.
Next Generation Hardware
Plug-and-play tactile sensors, with an intelligent built-in self-calibration algorithms to last the lifetime of your toughest applications.
Software SDK
Easy to use software SDK with visualisers, tactile intelligence APIs and C++, Python, ROS and ROS2 integrations. Supported on Windows, Mac and Linux.
Systems Integration
Out of the box integration with Robotiq-2F grippers with our custom drivers for a more responsive control. Support for more types of grippers coming soon.
PS: In future when you say to someone can you hold this for a second and they say no you will understand why.One of the references to the primary source document is:
“A. Vanarse, A. Osseiran, A. Rassau, A review of current neuromorphic approaches for vision, auditory, and olfactory sensors. Front. Neurosci. 10, 115 (2016).
CROSSREF
PUBMED
ISI”
So it seems clear they have at least heard of Brainchip. The full paper is behind a pay wall.
No wonder dumb people are clumsy who would have thought so much intelligence, science and mathematics went into picking up Tofu.
I really need one of these robots to help me take eggs out of the carton without breaking the first 2 or 3.
My opinion only DYOR
FF
AKIDA BALLISTA
One of the references to the primary source document is:
“A. Vanarse, A. Osseiran, A. Rassau, A review of current neuromorphic approaches for vision, auditory, and olfactory sensors. Front. Neurosci. 10, 115 (2016).
CROSSREF
PUBMED
ISI”
So it seems clear they have at least heard of Brainchip. The full paper is behind a pay wall.
No wonder dumb people are clumsy who would have thought so much intelligence, science and mathematics went into picking up Tofu.
I really need one of these robots to help me take eggs out of the carton without breaking the first 2 or 3.
My opinion only DYOR
FF
AKIDA BALLISTA