BRN Discussion Ongoing

TechGirl

Founding Member
Maybe an add to the iceberg



TECHNOLOGY​

Tac-01 Sensors
Our Tac-01 sensors are ideal for a wide range of use cases, from raw data collection to higher order tactile interpretations.

Asynchronous Coded Electronic Skin (ACES)​

sensor_aces.JPG

Asynchronous Coded Electronic Skin (ACES)

Drawing inspiration from the human sensory nervous system, we have developed an advanced artificial skin known asAsynchronous Coded Electronic Skin (ACES) - an event-based neuro-mimetic architecture that enables asynchronous transmission of tactile information.
ACES can detect touch at more than 1,000 times faster than the human sensory nervous system. For example, it is capable of differentiating physical contact between different sensors in less than 60 nanoseconds — the fastest ever achieved for an electronic skin technology — even with large numbers of sensors. ACES-enabled skin can also accurately identify the shape, texture and hardness of objects within 10 milliseconds, ten times faster than the blinking of an eye. This is enabled by the high fidelity and capture speed of the ACES system.
The ACES platform can also be designed to achieve high robustness to physical damage, an important property for electronic skins because they come into the frequent physical contact with the environment. Unlike the current system used to interconnect sensors in existing electronic skins, all the sensors in ACES can be connected to a common electrical conductor with each sensor operating independently. This allows ACES-enabled electronic skins to continue functioning as long as there is one connection between the sensor and the conductor, making them less vulnerable to damage.

Neuromorphic Technology​

sensor_vtsnn.png

Event-driven visual-tactile perception system, comprising a novel biologically-inspired tactile sensor and multi-modal spike-based learning.

To break new ground in robotic perception, we are exploring neuromorphic technology – an area of computing that emulates the neural structure and operation of the human brain – to process sensory data from ACES.
We are developing sensory integrated artificial brain system that mimics biological neural networks, which can run on a power-efficient neuromorphic processor, such as Intel’s Loihi chip and Brainchip’s Akida neural processor. This novel system integrates ACES and vision sensors, equipping robots with the ability to draw accurate conclusions about the objects they are grasping based on the data captured by the sensors in real-time - while operating at a power level efficient enough to be deployed directly inside the robot.

Intelligent Sensing​

sensor_textureclassification.png

Confusion Matrix on the Texture Classification Task. Average accuracy was 94.3%±5.3%.
sensor_foodclassification.png

Confusion Matrix on the Food Identification Task. Overall accuracy was 90%.
Most of today’s robots operate solely based on visual processing, which limits their capabilities. In order to perform more complex tasks, robots have to be equipped with an exceptional sense of touch and the ability to process sensory information quickly and intelligently.
To improve the robot’s perception capabilities, we have developed machine-learning models for tactile perception and inference. Some of these models are synergized with vision sensing to achieve better performance.
Models' capabilities:

Integrations​

Our sensors include various hardware and software integrations you to easily include our tactile intelligence in your application.
sensor_integrations_gripper.JPG

Hardware
Robotic Grippers
The sensors have been designed to provide seamless integrations with commercial off-the-shelf robotic grippers. We currently support Robotiq 2F grippers with our own custom low-latency C++ drivers.
sensor_integrations_python_code.png

Software
C++/Python Support
We provide C++/Python APIs to interface with our sensors, robotic grippers and our tactile intelligence algorithms.
sensor_integrations_ros.png

Middleware
ROS/ROS2
Support for ROS is built right into the SDK. We provide ROS nodes and RViz plugins to enable you to integrate the sensors into your robotic applications.

Get​


Awesome, thanks Rocket

A bit more info




ARRIVING EARLY 2022

Hello, meet the new Tac-01​

The Tac-01 provides robots with the sense of touch via a synergistic combination of tactile sensors and artificial intelligence for intelligent robot manipulation.


The sensors and our tactile intelligence algorithms will be available soon as a development kit for purchase.
Cover Photo

APPLICATIONS​

Sensors to enable the next generation of perception

Our next generation AI-enabled low-latency tactile sensors for robotic and health applications, built upon a decade of research and testing here in sunny Singapore.

Delicate GraspingTactile VisualisationSlip DetectionPick and Place

Our sensors are able to grasp delicate and fragile objects, without any prior knowledge of the items.

TECHNOLOGY​

Neuromorphic sensors for a new class of tactile intelligence​

We build upon know-how from the fields of bioengineering and computer science to engineer a sensor that mimics the human sensory system.

Tactile Acumen​

Our Asynchronously Coded Electronic Skin (ACES) platform, an event-based neuro-mimetic architecture, enables responsive, asynchronous transmission of tactile information for dexterous manipulation tasks that require rapid detection of object slippage and object hardness.

High-Frequency, Low-Latency Data
Tactile stimuli can be transmitted at up to 4 kHz, faster than the average human.

40 Taxels per Sensor
Each sensor has 40 taxels (tactile pixels) arranged in a 8x5 grid to provide good spatial resolution.

Event-based or Synchronous-based Tactile Data
Work with data in the synchronous-based or event-based form.
index_left_right_spatial_view.png
Taxel representation of right and left fingers, grasping a tomato

Tactile Intelligence​

Our sensors come integrated with our novel Machine Learning models, which uses data from ACES, for fast and accurate tactile perception.

Smart Self-Calibration
Built-in self-calibration algorithm allows the user to calibrate the sensors easily, even after many cycles of wear and tear.

Slip Detection
The slip detection algorithm is able to recognise when the dynamics of the robot causes the object to slip from the grasp, and react accordingly.

GraspForceNet
Our novel GraspForceNet AI model is used to ensure that the forces applied by the gripper does not deform or damage the objects.

index_graspforce.png
Tofu grasping without GraspForceNet (left) and with GraspForceNet (right)

TAC-01 DEVELOPMENT KIT​

Everything you need to get started
Each set of our development kit comes with a pair of the latest Tac-01 sensors, and access to our tactile intelligence APIs with seamless integrations with various robotic grippers.

Next Generation Hardware​

Plug-and-play tactile sensors, with an intelligent built-in self-calibration algorithms to last the lifetime of your toughest applications.

Software SDK​

Easy to use software SDK with visualisers, tactile intelligence APIs and C++, Python, ROS and ROS2 integrations. Supported on Windows, Mac and Linux.

Systems Integration​

Out of the box integration with Robotiq-2F grippers with our custom drivers for a more responsive control. Support for more types of grippers coming soon.
 
  • Like
  • Love
  • Fire
Reactions: 24 users

jtardif999

Regular
In thinking about one of our ‘unfair’ advantages - being able to run multiple modalities in parallel, as pointed at in response to the nVisio webinar Q&A, this could enable the best kind of security in use cases where visually identifying the owner of a device to combat just voice activation from being spoofed. In this way Akida technology is like no other as potentially an independent, remote security check of an owner credentials. Multiple modality use cases will eventually proliferate imo - everything will probably have a form of multi-dimensional AI and Akida is the only way it can currently be achieved.
 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 13 users

TechGirl

Founding Member

ARRIVING EARLY 2022

Hello, meet the newTac-01​

The Tac-01 provides robots with the sense of touch via a synergistic combination of tactile sensors and artificial intelligence for intelligent robot manipulation.

Cover Photo

APPLICATIONS​

Sensors to enable the next generation of perception
Our next generation AI-enabled low-latency tactile sensors for robotic and health applications, built upon a decade of research and testing here in sunny Singapore.
Select a tab Delicate Grasping Tactile Visualisation Slip Detection Pick and Place
Our sensors are able to grasp delicate and fragile objects, without any prior knowledge of the items.

TECHNOLOGY​

Neuromorphic sensors for a new class of tactile intelligence​

We build upon know-how from the fields of bioengineering and computer science to engineer a sensor that mimics the human sensory system.

Tactile Acumen​

Our Asynchronously Coded Electronic Skin (ACES) platform, an event-based neuro-mimetic architecture, enables responsive, asynchronous transmission of tactile information for dexterous manipulation tasks that require rapid detection of object slippage and object hardness.

High-Frequency, Low-Latency Data
Tactile stimuli can be transmitted at up to 4 kHz, faster than the average human.

40 Taxels per Sensor
Each sensor has 40 taxels (tactile pixels) arranged in a 8x5 grid to provide good spatial resolution.

Event-based or Synchronous-based Tactile Data
Work with data in the synchronous-based or event-based form.
index_sensor_hw.JPG
Raw sensor layer, below the protective "skin"

Tactile Intelligence​

Our sensors come integrated with our novel Machine Learning models, which uses data from ACES, for fast and accurate tactile perception.

Smart Self-Calibration
Built-in self-calibration algorithm allows the user to calibrate the sensors easily, even after many cycles of wear and tear.

Slip Detection
The slip detection algorithm is able to recognise when the dynamics of the robot causes the object to slip from the grasp, and react accordingly.

GraspForceNet
Our novel GraspForceNet AI model is used to ensure that the forces applied by the gripper does not deform or damage the objects.
Our smart calibration routine uses three object types to run the calculations: soft, hard, intermediate

TAC-01 DEVELOPMENT KIT​

Everything you need to get started
Each set of our development kit comes with a pair of the latest Tac-01 sensors, and access to our tactile intelligence APIs with seamless integrations with various robotic grippers.

Next Generation Hardware​

Plug-and-play tactile sensors, with an intelligent built-in self-calibration algorithms to last the lifetime of your toughest applications.

Software SDK​

Easy to use software SDK with visualisers, tactile intelligence APIs and C++, Python, ROS and ROS2 integrations. Supported on Windows, Mac and Linux.

Systems Integration​

Out of the box integration with Robotiq-2F grippers with our custom drivers for a more responsive control. Support for more types of grippers coming soon.

Hahaha to funny, great minds think alike
 
  • Like
Reactions: 6 users

TechGirl

Founding Member
Maybe an add to the iceberg



TECHNOLOGY​

Tac-01 Sensors
Our Tac-01 sensors are ideal for a wide range of use cases, from raw data collection to higher order tactile interpretations.

Asynchronous Coded Electronic Skin (ACES)​

sensor_aces.JPG

Asynchronous Coded Electronic Skin (ACES)

Drawing inspiration from the human sensory nervous system, we have developed an advanced artificial skin known asAsynchronous Coded Electronic Skin (ACES) - an event-based neuro-mimetic architecture that enables asynchronous transmission of tactile information.
ACES can detect touch at more than 1,000 times faster than the human sensory nervous system. For example, it is capable of differentiating physical contact between different sensors in less than 60 nanoseconds — the fastest ever achieved for an electronic skin technology — even with large numbers of sensors. ACES-enabled skin can also accurately identify the shape, texture and hardness of objects within 10 milliseconds, ten times faster than the blinking of an eye. This is enabled by the high fidelity and capture speed of the ACES system.
The ACES platform can also be designed to achieve high robustness to physical damage, an important property for electronic skins because they come into the frequent physical contact with the environment. Unlike the current system used to interconnect sensors in existing electronic skins, all the sensors in ACES can be connected to a common electrical conductor with each sensor operating independently. This allows ACES-enabled electronic skins to continue functioning as long as there is one connection between the sensor and the conductor, making them less vulnerable to damage.

Neuromorphic Technology​

sensor_vtsnn.png

Event-driven visual-tactile perception system, comprising a novel biologically-inspired tactile sensor and multi-modal spike-based learning.

To break new ground in robotic perception, we are exploring neuromorphic technology – an area of computing that emulates the neural structure and operation of the human brain – to process sensory data from ACES.
We are developing sensory integrated artificial brain system that mimics biological neural networks, which can run on a power-efficient neuromorphic processor, such as Intel’s Loihi chip and Brainchip’s Akida neural processor. This novel system integrates ACES and vision sensors, equipping robots with the ability to draw accurate conclusions about the objects they are grasping based on the data captured by the sensors in real-time - while operating at a power level efficient enough to be deployed directly inside the robot.

Intelligent Sensing​

sensor_textureclassification.png

Confusion Matrix on the Texture Classification Task. Average accuracy was 94.3%±5.3%.
sensor_foodclassification.png

Confusion Matrix on the Food Identification Task. Overall accuracy was 90%.
Most of today’s robots operate solely based on visual processing, which limits their capabilities. In order to perform more complex tasks, robots have to be equipped with an exceptional sense of touch and the ability to process sensory information quickly and intelligently.
To improve the robot’s perception capabilities, we have developed machine-learning models for tactile perception and inference. Some of these models are synergized with vision sensing to achieve better performance.
Models' capabilities:

Integrations​

Our sensors include various hardware and software integrations you to easily include our tactile intelligence in your application.
sensor_integrations_gripper.JPG

Hardware
Robotic Grippers
The sensors have been designed to provide seamless integrations with commercial off-the-shelf robotic grippers. We currently support Robotiq 2F grippers with our own custom low-latency C++ drivers.
sensor_integrations_python_code.png

Software
C++/Python Support
We provide C++/Python APIs to interface with our sensors, robotic grippers and our tactile intelligence algorithms.
sensor_integrations_ros.png

Middleware
ROS/ROS2
Support for ROS is built right into the SDK. We provide ROS nodes and RViz plugins to enable you to integrate the sensors into your robotic applications.

Get​



https://tacniq.ai/videos/systems_pickandplace.mp4
 
  • Like
  • Love
  • Fire
Reactions: 7 users

cosors

👀
HA HA ...............ok I have just been over on Holy Crapper doing my thing reporting the shitheads and hopefully getting shareman modded some more! Love it as I have not posted for ages.

AND I could not help posting a excerpt from FF post just before (Below)
................................................

"BrainChip – Annual General Meeting CEO and Chairman’s Address Sydney 24 May 2022: BrainChip Holdings Ltd (ASX:BRN), appends the Chairman’s address, Chief Executive Officer’s address and presentation to the Annual General Meeting, in accordance with the ASX Listing Rules.

This announcement is authorised for release by the BRN Board of Directors."

CEO Sean Hehir statement regarding his due diligence process regarding his appointment -
"Being a Silicon Valley based executive I had easy access to some of the world’s best technical minds who I engaged to evaluate the core technology.


The overwhelming feedback was the technology is visionary in its design, unparalleled in flexibility, and transformative in performance.

The Market is moving to the edge, and we are already here...

..........................

Ok above says it all................
Yak52
................................................................................

Waiting to see if it gets MODDED now! lol

Yak52 :cool:
You let me see in for seconds, I actually didn't want to give them a click. I saw that first and my thought, why are they advertising themselves with four year old numbers :unsure:
Screenshot_2022-06-23-07-18-15-87_40deb401b9ffe8e1df2f1cc5ba480b12.jpg


__________
how many remain if you delete the non unique ones, i.e. the multi-avatars
 
Last edited:
  • Like
  • Haha
Reactions: 8 users

Yak52

Regular
short data is out from last friday, increased to 74mil short positions.

Hi mkg6R

As I have posted recently I will add again and update your post a bit. Suggest you get info direct from ASX website not 3rd parties.

ASX DATA for Tuesday 21st June 2022. TOTAL = 93,761,681 Covered/closed = 52,869,733
AsX Short positions for Tuesday 21 June 2022.jpg




Yak52
 
  • Like
  • Fire
Reactions: 13 users
Hopefully I will find a few more talks with Renesas at the Embedded world 2022


Hmmm. When discussing voice activation. The engineer did mention the word “spike”.
Not written in any of the specs but verbally articulated. Now I wonder if?
 
  • Like
  • Fire
  • Thinking
Reactions: 17 users
Awesome, thanks Rocket

A bit more info




ARRIVING EARLY 2022

Hello, meet the new Tac-01​

The Tac-01 provides robots with the sense of touch via a synergistic combination of tactile sensors and artificial intelligence for intelligent robot manipulation.



Cover Photo

APPLICATIONS​

Sensors to enable the next generation of perception

Our next generation AI-enabled low-latency tactile sensors for robotic and health applications, built upon a decade of research and testing here in sunny Singapore.

Delicate GraspingTactile VisualisationSlip DetectionPick and Place

Our sensors are able to grasp delicate and fragile objects, without any prior knowledge of the items.

TECHNOLOGY​

Neuromorphic sensors for a new class of tactile intelligence​

We build upon know-how from the fields of bioengineering and computer science to engineer a sensor that mimics the human sensory system.

Tactile Acumen​

Our Asynchronously Coded Electronic Skin (ACES) platform, an event-based neuro-mimetic architecture, enables responsive, asynchronous transmission of tactile information for dexterous manipulation tasks that require rapid detection of object slippage and object hardness.

High-Frequency, Low-Latency Data
Tactile stimuli can be transmitted at up to 4 kHz, faster than the average human.

40 Taxels per Sensor
Each sensor has 40 taxels (tactile pixels) arranged in a 8x5 grid to provide good spatial resolution.

Event-based or Synchronous-based Tactile Data
Work with data in the synchronous-based or event-based form.
index_left_right_spatial_view.png
Taxel representation of right and left fingers, grasping a tomato

Tactile Intelligence​

Our sensors come integrated with our novel Machine Learning models, which uses data from ACES, for fast and accurate tactile perception.

Smart Self-Calibration
Built-in self-calibration algorithm allows the user to calibrate the sensors easily, even after many cycles of wear and tear.

Slip Detection
The slip detection algorithm is able to recognise when the dynamics of the robot causes the object to slip from the grasp, and react accordingly.

GraspForceNet
Our novel GraspForceNet AI model is used to ensure that the forces applied by the gripper does not deform or damage the objects.

index_graspforce.png
Tofu grasping without GraspForceNet (left) and with GraspForceNet (right)

TAC-01 DEVELOPMENT KIT​

Everything you need to get started
Each set of our development kit comes with a pair of the latest Tac-01 sensors, and access to our tactile intelligence APIs with seamless integrations with various robotic grippers.

Next Generation Hardware​

Plug-and-play tactile sensors, with an intelligent built-in self-calibration algorithms to last the lifetime of your toughest applications.

Software SDK​

Easy to use software SDK with visualisers, tactile intelligence APIs and C++, Python, ROS and ROS2 integrations. Supported on Windows, Mac and Linux.

Systems Integration​

Out of the box integration with Robotiq-2F grippers with our custom drivers for a more responsive control. Support for more types of grippers coming soon.
One of the references to the primary source document is:


“A. Vanarse, A. Osseiran, A. Rassau, A review of current neuromorphic approaches for vision, auditory, and olfactory sensors. Front. Neurosci. 10, 115 (2016).
CROSSREF
PUBMED
ISI

So it seems clear they have at least heard of Brainchip. The full paper is behind a pay wall.

No wonder dumb people are clumsy who would have thought so much intelligence, science and mathematics went into picking up Tofu.

I really need one of these robots to help me take eggs out of the carton without breaking the first 2 or 3. 😂🤣😎

My opinion only DYOR
FF

AKIDA BALLISTA
 
  • Haha
  • Like
Reactions: 24 users
One of the references to the primary source document is:


“A. Vanarse, A. Osseiran, A. Rassau, A review of current neuromorphic approaches for vision, auditory, and olfactory sensors. Front. Neurosci. 10, 115 (2016).
CROSSREF
PUBMED
ISI

So it seems clear they have at least heard of Brainchip. The full paper is behind a pay wall.

No wonder dumb people are clumsy who would have thought so much intelligence, science and mathematics went into picking up Tofu.

I really need one of these robots to help me take eggs out of the carton without breaking the first 2 or 3. 😂🤣😎

My opinion only DYOR
FF

AKIDA BALLISTA
PS: In future when you say to someone can you hold this for a second and they say no you will understand why.

So be sympathetic it’s not that they are lazy or selfish. 🧐😂🤣😂
FF

AKIDA BALLISTA
 
  • Haha
  • Like
  • Love
Reactions: 11 users

TechGirl

Founding Member
One of the references to the primary source document is:


“A. Vanarse, A. Osseiran, A. Rassau, A review of current neuromorphic approaches for vision, auditory, and olfactory sensors. Front. Neurosci. 10, 115 (2016).
CROSSREF
PUBMED
ISI

So it seems clear they have at least heard of Brainchip. The full paper is behind a pay wall.

No wonder dumb people are clumsy who would have thought so much intelligence, science and mathematics went into picking up Tofu.

I really need one of these robots to help me take eggs out of the carton without breaking the first 2 or 3. 😂🤣😎

My opinion only DYOR
FF

AKIDA BALLISTA

Yes I noticed that too, I was actually looking for PVDM's name in the references but once I came across these Brainchip's guys names instead I was somewhat satisfied :ROFLMAO:

Robot Reaction GIF
 
  • Like
  • Haha
  • Love
Reactions: 22 users
Hmmm. When discussing voice activation. The engineer did mention the word “spike”.
Not written in any of the specs but verbally articulated. Now I wonder if?


I really hope Akida is in it. It looks like the product will have multiple uses and would be a nice earner.

Waiting for the like from Rob and AM! 😂
 
  • Like
  • Fire
  • Love
Reactions: 18 users

Diogenese

Top 20
Now here's a thing:

We have 40k odd shareholders.

We have 500 k+ unique users of MetaTF.

So, uiux aside, who's not on board?
 
  • Like
  • Haha
  • Fire
Reactions: 17 users

TechGirl

Founding Member
hmmmmm Prophesee liking Lucid

We’re seeing some great crowds at #CVPR2022 in New Orleans this year! Stop by our booth to see our newest machine vision cameras, including our event-based Triton® camera, featuring PROPHESEE’s Metavision® sensor, and the new Triton Edge all-in-one edge computing camera.


prolike.jpg
 
  • Like
  • Fire
  • Love
Reactions: 31 users

Diogenese

Top 20
Now here's a thing:

We have 40k odd shareholders.

We have 500 k+ unique users of MetaTF.

So, uiux aside, who's not on board?
We're gonna need a bigger bus!
 
  • Haha
  • Like
  • Fire
Reactions: 26 users
It looks exactly the same as continental website. Continental might be very big for us.

@Reuben I posted the below a while back. Was optimistic dreaming at the time. However, with the recent potential Sony links via Prophesee and you (and others) mentioning Continental I’m beginning to dream again


https://thestockexchange.com.au/threads/brn-discussion-2022.1/post-30963


I saw back in the ASX Announcement: Update for the March 2019 Quarter that both GM and Ford were mentioned. Looks like things were looser back then as the companies listed are not confirmed as being associated with BrainChip. However some of the other companies listed are now directly involved with BrainChip - well Valeo and Safran anyway, along with Ford

* How good would it be if the companies named in the ASX announcement was the actual list - and BrainChip are checking them off one by one

Other companies on the list discussed in this forum:
1647381928406.png
 

Attachments

  • 0C67149B-1CD6-4EAD-AEAE-460A242408DD.jpeg
    0C67149B-1CD6-4EAD-AEAE-460A242408DD.jpeg
    655.8 KB · Views: 72
Last edited:
  • Like
  • Fire
  • Love
Reactions: 43 users

Yak52

Regular
HA HA ...............ok I have just been over on Holy Crapper doing my thing reporting the shitheads and hopefully getting shareman modded some more! Love it as I have not posted for ages.

AND I could not help posting a excerpt from FF post just before (Below)
................................................

"BrainChip – Annual General Meeting CEO and Chairman’s Address Sydney 24 May 2022: BrainChip Holdings Ltd (ASX:BRN), appends the Chairman’s address, Chief Executive Officer’s address and presentation to the Annual General Meeting, in accordance with the ASX Listing Rules.

This announcement is authorised for release by the BRN Board of Directors."

CEO Sean Hehir statement regarding his due diligence process regarding his appointment -
"Being a Silicon Valley based executive I had easy access to some of the world’s best technical minds who I engaged to evaluate the core technology.


The overwhelming feedback was the technology is visionary in its design, unparalleled in flexibility, and transformative in performance.

The Market is moving to the edge, and we are already here...

..........................

Ok above says it all................
Yak52
................................................................................

Waiting to see if it gets MODDED now! lol

Yak52 :cool:

I also posted this info below in response to another poster over "there" on holy crapper.

Waited to see how long before being MODDED! And yes its happened. lol
---------------------------------------------------------------------------------------------------------------------------
(crapper post)
You would not be aware of this info being over here in this forum BUT................

BRN have plans and they are nearly completed to list on the NASDAQ in the future which has been mentioned in BRN announcements .

Apparently its just a matter of timing as all/most requirements have been completed to date.

Also they have cash-in-hand to see out next year and then have LDA still available (2024-25) for more which is unlikely considering revenue expected before then.

Yak52
-----------------------------------------------------------------------------------------------------------------------------

SO..........PREDICTABLE!
Yak52 :cool:
 
  • Haha
  • Like
  • Fire
Reactions: 13 users

ndefries

Regular
We have never been shorted as much as now

1655964765634.png
 
  • Sad
  • Like
  • Wow
Reactions: 18 users

robsmark

Regular
  • Like
Reactions: 5 users

Deena

Regular
  • Like
Reactions: 3 users

ndefries

Regular
But that is dated 17 June. What is the situation now ndefries?
this is the latest available. There is always a 4 days trading delay
 
  • Like
Reactions: 5 users
Top Bottom