BRN Discussion Ongoing

Learning

Learning to the Top 🕵‍♂️
Anyone care to speculate why the change in office address?
To be a Tech Power House,

You need to be in Central Business District of Australia!😉😎🥳

It's great to be a shareholder 🏖
 
  • Like
  • Fire
Reactions: 13 users

TechGirl

Founding Member
  • Haha
  • Like
  • Fire
Reactions: 35 users

MDhere

Top 20
Who on earth hacked my car radio on my way home yesteday ??? FF...do u have superpowers
20221023_175106.jpg
 
  • Haha
  • Like
  • Fire
Reactions: 18 users

buena suerte :-)

BOB Bank of Brainchip
  • Like
  • Fire
  • Haha
Reactions: 13 users
Last edited:
  • Like
  • Fire
  • Love
Reactions: 20 users
Ok Nintendo possible but not established.

MegaChips in the BRN bag.

I have watched the video and expect this is something that would require an intelligent sensor with processor to achieve.

But is there something more that a non gamer like me is not picking up or is this just the starting point for 1,000 Eye research.

Regards
FF

AKIDA BALLISTA
 
Last edited:
  • Like
  • Fire
Reactions: 19 users
I just receive the following unsolicited email from Tony Dawe. I can only assume he has had quite a few (possibly hundreds) of enquiries regarding
todays announcement. I was pretty close with my guess except it was Boardroom which had its lease expire.


Tony Dawe​

2:26 PM (6 minutes ago)
to me
Hi Fact Finder

Our share registrar, Boardroom Limited, changed address and BrainChip’s Registered Office is located at Boardroom Limited’s address.

It’s as simple as that.

Regards


Tony Dawe
Investor Relations Manager
+61 (0)405 989 743
 
  • Like
  • Haha
Reactions: 48 users
This look like a Nintendo Switch Controller to me. Not sure it directly from Nintendo or someone independent.



It's great to be a shareholder 🏖

I'm getting "Ready Player One" vibes off this.
 
  • Haha
  • Like
Reactions: 10 users
I just receive the following unsolicited email from Tony Dawe. I can only assume he has had quite a few (possibly hundreds) of enquiries regarding
todays announcement. I was pretty close with my guess except it was Boardroom which had its lease expire.


Tony Dawe​

2:26 PM (6 minutes ago)
to me
Hi Fact Finder

Our share registrar, Boardroom Limited, changed address and BrainChip’s Registered Office is located at Boardroom Limited’s address.

It’s as simple as that.

Regards


Tony Dawe
Investor Relations Manager
+61 (0)405 989 743
I think some must have been taking Tech seriously about Perth being the head of all things Brainchip unfortunately the law is the law and the Registered Office of Brainchip has been Sydney and continues to be Sydney and under the ASX Rules the AGM of a publicly listed company has to be held where its Registered Office is located. During Covid there were special exemptions but Covid no longer has an influence and the law is the law.

Not an opinion just the way it is but DYOR
FF

AKIDA BALLISTA
 
  • Like
  • Fire
  • Love
Reactions: 25 users

Dang Son

Regular
Because 210 George Street is pretty swish!!
Keeping up Appearances
 
  • Like
  • Wow
  • Haha
Reactions: 7 users
Ok Nintendo established.

MegaChips in the BRN bag.

I have watched the video and expect this is something that would require an intelligent sensor with processor to achieve.

But is there something more that a non gamer like me is not picking up or is this just the starting point for 1,000 Eye research.

Regards
FF

AKIDA BALLISTA
I truly hope it is Nintendo. Controllers would be a great starting point!

As a gamer though, I have dabbled with third party accessories. There are a lot of third party accessories that use the phrase “joy con” so it could just be some tech geek who has been messing around with their own controller. Any clues as to whether the poster of the video works for Nintendo?
 
  • Like
  • Love
Reactions: 8 users
I truly hope it is Nintendo. Controllers would be a great starting point!

As a gamer though, I have dabbled with third party accessories. There are a lot of third party accessories that use the phrase “joy con” so it could just be some tech geek who has been messing around with their own controller. Any clues as to whether the poster of the video works for Nintendo?
Thank you. This is the value of having such a diverse and knowledgeable group here at TSEx.
Will amend my post.

Regards
FF

AKIDA BALLISTA
 
  • Like
Reactions: 7 users
Well McKinsey Aug 22 report just catching on haha

Haven't read, just keyword searched.

Tech Trends Outlook


IMG_20221024_121359.jpg
 
  • Like
  • Fire
  • Love
Reactions: 23 users
IS BRAINCHIP AKIDA TECHNOLOGY THE NEXT BIG BANG IN Ai???

IN MY OPINION IT ALREADY IS BUT WHAT WOULD I KNOW BUT EVERYONE I READ WHO HAS THE CREDENTIALS TO KNOW ARE SCREAMING AT THE TOP OF THEIR LUNGS THAT POWER AND HEAT HAVE REACHED UNSUSTAINABLE LEVELS IN THE TECHNOLOGY WORLD AND AKIDA SOLVES BOTH ISSUES BEFORE IT EVEN STARTS TO DISPLAY ITS REVOLUTIONARY BEAST LIKE PROCESSING ABILITIES INCLUDING AS WE ALL KNOW ONE SHOT/SEVERAL SHOT/INCREMENTAL LEARNING. ON CHIP CONVOLUTION, CNN TO SNN CONVERSION AND CAPACITY TO OPERATE COMPLETELY SELF CONTAINED WITHOUT CLOUD CONNECTION WHILE REMAINING PROCESS AND SENSOR AGNOSTIC:


Has There Been A Second AI Big Bang?​

Calum Chace
Contributor
"The AI guy"
Follow
Oct 18, 2022,05:30am EDT

Listen to article7 minutes

https://policies.google.com/privacy

Aleksa Gordic, an AI researcher with DeepMind

Aleksa Gordic, an AI researcher with DeepMind
ALEKSA GORDIC

The first Big Bang in 2012

The Big Bang in artificial intelligence (AI) refers to the breakthrough in 2012, when a team of researchers led by Geoff Hinton managed to train an artificial neural network (known as a deep learning system) to win an image classification competition by a surprising margin. Prior to that, AI had performed some remarkable feats, but it had never made much money. Since 2012, AI has helped the big technology companies to generate enormous wealth, not least from advertising.


A second Big Bang in 2017?

Has there been a new Big Bang in AI, since the arrival of Transformers in 2017? In episodes 5 and 6 of the London Futurist podcast, Aleksa Gordic explored this question, and explained how today’s cutting-edge AI systems work. Aleksa is an AI researcher at DeepMind, and previously worked in Microsoft’s Hololens team. Remarkably, his AI expertise is self-taught – so there is hope for all of us yet!

Transformers

Transformers are deep learning models which process inputs expressed in natural language and produce outputs like translations, or summaries of texts. Their arrival was announced in 2017 with the publication by Google researchers of a paper titled “Attention is All You Need”. This title referred to the fact that Transformers can “pay attention” simultaneously to large corpus of text, whereas their predecessors, Recurrent Neural Networks, could only pay attention to the symbols either side of the segment of text being processed.


Transformers work by splitting text into small units, called tokens, and mapping them onto high-dimension networks - often thousands of dimensions. We humans cannot envisage this. The space we inhabit is defined by three numbers – or four, if you include time, and we simply cannot imagine a space with thousands of dimensions. Researchers suggest that we shouldn’t even try.

Dimensions and vectors

For Transformer models, words and tokens have dimensions. We might think of them as properties, or relationships. For instance, “man” is to “king” as “woman” is to “queen”. These concepts can be expressed as vectors, like arrows in three-dimensional space. The model will attribute a probability to a particular token being associated with a particular vector. For instance, a princess is more likely to be associated with the vector which denotes “wearing a slipper” than to the vector that denotes “wearing a dog”.
There are various ways in which machines can discover the relationships, or vectors, between tokens. In supervised learning, they are given enough labelled data to indicate all the relevant vectors. In self-supervised learning, they are not given labelled data, and they have to find the relationships on their own. This means the relationships they discover are not necessarily discoverable by humans. They are black boxes. Researchers are investigating how machines handle these dimensions, but it is not certain that the most powerful systems will ever be truly transparent.

Parameters and synapses

The size of a Transformer model is normally measured by the number of parameters it has. A parameter is analogous to a synapse in a human brain, which is the point where the tendrils (axons and dendrites) of our neurons meet. The first Transformer models had a hundred million or so parameters, and now the largest ones have trillions. This is still smaller than the number of synapses in the human brain, and human neurons are far more complex and powerful creatures than artificial ones.


Not by text alone

A surprising discovery made a couple of years after the arrival of Transformers was that they are able to tokenise not just text, but images too. Google released the first vision Transformer in late 2020, and since then people around the world have marvelled at the output of Dall-E, MidJourney, and others.
The first of these image-generation models were Generative Adversarial Networks, or GANs. These were pairs of models, with one (the generator) creating imagery designed to fool the other into accepting it as original, and the second system (the discriminator) rejecting attempts which were not good enough. GANs have now been surpassed by Diffusion models, whose approach is to peel noise away from the desired signal. The first Diffusion model was actually described as long ago as 2015, but the paper was almost completely ignored. They were re-discovered in 2020.

Energy gluttons

Transformers are gluttons for compute power and for energy, and this has led to concerns that they might represent a dead end for AI research. It is already hard for academic institutions to fund research into the latest models, and it was feared that even the tech giants might soon find them unaffordable. The human brain points to a way forward. It is not only larger than the latest Transformer models (at around 80 billion neurons, each with around 10,000 synapses, it is 1,000 times larger). It is also a far more efficient consumer of energy - mainly because we only need to activate a small portion of our synapses to make a given calculation, whereas AI systems activate all of their artificial neurons all of the time. Neuromorphic chips, which mimic the brain more closely than classic chips, may help.

Unsurprising surprises

Aleksa is frequently surprised by what the latest models are able to do, but this is not itself surprising. “If I wasn’t surprised, it would mean I could predict the future, which I can’t.” He derives pleasure from the fact that the research community is like a hive mind: you never know where the next idea will come from. The next big thing could come from a couple of students at a university, and a researcher called Ian Goodfellow famously created the first GAN by playing around at home after a brainstorming session over a couple of beers.
 
  • Like
  • Love
  • Fire
Reactions: 52 users

Iseki

Regular
? Processing.org is an easy to learn programming language that runs on the raspberry pi. Not on a nintendo switch? Then again nintendo switch runs an arm chip, so could run linux, so maybe....
 
  • Like
Reactions: 5 users
? Processing.org is an easy to learn programming language that runs on the raspberry pi. Not on a nintendo switch? Then again nintendo switch runs an arm chip, so could run linux, so maybe....

Jimmy Fallon I Had No Idea GIF by The Tonight Show Starring Jimmy Fallon
 
  • Haha
  • Like
Reactions: 7 users
IS BRAINCHIP AKIDA TECHNOLOGY THE NEXT BIG BANG IN Ai???

IN MY OPINION IT ALREADY IS BUT WHAT WOULD I KNOW BUT EVERYONE I READ WHO HAS THE CREDENTIALS TO KNOW ARE SCREAMING AT THE TOP OF THEIR LUNGS THAT POWER AND HEAT HAVE REACHED UNSUSTAINABLE LEVELS IN THE TECHNOLOGY WORLD AND AKIDA SOLVES BOTH ISSUES BEFORE IT EVEN STARTS TO DISPLAY ITS REVOLUTIONARY BEAST LIKE PROCESSING ABILITIES INCLUDING AS WE ALL KNOW ONE SHOT/SEVERAL SHOT/INCREMENTAL LEARNING. ON CHIP CONVOLUTION, CNN TO SNN CONVERSION AND CAPACITY TO OPERATE COMPLETELY SELF CONTAINED WITHOUT CLOUD CONNECTION WHILE REMAINING PROCESS AND SENSOR AGNOSTIC:

Has There Been A Second AI Big Bang?​

Calum Chace
Contributor
"The AI guy"
Follow
Oct 18, 2022,05:30am EDT

Listen to article7 minutes

https://policies.google.com/privacy

Aleksa Gordic, an AI researcher with DeepMind

Aleksa Gordic, an AI researcher with DeepMind
ALEKSA GORDIC

The first Big Bang in 2012

The Big Bang in artificial intelligence (AI) refers to the breakthrough in 2012, when a team of researchers led by Geoff Hinton managed to train an artificial neural network (known as a deep learning system) to win an image classification competition by a surprising margin. Prior to that, AI had performed some remarkable feats, but it had never made much money. Since 2012, AI has helped the big technology companies to generate enormous wealth, not least from advertising.


A second Big Bang in 2017?

Has there been a new Big Bang in AI, since the arrival of Transformers in 2017? In episodes 5 and 6 of the London Futurist podcast, Aleksa Gordic explored this question, and explained how today’s cutting-edge AI systems work. Aleksa is an AI researcher at DeepMind, and previously worked in Microsoft’s Hololens team. Remarkably, his AI expertise is self-taught – so there is hope for all of us yet!

Transformers

Transformers are deep learning models which process inputs expressed in natural language and produce outputs like translations, or summaries of texts. Their arrival was announced in 2017 with the publication by Google researchers of a paper titled “Attention is All You Need”. This title referred to the fact that Transformers can “pay attention” simultaneously to large corpus of text, whereas their predecessors, Recurrent Neural Networks, could only pay attention to the symbols either side of the segment of text being processed.


Transformers work by splitting text into small units, called tokens, and mapping them onto high-dimension networks - often thousands of dimensions. We humans cannot envisage this. The space we inhabit is defined by three numbers – or four, if you include time, and we simply cannot imagine a space with thousands of dimensions. Researchers suggest that we shouldn’t even try.

Dimensions and vectors

For Transformer models, words and tokens have dimensions. We might think of them as properties, or relationships. For instance, “man” is to “king” as “woman” is to “queen”. These concepts can be expressed as vectors, like arrows in three-dimensional space. The model will attribute a probability to a particular token being associated with a particular vector. For instance, a princess is more likely to be associated with the vector which denotes “wearing a slipper” than to the vector that denotes “wearing a dog”.
There are various ways in which machines can discover the relationships, or vectors, between tokens. In supervised learning, they are given enough labelled data to indicate all the relevant vectors. In self-supervised learning, they are not given labelled data, and they have to find the relationships on their own. This means the relationships they discover are not necessarily discoverable by humans. They are black boxes. Researchers are investigating how machines handle these dimensions, but it is not certain that the most powerful systems will ever be truly transparent.

Parameters and synapses

The size of a Transformer model is normally measured by the number of parameters it has. A parameter is analogous to a synapse in a human brain, which is the point where the tendrils (axons and dendrites) of our neurons meet. The first Transformer models had a hundred million or so parameters, and now the largest ones have trillions. This is still smaller than the number of synapses in the human brain, and human neurons are far more complex and powerful creatures than artificial ones.


Not by text alone

A surprising discovery made a couple of years after the arrival of Transformers was that they are able to tokenise not just text, but images too. Google released the first vision Transformer in late 2020, and since then people around the world have marvelled at the output of Dall-E, MidJourney, and others.
The first of these image-generation models were Generative Adversarial Networks, or GANs. These were pairs of models, with one (the generator) creating imagery designed to fool the other into accepting it as original, and the second system (the discriminator) rejecting attempts which were not good enough. GANs have now been surpassed by Diffusion models, whose approach is to peel noise away from the desired signal. The first Diffusion model was actually described as long ago as 2015, but the paper was almost completely ignored. They were re-discovered in 2020.

Energy gluttons

Transformers are gluttons for compute power and for energy, and this has led to concerns that they might represent a dead end for AI research. It is already hard for academic institutions to fund research into the latest models, and it was feared that even the tech giants might soon find them unaffordable. The human brain points to a way forward. It is not only larger than the latest Transformer models (at around 80 billion neurons, each with around 10,000 synapses, it is 1,000 times larger). It is also a far more efficient consumer of energy - mainly because we only need to activate a small portion of our synapses to make a given calculation, whereas AI systems activate all of their artificial neurons all of the time. Neuromorphic chips, which mimic the brain more closely than classic chips, may help.

Unsurprising surprises

Aleksa is frequently surprised by what the latest models are able to do, but this is not itself surprising. “If I wasn’t surprised, it would mean I could predict the future, which I can’t.” He derives pleasure from the fact that the research community is like a hive mind: you never know where the next idea will come from. The next big thing could come from a couple of students at a university, and a researcher called Ian Goodfellow famously created the first GAN by playing around at home after a brainstorming session over a couple of beers.
DARPA MAY THINK SO???


"The past few years have seen an explosion of interest in a sub-field of AI dubbed machine learning that applies statistical and probabilistic methods to large data sets to create generalized representations that can be applied to future samples. Foremost among these approaches are deep learning (artificial) neural networks that can be trained to perform a variety of classification and prediction tasks when adequate historical data is available. Therein lies the rub, however, as the task of collecting, labelling, and vetting data on which to train such “second wave” AI techniques is prohibitively costly and time-consuming.

DARPA envisions a future in which machines are more than just tools that execute human-programmed rules or generalize from human-curated data sets. Rather, the machines DARPA envisions will function more as colleagues than as tools. Towards this end, DARPA research and development in human-machine symbiosis sets a goal to partner with machines. Enabling computing systems in this manner is of critical importance because sensor, information, and communication systems generate data at rates beyond which humans can assimilate, understand, and act. Incorporating these technologies in military systems that collaborate with warfighters will facilitate better decisions in complex, time-critical, battlefield environments; enable a shared understanding of massive, incomplete, and contradictory information; and empower unmanned systems to perform critical missions safely and with high degrees of autonomy. DARPA is focusing its investments on a third wave of AI that brings forth machines that understand and reason in context."

Program Announcement for Artificial Intelligence Exploration (AIE)​


ACTIVE
Contract Opportunity

Notice ID
DARPA-PA-22-02
Related Notice
Department/Ind. Agency
DEPT OF DEFENSE
Sub-tier
DEFENSE ADVANCED RESEARCH PROJECTS AGENCY (DARPA)
Office
DEF ADVANCED RESEARCH PROJECTS AGCY


Looking for contract opportunity help?​

(opens in new window)
Procurement Technical Assistance Centers (PTACs) are an official government contracting resource for small businesses. Find your local PTAC (opens in new window)for free government expertise related to contract opportunities.

General Information​

  • Contract Opportunity Type: Presolicitation (Original)
  • All Dates/Times are: (UTC-04:00) EASTERN STANDARD TIME, NEW YORK, USA
  • Original Published Date: Aug 19, 2022 02:41 pm EDT
  • Original Response Date: Aug 18, 2023 04:00 pm EDT
  • Inactive Policy: Manual
  • Original Inactive Date: Sep 17, 2023
  • Initiative:
    • None

Classification​

  • Original Set Aside:
  • Product Service Code: AC11 - National Defense R&D Services; Department of Defense - Military; Basic Research
  • NAICS Code:
    • 541715 - Research and Development in the Physical, Engineering, and Life Sciences (except Nanotechnology and Biotechnology)
  • Place of Performance:

Description​

The mission of the Defense Advanced Research Projects Agency (DARPA) is to make strategic, early investments in science and technology that will have long-term positive impact on our Nation’s security. In support of this mission, DARPA has pioneered groundbreaking research and development (R&D) in Artificial Intelligence (AI) for more than five decades. Today, DARPA continues to lead innovation in AI research through a large, diverse portfolio of fundamental and applied R&D AI programs aimed at shaping a future for AI technology where machines may serve as trusted and collaborative partners in solving problems of importance to national security. The AI Exploration (AIE) program is one key element of DARPA’s broader AI investment strategy that will help ensure the U.S. maintains a technological advantage in this critical area.
 
  • Like
  • Fire
  • Love
Reactions: 29 users

ndefries

Regular
No idea if the following is correct but if you have a subscription for the AFR you might like to confirm:

"from today’s AFR

’Company secretary Ben Cohen said AVZ would only respond further with its lawyers present due to unspecified allegations on social media that Financial Review reporter Tom Richardson is connected with a short selling group named Boatman Capital, which chose to publish mistruths.’


No opinion either way at this stage needing to Do More Research
FF

AKIDA BALLISTA
this is that article

Battle over ‘world’s largest lithium deposit’ drags in China​

Tom Richardson

Tom RichardsonMarkets reporter and commentator
Updated Oct 24, 2022 – 8.15am,first published at 5.00am
Save
Share
Suspended lithium play AVZ Minerals has conceded a Congo court ordered its payments to acquire an additional 15 per cent stake in the Manono Project be suspended, after it was questioned about the September 20 court ruling by The Australian Financial Review.
The ruling suggests AVZ and Congo’s Dathomir face more arbitration over the deal’s validity and sale price, while extending a list of disputes AVZ faces over its ownership rights to the Manono hard rock lithium deposit in the Congo.
517fbed26ceabf4986f88fee2d35017228a47262

AVZ Minerals CEO Nigel Ferguson is disputing claims that a Chinese company owns 15 per cent of AVZ’s lithium project in the Democratic Republic of the Congo. David Rowe
On Thursday last week, AVZ told the market only a properly constituted arbitration tribunal has jurisdiction to overturn an acquisition it says it executed with Dathomir in August 2021 for $US21 million.
The Perth-headquartered explorer rode the lithium boom and excitement over claims Manono is the world’s largest lithium deposit to reach a $2.7 billion sharemarket valuation last May, before shares were suspended over its mining licence’s approval by the Congo government.
Despite the ruling suspending AVZ’s payments for the 15 per cent stake, AVZ said it still had legal title to 75 per cent ownership of Manono.

RELATED QUOTES​

AVZAVZ Minerals​

$0.780 -7.14%
1 year1 day


Updated: Oct 24, 2022 – 3.48pm. Data is 20 mins delayed.
View AVZ related articles

In September 2021, AVZ announced a separate agreement with Chinese investor Suzhou CATH Energy Technologies to sell a 24 per cent equity interest in Manono for $US240 million, which by proxy valued the project at $US1 billion.
This compares to the $US21 million AVZ said it paid Dathomir for a 15 per cent stake just a month before in August 2021 in a transaction the Congolese mining group has now had a court rule as suspended.
On May 10, AVZ told investors its future Manono stake would fall from 75 per cent to 51 per cent after the deal to sell a 24 per cent stake to CATH completed, with the right to negotiate to buy another 15 per cent from Congo government’s Cominiere potentially taking its stake back to a final 66 per cent.
However, in May 2022 another Chinese mining group named Zijin Mining announced it had signed a separate legal deal with the Congo government’s Cominiere to acquire a 15 per cent stake in the Manono Project for $US33.4 million.
Fortune-500 company Zijin said the deal was struck in September 2021 and the Commercial Court of Lubumbashi rejected AVZ’s attempt to have the deal thrown out in November 2021 and January 2022.
AVZ didn’t disclose the Zijin legal dispute or court rulings to the market until May 4, 2022, as shares raced to $1.30 in April 2022 on a $4.5 billion valuation. Its stock was suspended on May 9.

In May, Zijin also applied to the International Court of Arbitration (ICC) to force AVZ to recognise its claim to 15 per cent in the Manono project.
“Zijin Mining confidently looks forward to the ICC hearing in April 2023 and expects AVZ’s abuse of Zijin Mining’s 15 per cent stake in the Manono project as alleged will be addressed,” Zijin’s in-house legal counsel Sun Kuiyuan said on Thursday. “Zijin Mining is disappointed with AVZ’s lack of cooperation with the minority shareholders of the Manono project.
“Zijin is surprised that AVZ has not supported its participation in the Manono project, which brings substantial capital and expertise to the benefit of all stakeholders and the DRC.”

Zijin refused to comment further. It previously said the separate sale between Dathomir and AVZ had been legally terminated to mean AVZ only had legal ownership of 60 per cent of the project, with Zijin at 15 per cent, Dathomir at 15 per cent, and the Congo government under Cominiere at 10 per cent.
It also said if AVZ proceeded with the 24 per cent sale to CATH Technologies its stake would drop from 60 per cent to 36 per cent to mean AVZ “will no longer have an absolute controlling interest” in Manono.

AVZ refused to comment on questions around the September 20 court ruling and a separate December 2021 court document related to the Dathomir deal.
Company secretary Ben Cohen said AVZ would only respond further with its lawyers present due to unspecified allegations on social media that Financial Review reporter Tom Richardson is connected with a short selling group named Boatman Capital, which chose to publish mistruths.
London-based short side research firm Boatman Capital has followed Zijin in claiming AVZ’s stake will fall from 60 per cent to 36 per cent if the deal to sell to CATH goes ahead, which theoretically means AVZ would own less of the project than Chinese interests.
On September 9, AVZ issued a statement attacking Boatman for publishing alleged mistruths around the Congo courtroom barneys and reiterated its position that it still owns 75 per cent of Manono.
In response, Boatman instructed lawyers Grosvenor Law to demand both the ASX and ASIC investigate AVZ for allegedly misleading the market and failing to follow its disclosure obligations.
Boatman has also submitted court documents and made unreported allegations to Australia’s regulators it said supported its statements.
 
  • Like
  • Love
  • Fire
Reactions: 10 users

Xray1

Regular
  • Like
  • Fire
  • Love
Reactions: 21 users
  • Like
  • Love
  • Fire
Reactions: 27 users
Top Bottom