BRN Discussion Ongoing

mrgds

Regular
Hi Krustor

As @VictorG posted a company must publish on the ASX full details of any event that a reasonable person would expect to have a material effect on the share price.

The event can be a positive or a negative event. For example if the company is making a product to sell the successful or unsuccessful production of the product could be a material event and require an announcement on the ASX.

The successful production of AKD1000 is a good example.

When a company signs a contract or enters a partnership it has to decide if this contract or partnership is material and the guidance from the ASX is that if it does not have a dollar value that can be calculated with certainty it is not something which meets the definition of being material and announced on the ASX.

So MegaChips was able to be announced because they could calculate the value of the licence fee. If there was no upfront licence fee and just the hope of future royalties Brainchip would not have been allowed to announce it on the ASX.

The partnership with ARM could not be announced on the ASX because neither Brainchip or ARM could put a dollar amount on what sales this partnership was likely to generate in the future.

The real problem for companies is that they have to decide if the event is price sensitive and and whether to publish or not publish on the
ASX.

The ASX only decides if the company did the right thing afterwards. If the ASX decides the company made the wrong choice they can penalise and suspend the company and Directors.

It is like having a highway where there are no speed signs to tell you what the speed limit is and it is up to you to guess what speed you are allowed to travel.

You are not allowed to ask the police on duty what speed is legal but if you make the wrong choice the police jump out and takeaway your car and fine you for travelling too fast or too slow.

This is how the ASX works.

Brainchip has the intention of listing on the Nasdaq one day and so as their good character as a company is a consideration they are taking a very cautious approach to the ASX Rules so that they do not have adverse notations by the ASX to explain to the Nasdaq.

The Non Disclosure Agreements are a separate issue as these are between Brainchip and third parties that have nothing to do with the ASX.

If Brainchip enters a material contract with a company even if they have a non disclosure agreement this material agreement has to be announced in accordance with the Rules on the ASX.

The non disclosure agreement does not override the ASX Rules.

I hope this helps your understanding.

My opinion only DYOR
FF

AKIDA BALLISTA
Good morning FF,
Would like to ask you, with regards to the "Nasa Phase 2 " post you shared yesterday, you did say you had sent the document to BRN, im wondering if you had heard anything back from BRN .................... with thnx if you are able to share anything.

Akida Ballista
 
  • Like
  • Love
Reactions: 13 users

alwaysgreen

Top 20
It is like having a highway where there are no speed signs to tell you what the speed limit is and it is up to you to guess what speed you are allowed to travel.

You are not allowed to ask the police on duty what speed is legal but if you make the wrong choice the police jump out and takeaway your car and fine you for travelling too fast or too slow.

This is how the ASX works.
What a spot on explanation of the ridiculous workings of the ASX! :ROFLMAO:

It's why companies need to waste money and resources on compliance officers just to make sure they are towing the ASX line. What a stressful occupation that would be!
 
  • Like
  • Love
  • Haha
Reactions: 12 users
Good morning FF,
Would like to ask you, with regards to the "Nasa Phase 2 " post you shared yesterday, you did say you had sent the document to BRN, im wondering if you had heard anything back from BRN .................... with thnx if you are able to share anything.

Akida Ballista
Just checked my emails and nothing at this point in time. I sent it to Perth and to the US. The time difference can come into play but I suspect it has taken them by surprise. 😂

Will report when and if I hear anything. They may choose to ignore me for a little while. 😎

Regards
FF

AKIDA BALLISTA
 
  • Like
  • Love
  • Fire
Reactions: 41 users
We have long been looking for official confirmation of a successful NASA Vorago Brainchip Phase 1 project.
Well I have found it and it is copied with a link below.

Note:

1. Vorago is Silicon Space Technology Corporation for newer shareholders.
2. Vorago used the term CNN RNN to describe AKIDA not SCNN.

As you read through the extracts you will note the following:

A.
Vorago met all of the Phase 1 objectives
B. Vorago has five letters in support of continuing to the next Phase 2 importantly/interestingly two of these letters offer funding for the Phase 2 independent of NASA - (I personally am thinking large Aerospace companies jumping on board)
C. Vorago has modelling which shows AKIDA will allow NASA to have autonomous Rovers that will achieve speeds of up to 20 kph compared with a present speed of 4 centimetres a second.

There is in my opinion no other company on this planet with technology that can compete with the creation of Peter van der Made and Anil Mankar.


The Original Phase 1:
"The ultimate goal of this project is to create a radiation-hardened Neural Network suitable for Ede use. Neural Networks operating at the Edge will need to perform Continuous Learning and Few-shot/One-shot Learning with very low energy requirements, as will NN operation. Spiking Neural Networks (SNNs) provide the architectural framework to enable Edge operation and Continuous Learning. SNNs are event-driven and represent events as a spike or a train of spikes. Because of the sparsity of their data representation, the amount of processing Neural Networks need to do for the same stimulus can be significantly less than conventional Convolutional Neural Networks (CNNs), much like a human brain. To function in Space and in other extreme Edge environments, Neural Networks, including SNNs, must be made rad-hard.Brainchip’s Akida Event Domain Neural Processor (www.brainchipinc.com) offers native support for SNNs. Brainchip has been able to drive power consumption down to about 3 pJ per synaptic operation in their 28nm Si implementation. The Akida Development Environment (ADE) uses industry-standard development tools Tensorflow and Keras to allow easy simulation of its IP.Phase I is the first step towards creating radiation-hardened Edge AI capability. We plan to use the Akida Neural Processor architecture and, in Phase I, will: Understand the operation of Brainchip’s IP Understand 28nm instantiation of that IP (Akida) Evaluate radiation vulnerability of different parts of the IP through the Akida Development Environment Define architecture of target IC Define how HARDSIL® will be used to harden each chosen IP block Choose a target CMOS node (likely 28nm) and create a plan to design and fabricate the IC in that node, including defining the HARDSIL® process modules for this baseline process Define the radiation testing plan to establish the radiation robustness of the ICSuccessfully accomplishing these objectives:Establishes the feasibility of creating a useful, radiation-hardened product IC with embedded NPU and already-existing supporting software ecosystem to allow rapid adoption and productive use within NASA and the Space community.\n\n\n\n\t Creates the basis for an executable Phase II proposal and path towards fabrication of the processor."

CNN RNN Processor
FIRM: SILICON SPACE TECHNOLOGY CORPORATION PI
: Jim Carlquist Proposal #:H6.22-4509
NON-PROPRIETARY DATA
Objectives:

The goal of this project is the creation of a radiation-hardened Spiking Neural Network (SNN) SoC based on the BrainChip Akida Neuron Fabric IP. Akida is a member of a small set of existing SNN architectures structured to more closely emulate computation in a human brain. The rationale for using a Spiking Neural Network (SNN) for Edge AI Computing is because of its efficiencies. The neurmorphic approach used in the Akida architecture takes fewer MACs per operation since it creates and uses sparsity of both weights and activation by its event-based model. In addition, Akida reduces memory consumption by quantizing and compressing network parameters. This also helps to reduce power consumption and die size while maintaining performance.
Spiking Neural Network Block Diagram


ACCOMPLISHMENTS

Notable Deliverables Provided:
• Design and Manufacturing Plans
• Radiation Testing Plan (included in Final report)
• Technical final report


Key Milestones Met
• Understand Akida Architecture
• Understand 28nm Implementation
• Evaluate Radiation Vulnerability of the IP Through the Akida
Development Environment
• Define Architecture of Target IC
• Define how HARDSIL® will be used in Target IC
• Create Design and Manufacturing Plans
• Define the Radiation Testing Plan to Establish the Radiation
Robustness of the IC


FUTURE PLANNED DEVELOPMENTS

Planned Post-Phase II Partners


We received five Letters of Support for this project.

Two of which will provide capital infusion to keep the project going, one for aid in radiation testing, and the final two for use in future space flights.

Planned/Possible Mission Infusion

NASA is keen to increase the performance of its autonomous rovers to allow for greater speeds.

Current routing methodologies limit speeds to 4cm/sec while NASA has a goal to be able to have autonomous rovers traverse at speeds up to 20km/hr.


Early calculations show the potential for this device to process several of the required neural network algorithms fast enough to meet this goal.

Planned/Possible Mission Commercialization

A detailed plan is included in the Phase I final submittal to commercialized a RADHARD flight ready QML, SNN SoC to be available for NASA and commercial use.

This plan will include a Phase II plus extensions to reach the commercialization goals we are seeking.

CONTRACT (CENTER): SUBTOPIC:
SOLICITATION-PHASE: TA:
80NSSC20C0365 (ARC)
H6.22 Deep Neural Net and Neuromorphic Processors for In- Space Autonomy and Cognition
SBIR 2020-I4.5.0 Autonomy


My opinion only DYOR
FF

AKIDA BALLISTA
For those who might not read here over the weekend and missed it :

https://techport.nasa.gov/file/141775

We live in “exciting times” where Ai at the Edge will become ubiquitous.

My opinion only DYOR
FF

AKIDA BALLISTA
 
  • Like
  • Fire
  • Love
Reactions: 53 users

TECH

Regular
I aren't aware if this has been posted already, but there are certainly a rather large number of semi-conductor/tech companies
partnering up as we move into the next phase in the AI space.

We know of both these companies, and are linked with them..see the attached press release out of India overnight...Tech

 
  • Like
  • Fire
  • Love
Reactions: 57 users

stuart888

Regular
from @Bravo 's post:

That data all comes from a variety of sensors around the vehicle, a few of which will be new to future S-Class vehicles that have been ordered with the new Drive Pilot system. While the company wouldn’t disclose specific costs of the system, representatives did say that it will cost as much as their top-of-the-line Burmester audio system. That audio system on the S-Class is a $6,700 option alone, yet requires the addition of a separate $3,800 package, bringing the rough total to around $10,500. That’s getting close to the cost of Tesla’s “Full-Self Driving” system, which currently is a $12,000 option.

The conditional Level 3 Drive Pilot system builds on the hardware and software used by Mercedes’ Level 2 ADAS system known as Distronic. It adds a handful of additional advanced sensors as well as software to support the features. Key hardware systems that will be added to future S-Class vehicles configured with the Drive Pilot upgrade include an advanced lidar system developed by Valeo SA, a wetness sensor in the wheel well to determine moisture on the road, rear-facing cameras and microphones to detect emergency vehicles, and a special antenna array located at the rear of the sunroof to help with precise GPS location.

The Valeo lidar system is more advanced than what is on the current generation of S-Class, in that it scans at a rate of 25 times per second at a range of 200 meters (approximately 650 feet). This is the second generation of the system, according to the Valeo spokesperson at the event. The system sends out lasers, which then create points in space to help the AI classify the type of object in and around the path of the vehicle, whether it’s human, animal, vehicle, tree, or building. From there, the AI uses data from the other sensors around the car to determine more than 400 different projected paths for itself and the potential paths for the vehicles, pedestrians, and motorcyclists around it, and chooses the safest route through
.

We previously discussed the "advanced lidar system developed by Valeo" and found a patent for a LiDaR zoom feature using a NN, but there was no indication that Veleo developed the NN. We are friends with Valeo, and we have a sweet spot for LiDaR. So we've been drawing a reasonable inference in the absence of express evidence.

US2021166090A1 DRIVING ASSISTANCE FOR THE LONGITUDINAL AND/OR LATERAL CONTROL OF A MOTOR VEHICLE



View attachment 10414



The invention relates to a driving assistance system (3) for the longitudinal and/or lateral control of a motor vehicle, comprising an image processing device (31a) trained beforehand using a learning algorithm and configured so as to generate, at output, a control instruction (Scom1) for the motor vehicle from an image (Im1) provided at input and captured by an on-board digital camera (2); a digital image processing module (32) configured so as to provide at least one additional image (Im2) at input of an additional device (31b), identical to the device (31a), for parallel processing of the image (Im1) captured by the camera (2) and said at least one additional image (Im2), such that said additional device (31b) generates at least one additional control instruction (Scom2) for the motor vehicle, said additional image (Im2) resulting from at least one geometric and/or radiometric transformation performed on said captured image (Im1), and a digital fusion module (33) configured so as to generate a resultant control instruction (Scom) on the basis of said control instruction (Scom1) and of said a
Sensor Fusion seems like another way of saying our Eco-System of Partners. I started thinking about the Park Assist functionality. It is going to use the same camera, lidar and sensors, but might Not be active until the car is in Reverse or Going less that say 3 mph. Park Assist by Akida would be basically dormant energy wise in an EV, until the Sensor Fusion logic shifts priority to Park Assist by Akida.

The point is the Sensor Fusion, that data only needs to get sent to whatever Functions that do the work given the driving situation. The Rain Alert Auto Windshield Wiper stuff is the same, perfect for the Spiking Neural Network, low power EV brain tasks.

Use Cases are so exhaustive for the EV low battery usage segment. Same for Drones, Toys, and Smart Fabric Medical Devices. Go Brainchip team, employees, you can win this industry!
 
  • Like
  • Fire
  • Love
Reactions: 32 users
Was gonna say bit surprised no non sensitive Ann this morn but I'm not haha

Would've thought given the original Vorago Ann outlined the use case eg NASA Ph I etc that the outcome of said use would at least be advised via non sensitive Ann especially as FF found already it in the public domain.

Imo is kinda like a biotech saying oh...we partnering in a clinical trial and no commercial outcome yet but if want results go find it on clinical trials database.
 
  • Like
  • Love
Reactions: 15 users

Boab

I wish I could paint like Vincent
Big discrepancy between buyers and sellers today.

Screen Shot 2022-07-04 at 8.16.18 am.png
 
  • Like
  • Love
Reactions: 23 users
I aren't aware if this has been posted already, but there are certainly a rather large number of semi-conductor/tech companies
partnering up as we move into the next phase in the AI space.

We know of both these companies, and are linked with them..see the attached press release out of India overnight...Tech

Mr. One Percent is back.

Just one tiny percent of the World Wide semiconductor market by 2030:

“Meanwhile, the worldwide semiconductor market is expected to reach $1 trillion by 2030, up from $440 billion in 2020”

My opinion only DYOR
FF

AKIDA BALLISTA
 
  • Like
  • Love
  • Haha
Reactions: 45 users

Shadow59

Regular
Mr. One Percent is back.

Just one tiny percent of the World Wide semiconductor market by 2030:

“Meanwhile, the worldwide semiconductor market is expected to reach $1 trillion by 2030, up from $440 billion in 2020”

My opinion only DYOR
FF

AKIDA BALLISTA
FF 🤣🤣🤣we are talking Mr BIG 1%;)
 
  • Like
  • Love
  • Haha
Reactions: 9 users

Slymeat

Move on, nothing to see.
F/F, not only does Mr Bean look more intelligent, it appears that he is. GLTA
Since Rowan Atkinson (AKA Mr Bean) has a degree in Electrical and Electronic Engineering and a MSc in Electrical Engineering, I expect he does have a good chance of understanding Akida.

Mr Bean is hence quite an appropriate picture to use.
 
  • Like
  • Love
  • Wow
Reactions: 16 users
Quick TA thought.

Personally like to see a break through mid 90's confirm poss small double bottom then ~ 1.02 / 1.04 to clear prev supp area now resist.

Failing that then mid / high 70's potential supp revisit & ranging.

BRN D BAR 4.7.22 (AM).jpg



BRN W BAR 4.7.22 (AM).jpg
 
  • Like
Reactions: 19 users

LexLuther77

Regular
SP action feels weak. Red days we get nailed, green days we hardly move :unsure:... I wonder if its a reflection of the up and coming 4c?
 
  • Like
  • Thinking
Reactions: 7 users

Mt09

Regular
SP action feels weak. Red days we get nailed, green days we hardly move :unsure:... I wonder if its a reflection of the up and coming 4c?
I have no expectations for the next 2 4c’s, after that we need to start seeing some runs on the board.
 
  • Like
  • Fire
Reactions: 17 users

Bravo

If ARM was an arm, BRN would be its biceps💪!

CDAO, Air Force Test AI-Enabled ‘Smart Sensor’ Autonomous Tech on UAS; Lt. Gen. Michael Groen Quoted


perhaps the next articel ist something for our specialists?


Wow @Sirod69! Great find!

The first article says that the Department of Defense "partnered with the Johns Hopkins University Applied Physics Laboratory to work on the software package, dubbed Smart Sensor “Brain,” and General Atomics on the platform autonomy framework using open architectures".

This caused me to remember reading about Johns Hopkins University's longstanding connections with NASA. In September 20202, NASA and the DOD announced an agreement to collaborate more closely in space. The agreement states "While advancing plans for unprecedented lunar exploration under the Artemis program, NASA also is building on a longstanding partnership with the Defense Department according to a memorandum of understanding announced today by Space Force Gen. John W. "Jay" Raymond and NASA Administrator Jim Bridenstine.

It also says "The agreement commits DOD and NASA to broad collaboration in areas that include human spaceflight, space policy, space transportation, standards and best practices for safe operations in space, scientific research and planetary defense."

In relation to technology it says"But besides operating together in the space domain, the two organizations share the space industrial base, research and development and science and technology that benefit both, he said".

There's link in your first article that says "Smart Sensor, and if you click on it, it takes you to this document which I think makes for some fascinating reading, PARTICULARLY in light of the connections that have been ongoing since NASA first started taking a good, hard look at AKIDA:
  • BrainChip + NASA,
  • NASA + John Hopkins Uni
  • NASA + DOD
I'll be a monkey's uncle if AKIDA is not involved somehow in the “Smart Sensor” unmanned aerial system (UAS) and artificial intelligence (AI)-enabled autonomy capability" or the smart sensors/AI they are referring to in this document.


1.png



2 am.png




 
  • Like
  • Fire
  • Love
Reactions: 54 users

VictorG

Member
I have no expectations for the next 2 4c’s, after that we need to start seeing some runs on the board.
I don't expect royalty revenue in this 4c and possibly the next however I am expecting a significant increase in tech support revenue.
I theorise that royalty revenue in any future 4c's should be expected on the back of an increase in tech support revenue in prior 4c's.
 
  • Like
  • Love
  • Thinking
Reactions: 24 users
SP action feels weak. Red days we get nailed, green days we hardly move :unsure:... I wonder if its a reflection of the up and coming 4c?
Do not forget that the atmosphere around all things economic is toxic according to the press around the world.

According to the press we will not be able to afford fuel for our cars, eat lettuce in winter, pay our mortgages, or vote for an honest intelligent politician ever, ever again.

Almost forgot the most devastating event bananas are in short supply and Puto’s war that is not a war will last forever. And if that’s not enough China is about to invade and enslave our children.

Then we have floods and houses washed away that were built in locations where Governor Macquarie directed houses should not be built because of the flooding he observed two centuries ago and which we in our wisdom ignored.

The point being there is very little retail confidence. We have even seen it on these threads.

Market players rely upon sentiment. They use good sentiment to cause prices to run too far up and bad sentiment to cause prices to run down to far. They trade on human frailty.

This however is not a new thing as the following highlights:

SAID HANRAHAN by John O'Brien​

"We'll all be rooned," said Hanrahan,
In accents most forlorn,
Outside the church, ere Mass began,
One frosty Sunday morn.


The congregation stood about,
Coat-collars to the ears,
And talked of stock, and crops, and drought,
As it had done for years.


"It's looking crook," said Daniel Croke;
"Bedad, it's cruke, me lad,
For never since the banks went broke
Has seasons been so bad."


"It's dry, all right," said young O'Neil,
With which astute remark
He squatted down upon his heel
And chewed a piece of bark.


And so around the chorus ran
"It's keepin' dry, no doubt."
"We'll all be rooned," said Hanrahan,
"Before the year is out."


"The crops are done; ye'll have your work
To save one bag of grain;
From here way out to Back-o'-Bourke
They're singin' out for rain.


"They're singin' out for rain," he said,
"And all the tanks are dry."
The congregation scratched its head,
And gazed around the sky.


"There won't be grass, in any case,
Enough to feed an ass;
There's not a blade on Casey's place
As I came down to Mass."


"If rain don't come this month," said Dan,
And cleared his throat to speak -
"We'll all be rooned," said Hanrahan,
"If rain don't come this week."


A heavy silence seemed to steal
On all at this remark;
And each man squatted on his heel,
And chewed a piece of bark.


"We want an inch of rain, we do,"
O'Neil observed at last;
But Croke "maintained" we wanted two
To put the danger past.


"If we don't get three inches, man,
Or four to break this drought,
We'll all be rooned," said Hanrahan,
"Before the year is out."


In God's good time down came the rain;
And all the afternoon
On iron roof and window-pane
It drummed a homely tune.


And through the night it pattered still,
And lightsome, gladsome elves
On dripping spout and window-sill
Kept talking to themselves.


It pelted, pelted all day long,
A-singing at its work,
Till every heart took up the song
Way out to Back-o'-Bourke.


And every creek a banker ran,
And dams filled overtop;
"We'll all be rooned," said Hanrahan,
"If this rain doesn't stop."


And stop it did, in God's good time;
And spring came in to fold
A mantle o'er the hills sublime
Of green and pink and gold.


And days went by on dancing feet,
With harvest-hopes immense,
And laughing eyes beheld the wheat
Nid-nodding o'er the fence.


And, oh, the smiles on every face,
As happy lad and lass
Through grass knee-deep on Casey's place
Went riding down to Mass.


While round the church in clothes genteel
Discoursed the men of mark,
And each man squatted on his heel,
And chewed his piece of bark.


"There'll be bush-fires for sure, me man,
There will, without a doubt;
We'll all be rooned," said Hanrahan,
"Before the year is out."

Around the Boree Log and Other Verses, 1921
 
  • Like
  • Love
  • Fire
Reactions: 44 users
Hey Dingo, I think Sean also said that the 7/8 million was staff costs per year - not per quarter or about 2 million per quarter and that when they ramp up to 100 staff the cost would be about 12 million or 1 million per month (3 million per quarter) and that’s where they see their requirements for the foreseeable future.

Hey Dingo, I think Sean also said that the 7/8 million was staff costs per year - not per quarter or about 2 million per quarter and that when they ramp up to 100 staff the cost would be about 12 million or 1 million per month (3 million per quarter) and that’s where they see their requirements for the foreseeable future.
That's an average of 120 grand per employee, so sounds about right 👍

Or an average, of 73 grand per new employee (if 4 million for 55).
 
Last edited:
  • Like
  • Love
Reactions: 10 users
The only near term announcement that we have been told about is that the next generation IP will become commercially available - CEO Sean Hehir 2022 AGM.

My crazy what if theory.

What if the next generation IP is in testing with a major automobile manufacturer or OEM and the release of the IP commercially is being timed to coincide with a major automotive or OEM purchasing a licence announcement.

The only evidence I have for this wild theory is that the AKD1000 IP was released to select customers well before its full commercial release. It was hypothesised at the time that Valeo was at least one of these early access customers in 2018/19.

My wild speculation only so DYOR
FF

AKIDA BALLISTA
 
  • Like
  • Fire
  • Love
Reactions: 57 users
Army Technology
  1. RESEARCH REPORTS
May 2, 2022updated 23 May 2022 4:34am

BrainChip Advances AI and ML to Edge Computing​

shutterstock_518160535.jpg



Credit: spainter_vfx/Shutterstock

Concept: American technology startup BrainChip and semiconductor startup SiFive have partnered to combine their technologies to offer chip designers optimized AI and ML for edge computing. BrainChip’s Akida technology and SiFive’s multi-core capable RISC-V processors have been combined to create a highly efficient solution for integrated edge AI computation.
Nature of Disruption: With high performance, ultra-low power, and on-chip learning, BrainChip’s Akida is an advanced neural networking processor architecture that takes AI to the edge. SiFive Intelligence solutions combine software and hardware to accelerate AI or ML applications with its highly configurable multi-core, multi-cluster capable design. For AI and ML workloads, SiFive Intelligence-based processors can provide industry-leading performance and efficiency. The highly programmable multi-core, multi-cluster capable design can be used for a range of applications requiring high-throughput, single-thread performance while operating within the most stringent power and area limitations. Akida acts like a human brain, analyzing only the most important sensor inputs at the time of acquisition and processing data with unmatched efficiency, precision, and energy efficiency. BrainChip’s technology is based on its SNAP (spiking neuron adaptive processor) technology, which it licenses to other companies. RISC-V is an open instruction-set computing architecture based on well-known RISC ideas. It provides the high data processing speed that all new and heavier applications require.
Outlook: The duo aims to help companies looking to seamlessly integrate an optimized processor with dedicated ML accelerators, which are required for the demanding requirements of edge AI computing. They plan to use Akida, BrainChip’s specialized, differentiated AI engine, in conjunction with high-performance RISC-V processors like the SiFive Intelligence Series to achieve this. For organizations looking to enter the neuromorphic semiconductor chip market, SNAP provides a development option. It is a key feature of neuromorphic semiconductor circuits that allows for a variety of applications, including cybersecurity, gaming, robotics, and stock market forecasting.
 
  • Like
  • Love
  • Fire
Reactions: 43 users
Top Bottom