BRN Discussion Ongoing

I have a feeling thats an old one that pops up under a current date every couple of months ?

Yeah, I compared the one from January 2022 to this one. They are quite similar.

The main difference is that the wording appears to say they have moved from an EAP to a partner which if that is accurate then it is pretty awesome.

Still no $$$$ attached so I don’t expect any ASX announcement.

:)
 
  • Like
  • Love
  • Fire
Reactions: 32 users

Diogenese

Top 20

BrainChip selected by U.S. Air Force Research Laboratory to develop AI-based radar​

2023-02-11 16:21 HKT

https://inf.news/en/science/791bf988657af0e134c475ca748067c6.html

BrainChip, the world's first commercial producer of neuromorphic artificial intelligence chips and IP, today announced that Information Systems Laboratories (ISL) is developing an AI-based radar for the U.S. Air Force Research Laboratory (AFRL) based on its Akida™ Neural Network Processor Research solutions.

ISL is a specialist in expert research and complex analysis, software and systems engineering, advanced hardware design and development, and high-quality manufacturing for a variety of clients worldwide.

ISL focuses on areas such as advanced signal processing, space exploration, subsea technology, surveillance and tracking, cybersecurity, advanced radar systems and energy independence. As a member of the BrainChip Early Partnership Program (EAP), ISL will be able to evaluate boards for Akida devices, software and hardware support, and dedicated engineering resources.
I'm having trouble getting my head around this:

https://www.islinc.com/isl-selected-in-first-round-of-armys-xtechsearch7

ISL was selected as a winner in the first round of the Army’s xTechSearch7 (https://www.arl.army.mil/xtechsearch/competitions/xtechsearch7.html ). We proposed a novel Unmanned Air System (UAS) that leverages neuromorphic computing to autonomously search and track particular ground vehicles using minimal RF communications. In the Army CONOPS, this UAS can also be used to autonomously detect and track military vehicles employing Concealment, Camouflage, Deception (CCD). The associated commercial UAS system is intended for first responders for a variety of applications including wide area search in an Amber Alert scenario.



BrainChip selected by U.S. Air Force Research Laboratory to develop AI-based radar​

2023-02-11 16:21 HKT

One for the Army and one for the Air force????
 
  • Like
  • Fire
  • Haha
Reactions: 26 users
I'm having trouble getting my head around this:

https://www.islinc.com/isl-selected-in-first-round-of-armys-xtechsearch7

ISL was selected as a winner in the first round of the Army’s xTechSearch7 (https://www.arl.army.mil/xtechsearch/competitions/xtechsearch7.html ). We proposed a novel Unmanned Air System (UAS) that leverages neuromorphic computing to autonomously search and track particular ground vehicles using minimal RF communications. In the Army CONOPS, this UAS can also be used to autonomously detect and track military vehicles employing Concealment, Camouflage, Deception (CCD). The associated commercial UAS system is intended for first responders for a variety of applications including wide area search in an Amber Alert scenario.



BrainChip selected by U.S. Air Force Research Laboratory to develop AI-based radar​

2023-02-11 16:21 HKT

One for the Army and one for the Air force????

Nothing like working as a team!

That’s gold, LOL

You might as well throw the Navy in their as well via BascomHunter Technologies:

1676106577416.png
I understand BH are working with the Navy and also doing work with Akida and radars.
 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 17 users

Evermont

Stealth Mode
Here is the original Business Wire.

Information Systems Labs Joins BrainChip Early Access Program

January 09, 2022 05:30 PM Eastern Standard Time
LAGUNA HILLS, Calif.--(BUSINESS WIRE)--BrainChip Holdings Ltd (ASX: BRN, OTCQX: BRCHF, ADR: BCHPY), a leading provider of ultra-low power, high performance artificial intelligence technology and the world’s first commercial producer of neuromorphic AI chips and IP, today announced that Information Systems Laboratories, Inc. (ISL) is developing an AI-based radar research solution for the Air Force Research Laboratory (AFRL) based on its Akida™ neural networking processor.
“As part of BrainChip’s EAP, we’ve had the opportunity to evaluate firsthand the capabilities that Akida provides to the AI ecosystem”
Tweet this
ISL is an employee-owned technology development corporation that performs expert research and complex analysis, software and systems engineering, advanced hardware design and development, and high-quality specialty fabrication for a variety of customers worldwide. ISL specializes in the areas of advanced signal processing, space exploration, undersea technologies, surveillance and tracking, cyber security, advanced radar systems, and energy independence. As a member of BrainChip’s Early Access Program, ISL will be able to evaluate boards with the Akida device, software and hardware support and dedicated engineering resources.

“As part of BrainChip’s EAP, we’ve had the opportunity to evaluate firsthand the capabilities that Akida provides to the AI ecosystem,” said Jamie Bergin, Senior VP, Manager of Research, Development and Engineering Solutions Division at ISL.

BrainChip brings AI to the edge in a way that existing technologies are not capable. The Akida processor is ultra-low power with high performance, supporting the growth of edge AI technology by using a neuromorphic architecture, a type of artificial intelligence that is inspired by the biology of the human brain. Devices currently available to BrainChip’s EAP customers provide partners with capabilities to realize significant gains in power consumption, design flexibility and true learning at the Edge.

“ISL has decided to use Akida and Edge-based learning as a tool to incorporate into their portfolio of research engineering and engineering solutions in large part due to our innovative capabilities and production-ready status that provides go-to-market advantages,” said Sean Hehir, BrainChip CEO. “We are pleased to be included as the AI- and Edge-based learning component of ISL’s research sponsored by AFRL. We feel that the combination of technologies will help expedite its deployment into the field.”

Akida is currently available now to be licensed as IP, as well as available for orders for production release in silicon. Its focus is on low power and high-performance, enabling sensory processing, for applications in Beneficial AI, as well as applications including Smart Healthcare, Smart Cities, Smart Transportation and Smart Home. Those interested in learning how BrainChip has solved the problems inherent in moving AI out of the data center to the Edge where data is created can visit https://brainchipinc.com/technology/ for more information.

About BrainChip Holdings Ltd (ASX: BRN, OTCQX: BRCHF, ADR: BCHPY)
BrainChip is a global technology company that is producing a groundbreaking neuromorphic processor that brings artificial intelligence to the edge in a way that is beyond the capabilities of other products. The chip is high performance, small, ultra-low power and enables a wide array of edge capabilities that include on-chip training, learning and inference. The event-based neural network processor is inspired by the spiking nature of the human brain and is implemented in an industry standard digital process. By mimicking brain processing BrainChip has pioneered a processing architecture, called Akida™, which is both scalable and flexible to address the requirements in edge devices. At the edge, sensor inputs are analyzed at the point of acquisition rather than through transmission via the cloud to a data center. Akida is designed to provide a complete ultra-low power and fast AI Edge Network for vision, audio, olfactory and smart transducer applications. The reduction in system latency provides faster response and a more power efficient system that can reduce the large carbon footprint of data centers.

About AFRL and AFWERX
AFRL and AFWERX have partnered to streamline the Small Business Innovation Research process in an attempt to speed up the experience, broaden the pool of potential applicants and decrease bureaucratic overhead. Beginning in SBIR 18.2, and now in J203-CS01, the Air Force has begun offering 'The Open Topic' SBIR/STTR program that is faster, leaner and open to a broader range of innovations.

Additional information is available at https://www.brainchipinc.com
Follow BrainChip on Twitter: https://www.twitter.com/BrainChip_inc
Follow BrainChip on LinkedIn: https://www.linkedin.com/company/7792006
Follow BrainChip on YouTube: BrainChipInc

Contacts​

Media Contact
Mark Smith
JPR Communications
818-398-1424
marks@jprcom.com
Investor Contact
Mark Komonoski
Integrous Communications
Direct: 877-255-8483
Mobile: 403-470-8384
mkomonoski@integcom.us

 
  • Like
  • Fire
  • Love
Reactions: 18 users

Evermont

Stealth Mode
I think ISL was a quiet entry into Defense.

Creeping Best Friends GIF by Mickey Mouse
 
  • Haha
  • Like
  • Love
Reactions: 13 users
Nothing like working as a team!

That’s gold, LOL

You might as well throw the Navy in their as well via BascomHunter Technologies:

View attachment 29306 I understand BH are working with the Navy and also doing work with Akida and radars.


Whilst on the topic of BH, I can’t remember if this info about acquiring telgaas was posted on the forum:


And an interesting radar patent which appears when searching for telgaas patents:

 
  • Like
  • Fire
  • Love
Reactions: 12 users

equanimous

Norse clairvoyant shapeshifter goddess

BrainChip selected by U.S. Air Force Research Laboratory to develop AI-based radar​

2023-02-11 16:21 HKT

https://inf.news/en/science/791bf988657af0e134c475ca748067c6.html

BrainChip, the world's first commercial producer of neuromorphic artificial intelligence chips and IP, today announced that Information Systems Laboratories (ISL) is developing an AI-based radar for the U.S. Air Force Research Laboratory (AFRL) based on its Akida™ Neural Network Processor Research solutions.

ISL is a specialist in expert research and complex analysis, software and systems engineering, advanced hardware design and development, and high-quality manufacturing for a variety of clients worldwide.

ISL focuses on areas such as advanced signal processing, space exploration, subsea technology, surveillance and tracking, cybersecurity, advanced radar systems and energy independence. As a member of the BrainChip Early Partnership Program (EAP), ISL will be able to evaluate boards for Akida devices, software and hardware support, and dedicated engineering resources.
1676109060343.png

1676109090693.png
 
  • Like
  • Fire
Reactions: 9 users

equanimous

Norse clairvoyant shapeshifter goddess
I remember Vietnam associated with SNN

A low-power, high-accuracy with fully on-chip ternary weight hardware architecture for Deep Spiking Neural Networks​

Author links open overlay panelDuy-Anh Nguyen a

Acknowledgment​

This work is partly supported by Vietnam National University, Hanoi (VNU) through research project “Investigate and develop a secure IoT platform” (Secu-IoT).


Duy-Anh Nguyen is currently a Ph.D. student at VNU University of Engineering and Technology and Joint Technology Innovation and Research Centre between Vietnam National University Hanoi (VNU) and the University of Technology Sydney. He received his Bachelor degree from Nanyang Technological University, Singapore and Master degree from Western Jiao Tong University, China. His research interest includes system-on-chip design, hardware accelerator for artificial intelligent.

 
  • Like
  • Fire
Reactions: 8 users

BaconLover

Founding Member

Application of SNN in Vehicle Field​

2023-02-11 17:59 HKT

Depots are using neuromorphic technologies to implement AI functions such as keyword recognition, driver attention, and passenger behavior monitoring.
Mimicking biological brain processes is tantalizing because it promises to enable advanced functionality without a significant increase in power consumption, which is EV-friendly. Neuromorphic computing and perception are also expected to bring these advantages, such as extremely low latency, enabling real-time decision-making in some cases. This combination of low latency and high energy efficiency is very attractive.
Spike Network
The truth is, there's still something we don't know about how the human brain works. However, cutting-edge research has shown that neurons communicate by sending each other electrical signals called spikes, and that the sequence and timing of the spikes (rather than their size) are key factors. Mathematical models of how neurons respond to these spikes are still being studied. But many scientists agree that if multiple spikes arrive at adjacent neurons at the same time (or in very rapid succession), it means that the information those spikes represent are correlated, thus causing the neuron to fire a spike.
This is in contrast to artificial neural networks based on deep learning (the mainstream AI today), where information travels through the network in a regular rhythm; that is, the information entering each neuron is represented as a numerical value, rather than based on time.
Making a spike-based artificial system is not easy. Besides we don't know how neurons work, there is no consensus on the best way to train spike neural networks. Backpropagation requires computing derivatives, which is not possible with spikes. Some companies approximate the derivative of the spike in order to use backpropagation (like SynSense), and some use another technique called STDP (spike timing dependent plasticity), which is closer to how biological brains function. However, STDP is not yet mature as a technique (BrainChip uses this method for one-shot learning at the edge). It is also possible to take a deep learning CNN, trained by backpropagation in the normal way, and convert it to run in the spike domain (another technique used by BrainChip).
SynSense Speck
SynSense is working with BMW to advance the integration of neuromorphic chips in smart cockpits and explore related areas together. BMW will evaluate SynSense's Speck SoC, which combines SynSense's neuromorphic vision processor and Inivation's 128x128-pixel event camera. Can be used to capture visual information in real-time, identify and detect objects, and perform other vision-based detection and interaction functions.
Dylan Muir, vice president of global research operations at SynSense, said: "When BMW replaces RGB cameras with Speck modules for visual perception, they can replace not only sensors, but also a lot of GPU or CPU computing required to process standard RGB visual streams."
Using event-based cameras provides a higher dynamic range than standard cameras, which is beneficial for use in extreme lighting conditions inside and outside the vehicle.
BMW will explore the use of neuromorphic technology in cars, including monitoring driver attention and passenger behavior through the Speck module.
"In the coming months, we will explore more applications inside and outside the vehicle," Muir said.
SynSense's neuromorphic vision processors have a fully asynchronous digital architecture. Each neuron uses integer logic with 8-bit synaptic weights, 16-bit neuron states, 16-bit thresholds, and unit input-output spikes. Neurons use a simple integrate-and-fire model, where when the neuron fires a simple 1-bit spike, the input spike is combined with the neuron's synaptic weights until a threshold is reached. Overall, the design is a balance between complexity and computational efficiency, Muir said.
Application of SNN in Vehicle Field

SynSense's electronic neurons are based on the integrate-and-fire model
SynSense's digital chips are designed to process event-based CNNs, with each layer processed by a different core. The kernel runs asynchronously and independently; the entire processing pipeline is event-driven.
"Our Speck modules run in real-time with low latency," Muir said. "We can manage effective inference rates above 20Hz at less than 5mW power consumption. This is much faster than using traditional low-power computing on standard RGB video streams. ."
While SynSense and BMW will initially explore neuromorphic use cases in smart cockpits, it has potential for other automotive applications as well.
"First, we'll explore non-safety-critical use cases, and we're planning future versions of Speck with higher resolution, as well as improvements to our DynapCNN vision processor, which will interface with high-resolution sensors," Muir said. We plan for these future technologies It will support advanced automotive applications such as autonomous driving, emergency braking, etc."
Application of SNN in Vehicle Field

SynSense and Inivation Speck module, an event camera-based module containing sensors and processors
BrainChip Akida
Mercedes-Benz's EQXX concept car, which debuted at CES earlier this year, uses BrainChip's Akida neuromorphic processor for in-vehicle keyword recognition. Billed as "the most efficient car Mercedes has ever made," the car utilizes neuromorphic technology that consumes less power than a deep learning-based keyword spotting system. That's crucial for a car with a range of 620 miles, or 167 miles more than Mercedes' flagship electric car, the EQS.
Mercedes said at the time that BrainChip's solution was five to 10 times more efficient than traditional voice controls at recognizing the wake word "Hey Mercedes."
Application of SNN in Vehicle Field

Mercedes said, “Although neuromorphic computing is still in its infancy, such systems will soon be on the market within a few years. When applied at scale throughout vehicles, they have the potential to radically reduce the amount of effort required to run the latest AI technologies. power consumption."
BrainChip's CMO Jerome Nadel said: "Mercedes is focused on big issues like battery management and transmission, but every milliwatt counts, and when you think about energy efficiency, even the most basic reasoning, like finding keywords, matters. important."
A typical car could have as many as 70 different sensors by 2022, Nadel said. For cockpit applications, these sensors can enable face detection, gaze assessment, emotion classification, and more.
He said: “From a system architecture perspective, we can do a 1:1 approach where there is a sensor that will do some preprocessing and then the data will be forwarded. The AI will do inference near the sensor...it will Instead of the full array of data from sensors, the inference metadata is passed forward.”
The idea is to minimize the size and complexity of packets sent to AI accelerators, while reducing latency and minimizing power consumption. Each vehicle will likely have 70 Akida chips or sensors with Akida technology, each of which will be "low-cost parts that won't notice them at all," Nadel said. He noted that attention needs to be paid to the BOM of all these sensors.
Application of SNN in Vehicle Field

BrainChip expects to have its neuromorphic processor next to every sensor on the vehicle
Going forward, Nadel said, neuromorphic processing will also be used in ADAS and autonomous driving systems. This has the potential to reduce the need for other types of power-hungry AI accelerators.
"If every sensor could have Akida configured on one or two nodes, it would do adequate inference, and the data passed would be an order of magnitude less, because that would be inference metadata...that would affect the servers you need," he said. power."
BrainChip's Akida chip accelerates SNNs (spike neural networks) and CNNs (by converting to SNNs). It's not tailored for any specific use case or sensor, so it can be paired with visual sensing for face recognition or people detection, or other audio applications like speaker ID. BrainChip also demonstrated Akida's smell and taste sensors, although it's hard to imagine how these could be used in cars (perhaps to detect air pollution or fuel quality through smell and taste).
Akida is set up to handle SNNs or deep learning CNNs that have been transformed into SNNs. Unlike the native spike network, the transformed CNN preserves some spike-level information, so it may require 2 or 4 bits of computation. However, this approach allows exploiting the properties of CNNs, including their ability to extract features from large datasets. Both types of networks can be updated at the edge using STDP. In the case of Mercedes-Benz, this might mean retraining the network after deployment to discover more or different keywords.
Application of SNN in Vehicle Field

According to Autocar, Mercedes-Benz confirmed that "many innovations" from the EQXX concept car, including "specific components and technologies," will be used in the production model. There's no word yet on whether new Mercedes-Benz models will feature artificial brains.
 
  • Like
  • Fire
  • Love
Reactions: 38 users

equanimous

Norse clairvoyant shapeshifter goddess
I'm having trouble getting my head around this:

https://www.islinc.com/isl-selected-in-first-round-of-armys-xtechsearch7

ISL was selected as a winner in the first round of the Army’s xTechSearch7 (https://www.arl.army.mil/xtechsearch/competitions/xtechsearch7.html ). We proposed a novel Unmanned Air System (UAS) that leverages neuromorphic computing to autonomously search and track particular ground vehicles using minimal RF communications. In the Army CONOPS, this UAS can also be used to autonomously detect and track military vehicles employing Concealment, Camouflage, Deception (CCD). The associated commercial UAS system is intended for first responders for a variety of applications including wide area search in an Amber Alert scenario.



BrainChip selected by U.S. Air Force Research Laboratory to develop AI-based radar​

2023-02-11 16:21 HKT

One for the Army and one for the Air force????
Concealment, Camouflage, Deception (CCD)

 
  • Wow
  • Like
  • Fire
Reactions: 9 users

Diogenese

Top 20
Concealment, Camouflage, Deception (CCD)




WO2019161478A1 DISPLAY SYSTEM

1676110570102.png


The present invention relates to display systems that use materials made from various arrangements of lenses and other optical materials. Careful design and use of these materials can be used to achieve display systems with many desirable visual effects having applicability in image and video displays, virtual reality, immersive environments, as well as in architecture, art, entertainment, and interactive systems.
 
  • Like
  • Love
Reactions: 11 users

Dhm

Regular
French speaking clues of love, maybe Brainchip too! In the middle of this fancy meeting, the guy sends it to France!!! Just does not happen in the USA. Big Clue of love.

From 33:58 French, very unique. Yeah Brainchip the smarts behind.

View attachment 29282


I have argued this point to death over the last month, thinking that we are going to deliver, but as @chapman89 recently relayed, we are not in the latest Prophecy products. And it kills me to say it.


Screen Shot 2023-02-11 at 9.22.30 pm.png
 
Last edited:
  • Like
  • Sad
  • Love
Reactions: 13 users

MDhere

Regular
i..trying to organise another couple trips to sydney - melb. does anyone know exact date of next agm? and any sydney siders free evening 20th march and Melbournites 21st to 24th march?
 
  • Love
  • Like
Reactions: 2 users

Diogenese

Top 20
I'm having trouble getting my head around this:

https://www.islinc.com/isl-selected-in-first-round-of-armys-xtechsearch7

ISL was selected as a winner in the first round of the Army’s xTechSearch7 (https://www.arl.army.mil/xtechsearch/competitions/xtechsearch7.html ). We proposed a novel Unmanned Air System (UAS) that leverages neuromorphic computing to autonomously search and track particular ground vehicles using minimal RF communications. In the Army CONOPS, this UAS can also be used to autonomously detect and track military vehicles employing Concealment, Camouflage, Deception (CCD). The associated commercial UAS system is intended for first responders for a variety of applications including wide area search in an Amber Alert scenario.



BrainChip selected by U.S. Air Force Research Laboratory to develop AI-based radar​

2023-02-11 16:21 HKT

One for the Army and one for the Air force????
The timelines are a little fuzzy.

The submissions for the Army project closed in November 2022 ...

https://www.arl.army.mil/xtechsearch/competitions/xtechsearch7.html

Total Money Offered: Prize Money: Up to $800K; Phase I SBIR Award: Up to $2.5M
Challenge Topic: Open-Topic; Army Modernization
Partner Agency: The Office of the United States Assistant Secretary of the Army (Acquisition, Logistics, and Technology) (ASA(ALT)
Submission Dates: September 27, 2022 — November 6, 2022
Winner Announced: June 30, 2023
Who Can Submit: United States-based sole proprietors and small businesses
.


... but the BrainChip/ISL association was not announced until 20230109 in relation to the AFRL.

https://brainchip.com/information-systems-labs-early-access-program/

Information Systems Labs Joins BrainChip Early Access Program​

Laguna Hills, Calif. – January 9, 2022 – BrainChip Holdings Ltd (ASX: BRN, OTCQX: BRCHF, ADR: BCHPY), a leading provider of ultra-low power, high performance artificial intelligence technology and the world’s first commercial producer of neuromorphic AI chips and IP, today announced that Information Systems Laboratories, Inc. (ISL) is developing an AI-based radar research solution for the Air Force Research Laboratory (AFRL) based on its Akida™ neural networking processor.

But this announcement uses the present tense:
"(ISL) is developing an AI_based radar solution for the Air Force Research Laboratories (AFRL) based on Akida",
which indicates a previously established relationship as of 20230109.

So, while the Army/ISL announcement does not expressly refer to Akida, it is possible/likely that ISL was involved with BRN at the time they entered the Army programme, as the entries only closed 2 months earlier, and ISL had entered the EAP some time before 20230109.
 
  • Like
  • Fire
Reactions: 13 users

Diogenese

Top 20
I have a feeling thats an old one that pops up under a current date every couple of months ?
Hi Terroni,

You are right, the Hong Kong timestamp indicates it is a rehash. However, the concurrent ILS/Army announcement may well include Akida.
 
  • Like
  • Fire
Reactions: 14 users

Diogenese

Top 20
Concealment, Camouflage, Deception (CCD)


Thanks equanimous,

The penny dropped - ISL were flying under the radar under the EAP NDA cloak of invisibility until 20230109, so we did not get a mention re the Army project.
 
  • Like
  • Haha
Reactions: 11 users
D

Deleted member 118

Guest
Maybe one of our 1st lines of true revenue

 
  • Like
  • Fire
Reactions: 9 users

Glen

Regular
I'm having trouble getting my head around this:

https://www.islinc.com/isl-selected-in-first-round-of-armys-xtechsearch7

ISL was selected as a winner in the first round of the Army’s xTechSearch7 (https://www.arl.army.mil/xtechsearch/competitions/xtechsearch7.html ). We proposed a novel Unmanned Air System (UAS) that leverages neuromorphic computing to autonomously search and track particular ground vehicles using minimal RF communications. In the Army CONOPS, this UAS can also be used to autonomously detect and track military vehicles employing Concealment, Camouflage, Deception (CCD). The associated commercial UAS system is intended for first responders for a variety of applications including wide area search in an Amber Alert scenario.



BrainChip selected by U.S. Air Force Research Laboratory to develop AI-based radar​

2023-02-11 16:21 HKT

One for the Army and one for the Air force????
They need to implement Akida fast so they can track all those Balloons 🎈
 
  • Like
  • Haha
Reactions: 15 users

Glen

Regular
I'm having trouble getting my head around this:

https://www.islinc.com/isl-selected-in-first-round-of-armys-xtechsearch7

ISL was selected as a winner in the first round of the Army’s xTechSearch7 (https://www.arl.army.mil/xtechsearch/competitions/xtechsearch7.html ). We proposed a novel Unmanned Air System (UAS) that leverages neuromorphic computing to autonomously search and track particular ground vehicles using minimal RF communications. In the Army CONOPS, this UAS can also be used to autonomously detect and track military vehicles employing Concealment, Camouflage, Deception (CCD). The associated commercial UAS system is intended for first responders for a variety of applications including wide area search in an Amber Alert scenario.



BrainChip selected by U.S. Air Force Research Laboratory to develop AI-based radar​

2023-02-11 16:21 HKT

One for the Army and one for the Air force????
They need to implement Akida fast so they can track all those Balloons 🎈
 
  • Like
Reactions: 4 users

Pmel

Regular
 
  • Like
  • Fire
  • Love
Reactions: 21 users
Top Bottom