BRN Discussion Ongoing

jtardif999

Regular
I don’t think akida have something to do with this super computer. And if so, it’s nothing what you can sell in mass production. It would be again a pump, a Christmas present for shorter
What if it is Akida and has the potential to up-end data centre computing, then it would be a present we will all enjoy next year. 🤓
 
Last edited:
  • Like
  • Fire
Reactions: 7 users

Bravo

If ARM was an arm, BRN would be its biceps💪!

How the Cars of Tomorrow Will “See”: The Future of Lidar and Radar​

Emerging lidar technology and tried-and-true radar are working together to help cars navigate without humans.​



Tim Stevens
Dec 13, 2023
We are in the middle of the most radical technological transformation ever seen in the history of the automotive industry. Not only are we powering through a wholesale shift from internal combustion power to electrification, but every new-generation car that hits dealerships is also that much better at essentially driving itself than those that came before.


Yes, we're well behind the curve toward true autonomous driving that many were predicting five or six years ago. But still, we've reached a point where roughly 70 percent of new cars sold offer some form of advanced driver assistance system, or ADAS, according to IDTechEx. Those systems are the subject of massive research and investment by every major auto manufacturer on the planet, with the suppliers for those companies working just as hard to keep us moving forward to a safer—and eventually driverless—future.
https://marketplace.motortrend.com/

While software development, especially neural networks and other aspects of machine learning, tend to get the biggest headlines, the evolving ADAS systems to come will require ever more comprehensive sensors to see the world around them. Integrated radar sensors, only found in high-end cars a few years ago, are now nearly omnipresent at every price point. Soon, they'll be supplemented by lidar sensing and other technologies, enabling more cars to see more things more often from even farther away.

What can we expect from these sensors in tomorrow's cars? We talked to some of the industry's biggest suppliers and an automaker leading in the field to find out.

More Sensors in More Places

It's always difficult to predict the future, but one trend is so obvious that it's easy to project that line forward: Tomorrow's cars will offer more sensors than today's, and not by a small margin.
Every supplier we spoke with echoed this trend. "The expansion is crazy," Bhavana Chakraborty said. She's the engineering director for driving systems at Bosch, which has been producing radar sensors for 20 years. The adoption of basic ADAS, she said, like automatic emergency braking and adaptive cruise, is exploding: "If you look at 2016 to 2028, it is going from 10 percent to 100 percent installation for these features. And radar is one of the key components for it."
Tier 1 supplier Magna recently shipped its 75 millionth radar sensor. "If you go back five, 10 years, it was more of a niche product for luxury vehicles," Steven Jenkins, Magna's vice president of product line, ASI, and technology strategy, said. "With Level 3 cars, you start to get into five radars, and you get into potentially a five-camera solution plus one forward-looking lidar."

Volvo is an excellent example of this evolution. Volvo's current top-end XC90 luxury SUV has three radar sensors: one in the nose and two more at the rear, each pointing diagonally backward. The company's upcoming EX30, a small electric crossover SUV that starts under $35,000, raises the bar to five radar sensors.
The company's next large SUV, the electric EX90, has even more: five radar sensors plus eight cameras and, for the first time for Volvo, lidar. But Thomas Broberg, Volvo's senior technical adviser for safety, told us the intent isn't to use these sensors to shove even more information into the face of drivers.
"Our philosophy there is that we should support in such a way that it's not disturbing," Broberg said. "It should be there when you need it so that you don't even notice."
More sensors, in this case, means not only detecting more things but also being smarter about identifying real threats, filtering out false positives, and creating a safer, less distracting experience.

Next-Gen Radar

When it comes to the now-humble radar sensors, again, to predict the future, it's nice to look at the past. How have radar sensors improved since they first appeared on vehicles decades ago? "Size, resolution, accuracy, everything," Bosch's Chakraborty said.
The company's radar sensors have dropped from 600 grams to just 75 grams in weight. Meanwhile, their range has extended from 150 meters to 300, and their field of view has expanded from 6 degrees to 60. "It's almost like we used to look through a keyhole, and now we open the whole window," she said.
All these steps forward are happening while costs are decreasing, a huge reason why manufacturers can put so many radar sensors on a reasonably affordable car like the Volvo EX30. Those additional sensors are not only helping make for smarter driver assistance and eventually autonomy but also add more practical features.
Bosch's Chakraborty said the company is working on a rear-facing radar system that can detect the presence and size of trailers being towed and automatically extend the tow vehicle's blind-spot warnings to match.


However, one area that might not need further improvement: range. "Range-wise, we've seen some significant improvements over the past five to 10years," Magna's Jenkins said. "The range is pretty optimal given the road scenarios we want to achieve today."
Radar fidelity, though, will continue to improve, with the most significant step forward coming from something called 4D, or imaging radar.
"Imaging radar is a really interesting thing," Jenkins said. "One of the fundamental problems with radar is objects that are close to each other, reflective surfaces." Imaging radar will allow for higher fidelity. "You can start to differentiate between bridges and fence posts."
Bosch is working on this technology, as well. "We are able to have separation of one degree," Chakraborty said. "So if you have a huge truck and a motorcycle, both driving very close to each other, you can still reliably separate them." She said a system like this could even detect and separately track a person falling from that motorcycle.
But despite these improvements, electrical consumption has not increased. "Power surprisingly has been the same; we're just able to do a lot more," Chakraborty said. As cars start to get more powerful with ECUs capable of processing ever more raw data, the sensors can potentially simplify their own processing, which could get individual sensor consumption down to as little as four watts per sensor—less than half the consumption of a household LED light bulb.
More advanced sensor logic will likely have an even more significant increase in efficiency, Jenkins said. "Keep adding additional sensors in the vehicle, and you just end up draining the battery," he said. But by being smarter about sensor use, for example perhaps only enabling the full suite of sensors when driving on a dense city street, power demands could be cut significantly.
Efficiency only becomes more important as we shift toward electric vehicles, which will be more power-sensitive than their internal combustion counterparts.
There's another place where low-power radar will come into play: within the cabin. Lower-frequency, 60GHz radar systems have been approved for use within the cabin, where they'll initially be used for things like occupant detection, ensuring that nobody is accidentally left inside a locked vehicle.

Volvo will be the first to deploy an internal radar system on a car with the EX90. "For the EX90, we're focused on no one left behind," Volvo's Broberg said. "We know that, for a fact, it's easy when you have a busy day that you forget, you're stressed, and we also know that the worst can sometimes happen. And of course, we want to try to avoid that."
But the potential is much higher. A big focus for Volvo, Broberg said, is not just avoiding accidents but preventing situations that could result in them. This involves better detecting when drivers aren't in an optimal state for driving, perhaps due to stress or intoxication. An internal radar system might help extend the company's existing driver monitoring sensors, better identifying things like slouching posture or other signs of reduced functionality.

This information might also help a next-gen airbag system fire more appropriately based on a person's position during the crash.

Next-Gen Lidar

Radar is a tried-and-true technology on the automotive front these days, with numerous suppliers offering sensors of varying sizes, costs, and capabilities, all hardened to survive the life of a vehicle. Lidar, on the other hand, is a newer technology from an automotive standpoint.
But that's changing. Early lidar applications for autonomy testing and development "were not considered what is called automotive-grade," Innoviz's chief business officer Elad Hofstetter said. Today, the company is developing sensors that, per Hofstetter, can survive 15 years of automotive exposure.
Innoviz is a global supplier of lidar technology to automakers including BMW, whose present focus is largely on extending the operational design domain, or ODD, of its sensors.

This means future sensors that can see through weather conditions that flummoxed many earlier lidar sensors. This limitation is a key reason why autonomous test vehicles traditionally run for home when the forecast turns sour. "It's one thing in good weather conditions to have the lidar work nicely, but life is not always like this," Hofstetter said. "We'll have rain, we'll have snow, we'll have mud, and so on. You want to ensure that your lidar is available almost all the time as much as possible."
Innoviz isn't just improving its sensors to better see through conditions like this. It also seeks to identify areas of the sensor that are blocked by snow or mud or other muck, preventing potential false positive sensor readings.
Dirt and grime on sensors is an increasing problem and a topic of continued research. Additional washer nozzles are a straightforward solution, but technologies ActaJet's compressed air and even muck-resistant oleophobic coatings will play a part.
Still, there's a significant focus on reducing cost for Lidar, and increasing quality. Lidar, being much newer to the automotive world, is still seeing rapid improvements. Hofstetter said early lidar sensors, the giant units installed on prototype self-driving machines, often cost thousands of dollars.
"Obviously, you cannot put such a lidar on a consumer vehicle," he said. "Since then, a little bit like Moore's Law, [cost has] been reduced year over year, year over year significantly." Nobody we spoke to wanted to talk about current sensor pricing in detail, he said today's sensors are priced in the hundreds of dollars, and they'll only get cheaper.

And better. With each generation of its lidar sensors, Innoviz has seen costs drop by more than half while raising effective performance (resolution and field of view) significantly.
That cost reduction will lead to more sensors in more cars. Hofstetter said Innoviz expects to see as many as four or five lidar sensors embedded in future autonomous vehicles. "Evinteentually, you want a coverage of 360 [degrees] in some cases," he said, "[and] also redundancy because there are different occlusions and sometimes different aspects for performance."
That presents a new challenge: design integration.
Where radar sensors can often be hidden behind body panels and positioned low to the ground, lidar sensors need to see the world around them and, ideally, be positioned higher up. Hofstetter likened initial, roof-mounted lidar sensors to "a KFC bucket." Future lidars, he said, will be smaller, lighter, and even quieter, enabling them to be installed closer to or even within the passenger cabin. Automotive designers have "huge weight" in deciding sensor placement. "They don't design the vehicle around the lidar," Hofstetter said. "It goes the other direction."
And, just like we're seeing low-power radar sensors used in the cabin for occupant detection, Lidar systems will start pointing inwards as well. System's like Gentex's In-Cabin Monitoring System can detect people left inside the cabin and even monitor their health.

Better Sensing Without Sensors?

While sensors continue to improve, one of the more compelling shifts for in-vehicle sensing might not require any new sensors at all. Instead, this trend has more to do with increasing communication speed between cars. With vehicle-to-vehicle (V2V) communication becoming more commonplace and with the bandwidth of that connection increasing thanks to technologies like 5G LTE, cars can potentially start sending far more comprehensive information to each other.
That means one car could provide detailed sensor data for other vehicles still miles away, enabling them to "see" obstacles and issues, even trends in vehicle speed, well beyond the reach of even the most advanced future sensors. Magna's Jenkins calls these "virtual sensors."
"Sensors in two different vehicles in two different places that communicate with each other via the cloud," he said.

After decades of promises, this aspect of V2V technology is finally becoming a reality. Chakraborty said Bosch has already captured over 2 billion kilometers of this sort of data, part of a system it calls Connected Map Services. Bosch has partnered with one major OEM to begin deployment in Europe soon.

Solutions like this could mean safer, smarter cars without the additional cost or burden of onboard sensors. It's a rare engineering win-win. More of these wins are sure to come as ADAS and the lidar and radar systems they need to work continue to develop, as well.

 
  • Like
  • Love
  • Fire
Reactions: 30 users

IloveLamp

Top 20
View attachment 52004
"Huge array of new applications "

Screenshot_20231214_123612_Chrome.jpg
 
Last edited:
  • Like
  • Love
  • Fire
Reactions: 17 users

Bravo

If ARM was an arm, BRN would be its biceps💪!


Published 10 hours ago.

In this episode Xan Fredericks is joined by Josh Nimetz and Cindy Thatcher to continue the conversation about what's new and now in the realm of remote sensing specifically LiDAR. This discussion includes data suitability, increased efficiencies, considerations for accuracy, topographic lidar, bathymetric and top integrations, accuracy in integrated datasets, and what's next.
 
  • Like
  • Fire
Reactions: 4 users

IloveLamp

Top 20
  • Like
  • Fire
Reactions: 10 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
  • Like
  • Fire
  • Love
Reactions: 21 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
Here's a couple of extracts from an older article discussing Mobileye and Valeo's partnership to deliver best in class software defined radars. The radar technology has to be coupled with the sensors for it all to work.

Which IMO means coupling with Scala 3 lidar (AKIDA inside)


Screen Shot 2023-12-14 at 2.08.36 pm.png






Screen Shot 2023-12-14 at 1.57.12 pm.png


 
  • Like
  • Fire
  • Love
Reactions: 19 users

buena suerte :-)

BOB Bank of Brainchip
  • Like
  • Love
  • Haha
Reactions: 38 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
Wonder if we're involved in this in some way given the project is sponsored by AFRL. Detect and avoid system that leverages radar and sensor fusion which is right down our alley.


Vigilant Aerospace secures military contract to develop detect-and-avoid system

November 8, 2023 - By Jesse Khalil
clock.png

Est. reading time: 1 minute
Image: Vigilant Aerospace

Vigilant Aerospace Systems, a provider of multi-sensor detect-and-avoid safety systems for UAVs and advanced air mobility (AAM), has been awarded a contract by the U.S. Air Force to develop a detect-and-avoid system for the Air Force’s new long-endurance UAV.
According to the published project description, the objective is to “integrate a mature detect and avoid capability on an existing long-endurance, Group V UAS platform, for increased aircraft and pilot-in-the-loop operational awareness that leverages new and evolving C-SWaP sensors and sensor fusion software.”
The project is sponsored by the Air Force Research Lab (AFRL) and is a Small Business Innovation Research (SBIR) Phase II project through the SBIR program. The program is designed to bring dual-use technologies, which can help both civilian and military users, into the military, with a focus on high-impact, near-term implementations.
FlightHorizon is detect-and-avoid and airspace management software that combines data from aircraft transponders, radar, UAV autopilots and live Federal Aviation Administration (FAA) data to create a single picture of the airspace around a UAV. The software displays air traffic, predicts trajectories and provides avoidance commands to the remote pilot or autopilot. The system can be used on the ground or onboard the UAV and can be configured for any size of aircraft.

The software is based on two licensed NASA patents and the company has completed contracts with NASA, the FAA and a project with the USAF’s 49th Operating Group’s MQ-9 Reaper fleet to track training flights. It is designed to meet industry technical standards and to help UAS operators fly beyond visual line-of-sight (BVLOS).
The new Air Force project incorporates prior research and development by the company in solving the automatic self-separation and collision avoidance problem for UAVs. To evaluate sensors and algorithms and establish standards-compliance and risk ratios, the company has completed hundreds of hours of flight tests with the system and thousands of simulated aircraft encounters inside the software’s built-in simulation engine, according to Vigilant Aerospace Systems.

Screen Shot 2023-10-15 at 10.43.39 am (1).png
 
  • Like
  • Fire
  • Wow
Reactions: 10 users

Terroni2105

Founding Member
Please bring Rob back for the interviews… Nandan is painful. Talks in circles, condescending tone and highlights limitations. Couldn’t even listen to end.
I disagree. I thought Nandan answered appropriately to the questions that were asked of him. And Sally let him talk freely because he was giving a lot of information and insights, it is clear he is a very knowledgeable fellow. I guess it depends on the audience listener as I like the way he explains things.
 
  • Like
  • Love
  • Fire
Reactions: 38 users

TECH

Regular
What if Dr T Lewis's appointment as CTO was not only to replace Peter in his semi retirement phase, but to further advance the development of BRN's Akida's "Cortical Columns" reseach and product range which has been worked on by Peter for some considerable time now ............ especially given imo the fact that Akida 2 (E, S & P) has already been launched and the Akida 3.0 is in it's final count down mode so imo there's probably very little involvement that will be or would be required from Dr T Lewis at this stage .. IMO, Dr T Lewis would have required some rather substantial technologically advanced development to lure him away from his last highly held position..... Accordingly, I believe that Dr T Lewis and Peter have their eye's well and truely set on the next biggest technological change and that being the further creation, refinement and deployment of Peter's own Brainchip Cortical Columns techology.

That's one of your better posts......you are right on the money in my view Xray1, I know for a fact that Peter not only chose Tony, has done some
work alongside him, and spoke in glowing terms of how Alan Harvey (SAB Perth) is doing some great work on Cortical Columns.

Peter is a visionary, his work will continue, a lot in his own private time, I know that he's comfortable taking a step back, but the vision and the
never give up attitude is still strong....the dream is still alive, and so should it be !!

CES will prove to be exciting for Brainchip, has anybody ever checked out Nextchip Co Ltd out of South Korea ?

💞 Brainchip.....Tech
 
  • Like
  • Love
  • Fire
Reactions: 29 users
That's one of your better posts......you are right on the money in my view Xray1, I know for a fact that Peter not only chose Tony, has done some
work alongside him, and spoke in glowing terms of how Alan Harvey (SAB Perth) is doing some great work on Cortical Columns.

Peter is a visionary, his work will continue, a lot in his own private time, I know that he's comfortable taking a step back, but the vision and the
never give up attitude is still strong....the dream is still alive, and so should it be !!

CES will prove to be exciting for Brainchip, has anybody ever checked out Nextchip Co Ltd out of South Korea ?

💞 Brainchip.....Tech
What made you bring them up TECH?


Just the kind of "dark horse" that BrainChip would suddenly partner with..

In answer to the question of our Korean connection/s..

"It's Samsung"
"It's Hyundai"
"It's LG"..

BrainChip Announces partnership/IP deal with Nextchip

"Nextchip?? Who the hell are they??"
 
  • Like
  • Haha
  • Fire
Reactions: 10 users

Tothemoon24

Top 20
IMG_7950.jpeg



The ⌚ Hyfe CoughMonitor ⌚ is an AI-powered wrist-wearable that monitors cough passively and in real-time. It runs the Hyfe CoughMonitor software, counting 90% of coughs with just 1 false positive per hour, whilst patients go about their normal daily activities.

The watch is slim, comfortable and light enough to be worn for days, weeks or even months at a time. It preserves privacy by processing entirely on device so no sound is uploaded to the cloud, whilst cough frequency data is visible immediately via a dashboard on the web. The CoughMonitor is being used by thousands of patients & researchers around the world.

For more information about how continuous cough-monitoring could help your research or clinical trials, schedule a 15 min call here:
 
  • Thinking
  • Like
  • Fire
Reactions: 10 users

Tothemoon24

Top 20
RELEASE
12 December 2023

New Hyfe AI Study Validates the Accuracy of AI-Powered Cough Detection Human Annotators​

6578abb0e787adda2fba983f_labeler%20intra%20inter%20agreement.png

Wilmington, Del., December 12, 2023 -- Hyfe, Inc., the global leader in AI-powered cough detection, tracking, and classification, today announces breakthrough results from a performance evaluation study comparing human cough annotators to emerging automated technologies using AI and machine learning. The BMJ Open Respiratory Research study demonstrates Hyfe’s cough detection AI technology is nearly as accurate as the human ear in analyzing the amount and duration of cough in a real-world environment.
The study analyzed 40 hours of audio randomly collected from participants wearing audio recording smartwatches throughout an average day. The audio samples were manually reviewed and annotated twice by one expert human annotator and some samples of the 40 hours were annotated a third time by six of Hyfe’s expert annotators. Key insights from the study include:
  • After evaluating several ways to track cough, researchers found that tracking cough by cough seconds, which are defined as any second of time that contains at least one cough, rather than counting individual coughs, decreased annotator discrepancy by 50 percent.
  • Compared to counting individual coughs, the study proposes that using cough rate over time is a more clinically relevant and reproducible performance metric for evaluating automatic systems.
  • Hyfe’s labeling software was reported by labelers to be easy to use, an improvement over Audacity, and likely to lead to fewer errors in data management.
  • The study provides guidance for researchers and developers working on these technologies and has the potential to lead to more reliable and consistent automatic cough monitoring tools.
Notably, the study is also the first to observe sex differences in cough, observing for the first time that the duration of cough sounds and epoch size differed between male and female participants. The study found that women tend to have shorter cough sounds but more coughs in each episode compared to men. These observed differences in cough characteristics could have important implications for the development of cough monitoring tools. They will be key in better understanding how diseases spread, diagnosing illnesses, and how people seek medical help.
“We’re thrilled that the results of this study confirm the accuracy of our robust automated cough monitoring technology,” said Joe Brew, CEO of Hyfe AI. “These findings pave the way for more accurate, reliable, and clinically relevant methods for cough tracking. They provide valuable insights that could significantly impact the much-needed development for monitoring coughs in healthcare and clinical trial settings. These are breakthrough discoveries in the chronic cough space and we look forward to taking an even deeper look into cough in future studies like this.”
Hyfe is the leader in longitudinal cough monitoring with over 700 million sounds in its cough database. It can track and detect cough with 90%+ accuracy on any mic-enabled device with no need for patient intervention while preserving patient privacy.

For more information, visit hyfe.ai.

About Hyfe
Hyfe, Inc. is the global leader in AI-powered cough detection and classification that provides insight into cough patterns and correlations and is being widely used to help patients gain a better understanding of their cough and have more informed conversations with their providers. With more than 700 million samples, Hyfe maintains the largest cough dataset in the world enabling the building of powerful models to track, manage and diagnose respiratory illnesses. Hyfe provides platforms and data for pharmaceutical companies, medical researchers, government agencies, health care providers and patients and has partnerships with leading academic institutions including Johns Hopkins University and the University of California at San Francisco. The company was founded in 2020 and is headquartered in Wilmington, Delaware. More information is available at Hyfe.ai, on social media @hyfeapp and LinkedIn at /hyfe.
 
  • Fire
  • Like
Reactions: 3 users
RELEASE
12 December 2023

New Hyfe AI Study Validates the Accuracy of AI-Powered Cough Detection Human Annotators​

6578abb0e787adda2fba983f_labeler%20intra%20inter%20agreement.png

Wilmington, Del., December 12, 2023 -- Hyfe, Inc., the global leader in AI-powered cough detection, tracking, and classification, today announces breakthrough results from a performance evaluation study comparing human cough annotators to emerging automated technologies using AI and machine learning. The BMJ Open Respiratory Research study demonstrates Hyfe’s cough detection AI technology is nearly as accurate as the human ear in analyzing the amount and duration of cough in a real-world environment.
The study analyzed 40 hours of audio randomly collected from participants wearing audio recording smartwatches throughout an average day. The audio samples were manually reviewed and annotated twice by one expert human annotator and some samples of the 40 hours were annotated a third time by six of Hyfe’s expert annotators. Key insights from the study include:
  • After evaluating several ways to track cough, researchers found that tracking cough by cough seconds, which are defined as any second of time that contains at least one cough, rather than counting individual coughs, decreased annotator discrepancy by 50 percent.
  • Compared to counting individual coughs, the study proposes that using cough rate over time is a more clinically relevant and reproducible performance metric for evaluating automatic systems.
  • Hyfe’s labeling software was reported by labelers to be easy to use, an improvement over Audacity, and likely to lead to fewer errors in data management.
  • The study provides guidance for researchers and developers working on these technologies and has the potential to lead to more reliable and consistent automatic cough monitoring tools.
Notably, the study is also the first to observe sex differences in cough, observing for the first time that the duration of cough sounds and epoch size differed between male and female participants. The study found that women tend to have shorter cough sounds but more coughs in each episode compared to men. These observed differences in cough characteristics could have important implications for the development of cough monitoring tools. They will be key in better understanding how diseases spread, diagnosing illnesses, and how people seek medical help.
“We’re thrilled that the results of this study confirm the accuracy of our robust automated cough monitoring technology,” said Joe Brew, CEO of Hyfe AI. “These findings pave the way for more accurate, reliable, and clinically relevant methods for cough tracking. They provide valuable insights that could significantly impact the much-needed development for monitoring coughs in healthcare and clinical trial settings. These are breakthrough discoveries in the chronic cough space and we look forward to taking an even deeper look into cough in future studies like this.”
Hyfe is the leader in longitudinal cough monitoring with over 700 million sounds in its cough database. It can track and detect cough with 90%+ accuracy on any mic-enabled device with no need for patient intervention while preserving patient privacy.

For more information, visit hyfe.ai.

About Hyfe
Hyfe, Inc. is the global leader in AI-powered cough detection and classification that provides insight into cough patterns and correlations and is being widely used to help patients gain a better understanding of their cough and have more informed conversations with their providers. With more than 700 million samples, Hyfe maintains the largest cough dataset in the world enabling the building of powerful models to track, manage and diagnose respiratory illnesses. Hyfe provides platforms and data for pharmaceutical companies, medical researchers, government agencies, health care providers and patients and has partnerships with leading academic institutions including Johns Hopkins University and the University of California at San Francisco. The company was founded in 2020 and is headquartered in Wilmington, Delaware. More information is available at Hyfe.ai, on social media @hyfeapp and LinkedIn at /hyfe.
I must be missing something here..

Am I now in an alternate reality, where people don't know whether they are coughing or not 🤔..
 
  • Haha
  • Like
  • Thinking
Reactions: 7 users

Diogenese

Top 20
I must be missing something here..

Am I now in an alternate reality, where people don't know whether they are coughing or not 🤔..
If I'm all alone in forest ... and a tree falls on me ...
 
  • Haha
  • Like
Reactions: 16 users

IloveLamp

Top 20
  • Like
  • Love
  • Fire
Reactions: 13 users

IloveLamp

Top 20
Screenshot_20231214_220841_LinkedIn.jpg
 
  • Like
  • Fire
Reactions: 11 users
Top Bottom