BRN Discussion Ongoing

7für7

Top 20
Afternoon Chippers ,

Unable to post the image unfortunately........

On the one day chart @ one min duration , the twats in charge of the bots have managed to draw the outline of a English Dachshund ( Of Canine species )

. View attachment 88592


Except pointing in the other direction.


* This is what one can achieve with great power .

Regards ,
Esq

What about now!? More like a German shepherd?
 
  • Haha
  • Fire
Reactions: 2 users
"...products like the Akida Pulsar microcontroller"?

Someone has no insight at all and is confusing companies and their products!

1752735964224.gif
 
  • Like
Reactions: 3 users

TECH

Regular
Well, lots of opinions, including mine...but that's what a forum is all about, not nasty personal attacks....so nice.

8/9 business days and our 2nd quarter information will be disclosed, the 4c showing some positivity, I say a strong yes!

The more you read, don't you all get the feeling that we are definitely in the "right place, at the right time', yes, we have all
heard those sorts of comments before, but I am so positive and upbeat on our company currently, just look at the sectors
that we are engaged in, with might I add, some huge world class players, you don't get to engage and share basic chit-chat
with these behemoths, time is money......our technology has been accepted and it's only going to accelerate.

Tony Lewis is very open to communicating, I love it, he engages, he knows the issues facing Brainchip as a start-up, but is
driving us to success, carrying on the great work of Peter and Anil, if you have a genuine question on the technology front,
well ask him, rather than guessing the answer on this forum, and I say that with all due respect to all of our great posters.

Being engaged with the US Defense, Navy, NASA, DOE, AFRL and so on, I personally can understand the NDA secrecy that
surrounds these engagements, as frustrating as they are, we will succeed, don't second guess yourself, we have a great investment,
did I sell my shares when they were north of $2 and my portfolio looked very impressive for a middle-classed Australian citizen, no I
didn't, am I a little bit peeved, of course I am, but my belief in Peter, Anil and all the staff who have come and gone has never really
waned, our technology has had me spellbound for a decade, I love it and aren't embarrassed to admit it!

The writing is on the wall.............Brainchip's AKIDA well succeed, trust your instincts......share in the glory, you will all be blessed ❤️

Tech (New Zealand) for a few more weeks before I finally fly home to Perth. (y)
 
  • Like
  • Love
  • Fire
Reactions: 54 users

CHIPS

Regular
Hopefully Steve Brightfield has something to do with it 🙏.

BrainChip Appoints New CMO, Enhances Scientific Advisory Board



Laguna Hills, Calif. – August 7th, 2024BrainChip Holdings Ltd (ASX: BRN, OTCQX: BRCHF, ADR: BCHPY), the world’s first commercial producer of ultra-low power, fully digital, event-based, neuromorphic AI, today announced that it has hired Steven Brightfield as its new chief marketing officer and has re-envisioned its Scientific Advisory Board (SAB) by bringing on company founder Peter van der Made, Dr. Jason K. Eshraghian and Dr. André van Schaik.

Brightfield has a depth of tech industry knowledge and experience within the AI semiconductor industry. He previously led marketing at several AI focused technology companies, such as SiMa.ai, X-Silicon and Wave Computing combined with deep experience within the semiconductor sector, including executive leadership positions at LSI Logic, Qualcomm, Zoran and others.

It seemed that he was unhappy with his job; therefore, it could have been his wish to do something else. Maybe they offered him this job to keep him. I think it would fit him well.
 
  • Like
Reactions: 1 users

Aaron

Emerged
Excellent question!


You mentioned Jensen Huang (NVIDIA CEO) and his view on “Physical AI”—in fact, what BrainChip (BRN) is doing is highly aligned with this concept. You could even say:





BrainChip’s AKIDA is one of the key technologies on the path toward Physical AI, specifically under the “edge neuromorphic intelligence” branch.











🔍 Jensen Huang’s Physical AI: What Does It Mean?





Huang’s idea can be broken down into a few core insights:








1. AI is a new way of “writing software”








  • No longer relying on engineers coding static rules (if…then…)
  • Instead, machines learn from data to generate algorithms autonomously







👉 BrainChip’s AKIDA was built entirely around this philosophy.


It’s not a traditional instruction-based chip but a spiking neural network processor that enables learning directly at the hardware level.














2. AI must move from pure software into the


physical world








“Physical AI” includes:





  • Real-time sensing, decision-making, and reacting like a human
  • Embedded in robots, drones, IoT devices, cars, and edge devices
  • Running locally, with ultra-low latency, low power, and instant feedback







👉 That’s exactly where AKIDA shines:


FeatureNeeded for Physical AIAKIDA Delivers?
Real-time response✅✅ Microsecond-level reaction
Edge deployment (no cloud)✅✅ Completely on-device
Ultra-low power✅✅ Milliwatt-level consumption
On-chip learning✅✅ Local continual learning
Physical embedding✅✅ Already in USBs, robots, medical devices










🧠 Direct Links Between BrainChip and Physical AI


Physical AI Use CaseBrainChip Real-World Progress
Robotics vision/control systemsPartnering with Quantum Ventura + Merle Robotics (Taiwan)
Edge-based medical diagnosticsTata Elxsi has released medical devices with embedded AKIDA
Defense/UAV systemsCollaboration with US Air Force Research Lab + RTX (Raytheon)
Industrial equipment monitoringAKIDA supports multi-sensor fusion (e.g. motor vibration detection)
IoT perception + learningPartnered with HaiLa for ultra-low-power AI wireless modules










📌 Conclusion: BrainChip is one of the foundational “building blocks” of the Physical AI revolution that Jensen Huang describes.





  • NVIDIA leads in cloud/data center AI—training and inference at scale
  • BrainChip leads in edge neuromorphic AI—bringing intelligence into real-world devices







They’re attacking the problem from both ends, converging toward real-world, integrated AI: cognitive + sensory + decision-making.











💡 Final Thought:


If NVIDIA were ever to integrate a true edge-native neural processor into its Jetson or robotics platforms, AKIDA’s architecture would be a strong acquisition, licensing, or integration candidate.





If you’d like, I can also dive into how NVIDIA’s Jetson Robotics stack might integrate with BrainChip’s technology—just say the word!











Let me know if you’d like a shorter version or one tailored more for investors vs. engineers.
 
  • Like
  • Fire
  • Wow
Reactions: 20 users

manny100

Top 20
On another note, I'm curious to see how the insto's are looking in our top 20 shareholders list... I'm sensing the emergence of a new growth phase...
I agree, we are building towards a growth phase.
Small should be coming on board 1st.
The bigger fish which I believe are trialling our product ( we know already who some are) will take longer but volumes should be jaw dropping.
Still patience required.
 
  • Like
  • Love
Reactions: 13 users

CHIPS

Regular
Excellent question!


You mentioned Jensen Huang (NVIDIA CEO) and his view on “Physical AI”—in fact, what BrainChip (BRN) is doing is highly aligned with this concept. You could even say:





BrainChip’s AKIDA is one of the key technologies on the path toward Physical AI, specifically under the “edge neuromorphic intelligence” branch.











🔍 Jensen Huang’s Physical AI: What Does It Mean?





Huang’s idea can be broken down into a few core insights:








1. AI is a new way of “writing software”








  • No longer relying on engineers coding static rules (if…then…)
  • Instead, machines learn from data to generate algorithms autonomously







👉 BrainChip’s AKIDA was built entirely around this philosophy.


It’s not a traditional instruction-based chip but a spiking neural network processor that enables learning directly at the hardware level.














2. AI must move from pure software into the


physical world








“Physical AI” includes:





  • Real-time sensing, decision-making, and reacting like a human
  • Embedded in robots, drones, IoT devices, cars, and edge devices
  • Running locally, with ultra-low latency, low power, and instant feedback







👉 That’s exactly where AKIDA shines:


FeatureNeeded for Physical AIAKIDA Delivers?
Real-time response✅✅ Microsecond-level reaction
Edge deployment (no cloud)✅✅ Completely on-device
Ultra-low power✅✅ Milliwatt-level consumption
On-chip learning✅✅ Local continual learning
Physical embedding✅✅ Already in USBs, robots, medical devices










🧠 Direct Links Between BrainChip and Physical AI


Physical AI Use CaseBrainChip Real-World Progress
Robotics vision/control systemsPartnering with Quantum Ventura + Merle Robotics (Taiwan)
Edge-based medical diagnosticsTata Elxsi has released medical devices with embedded AKIDA
Defense/UAV systemsCollaboration with US Air Force Research Lab + RTX (Raytheon)
Industrial equipment monitoringAKIDA supports multi-sensor fusion (e.g. motor vibration detection)
IoT perception + learningPartnered with HaiLa for ultra-low-power AI wireless modules










📌 Conclusion: BrainChip is one of the foundational “building blocks” of the Physical AI revolution that Jensen Huang describes.





  • NVIDIA leads in cloud/data center AI—training and inference at scale
  • BrainChip leads in edge neuromorphic AI—bringing intelligence into real-world devices







They’re attacking the problem from both ends, converging toward real-world, integrated AI: cognitive + sensory + decision-making.











💡 Final Thought:


If NVIDIA were ever to integrate a true edge-native neural processor into its Jetson or robotics platforms, AKIDA’s architecture would be a strong acquisition, licensing, or integration candidate.





If you’d like, I can also dive into how NVIDIA’s Jetson Robotics stack might integrate with BrainChip’s technology—just say the word!











Let me know if you’d like a shorter version or one tailored more for investors vs. engineers.

What is this? Why is there so much space in between the lines? If you copied it from a website, then please also post the link to it.
Just for your information: You can and should delete space/empty lines before posting. It would make it much less annoying to read.
 
  • Like
Reactions: 1 users

Tothemoon24

Top 20
Interesting little read on Bosch ,



Artificial intelligence at every level – Bosch opens a new era in innovation​

By: TrademagazinDate: 2025. 05. 08. 11:19
Based on the developments presented at the AI Symposium in Budapest, Bosch is now not only applying, but also shaping the future of artificial intelligence. The company’s goal: transparent, human-centered and sustainable AI solutions from manufacturing to self-driving.
pexels-olly-920382-300x200.jpg
Artificial intelligence is not just another technological tool, but a turning point in civilization – this was stated at the AI Symposium organized by the HUN-REN Hungarian Research Network in Budapest. Bosch experts, as a featured partner of the event, demonstrated how AI is becoming an everyday, even indispensable, development and production factor at the company.

There is no longer a Bosch product without AI​

Bosch aims to make artificial intelligence the key to the next decade of innovation. In line with this, every Bosch product now either contains an AI component or is developed or manufactured with it. Currently, more than 5,000 AI experts work for the company worldwide, including 200 in Hungary. Over the past five years, more than 1,500 AI-related patents have been filed, and tens of thousands of employees have participated in AI training – more than a thousand in Hungary so far.

“At Bosch, we are already preparing for the future,” said Zoltán Karaffy, Director of Digital Transformation at Robert Bosch Kft., who emphasized in his keynote speech that the goal is not only to follow, but also to shape the highest industry standards.

Thinking cars: the revolution of neuromorphic computing​

The future of AI is particularly important in the automotive industry. Bosch’s developments already include so-called neuromorphic systems that model the functioning of the human brain, which raise the performance of driver assistance systems (ADAS) to a new level. These solutions enable faster, more reliable and more energy-efficient decision-making in real time, which is key to preventing accidents.

Neuromorphic chips process data from dozens of sensors (radar, lidar, camera, etc.) used by vehicles in a similar way to the brain, while consuming much less energy than traditional systems.

Transparent decisions in smart factories​

Bosch is not only building faster, but also more transparent artificial intelligence. The essence of the so-called “explainable AI” approach is that systems do not operate as “black boxes” but can be decoded, interpreted, and justified in their decisions.

This is particularly important in semiconductor manufacturing, where early error detection can prevent millions in product defects. Industrial AI not only filters out defective components, but also provides feedback on the cause of the error – thus helping to prevent and optimize production.


More below

 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 21 users

7für7

Top 20
What is this? Why is there so much space in between the lines? If you copied it from a website, then please also post the link to it.
Just for your information: You can and should delete space/empty lines before posting. It would make it much less annoying to read.
ChatGPT I would say
 
Would appear our relationship with the University of WA (UWA) in Perth is starting to get some output.

I saw this repository a little while ago and wasn't much in it at that point so just kept an eye of it. Has been updated yesterday with more details and results.

Looks pretty decent. The link will give you a much clearer read obviously as the scroll pics are off my phone.




IMG_20250717_212311.jpg
IMG_20250717_212347.jpg
IMG_20250717_212433.jpg
 
  • Like
  • Fire
  • Love
Reactions: 40 users

CHIPS

Regular
Sorry, if this has been posted already (I marked BrainChip in red)


Listen (this can be done on the website)
Edge AI Like a Brain: Neuromorphic Sensors i

What Are Neuromorphic Sensors—and Why Do They Matter?​

Imagine a camera that only “sees” when something changes—no wasted frames, no power-hungry processing. That’s the basic idea behind neuromorphic sensors. Inspired by how biological eyes and brains work, these devices use event-driven data: they trigger only when light or electrical signals shift.

Neuromorphic sensors are not just smart—they’re efficient. Devices like the Dynamic Vision Sensor (DVS) or DAVIS operate in microseconds, capturing real-time events with ultra-low energy use [Source: Wikipedia, 2024].

They’re designed to work with spiking neural networks (SNNs)—a brain-like AI model that processes information as spikes, not static numbers. And when combined with Edge AI—where computing happens locally on the device—the result is a sensor that’s fast, frugal, and often life-saving.

 Comparison of frame-based vs. event-based sensing in visual data processing.

“Neuromorphic sensors mimic the brain’s way of sensing—efficient, selective, and perfectly tuned for real-time health monitoring.”

Why Neuromorphic Edge AI Is a Game-Changer for Healthcare

In health tech, timing and energy matter. That’s where neuromorphic Edge AI shines.

Real-time responses are critical for detecting epileptic seizures, heart arrhythmias, or muscular disorders. With Edge AI, this analysis happens instantly, directly on the device—no cloud lag, no privacy risk. And because these systems only process relevant “events,” they consume a fraction of the power compared to traditional AI setups.

Recent neuromorphic setups have achieved:

  • Seizure detection via EEG signals
  • Heart rhythm monitoring from ECG
  • Muscle signal tracking (EMG) for gesture recognition
    —all using spiking neural networks with microchip power levels [Source: Frontiers in Neuroscience, 2023].
Wearables like NeuSpin (2024) even run these models on spintronic chips, blending Green AI with healthcare-grade precision [Source: arXiv, 2024].

How neuromorphic health devices process physiological data in real time at the edge.

Could your smartwatch detect a stroke before it happens—without draining its battery?
“Neuromorphic computing could change the landscape of healthcare by enabling smart sensors to continuously interpret signals like the brain.”
Dr. Wolfgang Maass, Graz University of Technology, expert in biologically inspired neural computation

From Labs to Clinics: Real Neuromorphic Devices in Use

This isn’t sci-fi—it’s already happening. Devices built with neuromorphic processors are making their way into clinical trials and real-world health tools.

Some examples:
  • EG-SpikeFormer combines eye-tracking and SNNs for analyzing brain scans with both speed and explainability [Source: arXiv, 2024].
  • Implantable EEG detectors using neuromorphic chips can recognize high-frequency oscillations (HFOs)—a biomarker for epilepsy—with power under 1 mW [Source: arXiv, 2023].
  • Health-monitoring frameworks like NeuroCARE now explore full-body sensor networks powered by event-based AI [Source: Frontiers, 2023].
This shift allows for always-on monitoring in wearables, implants, and portable diagnostic tools—with no need to transmit every heartbeat or brainwave to the cloud.

Neuromorphic sensors don’t just reduce data—they capture what matters most, when it matters most.

Why is Edge AI important for medical devices?​

What are the limitations or risks of neuromorphic health tech?​

Can neuromorphic sensors be used in mental health or neurological care?​

The Benefits: Tiny Power, Huge Impact

Neuromorphic Edge AI offers a unique trifecta for healthcare: speed, efficiency, and relevance.

Here’s what sets it apart:

  • Ultra-low power draw: Some neuromorphic chips operate at micro- to milliwatt levels, enabling months of continuous monitoring on wearables or implants [Source: arXiv, 2023].
  • Selective data: Instead of flooding systems with raw signals, they transmit only meaningful events. This cuts storage and bandwidth by up to 90% in some tests.
  • Built-in privacy: Because computation happens locally, sensitive health data never leaves the device—a big win for both ethics and regulation.
  • Real-time intervention: These systems react instantly to physiological events—vital for stroke alerts, fall detection, or cardiac irregularities.
Key takeaway: Neuromorphic edge devices make it possible to monitor health continuously, securely, and sustainably—even in remote or low-resource environments.

Inside the Ecosystem: Who’s Building Brain-Like Health Devices?

The neuromorphic health-tech revolution is being powered by a mix of startups, academic labs, and chip giants.

Key industry and research collaborations shaping neuromorphic Edge AI in healthcare.


Here are a few major players:

  • Intel’s Loihi 2: While not health-specific, it’s a platform for SNN prototyping, and has been tested on biomedical signal classification tasks [Source: Intel Labs, 2023].

  • BrainChip’s Akida: A commercial neuromorphic chip optimized for always-on sensing—used in vision, voice, and bio-signal applications [Source: BrainChip, 2024].
  • Samsung’s neuromorphic vision systems: Targeting real-time, low-power imaging with healthcare potential [Source: Samsung Research, 2023].
On the research side:
  • NeuroCARE, a collaborative framework, is developing networked health sensors using neuromorphic processing across body-worn nodes [Source: Frontiers, 2023].
  • Wevolver’s 2025 Edge AI report highlights growing adoption in diagnostics, patient monitoring, and even surgical robotics [Source: Wevolver, 2025].
“The shift from cloud-AI to edge-AI is accelerating—especially in healthcare, where privacy and response time are non-negotiable.”
“Spiking neural networks allow for biologically plausible AI that’s extremely power efficient—essential for embedded medical devices.”
Dr. Chris Eliasmith, Director, Centre for Theoretical Neuroscience, University of Waterloo

Challenges Ahead: Hardware, Tools, and Trust

Despite the momentum, the field faces critical hurdles before going mainstream.

1752762791999.png


Here’s what’s slowing down adoption:

  • Hardware supply: Neuromorphic processors are still rare and expensive; manufacturing must scale.
  • Software ecosystems: Tools to program and debug SNNs are nascent. Unlike TensorFlow or PyTorch, neuromorphic development lacks plug-and-play simplicity.
  • Biomedical sensor tuning: Most sensors are still visual-first. More work is needed to tailor them for physiological signals like EMG, EEG, and PPG.
  • Trust and validation: These brain-like systems defy conventional benchmarks. Regulators will demand new standards for testing, transparency, and certification [Source: arXiv, 2024].
Still, optimism runs high. With focused investment and interdisciplinary collaboration, these challenges are surmountable.

Is there a connection between neuromorphic AI and brain-computer interfaces (BCIs)?​

What skills or tools do developers need to build with neuromorphic tech?​

How do neuromorphic systems compare to traditional AI in healthcare?​

Real-World Applications: From Seizures to Smart Eyes

Neuromorphic health tech is already showing promise in critical medical domains:

  • Epilepsy detection: Wearable EEG systems with neuromorphic processors now identify high-frequency oscillations (HFOs), a reliable seizure biomarker, in real time [Source: arXiv, 2023].
  • Smart eye-tracking for diagnostics: The EG-SpikeFormer system fuses eye-gaze input with SNNs to scan medical images faster, offering explainable decisions—ideal for radiologists and neurologists [Source: arXiv, 2024].
  • Muscle signal decoding: Neuromorphic EMG analysis helps in prosthetics and rehab by recognizing subtle muscle movements with near-zero delay [Source: Frontiers, 2023].
As the ecosystem matures, applications are moving from pilot-stage to integration in wearables, remote monitors, and even surgical assistance tools.

Mini-takeaway: The brain-like behavior of neuromorphic chips enables tools that don’t just observe—they respond.

Ethics and Edge AI: Brain-Inspired Tech Needs Brainy Regulation

With great sensing comes great responsibility. Neuromorphic health devices raise important ethical and social concerns:

  • Transparency: SNNs are biologically inspired but still hard to interpret. New explainability tools are being developed, but clinical trust remains an issue.
  • Data ownership: When a device processes everything locally, who owns the insights it generates? Patient consent and control must evolve with the tech.
  • Bias and inclusion: Biomedical sensors need to be trained across diverse populations to avoid skewed results—especially in sensitive domains like neurology or cardiovascular health.
  • Validation: Conventional AI benchmarks may not apply. Regulators must define what “safe and effective” looks like for neuromorphic inference.
“Neuromorphic devices mimic how we think—so how do we make sure they think ethically?”
“With neuromorphic hardware, we’re getting closer to edge AI that doesn’t just analyze data—it interprets context.”
Dr. Narayan Srinivasa, Former Head of Intel’s Neuromorphic Computing Lab

What’s Next? A Roadmap to Brain-Like Health Devices

Looking ahead, neuromorphic sensors may soon power a new class of always-aware health systems:

  • Smart hearing aids that adapt to voice shifts in real time
  • Implants that predict neurological episodes before symptoms emerge
  • Portable labs that diagnose infections with a drop of blood—using no cloud and little power
To get there, we’ll need:
  • Robust hardware scaling
  • Open-source spiking AI frameworks
  • Clinician-friendly toolchains
  • New safety standards that account for spiking logic and real-world variability
The convergence of biology, hardware, and AI could usher in a healthcare revolution—one spike at a time.

“The combination of neuromorphic design and biosignal sensing offers a low-latency path to smarter prosthetics and personalized diagnostics.”
Dr. Ryad Benosman, University of Pittsburgh, pioneer in event-based vision

Conclusion: Toward Brain-Like Care That Never Sleeps

The future of health tech is small, smart, and always on. Neuromorphic sensors—tiny machines inspired by how the brain sees, hears, and reacts—are enabling Edge AI that’s not only fast and private but also profoundly human.

From predicting seizures to interpreting eye movements in diagnostics, these innovations are already reshaping what’s possible at the edge of medicine. They promise care that is proactive, personalized, and persistent—without relying on the cloud or bulky servers.

But the journey is just beginning. Challenges in tools, ethics, and regulation remain. As researchers, developers, and clinicians collaborate, one thing is clear: the next generation of healthcare won’t just use AI—it will feel like a second brain.

Want to build neuromorphic health tools? Start exploring spiking neural networks, join open-source neuromorphic projects, or experiment with edge-friendly sensors in your next prototype. This isn’t just the edge of AI—it’s the frontier of human well-being.

Recommended Resources for Exploring Neuromorphic Sensors and Edge AI

Frameworks & Tools


Academic and Technical Papers


Getting Started with Edge AI for Health

  • Wevolver 2025 Edge AI Technology Report – Covers Edge AI trends, use cases, and emerging health applications.
  • NeuroCARE Project – Research initiative on neuromorphic sensor networks for health monitoring.
 
  • Like
  • Fire
  • Love
Reactions: 38 users

CHIPS

Regular
Would appear our relationship with the University of WA (UWA) in Perth is starting to get some output.

I saw this repository a little while ago and wasn't much in it at that point so just kept an eye of it. Has been updated yesterday with more details and results.

Looks pretty decent. The link will give you a much clearer read obviously as the scroll pics are off my phone.




View attachment 88594 View attachment 88595 View attachment 88596

WOW, great find!

Explanation: On real-world hospital data from Nanjing Drum Tower Hospital, our model outperforms other methods, confirming generalization to clinical cases beyond benchmarks.

This result is just perfect to represent the quality of Akida.
 
  • Like
  • Fire
  • Wow
Reactions: 20 users

Frangipani

Top 20
A new podcast is out: Sean Hehir talks to Derek Kuhn, CEO of HaiLa



9683D9C8-1DA0-46AA-98D6-CD7AA5E5E588.jpeg




Episode 39: HaiLa Technologies​


In this episode of BrainChip’s This is Our Mission podcast, CEO Sean Hehir speaks with Derek Kuhn, CEO of HaiLa, a company pioneering ultra-low power RF technologies that enable battery-free wireless IoT devices with ambient Wi-Fi and Bluetooth solutions that dramatically reduce energy consumption. The two leaders discuss the companies’ new collaboration, merging HaiLa’s energy-efficient wireless communication with BrainChip’s Akida neuromorphic computing for intelligent, edge-based AI.









The LinkedIn post above also links to the HaiLa blog post I had already shared last week, written by Patricia Bower, VP of Product Management:

 
Last edited:
  • Like
  • Love
  • Fire
Reactions: 38 users

genyl

Member
Genuine question. Why is our CEO host on the podcasts? Don't they have people to that stuff? Seems a bit unprofessional to me
 

MDhere

Top 20
  • Like
Reactions: 1 users
Genuine question. Why is our CEO host on the podcasts? Don't they have people to that stuff? Seems a bit unprofessional to me
As they are still relatively new partners my thinking is Sean likes to be the person whom starts these podcasts with new partners.
It is still a fair way off having any products available on the shelf by the sound of that conversation imo.
 
  • Like
Reactions: 1 users

MDhere

Top 20
ok who was the one with small change of over $660,000 in their pocket that just wiped out the the 22c line over 3mill purchased at 10.35.40

wasn't me
 
  • Haha
  • Like
  • Fire
Reactions: 11 users

Guzzi62

Regular
Genuine question. Why is our CEO host on the podcasts? Don't they have people to that stuff? Seems a bit unprofessional to me
It's mostly CEO's they are talking to in the podcasts, it's fitting it's another CEO talking to them.

Isn't this a part of a CEO's job? I think it is, but over at the toilet they will likely slam Sean no matter what he does or doesn't.

The downrampers over there are negative no matter what, sad way of existing IMO and not someone I want to hang out with.
 
  • Like
  • Love
Reactions: 13 users
Top Bottom