BRN Discussion Ongoing

Frangipani

Top 20
A paper by three researchers from Norway titled “Using neuromorphic computing in prediction of GABA* concentration - a pilot study” was published a week ago.
*GABA = gamma-aminobutyric acid

While the authors from University of Oslo’s Department of Physics resp. Oslo University Hospital’s Department of Clinical and Biomedical Engineering were not convinced of AKD1000, they are still hoping to do future work with Akida 2.0:

Discussion
The CNN architecture performed well on the regression task while the Akida CNN2SNN exhibited significantly worsened performance due to dataset characteristics and hardware limitations. The Akida platform could only convert CNNs to SNNs and has a 4-bit precision limit for input layers, constraining convolutional analysis of permittivity data across GABA concentrations.
These findings underscore the significance of thoughtfully selecting network architecture and optimization methods to attain peak performance in SNNs.
Future work should explore the enhanced capabilities of the Akida 2 generation, including higher precision, programmable activations, and support for temporal and event-based neural networks, to better accommodate complex data.
Akida’s FPGA-based design limits network size and scalability [13], and the absence of standardized benchmarks complicates performance evaluation. Nevertheless, it remains promising for edge and real-time applications.

[…]

The domain of neuromorphic computing poses a substantial challenge for neophytes, necessitating an in-depth comprehension of various interconnected domains, including neuroscience, nano-electronics, and computer science. Nonetheless, with persistent research and development efforts, the Akida technology holds the potential to ascend as a preeminent neuromorphic computing paradigm, proffering more efficient solutions for real-time processing applications.

[…]

Conclusion
Our experiment revealed that CNN architecture, optimized structurally, showed the best performance in predicting GABA concentration from permittivity data. Akida’s CNN to SNN framework underperformed due to dataset characteristics and Akida’s limitations. Despite challenges in applying neuromorphic computing to impedance-based sensory systems, these technologies hold promise for revolutionizing various applications, including real-time processing of extensive physiological data
.”




ABC5855B-CCF0-42FB-9F3F-46F09ECCA206.jpeg

AF1C3937-681F-4295-AE69-DCD2382B9EBF.jpeg

120F554A-F59C-4C8F-9AB8-E19727AC9C1C.jpeg
867E57F7-3BF8-4D5B-9854-CF3074275705.jpeg
 
  • Like
  • Fire
  • Love
Reactions: 21 users

Gazzafish

Regular
So yesterday we had super low shares traded. Approx 3m for the entire day. Today we have 7.6m shares traded in the first 30 minutes … hmmmm 🤔
 
  • Like
  • Haha
  • Fire
Reactions: 10 users

HopalongPetrovski

I'm Spartacus!
Looks like our kapellmeister's are looking to give us a little run this morning? 🤣
 
  • Haha
  • Fire
  • Like
Reactions: 7 users

Cardpro

Regular
Long post ahead - if you’re not interested in what I found out about Steve Harbour’s invention The Living Processor™, whose concept fundamentally differs from fully digital neuromorphic chips such as Loihi and Akida, simply scroll on to the Steve Harbour quote marked in green - I am pretty sure you will enjoy reading that part… 😊

Steve Harbour - formerly with SwRI (Southwest Research Institute) in Beavercreek, OH (a suburb of Dayton) and since June 2024 Director of AI Hardware Research at Parallax Advanced Research, also in Beavercreek, OH - has been working with Loihi for years, but has also been utilising Akida in his neuromorphic research, at least since joining Parallax. He has repeatedly spoken favourably about both Loihi and Akida and revealed in an article published on 29 April 2025 ("Parallax Advanced Research and the Future of Neuromorphic Artificial Intelligence in Electronic Warfare") 👆🏻that Parallax were partnered with both Intel and BrainChip, something that to date is still not reflected on the BrainChip "Partners" webpage.

A paper on Martian flight [= flight of a mini helicopter/drone in the extremely thin atmosphere of Mars] published in July 2025* that he co-authored with his son David Harbour (first author) and other researchers from Parallax Advanced Research, University of Dayton, University of Cincinnati and Sinclair College mentions that the neuromorphic system they developed - dubbed VelocitySNN-Fuzzy AI - “supports autonomous flight with interpretable output and deployment on neuromorphic hardware such as Loihi and Akida”. Funding for this research came from NASA.
*“Event-Driven Spiking Neural Network and Fuzzy Logic AI System for Velocity Determination in Martian Flight”



View attachment 94127

View attachment 94128


However, over the past few months it has become increasingly clear that fully digital neuromorphic processors like Loihi and Akida are not the holy grail in Steve Harbour’s eyes.

In October, I shared a podcast titled "Thinking Like the Brain: Neuromorphic AI and the Future of Defense Computing”, in which he was interviewed by Arun Seraphin, the Executive Director of the NDIA Emerging Technologies Institute (ETI) and a former Professional Staff Member on the staff of the United States Senate Committee on Armed Services:



At around the 23:30 min mark, Arun Seraphin asks "How mature is the technology, you know, on a TRL scale? Can I go to Costco and buy a laptop with a neuromorphic processor on it, yet, or Best Buy? Or is it something that I see at someone's lab bench, and it's all sort of all wired up right now, and they're promising me in five years?"
TRL= Technology Readiness Level

Steve Harbour replies that “it’s in a stage that I think we are beyond just a lab curiosity” and then briefly talks about both Intel and BrainChip, but also shares his belief that the future of neuromorphic chips will ultimately belong to analog neuromorphic processors, adding that he is currently working on a novel processor that will be largely analog:

From 24:56 min
“... BrainChip Akida - probably has, if not the first commercialised neuromorphic processor, it's among the first commercialised. And you can buy that today. And then you get support with it.

Again, it’s a digital-based neuromorphic processor with a neuromorphic mesh, if you will. Full-blown neuromorphics will involve something that's very analog-like, okay, like the mind. And we’re dev, we're working on that. In fact I’m heavily involved in a specific program right now doing that.”


At 31:33 min, he reiterates this assessment:
"And at the end of the day we're gonna see, after the long haul, that the analog neuromorphic-like processor is going to win out."


From 42:33 min, Steve Harbour reveals a little bit more about the Living Processor project he alluded to minutes earlier:

"I think besides the military on the edge, in space on the edge, you know, I think we're gonna see it with wearable devices in the medical community, whether they are on or inside the human. I think also, I think the data centers are going to be a huge driver. One of the, so, erm, I've been an innovator for something that Parallax called the Living Processor, and I'm working on that with some universities on doing research. And it's a processor that could be used for the edge or for the data center.

And so, I think that that's gonna be - I don't know which is gonna, you know, we know that in the military, it's gonna be the edge device, it's just gonna happen. In space, it's gonna be the device of choice. I believe that it's gonna be, I don't know which is gonna happen first, either the, you know, the smartphone with a neuromorphic processor or if it's gonna be the data center. You know, there's gonna be, again, some challenges with the data center, you know, it's a little different way of computing, with LLMs, but, you know, I know we can breakthrough that one, too. So I think it's gonna be, who's gonna be first in the civilian market? Is it gonna be cell phone or data center? But all that's gonna find its way in the personal computer, and you know, and become more the norm."



At 48:49 min, the podcast host asks

"Is there work going on between this Neuromorphic Computing AI community and those people who really do study legitimately the brain - to develop more optimal systems?"

SH: "... NIH is a pivotal piece of this process. You know, for example, I work with Dr. Rishma [he means Rashmi] Jha at UC [= University of Cincinnati] and folks at Penn State. We currently have an NSF Grant for a DNA compute layer - I said DNA compute layer - to add to work that we're currently doing with this Living Processor."





Steve Harbour also mentions being the “Inventor of the Living Processor™” in his LinkedIn profile:


Innovation in National Security AI: Invented the Living Microprocessor™ — a hybrid neuromorphic architecture enabling self-healing, ultra-low power, and event-driven computing directly relevant to DOE/LANL’s scalable secure AI mission.”

LANL = Los Alamos National Laboratory (https://www.lanl.gov)





On 26 September 2025, Parallax Advanced Research published a blog post called "Toward a Living Microprocessor™: Dr. Steve Harbour’s Vision for Bio-Integrated Intelligence"

https://parallaxresearch.org/news/b...e-harbours-vision-bio-integrated-intelligence

“The Living Microprocessor™ integrates material innovations, both organic and inorganic, into a single architecture. Together, these materials create a processor that functions like a cognitive organism with near-zero latency and ultra-low power consumption. Unlike digital neuromorphic platforms, Harbour’s design introduces organic intelligence into the hardware itself. In practical terms, this means a processor that not only maintains function under noise, radiation, or physical damage but actually improves in adverse conditions.

The implications extend far beyond laboratory performance. Harbour envisions these processors as the backbone of next-generation human-machine teaming. This innovation could also apply to RF classification and improve data center scalability. Because the Living Processor is robust to radiation and electromagnetic interference, it is also ideally suited for space and electronic warfare environments. The Living Processor is a collaborative effort among Parallax, the University of Dayton, the University of Cincinnati, Rochester Institute of Technology, Pennsylvania State University, and Brisk Computing.”


Brisk Computing LLC (https://briskcomputing.com), which has won four Phase I SBIR and two Phase II SBIR awards (5x NASA, 1x DoD) since 2020 that were all related to neuromorphic computing, is registered at the residential address of Tarek M. Taha, one of the University of Dayton neuromorphic researchers who has been collaborating with Steve Harbour for years:

https://www.sbir.gov/portfolio/1676181

https://www.ohioresidentdatabase.com/person/OH0020935927/taha-tarek




The same day the Parallax blog post came out, Steve Harbour gave a presentation at an event called “Cybersecurity in the era of AI”, organised by Ohio State University’s Institute for Cybersecurity & Digital Trust. A video of this presentation titled “Brains over Bots: How Bio-Inspired Computing Transforms Cyber Defense” was uploaded to YouTube on 2 October 2025.




View attachment 94129


Unfortunately, the audio quality is not very good, and there were quite a few parts that were unintelligible to me. I marked them as [x] in my quotes below. Not 100% sure I got the rest right, but as always I’d advise anyone to listen to interviews/podcasts themselves anyway rather than simply rely on what other forum members post here and elsewhere. Also, it’s about noticing intonation, emphasis, sentences being rephrased - subtle hints that may not be picked up when reading a transcript, even a verbatim one.


Steve Harbour refers to his Living Processor-work in progress in one of the presentation slides (from 10:48 min) as well as in the Q&A section. He clearly differentiates it from Loihi and Akida. Not only will the Living Processor have a hybrid architecture - partly digital, but mostly analog. Unlike Loihi or Akida, it is not a CMOS chip, but instead a mix of organic and inorganic materials, although Steve Harbour clarifies they are not using "fleshy" brain tissue as organic material.

While he refers to both Loihi and Akida as really good neuromorphic chips for edge applications, he evidently doesn’t consider them suitable for use in big data centers - something that he claims differentiates them from his own invention which he predicts will ultimately play a dual role - both at the edge and in the data centre. [I suppose Intel Labs and BrainChip would disagree with him here, as they see their own offerings as eventually scalable.]



From 19:27 min:
Question from the audience: “Are you guys manufacturing this at scale, yet, or is it still mostly [doesn’t finish the sentence]?”

SH: “Yeah, not at scale, yet.
So, erm, I've done a lot of work with Mike Davies and Intel, San Jose in [and?] neuromorphic processors, Loihi 1, Loihi 2, and they're real good at edge - they’re research chips, you can't go buy these - they're real good at edge computing. They're not [x, possibly scalable?]. That's a problem.

Erm, BrainChip. Ever heard of BrainChip? Check them out. BrainChip, they have an AKD1000 that you can buy. They just put it in commercial availability to buy it. And they are around 500 [US]$ and you can start working with it, playing with it, okay. It's not a full neuromorphic chip, it's still digital. It has a neuromorphic mesh, so it behaves kinda [x], but it uses a lot less energy and it has very good accuracy when trained, you know when [x] final result [x] deploy the algorithm, but again, it's edge, [x, possibly fit or for?] the edge, which is really good.

So, the Living Processor that I put up there, that one will be, can be dual role - at the edge, but then also in the data center. The memory capability - can't say what it is, but it is, it will blow your mind, it wholly will blow your mind. And it’s, it’s with the memory, computing memory and also storage within the layer, yeah, you can replace [x] racks of GPUs in the data center.


So, how fast is that gonna happen? Is it gonna happen in a year? You know, it's gonna be a little bit of time, but we have to get there as fast as we can, because we're getting more and more data centers. Everyone’s now doing the shopping list with ChatGPT apparently. [x] “It’s a great tool. I think it’s [x, sounds like ‘dual on steroids’?]”.



Next up is the question “Are the big GPU providers that are out there already leaning in this direction? Have you seen evidence of that?”, to which Steve Harbour replies: “So, I have not seen evidence of that [goes on to briefly talk about NVIDIA]… I have no indication they are, but, you know, my sixth sense is they’re probably going ‘You know what? We don’t wanna be the dodo or the dinosaur.’ Right? Some companies have done that and then they get bad, right? So, but I’m not seeing if they are [x].

IBM has. IBM has a couple of, has NorthPole, which is, again, it’s a digitally-based neuromorphic mesh chip, in research, so, it’s, it’s, you know, starting to see that.

Raytheon has - I'm not a business guy, but I've been told that Raytheon’s bought some stock, [x] stock in BrainChip.

So, you know, it's starting to happen, momentum [x], everyone is seeing the writing on the wall, [x]."


Q: “So are these Loihi chips from Intel, are they, like, truly analog, and also if they aren't, erm...”

SH: “Yeah, the are analog, unfortunately. So, [corrects himself] they're digital.
The Living Processor is mainly analog, some digital. So for an absolute pure neuromorphic, it's analog, right? And - I know, I know: [copies other people who will say in disbelief] 'What? Is this not like going backwards? ' [...]”


At the end of the Q&A, somebody in the audience asks about the computational medium they are using and specifically whether they have considered using mushrooms (no kidding, research into fungal computing has been around for a while and mycelium as a computing substrate is said to be promising; cf this recent research into organic memristors made of shiitake and button mushrooms conducted at Ohio State, which happened to host the event: https://news.osu.edu/powered-by-mushrooms-living-computers-are-on-the-rise/):

SH replies: "Mycelium?

So, erm, I can't tell you about the composition, I'm sorry [...] and DARPA is involved. So. But it’s a great question, and, it’s a very, it’s a good question, by the way. One I’d ask.

Mycelium? So, the answer is, I’m doing research in mycelium […] Under our feet, the trees are talking, okay? So, everybody’s got trees, we have stuff underneath our feet, right? There’s communication going on, guys. It’s going on, okay? And it’s going on in a spiking fashion, like our brains. And there may be some language going on, too. So yes, the answer is ‘yes’ to your question. I was just [?] with DARPA very much on this specific thing. So, that I can share.”



So at 22:50 min, Steve Harbour reveals a piece of information that is new to us BRN shareholders, when he claims: “I've been told that Raytheon bought some stock, [x] stock in BrainChip”, namely that Raytheon/RTX have apparently put their money where their mouth is.

Saying that, they are yet to sign an actual IP license.

While this stock purchase info could be just a rumour, as long as we do not have official confirmation from BrainChip, Raytheon/RTX or via publicised stock market documents, I personally believe Steve Harbour would likely have heard about it from somebody in the know, not merely on the grapevine.

So provided this investment info is accurate, it appears that Raytheon not only believes in BrainChip's technology, but also in our present or future management's ability to translate this technology into meaningful revenue at some point.

Thank you Frangipani!!!
 
  • Like
  • Love
Reactions: 6 users

7für7

Top 20
Wohooo 18.5 just “wiped out” like nothing 😂
Jan Böhmermann Anführungszeichen GIF by ZDF Magazin Royale


But let’s be honest to ourselves… on the end of the day we will have 18.0 again…
 
Last edited:
  • Like
  • Haha
Reactions: 8 users

7für7

Top 20
Wohooo 18.5 just “wiped out” like nothing 😂
Jan Böhmermann Anführungszeichen GIF by ZDF Magazin Royale


But let’s be honest to ourselves… on the end of the day we will have 18.0 again…

I’m sure some genuine shareholder sell some of their shares to avoid a speeding ticket! Thank you! You guys are looking out of the box! 👍
 
  • Haha
Reactions: 1 users

TheFunkMachine

seeds have the potential to become trees.
Chat GPT deep dive into potential/likely costumer engagements.

(Many of you will be familiar with this but it is an interesting summary)


Below is an evidence-based “dot-joined” map of the most credible real-world product integrations and near-term commercialization paths for BrainChip’s Akida (AKD1000 / AKD1500 / Akida 2 / Akida IP)—focused on who is building with it, where it fits in their product lines, and what that implies for likely downstream customers.



I am only using public, attributable material (company releases, partner product pages, program documents). Where I infer “likely next customers,” I label it clearly as inference—not fact.







Highest-signal, real-world product engagements (already in a product, or in formal product evaluation)



1) Frontgrade Gaisler: GR801 / GRAIN space-grade SoC line with Akida integrated (IP)

• What’s real: Frontgrade Gaisler publicly states it is developing the GR801 and that it “includes the Akida brain-inspired neural network technology”.

• Why it matters: Frontgrade Gaisler is not a “demo partner”—they sell space-grade processors/SoCs into flight programs. A GR801-class part becoming a catalog item effectively creates a distribution channel into spacecraft primes and their supply chain.

• Near-future customer set (inference): If GR801 enters broader adoption, the natural buyers are space integrators and primes already buying Gaisler/Frontgrade processor IP and rad-tolerant SoCs (e.g., payload autonomy, onboard ISR/EO processing, comms autonomy). This is “quiet” because it shows up first as processor selection on mission BOMs, not in flashy PR.



2) ESA-linked space deployment pathway is explicit (Frontgrade license + ESA evaluation)

• BrainChip’s quarterly report explicitly ties the Frontgrade license to ESA: Akida 1.0 IP incorporation into space-grade SoCs and “paves the way… to be deployed in space by the European Space Agency (ESA)” and references joint evaluation work with ESA + Frontgrade + BrainChip.

• This is one of the strongest “non-obvious” dots because ESA/space work typically appears as program deliverables before it appears as “customer wins.”



3) Bascom Hunter: AKD1000 is already inside a rugged defense VPX card (SNAP Card)

• What’s real (product page): Bascom Hunter’s 3U OpenVPX SNAP Card states it combines an RFSoC FPGA with five BrainChip AKD1000 processors.

• Why it matters: VPX/SOSA cards are a standard procurement object in defense. This is not “maybe”—it’s a shipping defense electronics form factor with Akida inside.



4) Bascom Hunter: formal AKD1500 “full scale evaluation of commercial products” contract

• BrainChip reports a US$100k contract with Bascom Hunter for AKD1500 chips for full scale evaluation of commercial products, and notes Bascom Hunter integrates third-party tech for Defense and Intelligence applications.

• Dot-join insight: This is a classic path: AKD1000 in current card → AKD1500 eval → next card spin / module upgrade (inference). If Bascom’s VPX line moves to AKD1500, you often see it first as an updated module datasheet, not a market announcement.



5) Parsons: explicit strategic agreement for defense systems + “designed into end solutions”

• BrainChip has a strategic agreement with Parsons to accelerate edge AI defense systems (as referenced in BrainChip materials and related posts).

• Separately, BrainChip states the AKD1500 “has been delivered and designed into several end solutions… including Parsons, Bascom Hunter and Onsor Technologies.”

• Why it matters: Parsons is a prime contractor / systems integrator; “designed into” strongly implies the engagement is beyond evaluation.



6) Onsor Technologies: AKD1500 in a commercial medical device (seizure-detecting smart glasses)

• BrainChip positions Onsor’s seizure-detection glasses as a customer success using AKD1500.

• Why it matters: This is the cleanest example of AKD1500 as a component in a real product in healthcare—useful as a reference design for other wearables/medtech OEMs (inference).



7) Arquimea (Spain): drone + Prophesee event camera + Akida for water-safety detection

• BrainChip states Arquimea demonstrated Akida with a Prophesee Metavision event-based camera on a low-power drone to detect distressed swimmers/surfers.

• Dot-join insight: This is a practical proof that event cameras + neuromorphic compute can run at the edge on a UAV. The “quiet” next step is usually maritime safety agencies / coastal surveillance integrators (inference), especially where endurance/latency matters.



8) Intellisense Systems: Akida selected for cognitive radio; NASA use case named

• BrainChip states Intellisense selected Akida for SWaP-constrained platforms (including spacecraft/robotics) and explicitly references a NECR device where Intellisense says it provides NASA applications, planned for a Phase II prototype integrating Akida.

• Why it matters: “Phase II prototype” language is a meaningful maturity marker in US government-adjacent development cycles.







Strong “pipeline” signals (credible pathways to new customers, but not yet confirmed end-customer design wins)



9) Blue Ridge Envisioneering (BRE): tactical edge collaboration backed by a Naval Air Warfare Center contract

• BrainChip’s BRE announcement includes a Naval Air Warfare Center contract numberand states BRE will integrate Akida processors into tactical devices.

• Dot-join insight (inference): BRE/NAWC-backed work typically translates into program-specific prototypes that can later be pulled into broader platforms if the performance/power story holds.



10) Lorser Industries: neuromorphic SDR devices (system-level manufacturing/integration)

• BrainChip and Lorser state they will use Akida to deliver neuromorphic solutions for software-defined radio devices, emphasizing SDR tasks like signal classification, modulation/demodulation, encryption/decryption, and anomaly detection.

• Dot-join insight (inference): If Lorser turns this into a sellable SDR module/line item, the downstream customer universe becomes defense comms, aerospace comms, navigation, and SIGINT-adjacent integrators.



11) HaiLa: AKD1500 + ultra-low-power Wi-Fi backscatter for “leading OEMs” in multiple domains

• BrainChip and HaiLa state they’re working with leading OEMs and ecosystem partnersacross medical, environmental, and infrastructure monitoring, pairing AKD1500 with HaiLa’s BSC2000 to enable microwatt-class connected sensors.

• Dot-join insight (inference): The real commercialization vector here is module makers and sensor OEMs who can bundle connectivity + edge inference into a single reference design.



12) Andes Technology (RISC-V): AKD1500 integrated on QiLai Voyager platform

• BrainChip describes integrating Akida with Andes’ QiLai platform and highlights this as a way these can be used in production SoCs.

• Andes describes its footprint as “over 16 billion Andes-powered SoCs.”

• Dot-join insight: This is less about one customer and more about distribution leverage: if Akida becomes a “known-good” companion IP in the Andes ecosystem, it can surface inside many unnamed SoC design houses that already license Andes cores (inference).



13) Akida 2: “no hardware” Akida Cloud and commercialization emphasis

• BrainChip is explicitly pushing Akida 2 access via “Akida Cloud… live demos on Akida 2 with no hardware requirements.”

• Why it matters: This is often how “quiet” design-ins start: developers validate performance without hardware lead times, then move into evaluation silicon/modules (inference).







RTX / Raytheon angle: what’s supportable from public evidence

• BrainChip’s December 2024 quarterly report states an SBIR project focused on micro-Doppler radar signature analysis, and that BrainChip was negotiating with a major defence industry contractor as a subcontractor to complete the contract award.

• Some secondary commentary explicitly names Raytheon/RTX, but that is not consistently corroborated by primary documentation; the most defensible statement is that a major defense contractor is involved per BrainChip’s own report.







Practical “dot-joining” watchlist (where the next customer names typically appear first)



If you want to surface non-obvious, near-term customer engagements early, the highest-yield places to monitor are:

• Defense module catalogs (VPX/SOSA cards, rugged edge servers): look for refreshed datasheets that swap AKD1000 → AKD1500 (Bascom Hunter is the clearest example path).

• Space-grade processor roadmaps and ESA program deliverables tied to Frontgrade Gaisler GR801 / GRAIN.

• Cognitive radio / space comms prototypes (Intellisense NECR Phase II prototype).

• Edge drone autonomy demos moving into pilots (Arquimea + Prophesee + Akida).

• RISC-V platform partner announcements where Akida becomes a standard add-on in an ecosystem (Andes).

• Ultra-low-power sensor OEM announcements in medical/infrastructure monitoring that mention always-on anomaly detection + coin-cell longevity (HaiLa + AKD1500 pairing is directly positioned at this).







What I did not do (by design)

• I did not claim any unnamed company is a “customer” unless there is a public statement supporting it.

• I did not treat generic partnership announcements as “design wins” unless there is evidence of integration, evaluation for commercial products, a Phase II prototype, or a product page.



 
  • Like
  • Fire
  • Love
Reactions: 24 users
May have been posted previously, not searched, but good to see the authors of the below paper are all part of, or working with, Parallax Advanced Research, Uni Dayton, Cincinatti and Sinclair College.

They state their Martian drone flight development supports deployment on Akida (& Loihi).



A VelocitySNN-Fuzzy AI Architecture: Neuromorphic Event-Driven Spiking Neural Network and Fuzzy Logic AI System for 3D Velocity Determination in Martian Flight​


Abstract:​

We introduce VelocitySNN-Fuzzy AI, a novel neuromorphic system for real-time determination of three-dimensional velocity vectors (Vx, Vy, Vz) during Martian drone flight. Unlike Earth-Based UAVs, Mars operations cannot rely on GPS or conventional visual odometry due to the lack of infrastructure and the extreme lighting and terrain conditions. Our approach integrates a bio-inspired event-based camera, a Spiking Neural Network (SNN) developed using SpikingJelly, and an explainable Fuzzy AI Logic layer to produce reliable and interpretable velocity determination in challenging Mars-Like environments. Validated using publicly available Mars-Style drone datasets, the architecture demonstrates low-latency, high-dynamic-range perception suitable for small, power-constrained UAV platforms. VelocitySNN-Fuzzy AI determines 3D velocity for Martian drones using event-based vision, spiking neural networks, and fuzzy logic. Designed for GPS-denied, low-light environments, it enables real-time, low-power navigation. Validated on FPV drone datasets, it supports autonomous flight with interpretable outputs and deployment on neuromorphic hardware like Loihi and Akida
 
  • Like
  • Fire
  • Love
Reactions: 19 users

7für7

Top 20
  • Like
  • Love
Reactions: 4 users

7für7

Top 20
  • Like
  • Love
Reactions: 5 users

MegaportX

Regular
So yesterday we had super low shares traded. Approx 3m for the entire day. Today we have 7.6m shares traded in the first 30 minutes … hmmmm 🤔
Recon it could be was the Fried rice gang. There impressed.
 
  • Like
Reactions: 1 users

MegaportX

Regular
Chop Suey Man GIF
 
  • Haha
  • Thinking
Reactions: 2 users

7für7

Top 20
  • Thinking
Reactions: 1 users

walderamaa

Emerged
Bist du rassistisch?
"Shouldn’t you have disappeared a long time ago? How can someone post nonsense here every single day? Do you need attention? It’s annoying!!!
 
  • Like
  • Fire
  • Haha
Reactions: 6 users

IloveLamp

Top 20
Last edited:
  • Like
  • Fire
  • Haha
Reactions: 14 users

TECH

Regular
Hi All,

Some nice posts coming through, but once again, it's still too early, yes, I know my comment pisses many off, BUT it's the truth!

I know that our company has a lot of work to do throughout 2026, yes, a comment has been mentioned to me that 2027 will "be
the year", with a definite change appearing during the second half of 2026, so hang tight, we have the technology, we have an ever
growing ecosystem, we have a "ever expanding customer register", we have a number of NDA agreements. we have other company's
attention, even Dr. Steve Harbour "liked" our company's photos on Linkedin yesterday, is analog the go, "digital isn't really neuromorphic"
well, I intend asking Peter for a few comments on what was posted by our great poster, Frangipani, lots of respect for her research, her time,
but in the end, I personally trust Peter and his digital creation that we are all hanging our hat on to deliver a worldwide advancement.

Time, we all seem to be controlled by it, but the more you learn about the universe, the ever-expanding universe, the size of which humans
can never, ever understand, it seems that our lifetimes of basically under 100 human years is so minuscule in the overall "plan" of things, why
do we hang so much weight on "time".......maybe we live over and over again through a consciousness that never actually "dies".

Sorry, I have lost myself in thought, but still believe we are all backing a great, future leading company, God bless all.

Cheers......Tech.
 
Last edited:
  • Like
  • Love
  • Haha
Reactions: 21 users

itsol4605

Regular
 
  • Like
Reactions: 3 users

Guzzi62

Regular
"This is not a prediction, but a computational physics certainty"

View attachment 94139 View attachment 94140
What a space cadet this Jerry dude is, LOL.

I have no idea what he is on about, but it must be good shit he has been smoking.
 
  • Like
  • Thinking
Reactions: 2 users

Diogenese

Top 20
What a space cadet this Jerry dude is, LOL.

I have no idea what he is on about, but it must be good shit he has been smoking.
Not so much a word salad as a buzzword jungle.

The message I took away was in Slide 4, where we see AGI disappearing up its own fundament.

I do like the idea of harvesting entropy - it's sort of like a big sleep-in for everyone.

The important thing is that he got the right answer.
 
  • Like
  • Haha
Reactions: 5 users
Top Bottom