BRN Discussion Ongoing

7für7

Top 20
Ten year horizon, yeah me too. Difference is I've already been in 8 years and heard management make many statements (J-curve sales, explosion of sales, watch the financials, etc.) that have not resulted in any demonstrable commercial progress............
oh you mean you have only 2 years until you sell? No I mean I am already as long as you in. And wait minimum 10more years. 👍
 
  • Fire
  • Like
Reactions: 2 users

TopCat

Regular
Starting tomorrow, BRN is presenting at Embedded World with NVISO and VVDN:


Embeddedworld - BrainChip


Gain firsthand insights into the future of Space Technology and BrainChip’s pivotal role in shaping it.




As BrainChip’s product engineering and manufacturing services partner, VVDN is thrilled about the prospects of 2024. The year promises unprecedented growth in intelligent products, many of which will leverage the power of BrainChip’s Akida Edge AI Box. From niche offerings to broader technological advancements, our collaboration with BrainChip ensures that cutting-edge solutions are at the forefront. Together, we anticipate significant expansion in our customer base, driven by the seamless integration of Akida’s capabilities into various fields, paving the way for a new era of innovation.

Kalpeshkumar Chauhan

VP, Technology Evangelist At VVDN Technologies


View attachment 60471
I guess our NVISO presentation will be a reprise of CES 2024:

https://brainchip.com/brainchip-and...nabled-human-behavioral-analysis-at-ces-2024/

BrainChip and NVISO Group Demonstrate AI-Enabled Human Behavioral Analysis at CES 2024​


Laguna Hills, Calif. – January 5, 2024 BrainChip Holdings Ltd (ASX: BRN, OTCQX: BRCHF, ADR: BCHPY), the world’s first commercial producer of ultra-low power, fully digital, event-based, neuromorphic AI IP, and NVISO Group Ltd, a global leader in human behavior AI software, will showcase a joint system that enables more advanced, more capable and more accurate AI on consumer products at the Consumer Electronics Show (CES) 2024 in Las Vegas, Nevada.



...and our friends at Global foundries have not forgotten us:


View attachment 60470


So it is happening, it's just that it feels like Achilles and the tortoise ...



Akida - Gateway to the Cyberverse


“Learn how our strategic partnerships are expediting deployments for Intelligent Retail and Security solutions.”

Not sure I have ever seen intelligent retail ever mentioned before
 
  • Like
  • Fire
Reactions: 14 users

Diogenese

Top 20
Starting tomorrow, BRN is presenting at Embedded World with NVISO and VVDN:


Embeddedworld - BrainChip


Gain firsthand insights into the future of Space Technology and BrainChip’s pivotal role in shaping it.




As BrainChip’s product engineering and manufacturing services partner, VVDN is thrilled about the prospects of 2024. The year promises unprecedented growth in intelligent products, many of which will leverage the power of BrainChip’s Akida Edge AI Box. From niche offerings to broader technological advancements, our collaboration with BrainChip ensures that cutting-edge solutions are at the forefront. Together, we anticipate significant expansion in our customer base, driven by the seamless integration of Akida’s capabilities into various fields, paving the way for a new era of innovation.

Kalpeshkumar Chauhan

VP, Technology Evangelist At VVDN Technologies


View attachment 60471
I guess our NVISO presentation will be a reprise of CES 2024:

https://brainchip.com/brainchip-and...nabled-human-behavioral-analysis-at-ces-2024/

BrainChip and NVISO Group Demonstrate AI-Enabled Human Behavioral Analysis at CES 2024​


Laguna Hills, Calif. – January 5, 2024 BrainChip Holdings Ltd (ASX: BRN, OTCQX: BRCHF, ADR: BCHPY), the world’s first commercial producer of ultra-low power, fully digital, event-based, neuromorphic AI IP, and NVISO Group Ltd, a global leader in human behavior AI software, will showcase a joint system that enables more advanced, more capable and more accurate AI on consumer products at the Consumer Electronics Show (CES) 2024 in Las Vegas, Nevada.



...and our friends at Global foundries have not forgotten us:


View attachment 60470


So it is happening, it's just that it feels like Achilles and the tortoise ...



Akida - Gateway to the Cyberverse


Just a reminder of NVISO/Akida performance from mid-2022:

https://nviso.ai/news-and-media/pre...lestone-with-brainchip-akida-neuromorphic-ip/

NVISO announces it has reached a key interoperability milestone with BrainChip Akida Neuromorphic IP​


NVISO has successfully completed full interoperability of four of its AI Apps from its Human Behavioural AI catalogue to the BrainChip Akida neuromorphic processor achieving a blazingly fast average model throughput of more than 1000 FPS and average model storage of less than 140 KB.


Lausanne, Switzerland – July 19th, 2022 – nViso SA (NVISO), the leading Human Behavioural Analytics AI company, is pleased to announce it has released an Evaluation Kit (EVK) for its Human Behavioural AI SDK running on the BrainChip Akida™ neuromorphic processing platform. NVISO will release commercially its Human Behaviour AI as both as an Evaluation Kit (EVK) and Software Development Kit (SDK) optimized for neuromorphic processors targeting innovators looking to adopt AI-driven human machine interfaces to detect human behaviour in real-time at the edge. Both companies will jointly promote the EVK and SDK and a first evaluation with a semiconductor manufacturer in Japan has started. This deployment of NVISO’s Human Behavioural AI exploits the superior performance capabilities of BrainChip Akida neuromorphic processor IP with latencies under 10ms and model storage requirements under 1MB for a complete target solution. Targeting next generation of low-power SOC devices with embedded neuromorphic capabilities, target applications include Robotics, Automotive, Telecommunication, Infotainment, and Gaming.​


Could the Japanese semiconductor manufacturer be Socionext?

NVISO does eye tracking:
https://nviso.ai/eye-tracking-technology-for-human-machine-interfaces/

This has several uses including driver monitoring systems.

... and remember NVIDIA are more like partners than competitors ...



NVISO presents Automotive Interior Monitoring System (IMS) AI Solution with Neuromorphic Computing with BrainChip Akida​

2 years ago

NVISO presents a complete Interior Monitoring Software (IMS) solution running on an NVIDIA NPU and BrainChip Akida neuromorphic processor.

Driver Monitoring System (DMS) will be a key feature for OEMs to meet Euro NCAP 2025. It can detect distracted and drowsy drivers by accurately measuring eye and head position, attention and fatigue.

While the Interior Monitoring System (IMS) uses machine learning to enable in-vehicle systems to sense their occupants’ emotional states and gestures to provide personalized experiences in the transition to automated driving
.

Now is this meant to refer to a combination of Akida and Nvidia, or does it refer to a comparison, because about the same time NVISO published these benchmark graphs? (Tales from the NDA - So many questions, so few answers):

1712550394342.png


1712550649397.png


https://nviso.ai/news-and-media/pre...lestone-with-brainchip-akida-neuromorphic-ip/

NVISO’s first neuromorphic optimised EVK was achieved in record time and exceeded all expectations”, said Tim Llewellynn, CEO of NVISO, “BrainChip has delivered an excellent development environment for AI software specialists like NVISO, and the maturity of their tools really show why they are first commercial neuromorphic processor IP to market. Deployment of the combined technologies of NVISO AI Apps together with embedded neuromorphic processing can provide game changing performance improvements for our most demanding customers and a wide range of use cases. We are really excited that the neuromorphic revolution is now upon us and could finally deliver on the promise of wide scale deployment of human centric technologies from consumer products through to medical devices and automotive ADAS systems that will have a profound impact on our lives.”

“We are delighted with the progress of our partner NVISO reinforcing the performance and efficiency gains of BrainChip’s Akida neuromorphic processor design with compelling results demonstrated with NVISO’s behavioural AI software”, said Jerome Nadel, CMO at BrainChip. “Our uniquely differentiated AI acceleration provides equivalent performance benefits to any edge AI software; high inference performance, low latency, and ultra-low power consumption”
.
 
  • Like
  • Fire
  • Love
Reactions: 66 users

toasty

Regular
oh you mean you have only 2 years until you sell? No I mean I am already as long as you in. And wait minimum 10more years. 👍
You're obviously younger than me..........:D
 
  • Like
  • Love
Reactions: 3 users

7für7

Top 20
You're obviously younger than me..........:D
Doesn’t matter how old. I will not sell before we reach a price which is acceptable! If I will not reach the goal, the shares will get my family and so on. I’m not in to sell on penny’s
 
  • Like
  • Fire
  • Love
Reactions: 11 users

IloveLamp

Top 20
  • Like
  • Love
Reactions: 4 users
It’s that time and my super will soon be available, please, please, please no announcements

1712562558305.gif
 
Last edited:
  • Haha
  • Like
Reactions: 9 users

Esq.111

Fascinatingly Intuitive.
Good Evening Chippers ,

Hard to belive , but one of our AKIDA chips is experiencing such views.



Do us proud little feller.

* with or without sound , the view is humbling.

Regards,
Esq.
 
  • Like
  • Fire
  • Love
Reactions: 27 users

Makeme 2020

Regular
Good Evening Chippers ,

Hard to belive , but one of our AKIDA chips is experiencing such views.



Do us proud little feller.

* with or without sound , the view is humbling.

Regards,
Esq.

BRNs Akida seems to be in space, so are we just giving the technology away.
Where's the money honey.
No REVENUE And No IP License.
Can't wait for 2030
 
  • Like
  • Love
Reactions: 5 users

Esq.111

Fascinatingly Intuitive.
BRNs Akida seems to be in space, so are we just giving the technology away.
Where's the money honey.
No REVENUE And No IP License.
Can't wait for 2030
Evening Makeme2020 ,

110% agree , the only ones profiting so fare are a handful of directors , and some well paid employees doing the actual work.

Regards,
Esq.
 
  • Like
  • Fire
  • Love
Reactions: 12 users
BRNs Akida seems to be in space, so are we just giving the technology away.
Where's the money honey.
No REVENUE And No IP License.
Can't wait for 2030
They haven't even turned it on yet to see if it works. If it is a partnership I am thinking we get a percentage of services rended. Way to go yet. By memory it will be turned on about 4 weeks after payload put into orbit, then testing, then hopefully they have some jobs lined up. IMO.

SC
 
  • Like
  • Love
Reactions: 5 users

Frangipani

Regular
A new Brains & Machines podcast is out - the latest episode’s podcast guest was Amirreza Yousefzadeh, who co-developed the digital neuromorphic processor SENECA at imec The Netherlands in Eindhoven before joining the University of Twente as assistant professor in February 2024.




Here is the link to the podcast and its transcript:



BrainChip is mentioned a couple of times.

All of 2017, towards the end of his PhD, Yousefzadah collaborated with our company as an external consultant. His PhD supervisor in Sevilla was Bernabé Linares-Barranco who co-founded both GrAI Matter Labs and Prophesee. Yousefzadah was one of the co-inventors (alongside Simon Thorpe) of the two JAST patents that were licensed to BrainChip (Bernabé Linares-Barranco was co-inventor of one of them, too).

https://thestockexchange.com.au/threads/all-roads-lead-to-jast.1098/

30FFF075-2A07-4AA0-BCF5-8EBAC8D0C035.jpeg



Here are some interesting excerpts from the podcast transcript, some of them mentioning BrainChip:

5B683626-A2D8-4E8A-A900-8BAD606FFC36.jpeg

BBBDFD0C-E9FA-4CBD-B1C1-7A5EB89DB7FD.jpeg


E86D70C0-6B20-46B8-8803-F0A5005869AC.jpeg



(Please note that the original interview took place some time ago, when Amirezza Yousefzadah was still at imec full-time, ie before Feb 2024 - podcast host Sunny Bains mentions this at the beginning of the discussion with Giulia D’Angelo and Ralph Etienne-Cummings and again towards the end of the podcast, when there is another short interview update with him.)


4167AF2B-C3EE-471B-8C21-9B299EC9B16C.jpeg



Sounds like we can finally expect a future podcast episode featuring someone from BrainChip? 👍🏻
(As for the comment that “they also work on car tasks with Mercedes, I think”, I wouldn’t take that as confirmation of any present collaboration, though…)


E1D11422-2621-442A-85C2-BB9AE42F3BA3.jpeg



Interesting comment on prospective customers not enamoured of the term “neuromorphic”… 👆🏻



DB5EBAA9-1B3A-493C-96F2-721BC523B4C6.jpeg

C5B7ABDD-D6C9-44DC-A36C-E35603D34F69.jpeg
 
  • Like
  • Love
  • Fire
Reactions: 40 users

Mt09

Regular
Hopefully ESA/Frontgrade will share/sell their hardware (that’s due to tape out later this year) for class 4/5 missions, should suit the bill for ant61’s requirements.

Plus we look like being included on some future hardware for class 1 (safety critical) missions according to Laurent Hili.

47 minutes on



Well worth a read, around defining safety critical systems.

 
  • Like
  • Love
Reactions: 16 users

Makeme 2020

Regular
They haven't even turned it on yet to see if it works. If it is a partnership I am thinking we get a percentage of services rended. Way to go yet. By memory it will be turned on about 4 weeks after payload put into orbit, then testing, then hopefully they have some jobs lined up. IMO.

SC
So BRN MANAGEMENT is saying to customers if it works you pay but if it doesn't you don't have to pay .
 
  • Like
Reactions: 1 users

Frangipani

Regular
They haven't even turned it on yet to see if it works. If it is a partnership I am thinking we get a percentage of services rended. Way to go yet. By memory it will be turned on about 4 weeks after payload put into orbit, then testing, then hopefully they have some jobs lined up. IMO.

SC



A684F2BE-C53B-45D5-9440-57A65767A8A7.jpeg



Teaching us the virtue of patience is definitely something space and the stock market have in common… 😉
 
  • Like
  • Fire
  • Love
Reactions: 18 users
Doesn’t matter how old. I will not sell before we reach a price which is acceptable! If I will not reach the goal, the shares will get my family and so on. I’m not in to sell on penny’s
That's the spirit 7! Cheers! 👍

20240408_192419.jpg
 
  • Haha
  • Like
Reactions: 8 users

Frangipani

Regular
A new Brains & Machines podcast is out - the latest episode’s podcast guest was Amirreza Yousefzadeh, who co-developed the digital neuromorphic processor SENECA at imec The Netherlands in Eindhoven before joining the University of Twente as assistant professor in February 2024.




Here is the link to the podcast and its transcript:



BrainChip is mentioned a couple of times.

All of 2017, towards the end of his PhD, Yousefzadah collaborated with our company as an external consultant. His PhD supervisor in Sevilla was Bernabé Linares-Barranco who co-founded both GrAI Matter Labs and Prophesee. Yousefzadah was one of the co-inventors (alongside Simon Thorpe) of the two JAST patents that were licensed to BrainChip (Bernabé Linares-Barranco was co-inventor of one of them, too).

https://thestockexchange.com.au/threads/all-roads-lead-to-jast.1098/

View attachment 60489


Here are some interesting excerpts from the podcast transcript, some of them mentioning BrainChip:

View attachment 60487
View attachment 60490

View attachment 60491


(Please note that the original interview took place some time ago, when Amirezza Yousefzadah was still at imec full-time, ie before Feb 2024 - podcast host Sunny Bains mentions this at the beginning of the discussion with Giulia D’Angelo and Ralph Etienne-Cummings and again towards the end of the podcast, when there is another short interview update with him.)


View attachment 60492


Sounds like we can finally expect a future podcast episode featuring someone from BrainChip? 👍🏻
(As for the comment that “they also work on car tasks with Mercedes, I think”, I wouldn’t take that as confirmation of any present collaboration, though…)


View attachment 60493


Interesting comment on prospective customers not enamoured of the term “neuromorphic”… 👆🏻



View attachment 60494
View attachment 60495

And as for patience required regarding product timelines:

CDDDCFD1-5A84-46BA-B6B6-D18BD3D3053C.jpeg


B777DC7E-C825-4598-B345-49E97C2F33CD.jpeg
 
  • Like
  • Fire
Reactions: 16 users
Interesting group and workshop.

Haven't checked if posted already. Apols if has.

We're being utilised as well in the workshop as is Prophesee, Inivation and some others.

Equipment apparently being provided by Neurobus & WSU as organisers it appears....they obviously have access to Akida...wonder what else they been doing with it outside of the workshop :unsure:



SPA24: Neuromorphic systems for space applications​



Topic leaders​

  • Gregory Cohen | WSU
  • Gregor Lenz | Neurobus

Co-organizers​

  • Alexandre Marcireau | WSU
  • Giulia D’Angelo | fortiss
  • Jens Egholm | KTH

Invited Speakers​

  • Prof Matthew McHarg - US Air Force Academy
  • Dr Paul Kirkland - University of Strathclyde
  • Dr Andrew Wabnitz - DSTG

Goals​

The use of neuromorphic engineering for applications in space is one of the most promising avenues to successful commercialisation of this technology. Our topic area focuses on the use of neuromorphic technologies to acquire and process space data captured from the ground and from space, and the exploration and exploitation of neuromorphic algorithms for space situational awareness and navigation. The project combines the academic expertise of Western Sydney University in Australia (Misha Mahowald Prize 2019) and industry knowledge of Neurobus, a European company specialising in neuromorphic space applications. Neuromorphic computing is a particularly good fit for this domain due to its novel sensor capabilities, low energy consumption, its potential for online adaptation, and algorithmic resilience. Successful demonstrations and projects will substantially boost the visibility of the neuromorphic community as the domain is connected to prestigious projects around satellites, off-earth rovers, and space stations. Our goal is to expose the participants to a range of challenging real-world applications and provide them with the tools and knowledge to apply their techniques where neuromorphic solutions can shine.

Projects​

  • Algorithms for processing space-based data: The organisers will make a real-world space dataset available thatwas recorded from the ISS, specifically for the purpose of this workshop. In addition, data can be recorded with WSU's remote-controlled observatory network. There are exciting discoveries to be made using this unexplored data, especially when combined with neuromorphic algorithmic approaches.
  • Processing space-based data using neuromorphic computing hardware: Using real-world data, from both space and terrestrial sensors, we will explore algorithms for star tracking, stabilisation, feature detection, and motion compensation on neuromorphic platforms such as Loihi, SpiNNaker, BrainChip, and Dynap. Given that the organisers are involved in multiple future space missions, the outcomes of this project may have a significant impact on a future space mission (and could even be flown to space!)
  • Orbital collision simulator: The growing number of satellites in the Earth’s orbit (around 25,000 large objects) makes it increasingly likely that objects will collide, despite our growing Space Situational Awareness capabilities to monitor artificial satellites. The fast and chaotic cloud of flying debris created during such collisions can threaten more satellites and is very hard to track with conventional sensors. By smashing Lego “cubesats” () in front of a Neuromorphic camera, we can emulate satellite collisions and assess the ability of the sensor to track the pieces and predict where they will land. We will bring Lego kits to the workshop. Participants will need to design a “collider”, build cubesats with legos, record data, measure the position of fallen pieces, and write algorithms to process the data.
  • High-speed closed-loop tracking with neuromorphic sensors: Our motorised telescope can move at up to 50o per second and can be controlled by a simple API. The low-latency of event cameras makes it possible to dynamically control the motors using visual feedback to keep a moving object (bird, drone, plane, satellite...) in the centre of the field of view. The algorithm can be tested remotely with the Astrosite observatory (located in South Australia) or with the telescope that we will bring to the workshop.
  • Navigation and landing: Prophesee’s GenX320 can be attached to a portable ST board and powered with a small battery. To simulate a landing of a probe on an extraterrestrial body, we attach the camera to an off-the-shelf drone for the exploration of ventral landing, optical flow and feature tracking scenarios, as well as predicting the distance to the ground to avoid dangerous landmarks.
  • High-speed object avoidance: The goal is to work on an ultra-low-latency vision pipeline to avoid incoming objects in real-time, simulating threats in the form of orbital debris. This will involve a robotic element added to the orbital collision simulator.


Materials, Equipment, and Tutorials:

We're going to make available several pieces of equipment that includes telescopes to record the night sky, different event cameras from Prophesee and iniVation, a Phantom high-frame rate camera for comparison, neuromorphic hardware such as Brainchip's Akida and SynSense Speck. ICNS will also provide access to their Astrosite network of remote telescopes, as well as their new DeepSouth cluster.

We will run hands-on sessions on neuromorphic sensing and processing in space, building on successful tutorials from space conferences, providing code and examples for projects, and training with neuromorphic hardware. Experts in space imaging, lightning, and high-speed phenomena detection will give talks, focusing on neuromorphic hardware's potential to address current shortcomings. The workshop will feature unique data from the International Space Station, provided by WSU and USAFA, marking its first public release, allowing participants to develop new algorithms for space applications and explore neuromorphic hardware's effectiveness in processing this data for future space missions. Additionally, various data collection systems will be available, including telescope observation equipment, long-range lenses, tripods, a Phantom High-speed camera, and WSU's Astrosite system for space observations. Neurobus will make available neuromorphic hardware on site. This setup facilitates experiments, specific data collection for applications, experimenting with closed-loop neuromorphic systems, and online participation in space observation topics due to the time difference
 
  • Like
  • Fire
  • Love
Reactions: 45 users

Frangipani

Regular
Interesting video buddy.
They mention in the video.
BRAINSHIP AND AKITA.
Love your work.

You forgot Brainshift! 🤣

88CF5075-3797-4020-ADF7-A68F5B3550A8.jpeg



That Japenese dog Akita

Speaking of Akita: While the image of that puppy (or is it an adult dog?) featured on BrainChip’s website is most likely supposed to be a picture of an Akita Inu for obvious reasons…

EC2C3592-5057-49ED-92E4-6E77475A2FB3.jpeg


… I reckon they may have mixed up the dog breeds - looks more like a Shiba Inu (right) to me - the smaller cousin of the Akita Inu (left).

D6B19184-EA9D-4931-8C58-8B62812ADFB8.jpeg


Then again, I hardly know anything about canines, let alone Japanese breeds, unlike the curvaceous host of Animal Watch, “YouTube’s most popular Wolf and Alpha Dog Breed Channel” (I wonder how many of her 1.18 million followers are genuinely interested in the four-legged animals featured… 🤣):





HI (human Intelligence) created this video😂🙈

This is why the world needs AI 😂

On the other hand, the world still needs human intelligence to fact-check and proofread the AI’s creations:

8D01BCBB-68BB-4561-9455-400EB0FBB45D.jpeg



I first came across the above LinkedIn post in mid-March and incredibly this blunder has still not been corrected by the GenAI-enamoured authors one month to the day after they uploaded their case report on March 8! 😱
How utterly embarrassing for BOTH the authors AND those entrusted with the scientific peer review…

Go check it out for yourselves, it is still there for the world to see:


8BA7B64D-7D82-4107-92BF-1B24CD24950F.jpeg

B20DEE56-9667-4D70-8365-E82CBC1CCD0C.jpeg

Is that so?! 🤭
 
  • Haha
  • Wow
  • Like
Reactions: 8 users

Frangipani

Regular
Interesting group and workshop.

Haven't checked if posted already. Apols if has.

We're being utilised as well in the workshop as is Prophesee, Inivation and some others.

Equipment apparently being provided by Neurobus & WSU as organisers it appears....they obviously have access to Akida...wonder what else they been doing with it outside of the workshop :unsure:



SPA24: Neuromorphic systems for space applications​



Topic leaders​

  • Gregory Cohen | WSU
  • Gregor Lenz | Neurobus

Co-organizers​

  • Alexandre Marcireau | WSU
  • Giulia D’Angelo | fortiss
  • Jens Egholm | KTH

Invited Speakers​

  • Prof Matthew McHarg - US Air Force Academy
  • Dr Paul Kirkland - University of Strathclyde
  • Dr Andrew Wabnitz - DSTG

Goals​

The use of neuromorphic engineering for applications in space is one of the most promising avenues to successful commercialisation of this technology. Our topic area focuses on the use of neuromorphic technologies to acquire and process space data captured from the ground and from space, and the exploration and exploitation of neuromorphic algorithms for space situational awareness and navigation. The project combines the academic expertise of Western Sydney University in Australia (Misha Mahowald Prize 2019) and industry knowledge of Neurobus, a European company specialising in neuromorphic space applications. Neuromorphic computing is a particularly good fit for this domain due to its novel sensor capabilities, low energy consumption, its potential for online adaptation, and algorithmic resilience. Successful demonstrations and projects will substantially boost the visibility of the neuromorphic community as the domain is connected to prestigious projects around satellites, off-earth rovers, and space stations. Our goal is to expose the participants to a range of challenging real-world applications and provide them with the tools and knowledge to apply their techniques where neuromorphic solutions can shine.

Projects​

  • Algorithms for processing space-based data: The organisers will make a real-world space dataset available thatwas recorded from the ISS, specifically for the purpose of this workshop. In addition, data can be recorded with WSU's remote-controlled observatory network. There are exciting discoveries to be made using this unexplored data, especially when combined with neuromorphic algorithmic approaches.
  • Processing space-based data using neuromorphic computing hardware: Using real-world data, from both space and terrestrial sensors, we will explore algorithms for star tracking, stabilisation, feature detection, and motion compensation on neuromorphic platforms such as Loihi, SpiNNaker, BrainChip, and Dynap. Given that the organisers are involved in multiple future space missions, the outcomes of this project may have a significant impact on a future space mission (and could even be flown to space!)
  • Orbital collision simulator: The growing number of satellites in the Earth’s orbit (around 25,000 large objects) makes it increasingly likely that objects will collide, despite our growing Space Situational Awareness capabilities to monitor artificial satellites. The fast and chaotic cloud of flying debris created during such collisions can threaten more satellites and is very hard to track with conventional sensors. By smashing Lego “cubesats” () in front of a Neuromorphic camera, we can emulate satellite collisions and assess the ability of the sensor to track the pieces and predict where they will land. We will bring Lego kits to the workshop. Participants will need to design a “collider”, build cubesats with legos, record data, measure the position of fallen pieces, and write algorithms to process the data.
  • High-speed closed-loop tracking with neuromorphic sensors: Our motorised telescope can move at up to 50o per second and can be controlled by a simple API. The low-latency of event cameras makes it possible to dynamically control the motors using visual feedback to keep a moving object (bird, drone, plane, satellite...) in the centre of the field of view. The algorithm can be tested remotely with the Astrosite observatory (located in South Australia) or with the telescope that we will bring to the workshop.
  • Navigation and landing: Prophesee’s GenX320 can be attached to a portable ST board and powered with a small battery. To simulate a landing of a probe on an extraterrestrial body, we attach the camera to an off-the-shelf drone for the exploration of ventral landing, optical flow and feature tracking scenarios, as well as predicting the distance to the ground to avoid dangerous landmarks.
  • High-speed object avoidance: The goal is to work on an ultra-low-latency vision pipeline to avoid incoming objects in real-time, simulating threats in the form of orbital debris. This will involve a robotic element added to the orbital collision simulator.


Materials, Equipment, and Tutorials:

We're going to make available several pieces of equipment that includes telescopes to record the night sky, different event cameras from Prophesee and iniVation, a Phantom high-frame rate camera for comparison, neuromorphic hardware such as Brainchip's Akida and SynSense Speck. ICNS will also provide access to their Astrosite network of remote telescopes, as well as their new DeepSouth cluster.

We will run hands-on sessions on neuromorphic sensing and processing in space, building on successful tutorials from space conferences, providing code and examples for projects, and training with neuromorphic hardware. Experts in space imaging, lightning, and high-speed phenomena detection will give talks, focusing on neuromorphic hardware's potential to address current shortcomings. The workshop will feature unique data from the International Space Station, provided by WSU and USAFA, marking its first public release, allowing participants to develop new algorithms for space applications and explore neuromorphic hardware's effectiveness in processing this data for future space missions. Additionally, various data collection systems will be available, including telescope observation equipment, long-range lenses, tripods, a Phantom High-speed camera, and WSU's Astrosite system for space observations. Neurobus will make available neuromorphic hardware on site. This setup facilitates experiments, specific data collection for applications, experimenting with closed-loop neuromorphic systems, and online participation in space observation topics due to the time difference


Apology accepted… 😉

https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-415663
 
  • Haha
  • Like
Reactions: 4 users
Top Bottom