BRN Discussion Ongoing

A new Brains & Machines podcast is out - the latest episode’s podcast guest was Amirreza Yousefzadeh, who co-developed the digital neuromorphic processor SENECA at imec The Netherlands in Eindhoven before joining the University of Twente as assistant professor in February 2024.




Here is the link to the podcast and its transcript:



BrainChip is mentioned a couple of times.

All of 2017, towards the end of his PhD, Yousefzadah collaborated with our company as an external consultant. His PhD supervisor in Sevilla was Bernabé Linares-Barranco who co-founded both GrAI Matter Labs and Prophesee. Yousefzadah was one of the co-inventors (alongside Simon Thorpe) of the two JAST patents that were licensed to BrainChip (Bernabé Linares-Barranco was co-inventor of one of them, too).

https://thestockexchange.com.au/threads/all-roads-lead-to-jast.1098/

View attachment 60489


Here are some interesting excerpts from the podcast transcript, some of them mentioning BrainChip:

View attachment 60487
View attachment 60490

View attachment 60491


(Please note that the original interview took place some time ago, when Amirezza Yousefzadah was still at imec full-time, ie before Feb 2024 - podcast host Sunny Bains mentions this at the beginning of the discussion with Giulia D’Angelo and Ralph Etienne-Cummings and again towards the end of the podcast, when there is another short interview update with him.)


View attachment 60492


Sounds like we can finally expect a future podcast episode featuring someone from BrainChip? 👍🏻
(As for the comment that “they also work on car tasks with Mercedes, I think”, I wouldn’t take that as confirmation of any present collaboration, though…)


View attachment 60493


Interesting comment on prospective customers not enamoured of the term “neuromorphic”… 👆🏻



View attachment 60494
View attachment 60495


Really a great episode. I listened to it yesterday but still had not enough time to copy my favorites quotes of the transcript.

One aspect that already became more pronounced during the last episodes is, that the the 3 hosts (Sunny Bains, Giulia D'Angelo & Ralph Etienne-Cummings) seem to like/enjoy the development of how the interviews are starting to change from a more scientific, historical view of (mostly strictly analog) neuromorphic computing to a more hands-on (commercial), current state of neuromorphic hardware.

In November '23 I emailed Sunny Bains if she had plans/might consider doing an episode about the current fields/contexts in which neuromorphic or event-based computing might have a specific advantage, and also what their view is about things that might be possible by this technology that weren't before.

In her response she couldn't tell further details about what potential future episodes of "Brains & machines" might cover, but she mentioned that she was currently writing a book about "neuromorphic engineering, but unfortunately that will not come out until the end of next year or even 2025".
 
  • Like
  • Love
  • Fire
Reactions: 13 users

Frangipani

Regular
One aspect that already became more pronounced during the last episodes is, that the the 3 hosts (Sunny Bains, Giulia D'Angelo & Ralph Etienne-Cummings) seem to like/enjoy the development of how the interviews are starting to change from a more scientific, historical view of (mostly strictly analog) neuromorphic computing to a more hands-on (commercial), current state of neuromorphic hardware.

It’s a good thing that Giulia D’Angelo will soon get to witness AKD1000 in action (if she hasn’t done so already), as she has been added as one of the co-organisers for the Telluride 2024 Topic Area Neuromorphic Systems for Space Applications:


B1421045-F61C-4BAC-B418-B77078BC8226.jpeg


She recently moved from Genoa to Munich to work as Senior Researcher at fortiss.

953BD308-503B-464B-8FF8-A0D66A0BBC4A.jpeg




The fortiss Neuromorphic Lab was one of more than 100 partner institutions in the EU-funded Human Brain Project (> SpiNNaker) that ran from 2013 to 2023. It has also been doing lots of projects based on Loihi in recent years (as well as collaborating with IBM on at least one project). While I have yet to come across any proof of fortiss researchers utilising Akida, I noticed they have at least been aware of BrainChip’s tech since 2020, as evidenced by their 2020 Annual Report:



A01DE0E3-3F0C-4819-93AC-81FAA0B52516.jpeg
 
  • Like
  • Fire
  • Love
Reactions: 18 users

Tothemoon24

Top 20

IMG_8759.jpeg

Revolutionizing Space Infrastructure: ANT61 Smart Robots​


April 8, 2024 – Mikhail Asavkin
Space infrastructure plays a key role in maintaining civilization on Earth. From moving goods around the globe and monitoring the effects of global warming, such as forest fires and floods, to everyday transactions, such as buying a donut at your favorite cafe, our everyday lives depend on satellites in orbit.
At ANT61, we want to make sure these satellites are still there when you need them. As humanity evolves, the importance of space infrastructure will only grow, and it’s about time we start using the same approach to maintaining it. On Earth, if something breaks down in a factory or a power plant, we don’t build a replacement. Instead, we send a repair team with spare parts, and even better yet, we monitor the components wear and replace them before they break down.
At ANT61, we create technology that enables us to use the same common-sense approach for space infrastructure.
Historically, there have been two obstacles to applying this approach in space. First, when something breaks down in orbit, it’s extremely difficult to send a crew out to understand what went wrong, and dead satellites can’t call home to explain what happened. Second, it’s way too expensive to send humans to repair something in space. One notable exception to this rule was several multi-billion dollar Space Shuttle missions to refurbish the Hubble Space Telescope.
What we do
ANT61 is providing solutions for both: our Beacon product allows satellite operators to understand what went wrong with their satellite and restore them back to operation. For larger and more expensive satellites, we are building robots that will dock, refuel and, in the future, refurbish satellites, prolonging their useful life. At the core of these robots lies ANT61 BrainTM, the innovative devices that combine machine vision and decision-making technology, enabling the autonomy of these maintenance robots. Autonomy is very important as, due to the speed of light limitations, it won’t be possible to control every movement of these robots remotely from Earth.
The first generation of the ANT61 Brain uses the BrainChip AkidaTM chip for power-efficient AI and is currently on board the Optimus-1 satellite, which was deployed recently by the SpaceX Transporter-10 mission. We will test ANT61 Brain later this year and perform training and evaluation of various neural networks that we will use for future in-orbit servicing technology demonstrations.
We chose to partner with BrainChip because we believe that neuromorphic technology will bring the same exponential improvement in AI as 20 years ago. The shift from CPU to GPU opened doors to deep neural network applications, which are at the core of all AI technologies today. Neuromorphic technology is also perfect for space: the lower power consumption means less heat dissipation, and we can get up to five times more computational power for the same electrical power budget.
Our vision for the future
Humanity’s expansion to the cosmos requires infrastructure that can only be built by robots. With the in-orbit servicing experience, ANT61 will become the main supplier of the robotic workforce for Moon and Mars installations, enabling companies from Earth to project their ambition to space, providing valuable services and resources for future off-world factories and cities.
We believe that in 20 years, 10 billion people on Earth will be able to look up and see the city lights on the night side of the lunar crescent. The space industry will transform from a place for the elite few to one open to everyone.
If you are coming out of college or pondering a career change, now is a great time to join a space company.
 

Attachments

  • IMG_8759.jpeg
    IMG_8759.jpeg
    1.2 MB · Views: 60
  • Like
  • Fire
  • Love
Reactions: 60 users

Tothemoon24

Top 20
This is a great listen ; little snip of discussion below

Click on link for full transcript

ML on the Edge with Zach Shelby Ep. 7 — EE Time's Sally Ward-Foxton on the Evolution of AI

ARTIFICIAL INTELLIGENCE
By Mike SeneseApr 8, 2024
ML on the Edge with Zach Shelby Ep. 7 — EE Time's Sally Ward-Foxton on the Evolution of AI

In this episode of Machine Learning on the Edge, host Zach Shelby is joined by EE Times senior reporter Sally Ward-Foxton, an electrical engineer with a rich background in technology journalism. Live from GTC 2024, they cover the ways that NVIDIA is impacting the AI space with its GPU innovation, and explore the latest technologies that are shaping the future of AI at the edge and beyond.

Watch or listen to the interview here or on Spotify. A full transcript can be read below.


Watch on Youtube



Listen on Spotify


Sally Ward-Foxton
Yeah so let's have an accelerator from BrainChip or Syntiant or somebody where it's a separate chip and it's alongside your CPU or or your microcontroller. I think eventually, a lot of that will go down the same route that you're talking about for cryptography, we'll go on to the SOC. It will be a block on the SOC, because that makes the most sense for embedded use cases for power efficiency and integration and so on. We will get there eventually. Not today, but eventually.

Zach Shelby
Very interesting. We have a good example of what's happened in the industry now with this little box, this is just an AI-powered camera reference design that we helped build, with a bunch of different partner and customer cases. One of those being nature conservation. So it turns out that a lot of endangered species, poaching, human conflict, nature conservation cases like elephants in Africa, really can make use of computer vision, with AI, in the field. Deep in the forest, in the jungle, needs to be left there for months at a time, very hard to go collect data. This has no acceleration. This is an STM32H7. So high-end Cortex M7, a lot of external memory, so we can put models that range from 10 to 20 megabytes in size, even, into this. And with software techniques, quantization, compression, re-architecting some of the ways that we do object detection, we can do fairly real-time object detection on this, and because it's a microcontroller architecture, we can do that for a year of battery life, with a certain number of PIR events per day where we capture images. And that's what no acceleration. So it's really interesting, as we get acceleration into these types of existing SOCs, say the next generation of ST microcontrollers has an accelerator, where's that going to get us and bring us right? What's the kind of optimization we should be thinking about from a from device manufacturers point of view? Like, all right, I've got a camera. Yep. We don't have acceleration, we're going to do a little bit of AI. Now we're going to want to add acceleration for the next generation.

Sally Ward-Foxton
Yeah, I mean, if you're looking at reducing power, doing more efficient ML, you definitely need acceleration today, I guess it's a balance of can your application handle the extra cost that you're going to face?
 
Last edited:
  • Like
  • Love
  • Thinking
Reactions: 25 users

Frangipani

Regular

Contextual Computing Requires an AI-First Approach​

OPINION​

By Vikram Gupta 04.08.2024

The infusion of AI into internet of things (IoT) edge devices has the potential to realize the vision of intelligent, contextually aware computing that intuitively acts on our behalf based on seemingly a priori knowledge. As we move through this world, our engagement with technology is frictionless, fluid, productive and trusted. Without prompt, home automation systems know users’ habits and needs; factories know when maintenance is needed; emergency services deliver medical care promptly; agricultural lands have optimal yields; and ecologies are sustained—a pathway to a better world.

This is what’s possible when IoT islands are knit together intelligently, securely, reliably and cost effectively. We’re not there yet, but with an edge computing market set to grow rapidly in the next two years, the industry is accelerating in that direction.

AI-infused IoT

The promise of the IoT as a transformative force remains intact, but it has been slowed by several challenges: fragmented hardware and software ecosystems, user privacy concerns, a cloud-centric data processing model with slow response times, and unreliable connectivity. The infusion of AI at the edge addresses two of these issues by allowing decisions to be made quickly in situ, with no need to upload user data. This tackles the latency and privacy issues while making better use of available bandwidth and lowering power consumption by reducing the number of transmissions.

Given this, solutions that skillfully handle edge IoT data while ensuring the seamless integration of AI for enhanced contextual awareness and improved IoT features will surely gain popularity. This is why so many companies have been seeking to incorporate the value of AI as they deploy smarter IoT products across consumer, industrial and automotive markets.

Interest in doing so has only spiked with the rapid emergence of large language models (LLMs). While AI and ML have been evolving quickly within the context of the IoT, an exciting overlap between LLMs and edge AI is the emergence of small language models (SLMs) that will aid in the hyper-personalization of the end user and drive more compute to the edge.

Meanwhile, practical applications of AI-infused edge IoT are already gaining traction in areas like feature enhancement for home video devices. Edge AI can also optimize crowd flow, ensure security, and enhance user experiences in retail, public transportation, and entertainment venues through people counting and behavioral analysis.

AI paralysis

While the opportunities are compelling, companies’ ability to capitalize on them varies. Some product companies have data but don’t have AI models or even know where to begin developing them. Others are more sophisticated and have models but can’t deploy them effectively across unproven hardware and ever-shifting, incompatible tool suites. Others remain paralyzed in the face of a technological revolution that will up-end their business if they don’t act quickly.

While it makes devices more useful, edge AI adds complexity and further frustrates developers, customers and users. They all recognize the potential of intelligent, connected nodes to enhance the user experience but lack the know-how, tools, and infrastructure to capitalize upon this relatively new and exciting technology. The spike in edge AI interest has resulted in sporadic ad hoc solutions hitched to legacy hardware and software tools with development environments that don’t efficiently capitalize upon AI’s potential to address customer demand for AI enablement for applications they’ve yet to clarify.

This situation is untenable for developers and end users, and the issue comes into stark relief against a backdrop of AI compute being increasingly pushed from the data center to the edge in applications like healthcare and finance, where security and response time are paramount. Clearly, more needs to be done by the industry to improve the customer journey to enable intelligent edge products.

Close the AI gap: Concept to deployment

While the edge AI train has already left the station, will different nodes have different shades of intelligence? Logic would dictate that everything would be intelligent, yet the degree of intelligence depends on the application. Some intelligence might be externally visible as a product feature, but others may not.

Regardless, if everything is going to be intelligent, it would follow that AI shouldn’t be a bolt-on “feature,” but inherent in every IoT product. Getting customers from ideation to real-world deployment requires shifting from the currently fragmented ecosystem to a cohesive, AI-first approach to IoT edge device design. This will require several elements: scalable AI-native hardware, unified software, more adaptive frameworks, a partnership-based ecosystem and fully optimized connectivity. This is the only way developers can deploy AI at the edge at the requisite speed, power, performance, reliability, security and cost point required to take part in a future that is coming…fast.

Many necessary elements are already available thanks to work done over the years on applying AI to vision, audio, voice and time series data. However, processor scalability and multi-modal capability need more attention to enable cost-effective, contextually aware sensing across increasingly diverse applications. While current microcontrollers and microprocessors are each highly capable in their own right, a gap still exists for the right IoT processors with the right mix of power, performance and processing flexibility to ensure the right compute at each edge AI node.

These suitable IoT processors, combined with compatible wireless connectivity and supported by a cohesive infrastructure, software, development tools and a truly “AI first” approach to ease the customer journey, will unlock a number of intelligent IoT products to help improve our world.

—Vikram Gupta is SVP and GM of IoT processors and Chief Product Officer at Synaptics.
 
  • Like
  • Fire
  • Love
Reactions: 7 users

RobjHunt

Regular
Yep, got close to an ounce in nuggets over weekend. A BRN nugget to end this week would be great. Just gotta clean all the ironstone of the specimens
Great bloody darts mate!
 
  • Like
  • Haha
Reactions: 3 users

Fredsnugget

Regular
Great bloody darts mate!
Yep, that little lot will get me another 8800 BRN shares. I always knew gold would pay off as a great investment :)
 
  • Like
  • Fire
  • Love
Reactions: 17 users

IloveLamp

Top 20
  • Like
  • Love
  • Fire
Reactions: 6 users

IloveLamp

Top 20
1000014867.jpg
 
  • Like
  • Love
Reactions: 5 users

Frangipani

Regular
This newly published doctoral thesis titled Avancées en vision neuromorphique: représentation événementielle, réseaux de neurones impulsionnels supervisés et pré-entraînement auto-supervisé (Advancements in neuromorphic vision: event representation, supervised spiking neural networks, and self-supervised pretraining) mentions Akida, albeit briefly.

It is yet another piece of evidence that BrainChip is slowly but surely gaining the recognition it deserves in academic circles, where in the past it had all too often been ignored as a serious contender among neuromorphic processors.



4571CAF6-182B-4241-9D78-232355AA56E1.jpeg

7B376E57-084F-40B9-8338-154A70A692C3.jpeg

36D02015-06AE-4FA2-9AE5-1260B1CBD010.jpeg


The author of this doctoral thesis, Sami Barchid, has meanwhile landed his first job as an AI Engineer with mintt, a company dealing with solutions around fall detection and prevention. This as well as other forms of remote monitoring are interesting use cases for neuromorphic event cameras and 3D sensors.

While this particular system is cloud-based and not on-device, it is nevertheless an interesting website to browse:


“At Mintt - fall detection, analysis & prevention, we are committed to improve the well-being of our seniors and offering them greater independence. We truly believe everyone has the right to spend their golden years in a safe and sound environment.”

DCD4BCD4-40C7-4BE1-B9CA-D2E7BB82D38B.jpeg
 

Attachments

  • 2A50B60B-58DF-4A45-BB24-1E63599F3058.jpeg
    2A50B60B-58DF-4A45-BB24-1E63599F3058.jpeg
    693.1 KB · Views: 46
  • 622DA419-2803-4A7D-BB2D-DCE4EA12CFC8.jpeg
    622DA419-2803-4A7D-BB2D-DCE4EA12CFC8.jpeg
    596.4 KB · Views: 44
  • Like
  • Love
  • Fire
Reactions: 43 users

Terroni2105

Founding Member
1712629136758.jpeg
 
  • Like
  • Haha
  • Love
Reactions: 18 users

cllrabbit

Emerged


Dont think it has been posted, AMD CEO mentions about edge ai
 
  • Like
  • Fire
  • Love
Reactions: 8 users

7für7

Top 20
Everyone is waiting for the BOOooOOM
 
  • Like
  • Haha
  • Fire
Reactions: 9 users

HarryCool1

Regular
  • Haha
Reactions: 1 users

toasty

Regular
All of a sudden the buy/sell ratio blows out to double........
 
  • Like
  • Fire
  • Thinking
Reactions: 13 users

Esq.111

Fascinatingly Intuitive.
Afternoon Harry ,

Certainly a start , Also something to bear in mind..... Today is International Unicorn Day ..... so all things being equal our share price should rise to $0.5525 per unit , which in turn will put us back in the unicorn company space.

Market cap over $1,000,000,000.00 = Unicorn.

Regards ,
Esq.
 
  • Like
  • Haha
  • Fire
Reactions: 29 users

7für7

Top 20
Afternoon Harry ,

Certainly a start , Also something to bear in mind..... Today is International Unicorn Day ..... so all things being equal our share price should rise to $0.5525 per unit , which in turn will put us back in the unicorn company space.

Market cap over $1,000,000,000.00 = Unicorn.

Regards ,
Esq.
I made a check on your calculations! Somehow my results are 0.55251 🤔 explanation?
 
  • Haha
  • Like
Reactions: 8 users

MegaportX

Regular
Market's appears to be picking up steam, looking for a push to the upside this week. Lets see..
 
  • Like
Reactions: 1 users
Top Bottom