BRN Discussion Ongoing

New Software Engineer comes from Continental Engineering


IMG_6517.jpeg

IMG_6518.jpeg


IMG_6519.jpeg
 
  • Like
  • Fire
  • Love
Reactions: 41 users

cosors

👀
already known so again

"EDGX, BrainChip unite to create data processing units for space​

Oct 10, 2023
EDGX, a Belgian SpaceTech company, and BrainChip, a neuromorphic computing device provider, have partnered to develop data processing units for challenging environments.
The pair signed a non-binding Memorandum of Understanding at the European Space Agency’s (ESA) EDHPC Conference in Juan-les-Pins, France.

Both companies note that space infrastructure is increasingly crucial in our daily lives as it supports essential aspects such as navigation, weather forecasting and secure communications.
Nick Destrycker, the founder and CEO of EDGX, envisions a future where interconnected satellites, fueled by advanced AI, provide real-time data connectivity for humans and machines.
“This is where BrainChip’s Akida technology comes into play, the event-based accelerator for AI at the edge, which coincides perfectly with our vision in developing brain-inspired data processing units for space,” explains Destrycker.
To achieve self-sustainability, the space industry is witnessing a rise in satellite launches. Recognizing this demand, EDGX is currently developing a data processing unit. This unit combines AI acceleration with neuromorphic computing, enhancing next-generation satellites’ power efficiency and autonomous learning capabilities.
Laurent Hili, an engineer with expertise in microelectronics and data handling at ESA, emphasizes Akida’s unique capability to operate in two distinct modes. The first mode is more conventional, ensuring compatibility with established neural networks. The second mode is event-based, which reduces power consumption while maintaining performance.
“Power is at a premium in the space environment,” adds Hili. “Event-based processing may easily open a door into a new realm of processing in space to do things which were not possible before.”
According to Peter van der Made, the founder and CTO at BrainChip, the Akida roadmap was designed to provide high-performance, ultra-low power processing without constant connection to the cloud, making it suitable for constrained environments like space.
“BrainChip’s focus on combining neuromorphic principles with the benefits of the deep learning ecosystem, customization and continuous learning matches EDGX’s requirements for disruptive technology. We are excited to partner with them to truly push the boundaries of what is possible,” says van der Made.
Last month, BrainChip entered into a strategic partnership with VVDN Technologies to collaborate on the development of the Edge Box. Designed for processing data and performing computations at the network edge, this edge device harnesses the power of neuromorphic technology, drawing inspiration from the human brain. The result is a hardware platform that enhances power efficiency but also delivers outstanding performance for a wide range of edge applications, the company says."
https://www.edgeir.com/edgx-brainchip-unite-to-create-data-processing-units-for-space-20231010

and


"BrainChip teams up with Belgium's EDGX to place its Akida neuromorphic chip in space
Published on 09-10-2023
Australian-born US company BrainChip, which has developed a neuromorphic edge processor under the name Akida, has signed a technology agreement with Belgian start-up EDGX, founded in 2023 to develop disruptive intelligent space systems, to develop data processing units for highly demanding environments. According to the Leuven-based start-up, EDGX is developing a new generation of space computers that combine traditional artificial intelligence (AI) acceleration with neuromorphic processing to reduce the power dissipation of on-board AI, increase the adaptability of payloads and pave the way for autonomous on-board learning capabilities within next-generation satellites.

BrainChip, for its part, has developed with Akida a 100% digital event-driven AI processor that uses neuromorphic principles mimicking the human brain and analyses only the essential data detected at the point of acquisition. The idea is to process data efficiently and accurately, while limiting power consumption. In addition, the Akida chip, which can also be implemented as a block of IP in a system chip or Asic circuit, enables learning to take place locally, independently of the cloud, which, BrainChip assures us, guarantees reduced latency and improves data confidentiality and security.

"We built the Akida technology roadmap with the objective of providing autonomous high-performance processing with very low power consumption that does not require a permanent connection to the cloud, which is essential for constrained environments such as satellites," explains Peter van der Made, founder and CTO of BrainChip.

EDGX and BrainChip formalised their collaboration by signing a non-binding memorandum of understanding at the European Space Agency (ESA) EDHPC conference held from 2 to 6 October 2023 in Juan-les-Pins (France).

"What fascinates us most about Akida technology is its ability to operate in two different modes," says Laurent Hili, a microelectronics and data processing engineer at ESA. On the one hand, the Akida processor can execute conventional convolutional neural networks (CNNs), making it compatible with many tried and tested CNNs. But on the other hand, in event-driven mode, it is able to reduce power demand by several orders of magnitude. Bearing in mind that consumption is a key concept in the space environment, this type of operation could open the door to a new area of processing in space to achieve things that weren't possible before." 🚀

It should be noted that, in early October, BrainChip announced the early availability of the second-generation Akida platform, which can be used in applications as diverse as the connected home, smart city, industrial and automotive, and is designed for extremely eco-efficient processing of complex neural network models on edge devices.

According to BrainChip, support for 8-bit weight values and activation levels, as well as long-range skip connections, extends the reach of fully hardware-accelerated models on the Akida platform. The second generation of the Akida platform now incorporates TENN (Temporal Event-based Neural Networks) spatiotemporal convolutions, a new class of neural networks that are distinguished by their lightness and efficiency in the use of resources, and which are particularly well suited to the needs of applications requiring continuous processing of raw data containing significant temporal information. Examples include video analysis, target tracking, audio classification, MRI and CT scan analysis for vital sign prediction, and time series analysis for predictive maintenance.

This technology, combined with the hardware acceleration of Vision Transformers (ViT) models, paves the way for edge devices capable of processing advanced vision and video applications consuming, according to the company, only a few milliwatts at sensor level (a few microwatts for audio or similar applications).

"Processing generative AI and large LLM language models at the edge is essential for intelligent situational awareness in verticals ranging from manufacturing to healthcare to defence," says Jean-Luc Chatelain**, managing director of Verax Capital Advisors. Disruptive innovations such as BrainChip's TENN technology supporting Vision Tranformers based on neuromorphic principles can provide compelling solutions in ultra-low power, small form factor devices without compromising accuracy."

You can also follow our news on L'Embarqué's LinkedIN showcase dedicated to artificial intelligence in embedded systems: Embedded-IA

You can also follow our news on L'Embarqué's LinkedIN showcase dedicated to the aerospace and defence market: Embedded-Aerospace
https://www.lembarque.com/article/b...cer-sa-puce-neuromorphique-akida-dans-lespace

**Episode 23 – BrainChip Talks AI Innovation with Accenture’s Managing Director and Global Chief Technology Officer of Accenture Applied Intelligence Jean-Luc Chatelain
 
Last edited:
  • Like
  • Love
  • Fire
Reactions: 37 users

cosors

👀

ASX Winners And Losers Of Today – 10-10-23​

1696924091693.png
 
  • Like
  • Love
  • Fire
Reactions: 25 users

cosors

👀
  • Like
  • Fire
  • Love
Reactions: 10 users

skutza

Regular
Likely you're all aware anyway, but 1+1 = 2

1696924519829.png
 

Attachments

  • 1696924449136.png
    1696924449136.png
    497.2 KB · Views: 71
  • Like
  • Fire
  • Love
Reactions: 27 users

TheFunkMachine

seeds have the potential to become trees.
Has this been posted yet, go to 51min mark they thank Brainchip for the board and is a lot more regarding Brainchip I haven't had time to go through the whole video yet.
View attachment 46724


View attachment 46725

I don’t want to claim anything, but I sent Tony an email months ago telling him about this tiny ML Hackathon challenge. I thought it sounded too close to home for Brainchip not to present anything. He said he would pass it on.
 
  • Like
  • Fire
  • Love
Reactions: 32 users

wilzy123

Founding Member
Bring on the new Nintendo Switch

So, apart from your comment being the equivalent of the nigeran prince email I just sent to spam, why should everyone currently invested in BRN all be waiting with baited breath for the new Nintendo Switch?
 
  • Like
  • Haha
  • Sad
Reactions: 7 users

cosors

👀

"This new robot is taking its first intuitive steps​

6-Oct-2023

Pitt researchers receive more than $1.6 million from the NSF to develop miniature robots that can navigate complex terrains using neuroscience concepts


Here is actually a video of a robot learning to walk. But I can't embed it here. So follow the link below.

Newswise — When walking on the sidewalk, a person is able to avoid puddles, other walkers, and cracks in the pavement. It may seem intuitive – and that's because it is.

There’s actually a biological component that allows humans and other mammals to navigate our complex environments. Central Pattern Generators (CPG) are neural networks that produce rhythmic patterns of control signals for limbs using simple environmental cues. When we quickly step away to avoid something blocking our path, that’s our CPGs doing their job.

Rajkumar Kubendran, principal investigator and assistant professor of electrical and computer engineering of the University of Pittsburgh, received a $1,606,454 award from the National Science Foundation to lead a two-year project to engineer these neural networks in robots. Feng Xiong, associate professor of electrical and computer engineering at Pitt, and M.P. Anantram, professor of photonics and nano devices at the University of Washington, will serve as co-principal investigators.

“While these networks are natural for us, there is currently no efficient way to replicate them using electronic devices and computers,” Kubendran explained. “Agile robots that can explore unknown and treacherous terrains have the potential to enable autonomous navigation for commercial transport, enhance disaster response during floods and earthquakes or to remote and unsafe areas like malfunctioning nuclear plants or space exploration.”

A Robot on the Move

Building bio-inspired robots isn’t something new, but modeling, designing and implementing neuromorphic networks with synapses and neurons inside miniature robots is a novel step forward – or, in the robot’s case, also backwards, left and right.

Neuromorphic engineering – computing inspired by the human brain – will be key to achieving efficient, adaptive sensorimotor control in these robots.

“We aim to demonstrate a fully functional quadropod or hexapod robot that can learn to move, using principles informed by neuroscience, leading to biomimetic sensorimotor control for energy-efficient locomotion, using learning algorithms running on bio-realistic neural networks, built by using semiconductor technology that expands beyond its normal limits,” Kubendran said.

These robots will be able to avoid a puddle just as intuitively as humans by constantly learning about movement. The group will take inspiration from neural circuitry found in biology that supports agile movement control. The team plans to incorporate non-linear temporal dynamics in mixed-feedback systems to build bio-inspired neural networks and implement them on scalable energy-efficient hardware.

To meet the challenging demands of neuromorphic engineering, the team is also developing the NeuRoBots* educational consortium. The consortium will train a new generation of engineers and researchers through evidence-based best practices to prepare them for the rapidly evolving needs of the industry.

“The breadth of skill sets that are required to effectively train a new cadre of workforce in neuromorphic computing for robotics makes curriculum design and integration with existing frameworks challenging,” Kudendran said. “We need help to prepare our engineers for this changing technical environment.”

The project, “Bio-inspired sensorimotor control for robotic locomotion with neuromorphic architectures using beyond-CMOS materials and devices,” is set to begin in 2024 and is part of a larger, $45 million initiative by the NSF to invest in the future of semiconductors."
https://www.newswise.com/articles/this-new-robot-is-taking-its-first-intuitive-steps

*There is no mention of this here on TSE yet.
 
Last edited:
  • Like
  • Love
  • Thinking
Reactions: 5 users

Frangipani

Regular
  • Like
  • Love
  • Fire
Reactions: 5 users

Sam

Nothing changes if nothing changes
So, apart from your comment being the equivalent of the nigeran prince email I just sent to spam, why should everyone currently invested in BRN all be waiting with baited breath for the new Nintendo Switch?
Speak for yourself willy
 
  • Like
  • Haha
Reactions: 5 users

Sam

Nothing changes if nothing changes
  • Haha
Reactions: 3 users

wilzy123

Founding Member
Speak for yourself willy

You are literally on par with champ with respect to the quality of your contribution here.
 
  • Haha
  • Like
Reactions: 4 users

Sam

Nothing changes if nothing changes

Attachments

  • 6A4F1EB2-64E9-4964-9445-448FAC66CE56.gif
    6A4F1EB2-64E9-4964-9445-448FAC66CE56.gif
    790.5 KB · Views: 33
  • Haha
  • Like
  • Love
Reactions: 13 users

Sam

Nothing changes if nothing changes
As good as your nonsense
 
  • Like
  • Fire
Reactions: 4 users

Sam

Nothing changes if nothing changes
I respect a lot of the posters here. They aren’t time poor like me, your contribution includes bullying people to inflate your flaccid wilzy!
Sorry another typo
 
  • Like
  • Haha
  • Fire
Reactions: 5 users

greatlake

Regular
  • Like
  • Fire
Reactions: 4 users

wilzy123

Founding Member
Sorry another typo

Maybe take your low value bar talk to the BRN bar chat https://thestockexchange.com.au/thr...ndly-banter-anything-goes-all-welcome.170934/

It's basically a great outlet for enthusiasm like yours. I'd hate for that to go to waste, so thought i'd share. Also - what you contribute in this thread isn't exactly helpful or useful or relevant. You're basically just cluttering up this entire thread with BS and triggering the need for a new wave of muppets to be removed from this forum.

Appreciate your help with this.
 
  • Like
  • Thinking
  • Sad
Reactions: 7 users
Top Bottom