BRN Discussion Ongoing

Well, on the ADAS AI front from a few days ago they predominantly looking for python and CNN at India.

Unless they using CNN2SNN Akida I'd suggest could be a push re Akida on this side at least. Though they do leave the door slightly ajar with the....other machine learning based models comment.

Infotainment / HMI?


April 2023
Mercedes-Benz Research and Development India Private Limited

ADAS - AI/ML, Python Developer, Algorithm Evaluation (SE)​

Tasks

- Responsible for creating evaluation pipelines and KPI’s for CNN based SW modules to evaluate SW performance.
- Responsible to interact with multiple stakeholders to understand the requirement and prepare KPI’s to evaluate the SW module
- Strong Understanding of Object oriented programming in python and cpp.
- experience with Convolutional Neural Networks (CNNs) or other machine learning based models
- has a good understanding on the training and especially evaluation of CNN models
- ideally experience/knowledge of models for object detection, sensor fusion, prediction, planning & control, ...
- can identify problems with specific architectures of CNNs
- a plus is automotive experience, ideally in the field of "Advanced Driver Assistance Systems" (ADAS)
 
  • Like
  • Fire
Reactions: 8 users

FlipDollar

Never dog the boys
  • Haha
  • Like
  • Love
Reactions: 24 users

suss

Regular

View attachment 35493
View attachment 35494
View attachment 35495
Nice to see someone from Woolworths asking if akida is used.. /s

Fascinating development from Mercedes.
 
  • Haha
  • Like
Reactions: 11 users

suss

Regular
Reminds me of predictive maintenance that brainchip promote a lot.
 
  • Like
Reactions: 4 users
This enables them to implement artificial neural networks (ANN) in series-production processors. Now patented, this workflow opens up all sorts of possible applications in a wide range of areas, including powertrain.

Powertrain?

F1EC8F85-5ED4-4EEB-B0FA-0AD0713A0541.jpeg


47678F1C-F09F-46A7-86E7-6CEF5F1A3E5F.jpeg


494EDC45-0149-423F-AA2B-BDA5B7E377AE.jpeg


AAFB51AB-47C2-482D-8A8B-9F5537AC10FE.jpeg




 
  • Like
  • Fire
  • Love
Reactions: 103 users

GStocks123

Regular
  • Haha
  • Like
  • Love
Reactions: 20 users

Diogenese

Top 20
Sorry Mea Culpa.
I’m off on a road trip in NZ with my mad 80 year old brother.
Bit like this actually so letting the threads look after themselves atm.



Looking forward to meeting some of you at the AGM. :ROFLMAO:

... but what are you doing in the afternoon?
 
  • Haha
Reactions: 3 users

Learning

Learning to the Top 🕵‍♂️
Powertrain?

View attachment 35496

View attachment 35497

View attachment 35499

View attachment 35498



Thanks for sharing Top Bloke.

What a coincidence that's Brianchip investors presentation was in 2019. With different use case for Akida as per thelitteshort post.

And Markus Schafer’s say; "Back in 2019, we defined a set of clear principles for how we work with AI to provide us with an operational framework."
In his latest post.

Screenshot_20230502_204045_Samsung Notes.jpg


Learning 🏖
 
  • Like
  • Love
  • Fire
Reactions: 56 users

GDJR69

Regular

View attachment 35493
View attachment 35494
View attachment 35495
Surely this has to be Akida given the backdrop of the EQXX, the fact they were looking at if for other applications and the fact that we are the only neuromorphic chip provider on the market. It seems pretty unlikely that they have just come up with a new MB Akida themselves. Open to discuss, but to me this sounds like potentially very good news!
 
  • Like
  • Fire
  • Thinking
Reactions: 28 users

miaeffect

Oat latte lover
  • Haha
  • Like
  • Love
Reactions: 17 users

MrRomper

Regular
Last edited:
  • Like
  • Fire
  • Wow
Reactions: 23 users

wilzy123

Founding Member
I agree that is was Sean said. There were a number of us dissecting what he said afterwards. Whether it meant growth percentages or dollar figure amount but you are correct, he never mentioned break even.

The whole economy pooping itself and Putin happened which is what some (incl. myself) assumed was the reason it didn't eventuate. I hope we get some clarification or explanation from Sean as to why it didn't eventuate at the AGM.
Oh great yes so much yes good post
stretch-chick-stretching.gif
 
  • Haha
  • Like
Reactions: 8 users

GDJR69

Regular
Surely this has to be Akida given the backdrop of the EQXX, the fact they were looking at if for other applications and the fact that we are the only neuromorphic chip provider on the market. It seems pretty unlikely that they have just come up with a new MB Akida themselves. Open to discuss, but to me this sounds like potentially very good news!
Further to my post above, another subtlety about the MB post that I didn't notice initially, but which is interesting and to my mind is unlikely to be a coincidence, is the language he uses around beneficial AI - sound familiar?

Back in 2019, we defined a set of clear principles for how we work with AI to provide us with an operational framework. The four guiding notions under which we develop and use AI are: “responsible use”, “ease of explanation”, “privacy protection” and “safety and reliability”. I am very excited by this progress and at the same time acutely aware of our responsibilities as leaders in our field. By pushing innovation while at the same time adhering to our principles, I believe we can help unleash the true benefits of AI in a sustainable way.

Now is it any coincidence that sometime after this one of the BRN slogans was 'beneficial AI'? This suggests to me MB and Brainchip have been working closely together for a long time.
 
  • Like
  • Fire
  • Thinking
Reactions: 33 users

Foxdog

Regular
QUOTE="wilzy123, post: 288795, member: 9"]
Oh great yes so much yes good post
View attachment 35504
[/QUOTE]
Man my hip flexors feel better just watching this little guy in action, not to mention the hammies - what a legend 😂
 
  • Haha
  • Like
Reactions: 5 users

Tothemoon24

Top 20
  • Like
  • Fire
  • Thinking
Reactions: 17 users
Evening team, can't remember seeing this.

TCS using Akida for spacetech, 'cloud cover detection' and prob 'lossless image compression' on small satellites.

Screenshot_20230502_221019_LinkedIn.jpg

Screenshot_20230502_221056_LinkedIn.jpg
 
  • Like
  • Fire
  • Love
Reactions: 133 users
  • Like
  • Fire
Reactions: 7 users

Diogenese

Top 20
Evening team, can't remember seeing this.

TCS using Akida for spacetech, 'cloud cover detection' and prob 'lossless image compression' on small satellites.

View attachment 35514
View attachment 35513
Two great indicators:

1. Gilles Bezard (BrainChip) is an author for the cloud cover satellite;
2.1 Arindam Basu, professor, City Uni HK, asks about BrainChip for the SNN in the edge satellite;
2.2 Airijit Mukherjee replied "Yes. The cloud cover detection work is entirely on Akida.".

It would be nice to see the BrainChip blog/Linkedin ... give this achievement a bit of publicity when the papers are published.
 
  • Like
  • Love
  • Fire
Reactions: 59 users

Tothemoon24

Top 20
FIRST SPACE BASED NEUROMORPHIC CAMERA
WHAT IS IT?

Falcon Neuro is an experiment flying on the International Space Station (ISS) designed and built by cadets and faculty at the United States Air Force Academy (USAFA). Falcon Neuro demonstrates for the first time the use of biologically-inspired event-based, or neuromorphic, cameras for use in space.
Falcon Neuro is small, but mighty! It contains two neuromorphic cameras which were modified, by Western Sydney University (WSU) in Sydney, Australia,
for use in space. One camera looks down—Nadir, and one camera looks forward—Ram. Both cameras are controlled by electronics developed in the Space Physics and Atmospheric Research Center (SPARC) in the Department of Physics and Meteorology at the USAFA.
Falcon Neuro got a ride to the ISS from the DoD Space Test Program (STP) and started operations in January 2022 and will continue operations until January 2024. Falcon Neuro is run from a dedicated ground station at USAFA and performs operations during most week days. The two cameras have captured more than 200 recordings to date.
image of Falcon Neuro

Flight unit of Falcon Neuro showing the Nadir and Ram cameras (left) CAD (Right) Photo Credit: USAFA SPARC
FACTS ABOUT FALCON NEURO
  • First demonstration of neuromorphic cameras in space
  • Hyper-temporal observations with low bandwidth
  • Full motion video — day or night
  • Watch what is important — Detect what changes in a scene, not the background that stays the same
HOW DOES IT WORK?
Event-based, or neuromorphic, cameras work more like the human eye than a regular camera, similar to a cell phone. Using complex circuitry, each pixel in Neuro records an event when the brightness in the pixel changes. Event cameras are high-speed, tens to hundreds of times faster than a standard video camera. Their unique circuitry provides this speed with much smaller amount of data being transmitted to the ground.
An event is just a list of the position of the pixel (row and column), the time and whether the brightness increased or decreased (polarity). Operators can send commands that change 21 different biases in the circuitry to allow Neuro to record brighter or dimmer events at the expense of more data to bring to the ground.
The data from Falcon Neuro is a long list of events that are post-processed on the ground by cadets and faculty at USAFA. The data is rendered at a frame rate chosen on the ground, allowing maximum flexibility in visualizing the data.
WHY IS IT IMPORTANT?
Neuromorphic cameras are new tools for space-based imagery. Since they read out much less data more quickly, they allow hyper-temporal (faster than real
time) recording capability. This is important for detecting challenging threats such as hypersonic re-entry vehicles, missiles or fast-moving aircraft. Future Air and Space Force leaders training at the USAFA are learning about space through first-hand experience. Falcon Neuro has allowed cadets from different backgrounds and departments to participate in this new cutting-edge experiment. Cadets from the Departments of Physics and Meterology, Astronautics, Electrical, Mechanical, Computer, Aero and Military Strategic Studies have all learned about space by helping build, test and fly Neuro.
comparison photo

Comparison of NASA ISS HD camera and Neuro motion compensated image of Honduras Photo Credit: United States Air Force Academy and Western Sydney University
ADDITIONAL FACTS
Falcon Neuro has proven so successful that a follow-on experiment called Falcon ODIN (Optical Defense and Intelligence through Neuromorpics) is planned. Falcon ODIN will contain greatly improved optics provided by AFRL’s Space Vehicles Directorate and a new, more sensitive focal plane array. Falcon ODIN is under construction in SPARC, and testing will start on the flight unit in the fall of 2023.
equipment model shown with engineering team

Captain Hayden Richards and Cadet Madison Yates Mechanical Engineering with the newly finished Neuro Photo Credit: United States Air Force Academy’s Space Physics and Atmospheric Research Center.
Falcon ODIN will be delivered to the DoD Space Test Program in late 2023 and fly to the ISS to continue research in neuromorphic technology in 2025.
Data from both experiments are freely available to government agencies and their contractors. These data are recorded in a wide variety of background lighting conditions and provide a unique “space-truth” that will be important for the validation of sensor modeling currently underway at USAFA, AFRL, WSU and the Air Force Institute of Technology.
FACT SHEET
THE AIR FORCE RESEARCH LABORATORY
 
  • Like
  • Fire
  • Thinking
Reactions: 24 users

overpup

Regular
  • Haha
  • Like
Reactions: 9 users
Top Bottom