BRN Discussion Ongoing

Boab

I wish I could paint like Vincent
The very favourable Forbes article is now on the BrainChip website under PR
Enjoy if you haven't already had a chance to see it.
Gotta love this bit.
"Additionally, there are no other off-the-shelf solutions that can provide comparable low-power, high-accuracy, and real-time AI processing".
 
  • Like
  • Love
  • Fire
Reactions: 48 users

mcm

Regular
I’m
Had a sneaky look back on HC. still a cesspit!... But, regardless, it is more so challenging everyday to defend the position here that we are not at any risk of being overtaken. Currently, the situation seems as likely as having a contract or actual bloody revenue announce, where is the management? :LOL: IMO
Pretty much the only posts on the Crapper worth reading are those by Fact Finder and rayz. I much appreciate their efforts.
 
  • Like
  • Fire
  • Love
Reactions: 30 users

Tothemoon24

Top 20

Northrop Grumman Australia and Spiral Blue team up for missile defence​

6 August 2024



A new iLAuNCH Trailblazer project aims to strengthen defence research in Australia by maturing a full stack missile detection capability.

Northrop Grumman Australia and Spiral Blue are working together on the project.




"We are bringing together Australian knowhow to build an advanced space surveillance system for missile defence tailored for deployment on small satellites,” said iLAuNCH Trailblazer Executive Director, Darin Lovett.

This mission aspires to cultivate a new sovereign capability for hypersonic missile detection through the fusion of cutting-edge infrared detection technology and industry-leading space-based artificial intelligence (AI) and machine learning (ML) data processing techniques.

“Through mission engineering, we plan to rapidly progress promising future operational concepts up the TRL ladder. We will catalyse the expertise in hypersonic vehicle signatures, space-based detection technologies, and the use of AI/ML techniques for threat identification from our different project partners to inform the development of a discriminating capability to the end user,” said Northrop Grumman Australia’s Technology Outreach Manager, Dr Dushy Tissa.

Anticipated outcomes encompass hypersonic vehicle radiance and trajectory modelling, comprehensive design, and rigorous analysis of an infrared electro-optical system, by leveraging state-of-the-art infrared detectors and space edge AI/ML hardware and algorithms for on-board event detection.
 
  • Like
  • Love
  • Thinking
Reactions: 15 users

7für7

Top 20
What do you think about this Sony/honda car? I know they collaborate with Microsoft azure.. but you know? Since it’s just a concept car and they plan to bring the car by 2026… there is still time fore some changes right?

Sony? Do you read this? Smart changes… cost effective changes.. both cloudfree and cloudbased available changes available and so on… 🙌

 
  • Sad
  • Fire
Reactions: 2 users
Nice find Tech,

I like this bit ...

[0109] The processes and logic flows described in this specification can be performed by one or more programmable computers executing one or more computer programs to perform functions by operating on input data and generating output. The processes and logic flows can also be performed by, and apparatus can also be implemented as, special purpose logic circuitry, e.g., an FPGA (field programmable gate array), an ASIC (application specific integrated circuit) a neuromorphic research chip, such as Intel's Loihi chip, or a neural network processor, such as BrainChip's Akida™ chip.
Snap.

Accenture just had another patent published in May too for gesture via neuromorphic although doesn't have Gallo on this one.



IMG_20240806_220144.jpg
IMG_20240806_215922.jpg
IMG_20240806_220504.jpg
 

Attachments

  • IMG_20240806_220144.jpg
    IMG_20240806_220144.jpg
    239.1 KB · Views: 57
  • IMG_20240806_215922.jpg
    IMG_20240806_215922.jpg
    529.3 KB · Views: 51
  • IMG_20240806_220504.jpg
    IMG_20240806_220504.jpg
    453.8 KB · Views: 56
  • Like
  • Fire
  • Love
Reactions: 71 users


Job description​

Who We Are:

Join Our Team! The Tactical Aerospace team is a premier supplier for avionics and aerospace technology for new and legacy DoD systems. If you like avionics, radar systems, or even supporting EW & SIGINT systems, Tactical Aerospace is the place to be. Come Join Us! This position has the option of being a fully remote in the following states: AL, AZ, AR, Dis. of Columbia, FL, GA, MA, MD, MI, MN, NH, NJ, NC, OH, OK, OR, TX, UT, VA, WV only. Alternate work locations are San Antonio, Tx.

Objectives of this Role:

  • This role is intended to be a lead over Neuromorphic/Cognitive AI research and development team and will be driving strategies and implementations of our AI solutions to meet our customers’ expectations.
  • Lead the development, machine learning (ML), and test of AI as applied to Systems, UAS, Avionics, EW, and/or aerospace subsystems.
  • Lead the AI team to create, and implement AI technologies/functionality and deployment strategies
  • Direct staff in the performance of Literature reviews, interface with academic institutions, lead the development of proposals and lead the implementation and deployment of those systems.
  • Lead a development team in code development (python, C), provide AI training to the internal SwRI Staff, lead the AI test group, implement algorithms, and perform various analysis.
  • Lead the AI team in the implementation of Spiking Neural Network (SNN) techniques as well as Generation 2 AI.
Daily and Monthly Responsibilities:

  • Develop Solutions for AI systems and embedded aerospace/avionics systems and subsystems.
  • Will work on 2nd and 3rd Gen AI systems (Cognitive & Neuromorphic AI).
  • Develop Solutions for neuromorphic systems, EW, SigInt, Situational Awareness, Drones (UAS/UAV), Avionics, AI/ML sensor correlation/fusion, etc.
  • Perform Data Science, Data Flow/Analysis duties, provide simulations, and integrate onto hardware.
  • Support business development activities.
  • Will also support non-AI programs.

 
  • Like
  • Love
Reactions: 9 users

davidfitz

Regular
Interesting news lately about Brainchip :giggle:

1723001371734.png
 

Attachments

  • 1723001329570.png
    1723001329570.png
    237.2 KB · Views: 32
  • Haha
  • Like
Reactions: 7 users

7für7

Top 20
Interesting news lately about Brainchip :giggle:

View attachment 67652
It’s not his fault actually… the media is just too incompetent to name it as it should be called! NEURALINK!! Elon, neuralink and brainchip…. What is that? this is not even helpful for brainchip to be honest.
 
  • Like
  • Sad
Reactions: 5 users

Gazzafish

Regular

Extract:-

Tier 1: Pure-Play Neuromorphic Computing Stocks​

The pure-play neuromorphic computing stocks represent the cutting edge of this nascent industry. These companies stake their entire business model on the potential of these brain-inspired chips. While this focus creates higher risk, it also offers the most direct exposure for investors bullish on neuromorphic computing. With only one public company currently in this tier, it underscores just how early we are in the neuromorphic computing market.

BrainChip Holdings (ASX: BRN)​

BrainChip Holdings (ASX: BRN) is a first-mover in commercial neuromorphic computing, with a focus on energy-efficient edge AI.

Australia-based BrainChip is a pioneer in commercializing neuromorphic computing, focusing on edge AI solutions. The company has developed an Edge AI platform that combines innovative silicon IP, software, and machine learning. This platform includes the Akida neuromorphic processor. Akida is designed to process information in a way that mimics the human brain from a fundamental hardware level. This “imitation” goes beyond the deep neural networks used in today’s AI models.

Brainship enjoys first-mover advantage in commercial neuromorphic computing. The company’s technology has several unique features, including microwatt power consumption and on-chip learning, while being able to support standard machine learning workflows. In fact, it offers a claimed 5-10x improvement in performance-per-watt over traditional AI accelerators. This would make the Akida chip ideal for battery-powered devices, edge computing, and in-sensor intelligence.

The company is pursuing a flexible business model centered on high-margin IP licensing. This strategy involves upfront license fees and ongoing royalties, which could provide steady revenue as adoption grows. BrainChip’s intellectual property portfolio includes 17 granted patents and 30 pending patents. The company’s team consists of 80% engineers, with 15% holding PhDs from leading AI research programs. BrainChip is also building partnerships with system integrators, including MegaChips, Prophesee, and SiFive.”
 
  • Like
  • Fire
  • Love
Reactions: 40 users

Evermont

Stealth Mode

Intel Foundry Achieves Major Milestones​

Intel 18A powered on and healthy, on track for next-gen client and server chip production next year.

1723003816936.png


1723003892401.png



BrainChip is an IP Partner of IFS. Worth reading the second link as well.

1723003909904.png



 
  • Like
  • Fire
  • Love
Reactions: 27 users
Regarding this Intel 18A platform if I am reading this correctly it will be the customer whom chooses their particular design and if it is to include BRN Akida , is that correct?.
 
  • Like
Reactions: 1 users

Diogenese

Top 20

Extract:-

Tier 1: Pure-Play Neuromorphic Computing Stocks​

The pure-play neuromorphic computing stocks represent the cutting edge of this nascent industry. These companies stake their entire business model on the potential of these brain-inspired chips. While this focus creates higher risk, it also offers the most direct exposure for investors bullish on neuromorphic computing. With only one public company currently in this tier, it underscores just how early we are in the neuromorphic computing market.

BrainChip Holdings (ASX: BRN)​

BrainChip Holdings (ASX: BRN) is a first-mover in commercial neuromorphic computing, with a focus on energy-efficient edge AI.

Australia-based BrainChip is a pioneer in commercializing neuromorphic computing, focusing on edge AI solutions. The company has developed an Edge AI platform that combines innovative silicon IP, software, and machine learning. This platform includes the Akida neuromorphic processor. Akida is designed to process information in a way that mimics the human brain from a fundamental hardware level. This “imitation” goes beyond the deep neural networks used in today’s AI models.

Brainship enjoys first-mover advantage in commercial neuromorphic computing. The company’s technology has several unique features, including microwatt power consumption and on-chip learning, while being able to support standard machine learning workflows. In fact, it offers a claimed 5-10x improvement in performance-per-watt over traditional AI accelerators. This would make the Akida chip ideal for battery-powered devices, edge computing, and in-sensor intelligence.

The company is pursuing a flexible business model centered on high-margin IP licensing. This strategy involves upfront license fees and ongoing royalties, which could provide steady revenue as adoption grows. BrainChip’s intellectual property portfolio includes 17 granted patents and 30 pending patents. The company’s team consists of 80% engineers, with 15% holding PhDs from leading AI research programs. BrainChip is also building partnerships with system integrators, including MegaChips, Prophesee, and SiFive.”
Hi Gazza,

Thanks for this. We've seen a couple of articles along this line, which allow me to maintain my illusion about software as a product.

As you may or may not know, I've been posting about the possibility of our EAPs using Akida simulation software, particularly after the emergence of Akida 2 + TeNNs more than 30 months ago, with Valeo and MB using software for signal processing. This sentence from the article again adds more grist to that rumor mill:

"The company has developed an Edge AI platform that combines innovative silicon IP, software, and machine learning."

To repeat myself, no potential user would commit to Akida 2 in silicon while the tech was in a state of flux. The use of software AI is not so problematic in ICEs as it is in EVs, but from what we've heard about TeNNs, the power and latency could be tolerated in EVs using TeNNs in software. Of course, the software would sensibly include the full Akida 2 simulation including TeNNs, or TeNNs could be used on its own. Software can be readily updated as new developments are implemented, whereas silicon is set in stone.

It's been several months since Anil announced the proposed tapeout of Akida 2, which suggests that the development had reached a satisfactory plateau of stability sufficient for the company to commit to silicon. The tapeout was subsequently "delegated" to a mysterious "other" - the rest is silence.

The SPP talks about developing a cloud-based FPGA demonstration setup, again, not a tapeout. This would be a cheaper way to obtain customer feedback than taping out and making a batch of "engineering sample" chips.

Presumably the mysterious "other" would want to be in on the results of the cloud feedback before going to silicon.

Does this mean that we need to wait for the cloud FPGA venture to provide meaningful results before the tapeout can be implemented? - sigh!

Does it follow that BRN will become a software provider, at least in the short term?
 
  • Like
  • Love
  • Fire
Reactions: 43 users
Hi Gazza,

Thanks for this. We've seen a couple of articles along this line, which allow me to maintain my illusion about software as a product.

As you may or may not know, I've been posting about the possibility of our EAPs using Akida simulation software, particularly after the emergence of Akida 2 + TeNNs more than 30 months ago, with Valeo and MB using software for signal processing. This sentence from the article again adds more grist to that rumor mill:

"The company has developed an Edge AI platform that combines innovative silicon IP, software, and machine learning."

To repeat myself, no potential user would commit to Akida 2 in silicon while the tech was in a state of flux. The use of software AI is not so problematic in ICEs as it is in EVs, but from what we've heard about TeNNs, the power and latency could be tolerated in EVs using TeNNs in software. Of course, the software would sensibly include the full Akida 2 simulation including TeNNs, or TeNNs could be used on its own. Software can be readily updated as new developments are implemented, whereas silicon is set in stone.

It's been several months since Anil announced the proposed tapeout of Akida 2, which suggests that the development had reached a satisfactory plateau of stability sufficient for the company to commit to silicon. The tapeout was subsequently "delegated" to a mysterious "other" - the rest is silence.

The SPP talks about developing a cloud-based FPGA demonstration setup, again, not a tapeout. This would be a cheaper way to obtain customer feedback than taping out and making a batch of "engineering sample" chips.

Presumably the mysterious "other" would want to be in on the results of the cloud feedback before going to silicon.

Does this mean that we need to wait for the cloud FPGA venture to provide meaningful results before the tapeout can be implemented? - sigh!

Does it follow that BRN will become a software provider, at least in the short term?

@Diogenese what do you mean by "The tapeout was subsequently "delegated" to a mysterious "other" - " ?
 

Frangipani

Regular
Last edited:
  • Like
  • Fire
  • Love
Reactions: 36 users

FJ-215

Regular
Hi Gazza,

Thanks for this. We've seen a couple of articles along this line, which allow me to maintain my illusion about software as a product.

As you may or may not know, I've been posting about the possibility of our EAPs using Akida simulation software, particularly after the emergence of Akida 2 + TeNNs more than 30 months ago, with Valeo and MB using software for signal processing. This sentence from the article again adds more grist to that rumor mill:

"The company has developed an Edge AI platform that combines innovative silicon IP, software, and machine learning."

To repeat myself, no potential user would commit to Akida 2 in silicon while the tech was in a state of flux. The use of software AI is not so problematic in ICEs as it is in EVs, but from what we've heard about TeNNs, the power and latency could be tolerated in EVs using TeNNs in software. Of course, the software would sensibly include the full Akida 2 simulation including TeNNs, or TeNNs could be used on its own. Software can be readily updated as new developments are implemented, whereas silicon is set in stone.

It's been several months since Anil announced the proposed tapeout of Akida 2, which suggests that the development had reached a satisfactory plateau of stability sufficient for the company to commit to silicon. The tapeout was subsequently "delegated" to a mysterious "other" - the rest is silence.

The SPP talks about developing a cloud-based FPGA demonstration setup, again, not a tapeout. This would be a cheaper way to obtain customer feedback than taping out and making a batch of "engineering sample" chips.

Presumably the mysterious "other" would want to be in on the results of the cloud feedback before going to silicon.

Does this mean that we need to wait for the cloud FPGA venture to provide meaningful results before the tapeout can be implemented? - sigh!

Does it follow that BRN will become a software provider, at least in the short term?
Hi @Diogenese,

One question I would have for Sean, is, how do we provide yield numbers for Akida 2 without proving it out in silicon? Yes, we have AKD 1000/1500 from two different foundries but no commercial runs to speak of.

If I remember the ARM history correctly, they were on their 4th or 5th run of commercial chips before going down the IP route that BRN is trying to copy.
 
  • Like
Reactions: 1 users

FiveBucks

Regular
Hi @Diogenese,

One question I would have for Sean, is, how do we provide yield numbers for Akida 2 without proving it out in silicon? Yes, we have AKD 1000/1500 from two different foundries but no commercial runs to speak of.

If I remember the ARM history correctly, they were on their 4th or 5th run of commercial chips before going down the IP route that BRN is trying to copy.
Did we jump the gun by going the IP route?

Or are we ahead of the game?
 
  • Like
Reactions: 1 users

FJ-215

Regular
  • Like
Reactions: 1 users

Frangipani

Regular
Researchers at UC Irvine’s Cognitive Anteater Robotics Laboratory (CARL), led by Jeffrey Krichmar, have been experimenting with AKD1000:








View attachment 67694


View attachment 67690


View attachment 67691

View attachment 67692


View attachment 67693



View attachment 67695






CE7AB531-9636-4F97-AE43-189FA9279C0D.jpeg


This is the paper I linked in my previous post, co-authored by Lars Niedermeier, a Zurich-based IT consultant, and the above-mentioned Jeff Krichmar from UC Irvine.


D99716FD-B259-443D-BF0B-F93288698EF1.jpeg


The two of them co-authored three papers in recent years, including one in 2022 with another UC Irvine professor and member of the CARL team, Nikil Dutt (https://ics.uci.edu/~dutt/) as well as Anup Das from Drexel University, whose endorsement of Akida is quoted on the BrainChip website:

F59D0CEB-A967-430B-B4BA-C5C50BD6DCFF.jpeg



7D165A19-60AB-4867-AE8A-2B99B494341D.jpeg





6EBD13C0-4F42-48E8-A9F5-DE58F4D8C22D.jpeg


Lars Niedermeier’s and Jeff Krichmar’s April 2024 publication on CARLsim++ (which does not mention Akida) ends with the following conclusion and the acknowledgement that their work was supported by the Air Force Office of Scientific Research - the funding has been going on at least since 2022 -



and a UCI Beall Applied Innovation Proof of Product Award (https://innovation.uci.edu/pop/)

and they also thank the regional NSF I-Corps (= Innovation Corps) for valuable insights.

08CDD00C-9445-429B-8D25-46652068F45B.jpeg




BD1853A6-48C6-41B3-A2B9-20F2732FF0AC.jpeg



Their use of an E-Puck robot (https://en.m.wikipedia.org/wiki/E-puck_mobile_robot) for their work reminded me of our CTO’s address at the AGM in May, during which he envisioned the following object (from 22:44 min):

“Imagine a compact device similar in size to a hockey puck that combines speech recognition, LLMs and an intelligent agent capable of controlling your home’s lighting, assisting with home repairs and much more. All without needing constant connectivity or having to worry about privacy and security concerns, a major barrier to adaptation, particularly in industrial settings.”

Possibly something in the works here?

The version the two authors were envisioning in their April 2024 paper is, however, conceptualised as being available as a cloud service:

“We plan a hybrid approach to large language models available as cloud service for processing of voice and text to speech.”


The authors gave a tutorial on CARLsim++ at NICE 2024, where our CTO Tony Lewis was also presenting. Maybe they had a fruitful discussion at that conference in La Jolla, which resulted in UC Irvine’s Cognitive Anteater Robotics Laboratory (CARL) team experimenting with AKD1000, as evidenced in the video uploaded a couple of hours ago that I shared in my previous post?





9DFCFFE4-CE8E-4C4A-B868-159E80917E23.jpeg




F077D344-7BC1-4C01-9A15-90BE02FB7F07.jpeg
 

Attachments

  • 7B351CDB-BC9C-4750-AA6A-1E383954F692.jpeg
    7B351CDB-BC9C-4750-AA6A-1E383954F692.jpeg
    393.5 KB · Views: 45
Last edited:
  • Like
  • Love
  • Fire
Reactions: 31 users
Top Bottom