BRN Discussion Ongoing

IloveLamp

Top 20
Well well well........

View attachment 61358 View attachment 61359

Speaking to investors, he said: "I think Tesla is best positioned of any humanoid robot maker to be able to reach volume production with efficient inference on the robot itself."
 
  • Like
  • Wow
  • Love
Reactions: 9 users
  • Like
  • Fire
  • Love
Reactions: 35 users

Iseki

Regular
  • Haha
  • Like
Reactions: 9 users
Linkedin = Tinder for board members, nothing more.
Well the more Company board members, swiping right on us the better! 😉

20240425_004734.jpg
20240425_004706.jpg
20240425_004634.jpg
392107620_SWIPE_RIGHT_400px.gif
 
  • Like
  • Haha
  • Fire
Reactions: 20 users

Damo4

Regular
 
  • Like
  • Love
  • Fire
Reactions: 26 users

Tothemoon24

Top 20
⬇️⬇️⬇️⬇️⬇️🔜🚀
IMG_8832.jpeg
 
  • Like
  • Fire
  • Love
Reactions: 31 users

TECH

Regular
Hi Tech,

Yes, the CNN-to-SNN conversion is particularly telling:


View attachment 61433

Conventional gesture detection approaches demand large memory and computation power to run efficiently, thus limiting their use in power and memory constrained edge devices. Present application/disclosure provides a Spiking Neural Network based system which is a robust low power edge compatible ultrasound-based gesture detection system. The system uses a plurality of speakers and microphones that mimics a Multi Input Multi Output (MIMO) setup thus providing requisite diversity to effectively address fading. The system also makes use of distinctive Channel Impulse Response (CIR) estimated by imposing sparsity prior for robust gesture detection. A multi-layer Convolutional Neural Network (CNN) has been trained on these distinctive CIR images and the trained CNN model is converted into an equivalent Spiking Neural Network (SNN) via an ANN (Artificial Neural Network)-to-SNN conversion mechanism. The SNN is further configured to detect/classify gestures performed by user(s).


The granting of this Patent on 16 April 2024 is rather revealing, it was first filed on 14 December 2022, published on 12 October 2023...why do
I mention this ? For this reason, if you reread the link below it clearly establishes a time frame...3 years, including through the Covid pandemic

When we presented alongside Tata Consultancy Services back in Vancouver in December 2019 with the joint venture specializing in gesture movement technology in robotics, it seems that things had been underway between us for sometime protected with an NDA.

So what I'm suggesting, yes it's speculative, but, with this very recent Patent being granted, things appear to be fitting in with the time
guidelines we are all aware of....IP block within Tata clients products appear to be closer than we may realize.

Tech's personal opinion....Happy ANZAC DAY....God Bless all those young men who fought for our freedom all those years ago...☮️❤️

 
  • Like
  • Love
  • Fire
Reactions: 70 users

Tothemoon24

Top 20

April 2024 Newsletter​

April has been buzzing with innovation and discovery here at BrainChip. We recently spent a whirlwind week at Embedded World 2024, where the team demonstrated Akida in real-life use cases and engaged in extensive conversations around AI, Edge processing, and Akida neuromorphic technology. But the journey doesn't stop there. Join us as we voyage beyond Earth's biosphere, exploring the cosmos through our recently launched podcast and guest blog post, focused on SpaceTech Robotics, featuring ANT61's CEO and Founder Mikhail Asavkin. We also had the pleasure of collaborating with our partners at Edge Impulse as we delved into the intricacies of neuromorphic processing, we’ve included their latest blog post below. With upcoming events on the horizon, where our leadership team will lead the charge, there's never been a more exciting time to be part of the BrainChip community. Thanks for joining us on this journey.
564206dd-7739-fb5f-2bd6-86ab157db57a.jpg
What an incredible experience witnessing the widespread recognition of the BrainChip brand and the enthusiastic engagement in discussions about AI, Edge processing, and Akida™ neuromorphic technology at Embedded World 2024.
Throughout the event, we had the privilege of diving deep into conversations with both new connections and long-standing partners, reinforcing BrainChip's impact in the market.
To those who joined us in person, it was fantastic to meet you face-to-face. And to those who couldn't make it, we're grateful for the opportunity to connect virtually and schedule meetings for deeper discussions.
See highlights from our time in Nuremberg, Germany:

 
  • Like
  • Fire
  • Love
Reactions: 48 users

IloveLamp

Top 20
I'm sure it's been posted previously, but i don't recall


1000015376.jpg
 
  • Like
  • Fire
  • Love
Reactions: 29 users

Quiltman

Regular
The granting of this Patent on 16 April 2024 is rather revealing, it was first filed on 14 December 2022, published on 12 October 2023...why do
I mention this ? For this reason, if you reread the link below it clearly establishes a time frame...3 years, including through the Covid pandemic

When we presented alongside Tata Consultancy Services back in Vancouver in December 2019 with the joint venture specializing in gesture movement technology in robotics, it seems that things had been underway between us for sometime protected with an NDA.

So what I'm suggesting, yes it's speculative, but, with this very recent Patent being granted, things appear to be fitting in with the time
guidelines we are all aware of....IP block within Tata clients products appear to be closer than we may realize.

Tech's personal opinion....Happy ANZAC DAY....God Bless all those young men who fought for our freedom all those years ago...☮️❤️


And the teams continue to support each other at the highest levels.

BBE1CC38-4D37-4F1F-B9A3-1AFCB59F3348.jpeg
 
  • Like
  • Fire
  • Love
Reactions: 40 users

MegaportX

Regular
  • Like
  • Love
  • Fire
Reactions: 14 users

HopalongPetrovski

I'm Spartacus!
  • Haha
  • Like
  • Wow
Reactions: 6 users

Tothemoon24

Top 20



This captured data is then processed locally using edge computing or sent to a centralized server for analysis. Advanced software algorithms such as artificial intelligence (AI) and machine learning (ML) analyze this data, drawing on facial recognition algorithms that can identify a viewer’s age, gender, and mood.
 
  • Like
  • Fire
Reactions: 5 users

Diogenese

Top 20
  • Like
  • Love
  • Fire
Reactions: 17 users

Diogenese

Top 20
  • Haha
  • Like
Reactions: 9 users
"We were told BRN pulled out of poducing an Akida 2 SoC ..."

Apologies I've not been here for a while... anyone can tell me where this statement is from?

-MMH

We were told BRN pulled out of poducing an Akida 2 SoC because we didn't want to step on "someone's" toes.

Now we are doing a pas de deux with Nviso.

Out of 17 business-related items on Nviso's News page, 4 relate to BRN, starting from 19/4/2022:

https://nviso.ai/news/

NVISO and BrainChip partner on Human Behavioral Analytics in automotive and edge AI devices

April 19, 2022
Lausanne, Switzerland & Laguna Hills, Calif. – April 19, 2022 – BrainChip Holdings Ltd (ASX: BRN, OTCQX: BRCHF, ADR: BCHPY), the world’s first commercial producer of neuromorphic AI



NVISO advances its Human Behaviour AI SDK for neuromorphic computing using the BrainChip Akida platform

May 11, 2022
Lausanne, Switzerland – May 11, 2022 – nViso SA (NVISO), the leading Human Behavioural Analytics AI company, today announced at the AI Expo Japan new capabilities


NVISO announces it has reached a key interoperability milestone with BrainChip Akida Neuromorphic IP

July 18, 2022
To learn more about the BrainChip Akdia interoperability results please click here to register for the results presentation on 20th July 2022 at 7pm AEST. NVISO has


NVISO’s latest Neuro SDK to be demonstrated running on the BrainChip Akida fully digital neuromorphic processor platform at CES 2023

January 2, 2023
Following the recent NVISO Neuro SDK milestone release, including two new high performance AI Apps from its Human Behavior AI App catalogue, Gaze and Action


The key interoperability milestone was announced 3 months after the Nviso/Brainchip partnership was announced. That is an indication that the technologies fit like a hand in a glove.


There are several other Nviso partnerships mentioned:

Panasonic, Interior Monitoring Systems, Siemens, Unith, Privately SA.

I'm guessing Nviso is handling these partnerships with kid gloves.

Nviso list healthcare, Consumer Robots, Automotive Interiors, and Gaming as their fields of interest.

Nviso's patent relates to a software-implemented CNN method, so Akida would have been a revelation.

US11048921B2 Image processing system for extracting a behavioral profile from images of an individual specific to an event 20180509

View attachment 61211


So did BRN hold off on Akida2 SoC at Nviso's behest? - Highly improbable because Nviso is software-based.

So who(m?)?
 
  • Like
Reactions: 2 users

Frangipani

Regular
AKD 1000 AKD 1500 AKD 2000 and moving ever so closer...AKD 3000...and 3 of which are 100% available now and setting the benchmark
which the company believe their IP will end up in everything, everywhere...(that I'd suggest is the target, NOT a proven fact to date).

Hi TECH,

just an attempt to sort out the terminology…

The following is how I understand BrainChip’s nomenclature, but I could be wrong, so everybody please feel free to brainstorm and chip in…

In early 2022, shortly after Sean Hehir had joined BrainChip as CEO, “AKD1000” was still used as an umbrella term to describe everything our company had for sale at the time (chip, IP, PCIe board), as evident by the following investor conference presentation slides:

7CF83133-83B4-4BD3-97DE-EA0CBC24FA44.jpeg



These days, however, a distinction appears to be made between the generational iterations of the Akida processor technology platform (categorised as akida, akida 2.0…) as opposed to the physical reference chips (so far AKD1000 and AKD1500).

The AKD1000 and AKD1500 SoCs are both silicon implementations of the Akida technology embodied in BrainChip’s 1st Generation Edge AI neuromorphic processor platform akida (technically speaking akida 1.0).

AKD1000 was implemented with TSMC at 28nm, whereas AKD1500 was taped out on GlobalFoundries’ MCU-friendly 22nm fully depleted silicon-on-insulator (FD-SOI) process, aka GF’s 22FDX technology. As reference chips, they were primarily meant to target prospective IP licencees (as a proof of concept), but at the same time the AKD1000 (on a PCIe board or inside a Dev Kit) benefitted individual developers (professional hardware engineers in companies or academic settings as well as advanced hobbyists) who were not interested in mass production of edge devices and the signing of an IP licence, but instead may have only required a single PCIe Board or Dev Kit for their projects or research (take note that it says on the BrainChip shop website in bold capital letters that development kits are not intended to be used for production purposes). Meanwhile, AKD1000 chips have also been integrated into the VVDN Edge AI Boxes, and the Unigen Cupcake Edge AI Server will soon be offered with a new configuration based on the AKD1500 (?) as an AI option.

Akida 2.0, the neural processing system’s enhanced 2nd Generation, was announced and introduced last year, but purely as an IP offering, productised in three different variations: akida-E, akida-S and akida-P, depending on where in the Edge AI spectrum (sensor edge < server edge) its prowess is required.

AKD2000, however, doesn’t exist - at least not yet.
The way I understand it, AKD2000 would be the name
of BrainChip’s (hypothetical) reference chip based on Akida Gen 2, which may or may not materialise.

Let’s recall what was said earlier this year:
In an interview with Jim McGregor from TIRIAS Research during CES 2024 (January 9-12), Todd Vierra replied the following to his interview partner’s comment “And correct me if I’m wrong, but this is the Akida 2?”
Todd: “This is actually all ran [sic] on Akida 1 hardware. Akida 2, erm - we are in the process of taping out and we’ll get that silicon back a little bit later, but these are all just Gen 1...” (from 9:26 min). His statement about an imminent tapeout seemed to confirm what (according to FF) Sean Hehir had told select shareholders in the November 2023 Sydney “secret meeting”.

Surprisingly, a mere seven weeks later, during the Virtual Investor Roadshow (February 27), neither Sean Hehir nor Tony Lewis mentioned anything at all about a tape-out in progress, but instead argued a second generation reference chip as proof of concept was unnecessary, while at the same time not totally excluding a potential future tape-out complementing their core IP business. However, our CTO made it clear that it is definitely not their intention to manufacture chips on a large scale:


From 43:17
Roger Manning’s question: “Given BrainChip’s business model is to largely sell its akida IP, will it be necessary to prove each new version of akida in silicon?”

Tony: “So I think the big question was, will this event-based paradigm yield, erm can it be done and will it yield some benefits to customers, and I think we achieved that by taping out our earlier generation of products, and so we’ve already achieved that. And it’s my belief that there is only marginal benefit in taping out the next generation, we already proved the main points of it. And clearly we don’t want to start to manufacture chips on a large scale. We’d be competing with our customers and that would really break our business model right now.

Sean: “Yeah, and to be clear, a lot of work has gone on with our ability to simulate workloads in Generation 2 as well. As Tony said, we certainly have reference chips in Generation 1, and the typical engagement course that we work with IP license prospects is we allow them to run models on there and/or simulate them in our simulation tools that we have for Generation 2, so Roger, I would say stay tuned, we may or may not, erm, but right now, there is no need for us to do that.”




Now the way I understood the “We’d be competing with our customers” comment is not for fear of treading on their customers’ or a specific customer’s toes (as other TSE users have interpreted it), but because they would shoot themselves in the foot by doing that: after all, it would be less profitable for our company when customers could just buy those chips off the shelves to utilise them in their in-house development (and hence save a lot of money, as there won’t be any follow-up costs for them) rather than having to pay an initial IP license fee and future royalties.

Does my interpretation make sense or am I overlooking something here?



And as for the next iteration of the Akida processor family, we don’t even know, yet, whether it will be called akida 3.0 or akida 2.X…

From min 47:41 min:

Sean Hehir: “… but round Generation 3, if you will: You know, I mentioned earlier in my slide that when I talked about our product planning and our execution cycle, if you will, you could see we are always planning on this product and we are always looking for improvements on our IP offering right there. Now whether we call it formally “Generation 3” or “2 something x” - but yes, we are in the middle of a planning cycle right now to make some changes and we’ll make announcements over time.”


So what we can say with confidence (provided we believe our CEO’s words) is that the advent of the Akida neural processor family’s 3rd generation is approaching… Whether there will ever be an AKD3000, though, is uncertain.

Regards
Frangipani


P.S.: I still can’t get my head around Todd Vierra’s statement, though. Would they really have backed out at the last minute, eg due to financial constraints?
Or was he possibly referring to a tapeout not of AKD2000 as a reference chip, but to a tape-out of a SoC by a specific customer that includes Akida 2.0 IP, along the lines of what DingoBorat said?

To me, it's saying there's an in the flesh, integrated circuit, customer custom SOC, with our IP in it...

But then again, would he really say “…we’ll get that silicon back a little later”?
I suppose only if it was a joint development, such as possibly one with Socionext or Tata Elxsi (which could explain the different colour of the mysterious Custom SoC on that presentation slide)? As otherwise, wouldn’t such a SoC be taped out by the customer itself rather than by / in collaboration with BrainChip? I am a bit confused here. 🤔

It seems too early as a mere placeholder for the planned integration of Akida IP into the Frontgrade Gaisler SoC, which according to ESA’s Laurent Hili “We aim to tape out ideally before the end of the year, beginning of next year” (from 47:25 min shortly before the end of the mid -March BrainChip podcast Episode 31). After all, they could have labelled it “prospective Customer Custom SoC”, but the way it is presented on the slide, I agree with DingoBorat that it does appear to be an already existing physical implementation, indeed. Mmmmh… 🤔
 
Last edited:
  • Like
  • Love
  • Fire
Reactions: 32 users
Top Bottom