BRN Discussion Ongoing

itsol4605

Regular
Probs posted but first I've seen of it.

Nice simple positive summary.


7 Semiconductor Startups Revolutionizing EdgeAI​


Introduction​

This opens up a whole new market for affordable chips that can do artificial intelligence and machine learning tasks, costing less than $10 each. This is a big change from the expensive chips like TPUs/GPUs/DLAs (deep learning accelerator) currently required for implementing Cloud AI, costing tens of thousands of dollars each. We have compiled a list of 7 semiconductor startups that are making their mark with their AI Chipsets.

“Edge AI gives wings (of inference) to thin clients, as it puts 32-bit MCUs on steroids”


2. BrainChip​

brainchip_holdings_limited_cover-1024x294.jpg

Although the company is still in the early stages of deployment in a few niche markets. But we believe this company has the potency to go big. BrainChip is involved in edge AI on-chip processing and learning.

The company’s first-to-market neuromorphic processor, AkidaTM, mimics the human brain to analyze only essential sensor inputs at the point of acquisition, processing data with unparalleled efficiency, precision and economy of energy. Moreover, keeping machine learning local to the chip, without the need to access the cloud, dramatically reduces latency while improving privacy and data security.
Akida’s performance per microwatt can be orders of magnitude better than the current solutions on the market.”
In enabling effective edge compute to be universally deployable across real-world applications, such as connected cars, consumer electronics and industrial IoT, BrainChip is proving that on-chip AI close to the sensor is the future for customers’ products as well as the planet.
Akida is Best product and solution on the market!
 
  • Like
Reactions: 13 users
Our Chief Technology Officer, Dr Tony Lewis was the former Senior Director of Technology at Qualcomm and was the creator of Qualcomm's Zeroth neural processing unit and its software API, which is so AMAZING since the cognitive computing abilities developed though the Zeroth program were subsequently incorporated into the SNAPDRAGON processor.

That being said, wouldn't it would be fantastic to get Tony do a podcast or two to hear his detailed thoughts on Zeroth (SnapDragon) and AKIDA and all of the various complementary aspects of each technology that when combined together will be guaranteed to blow everyone's sock off)?

While we wait for these podcasts to be produced, we can entertain ourselves by watching this video from Tony when he was with Qualcomm.



Tony also mentioned when starting at BRN that he’s whole working life has lead to this opportunity. We are very lucky to have him on board , amazing talent.
 
  • Like
  • Fire
  • Love
Reactions: 39 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
This article from 2015 predicted we'd be in Snapdragon.



Screenshot 2024-04-12 at 1.04.54 pm.png

 
  • Haha
  • Like
  • Love
Reactions: 15 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
Screenshot 2024-04-12 at 1.32.56 pm.png




When do you think there'll be more to come? I only pray it happens sooner rather than later!



9474151.gif
 
  • Haha
  • Like
  • Fire
Reactions: 14 users

IloveLamp

Top 20
Last edited:
  • Like
  • Love
  • Fire
Reactions: 46 users

Esq.111

Fascinatingly Intuitive.
View attachment 60736 View attachment 60737
Good Afternoon ILoveLamp & Fellow Chippers ,

I'm a little perplexed ....... we know TATA Consulting Services are a BrainChip partner , But PERDUE UNIVERSITY ?????

Dose one add Perdue University to The BrainChip Scroll ???

Think this is the first time i can recall Perdue University being associated with us , or is it still too indirect .

Regards,
Esq.
 
  • Like
  • Thinking
  • Love
Reactions: 11 users

Mccabe84

Regular
  • Like
  • Thinking
Reactions: 4 users

mrgds

Regular
Good Afternoon ILoveLamp & Fellow Chippers ,

I'm a little perplexed ....... we know TATA Consulting Services are a BrainChip partner , But PERDUE UNIVERSITY ?????

Dose one add Perdue University to The BrainChip Scroll ???

Think this is the first time i can recall Perdue University being associated with us , or is it still too indirect .

Regards,
Esq.
Gday @Esq.111
If you think it pertinent then please due.
 
  • Haha
  • Fire
Reactions: 8 users

Esq.111

Fascinatingly Intuitive.
Afternoon Mrgds ,

Personally , Need a dash more clarification before i add them.

If , When @Bravo or @Frangipani get time , Maybe bend this chap over the nearest park bench & see if they can extract anything more substantial.

Thankyou in advance.

Regards ,
Esq.
 
  • Haha
  • Like
  • Love
Reactions: 7 users

Iseki

Regular
Next podcast: "This is our mission, so why haven't you signed with us yet?"
Featuring recipies for the Samsung Slow Cooker, complete with emotion detection (hungry, starving, annoyed, disinterested) control.
A moveable feast, you can almost smell. Tell you friends! Okay, maybe not that last bit.
 
  • Haha
  • Like
Reactions: 13 users

Tothemoon24

Top 20
IMG_8773.jpeg




Edge Impulse has partnerships with a great range of silicon vendors and chipmakers, providing native software support for devices from these companies. One notable Edge Impulse partner is BrainChip, the Australia-based company that has created a proprietary neuromorphic processor, called AkidaTM, using spiking neural networks that replicates human brain activity. The nature of neuromorphic processing makes it especially well-suited for artificial intelligence applications, which makes BrainChip and Edge Impulse a great combination.

With the close connection between both teams, Edge Impulse’s Solutions Engineering Manager Joshua Buck sat down with BrainChip CMO Nandan Nayampally to talk through the underlying technology that they’re creating, the areas that the company is focusing on, some of the latest developments, and more.

Josh Buck: Hello Nandan. I'd like to hear about the newly released Edge AI Box that features the Akida AKD1000 neuromorphic processor. But first, what is BrainChip’s chief differentiator in the market? Why do you guys exist and what are you hoping to accomplish?

Nandan Nayampally: Josh, thank you for having me. Part of the reason we are doing this is to accelerate what BrainChip believes is our key differentiator, which is extremely low-power processing for more complex neural networks out at the edge without the need to connect the cloud.

BrainChip, as the name suggests, is inspired by the brain, which is one of the most efficient computational, inferential, and predictive engines known. The neuromorphic approach relies on the behaviors of neurons, which only fire when needed, and hence are more efficient with computation itself, and with how much is needed. That's where BrainChip derives its inspiration. We call it neuromorphic processing, but really event-based processing. And the minor difference here is, traditionally neuromorphic aims to be analog, which is great, but at the same time is much more difficult to achieve and not as portable. BrainChip has gone the approach of building an event-based, purely digital version of it, which can actually be sold as IP, or licensed as IP to go into lots of different edge solutions.

The other advantage that you have with BrainChip: While neuromorphic computing and what is traditionally called spiking neural nets, which truly mimic brain function, are cool, they are not yet production-ready and are not as prevalent in the marketplace. So, what BrainChip has done with its event-based processor is to make it much easier to take today's models — convolutional transformers, etc. — and make them run on hardware that is much more event-based. So key advantages are that it can be extremely power efficient on complex models, you can do them in battery-operated or very low thermal environments, completely disconnected from the cloud if needed. And that really changes how you can bring more compelling, intelligent applications to the edge.

Josh Buck: What are the key challenges that we're trying to solve for the customers in this space? You've already mentioned a few, like vision applications, but in particular with the Edge Box? And how does the partnership with Edge Impulse supporting this Edge Box help?

Nandan Nayampally: Edge Impulse has taken on the mantle of being a development platform for AI at the edge, which, when it started a few years ago, I thought was a very big undertaking, because it's a very fragmented space. And what you guys have done in terms of building a developer community with the projects that people are creating, is an amazing validation of the fact that there is real hunger to do things at the edge. And the reason, of course, is the fact that the demands on the cloud are overwhelming, and if anything, generative AI has taken an order of magnitude more complex and the cloud can't keep up. So you do need to pull a lot of that to the edge.

What you guys have done in terms of building a developer community with the projects that people are creating, is an amazing validation of the fact that there is real hunger to do things at the edge.
Now “edge” means a lot of things to a lot of people as you know. BrainChip’s technology goes from what we call far, far edge or sensor-edge, where it's right in sub-watt or sub-milliwatt-types of applications, very close to sensor, IP integrated, all the way up to the network edge where it's almost a server-type application, but not quite. And so, the Edge Box from VVDN sits closer to the right-hand side, if you will, closer to the network edge. But the goal is to actually have a very compact, portable, scalable, cost-effective box that can enable lots of interesting applications. For example, if you have multiple video streams coming in retail, where you're managing the store, that can be computed on the box itself. The box will have connectivity with Ethernet with Wi-Fi, etc., but the main part is it's going to do the heavy lifting of the compute in real time. One of the key aspects of edge is you want these things to be real-time, independent of how good your connection to cloud is. So the real critical part needs to be done close to where it is.

image2.jpg
Edge Box, featuring BrainChip's Akida AKD1000 processor
Josh Buck: Absolutely, I see that a lot too. In my role at Edge Impulse, I work directly with customers both on the far edge and, like you said, server edge. In both scenarios, and even the ones in between, it's a lot about making sure that we can prove that the data is valuable, and that you don't have to ship it off to the cloud. You can adopt processing on site and get real-time insights directly there, whether or not you have cloud connectivity. My experience with using the AKD1000, which is inside this Edge Box, has been that it's been able to do some pretty high tech models. We can do object detection, we can do image classification, and also some of the sensor-based applications.

What I've been really excited to explore with this device is how we can enable customers to show that the data they have today, even if it's in the cloud or not, is worth something, and they can run it on all these devices very quickly and efficiently. With some of the integrations that we have already with BrainChip, and with stuff that's coming out in the future, they'll be able to test out their models, test out their data, and then get performance insights. And then when these devices are available, they'll be able to actually run them and then validate that they have this data here that works well. And they don't have to extricate it to the cloud just to get the inferencing that they need.

Nandan Nayampally: That's well put, Josh. I think the real benefit that we see is that the idea of completely integrated AI into very, very small sensors will take time. It is probably where it should be just because of the fact that you can do it in real time and minimize the amount of communication needed to minimize the amount of context-free computation. At every stage you go further up the network, you lose some context, and you have to put more brute force in it. So, the true distributed hybrid approach is where it will gravitate to, but you will find those points. What the Edge Box does is give you that point that is on the edge side of the network edge. It gives you the ability to, at the same time, have computation.

The Edge Box has an Arm-based processing unit with GPU from NXP, an iMX 8-based one. So, you have computation, user experience, as well as some meaningful intelligence being built into your solution. For example, in the retail space, can you actually understand if your stacking of shelves is appropriate? Can you understand customer behaviors? Can you understand potential threats? Those are all things that people would like to see. But the cost of getting a cloud service for that is pretty high. And the response may not be timely enough. You see the same thing for healthcare. Can you actually manage health care networks, especially remotely? That's another area where the Edge Box comes in handy. We see that for smart city, we see that for automotive transportation, etc.

I think this enables a lot of different types of smaller businesses to bring this to market in small-volume configurations and make it still cost effective versus trying a much heavier cloud or network edge solution.

In the retail space, can you actually understand if your stacking of shelves is appropriate? Can you understand customer behaviors? Can you understand potential threats? Those are all things that people would like to see.
Josh Buck: Tell me more about the specifics on the Edge Box. You mentioned it’s an iMX device, I assume it has Linux on it, it will have the AKD1000. Can you share some of the production ready-specs that enable it to go out in the field or is that to be released?

Nandan Nayampally: We released some of those specs this past February, with the launch of the pre-sale. These are at a datasheet level, if you will. I think in terms of the actual performance metrics on types of models, we would like our partner VVDN to present it, because they're the ones delivering the QoS for it, but we'd expect to consistently be delivering more and more information as we go along. But there's a lot of excitement around it, and I think most users will look at this or developers will look at it as a great way to do POCs and then go into small-volume production before they get to the next step of higher volume solutions.

Josh Buck: Edge Impulse wants to be right there in that journey with you. The type of things that we provide already for the Akida platform show off some of the models that you have today on BrainChip’s website, and even some of the Edge Impulse applications on there. There's been a number of developments with FOMO — Edge Impulse’s model for object detection — that works very well in the BrainChip AKD1000. And with that enabled on the Edge Box now, we're just one more step to bringing this to production so that people can literally drop it into place and get started very fast, and then go to that kind of small-volume production, and have that data set ready that they may already have and get it onto the field and into production quickly. I’m really excited about that.

Nandan Nayampally: Edge Impulse, I feel, is an important cog in the wheel. We're happy to be the first IP partner that Edge Impulse has worked with, and we've worked very closely on that. And then having this for the Box gives the developers a spectrum where they can use the Edge Box. They can use our development boards, and beyond the boards, when our partners come to market with integrated Akida solutions, they have a common development platform for all three. And that actually gives a developer continuity in a usage familiarity, which is important because as everyone knows, it's a pretty complex environment.

When we started, we assumed that everyone would be a super user using TensorFlow or PyTorch. That's only a small section of the developer community. Our MetaTF software supports that. For the people that are doing low-code/no-code, like yourselves, or enabling low-code yourself, I think Edge Impulse has been a great platform for us to integrate our compilation technology tools in, so that they can build on a much more stable environment and build out less intense development cycles, but there's a ready set of widgets for them to choose from and optimize thereafter. And then, of course, we're also working with application vendors such as NVISO and Ai Labs, which are solving [vehicle] in-cabin or factory maintenance or such problems for direct users that can solve direct applications.

Edge Impulse has been a great platform for us to integrate our compilation technology tools in, so that they can build on a much more stable environment and build out less intense development cycles, [with] a ready set of widgets to choose from
Josh Buck: It's good to hear that you have a whole ecosystem and a zero to 100 solution. Not only can you start off and go down to the IP level, and integrate it into your chips and sensors that you need to, but then you have, development boxes, like the enablement platforms you have today, the new Edge Box for production, then you have software support from MetaTF, which is what you provide for the coding environment. And then people can build on top of that, like we did at Edge Impulse, for production solutions and things like that. And then you have partners as well, to go target real specific applications, maybe as a system integrator, if you will, to bring a whole solution together if the customer themselves are not able to put all the pieces.

Nandan Nayampally: Exactly. And in fact, if you think about VVDN, the Edge Box, it is us working with the ecosystem, yourselves, VVDN, you could put them somewhere between an ODM or original device manufacturer or a system integrator. This is not the business that we're in. BrainChip is not an Edge Box builder; BrainChip is enabling a lot of the Edge Box vendors to come up with meaningful, clear solutions that they can then take to their channel and proliferate efficient edge AI processing.

Josh Buck: So, what's next for BrainChip? When do you expect the next announcement for Edge Box, upcoming new releases that you're able to talk about, any conferences that we may be at?

Nandan Nayampally: I think one of the key things to understand is that the Edge Box that we're talking about today is still based on our first generation technology. Currently, with the AKD1000, could it be extended to the AKD1500, are those really viable solutions?

What we announced and delivered at the end of 2023 is the second generation, which is a lot more capable. It has some of the things we learned from the market, saying “hey, we need 8-bit processing for sure,” even though neuromorphic can be very efficient and capable with 4 bits or even 2 bits.

We've added in the ability to do more complex models like ResNet 50, but really, we had two big new additions. We can do temporal event-based neural nets. That's a new step for traditionally what I call state-based structured models. And what that does is, for traditional recurrent or even transformer-type models where attention is needed, you can actually build models that are much more compact, much more efficient, but equally as accurate and sometimes, as in this case, more accurate. If you could reduce some of the video object detection by 50x in terms of the model size, 50x in terms of the number of operations needed, while maintaining accuracy, that's a boon for edge computing.

We have also added vision transformer encoder in hardware, which again accelerates a lot of this capability for vision object detection. We see a second generation being very compelling for vision and video, obviously, because that's where a large part of the market is for predictive sequence prediction, such as for predictive maintenance, or a lot of the time series data or sequential data, multi-dimensional data for multimodal data, we see that solution being very compelling. Can it start helping with things like gen AI, can it starts helping with multimodal AI? Absolutely. Obviously, it’s a huge space with potentially very complex models, but we're looking at what parts of it can we accelerate for the edge, speech to text, text to speech, some level of automatic speech, speech recognition. All of those things are possible with the generation two. And we'll be talking more about it as more details come out.

Josh Buck: So new technologies come out, BrainChip, and Edge Impulse definitely work together to exhibit those on Edge Impulse and make sure that customers have access to the tools they need to quickly get to production, like with a VVDN box and other devices that may come out.

Nandan Nayampally: One of the things that we didn't touch on, which is kind of a unique capability, I believe, that is now coming into its own is the ability to learn on device. People confuse it with training, this is not training, it is about taking characteristics and features extracted from training, which are now in the inference model, and being able to customize it. So, if you have a face detection capability that can say, “hey, this is Josh's face,” or none in space, that can be customized on device. And what I saw at CES this year was the ability of doing that customization, that personalization is really important as AI starts coming closer and closer to the edge. People are worried about privacy, people are worried about security, this is going to be important.

Josh Buck: Because when you're customizing the model on device, that doesn't have to go back to the cloud. That privacy and security aspect is what stays and what data happens to it, you can get that personalization, and do this edge learning on device. Enter a mode, give it a few samples, and there, it's now personalized for that particular face or voice or sensor application that you never want to have the potential to even get outside of that box. That's great.

Nandan Nayampally: Yeah. If you were at CES you would have noticed that AI was pretty much everywhere, from intelligent toilets to robots doing anything you want. Pretty much edge AI was there. Now, that doesn't mean that everyone will have an accelerator in it. But the fact is models, requirements, and the need for intelligence at the edge is increasing. And for Edge Impulse’s perspective, it's perfect because you're independent of hardware. You are developer models that can scale to partner A's platform, to partner B's platform, which is an excellent thing to see for you guys, how you actually take it to market. We are excited because that says that there will be more demand for intelligent compute, very efficient compute, that starts pulling these things in.

Josh Buck: Any conferences coming up? We're going to Hardware Pioneers and Embedded World in April.

Nandan Nayampally: I believe we're jointly going to the Hardware Pioneers conference. We obviously will be attending the TinyML. We will be at Embedded World, as we should. Naturally the Edge Vision Alliance summits in May, and there's more, but I think for the next two to three months, we have a pretty full slate of events where Edge Impulse and BrainChip can show customers what's done.

Josh Buck: Thank you Nandan, I appreciate your time.
 
Last edited:
  • Like
  • Love
  • Fire
Reactions: 61 users

mrgds

Regular
IMO ................ MAXIPAD FRIDAY = ABSORBTION ............... EVEN THE COLOUR IS RIGHT TODAY ! :eek:

RED BABY ......................... YEAH ! :cool: 😍


Screenshot (38).png
 
  • Haha
  • Sad
  • Like
Reactions: 9 users
  • Like
  • Fire
  • Love
Reactions: 25 users

Easytiger

Regular
The BIG question is.......

Can any company currently focused on disrupting the market at the "edge or even the far edge" do on-chip learning and all the benefits
that come with this architecture that Peter first patented in 2008, then with Anil and all our brilliant, highly intelligent team have added and/or
refined the design phase, really compete ?

What I mean by that statement is this, not only do we have 'proof of concept' in silicon of 2 chips (AKD I and AKD I500) with AKD II possibly
in chip form already (not been disclosed to date) as a 100% commercial proposition available now, with superior engineering teams available
to assist clients in the very specialized field of Neuromorphic Engineering and Design, BUT we also hold (I believe) 19 Patents currently worldwide, with around 22 others pending....the others (you already know who they are) are all talk, lab coats and possibly misleading tech articles that don't reveal the true picture, in my view.

Our company will deliver, you'd be an absolute fool to think otherwise in my opinion.

Tech 202

Our Chief Technology Officer, Dr Tony Lewis was the former Senior Director of Technology at Qualcomm and was the creator of Qualcomm's Zeroth neural processing unit and its software API, which is so AMAZING since the cognitive computing abilities developed though the Zeroth program were subsequently incorporated into the SNAPDRAGON processor.

That being said, wouldn't it would be fantastic to get Tony do a podcast or two to hear his detailed thoughts on Zeroth (SnapDragon) and AKIDA and all of the various complementary aspects of each technology that when combined together will be guaranteed to blow everyone's sock off)?

While we wait for these podcasts to be produced, we can entertain ourselves by watching this video from Tony when he was with Qualcomm.



Agreed and would also love the CTO to do a podcast to market how the Akida silicon solution will be a massive differentiator for the PC and Mobile phone companies and how early adopters with Akida IP will have a market leading competitive advantage.
 
  • Like
  • Fire
  • Love
Reactions: 13 users

Esq.111

Fascinatingly Intuitive.
I have no idea what this thing dose but apparently it has a Neural DSP thing in it.

Popped up , whilst in deep sound saturation


Esq.
 
  • Like
  • Thinking
Reactions: 2 users

TECH

Regular
Agreed and would also love the CTO to do a podcast to market how the Akida silicon solution will be a massive differentiator for the PC and Mobile phone companies and how early adopters with Akida IP will have a market leading competitive advantage.

Great question...maybe send it to Tony and ask that Dr. Lewis addresses it...I like how Dr. Tony is fronting these podcasts, as I mentioned a while ago, his personality is different from Peters, but I still love our founder, he is not only a genius, but an extremely generous, kind soul.

I also believe with AI being introduced into our favorite device, the "beam me up Scottie" smartphone, being an edge device, we, Brainchip and
the Akida suite of products on the near horizon, will most certainly end up as an IP block within some companies smartphone, maybe Samsung
or Ericsson or even in an Apple some time soon !

Have a great weekend, back in Perth in a month, looking forward to coming home, my jail sentence is finally over...joke :ROFLMAO::ROFLMAO::ROFLMAO: Tech x
 
  • Like
  • Love
  • Fire
Reactions: 20 users

Frangipani

Regular
Good Afternoon ILoveLamp & Fellow Chippers ,

I'm a little perplexed ....... we know TATA Consulting Services are a BrainChip partner , But PERDUE UNIVERSITY ?????

Dose one add Perdue University to The BrainChip Scroll ???

Think this is the first time i can recall Perdue University being associated with us , or is it still too indirect .

Regards,
Esq.
Afternoon Mrgds ,

Personally , Need a dash more clarification before i add them.

If , When @Bravo or @Frangipani get time , Maybe bend this chap over the nearest park bench & see if they can extract anything more substantial.

Thankyou in advance.

Regards ,
Esq.

Good morning from Germany, Esq.!

first of all I need to somewhat curb your enthusiasm, as Purdue University is not Tamal Acharya’s workplace - as I wrote in another post two days ago, he works for both Tata Consultancy Services and NeuroCortex.AI. Instead, the university name next to his place of residence (Bengaluru, formerly known as Bangalore, India) signifies he is an alumni.

In his LinkedIn bio, we can see he pursued a post graduate program at Purdue, a one year intensive course to be precise. Sounds to me as if this this was done online (see reference to Simplilearn that offer online courses and also note that this was 2020-2021, when the US closed their borders to non-US residents due to the COVID-19 pandemic…)

670D2256-BD9D-4FE6-9F5E-46BBD19725E2.jpeg




While I haven’t yet come across any indication that Purdue University is collaborating with Brainchip, there are certainly some intriguing connections, the most obvious being Purdue is Kristofor Carlson’s Alma Mater!

BD28250C-CF38-4A13-B106-37BA989AD9CB.jpeg



What we do know is that there is neuromorphic research being pursued at Purdue, eg at C-BRIC, the “Center for Brain-inspired Computing Enabling Autonomous Intelligence”, which is “a five-year project supported by $32 million in funding from the Semiconductor Research Corp. (SRC) via its Joint University Microelectronics Program, which provides funding from a consortium of industrial sponsors as well as from the Defense Advanced Research Projects Agency.”



9873F85C-4118-492F-8D07-E26E9C08A0FD.jpeg


16B305ED-AA2E-4F12-B292-EE546E5041E2.png


13ACA5C8-4736-4926-A049-E43CBE4D882A.jpeg


The comments in green are just off the top of my head. Some day, we will probably have dot-joined all of these universities! 😂


Then, there is also a tenuous link via GlobalFoundries:

67099BE7-E1AD-442F-8B7D-7C814E284920.png


So my advice would be to definitely keep Purdue on our watchlist, but to hold off adding it to your majestic scroll for the time being… 😊
 
  • Like
  • Love
  • Fire
Reactions: 17 users

chapman89

Founding Member
Hi, can I ask a favour of those in Germany and beyond.

Is this gentleman the CEO of the German fund that was/is accumulating shares? I cannot remember the name of the fund and if somebody is able to post a recent photo of their shares in Brainchip? Also, does a fund have voting rights for AGM’s?

IMG_7698.png
 
  • Like
  • Love
  • Thinking
Reactions: 23 users
 
  • Like
Reactions: 4 users
Top Bottom