BRN Discussion Ongoing

dippY22

Regular
I really hope you are right...

What I see is, the Akida 1000 / SoC was the product that we've been waiting on to generate revenues after AI Accelerator/ FPGA... (after moving on from Brainchip Studio)

But the above only landed two IP contracts, my guess is that the above tech was not great enough for the EAPs to adopt / move to a new tech, perhaps they rather settle for inferior tech due to uncertainty, etc.

But that being said, I don't really have any technical background and only know basic terms which I learnt mostly from researching and from this website or HC... my understanding was that this tech is revolutionary and can drive massive change and lead Edge AI field... but lack of annoucements, lack of new IP contracts, revenue, cash inflows and previous failed products along with fked up share price made me question that there are multiple things that doesnt really add up... if we were able to sell 2 IP contracts, how come we weren't able to sell more? Why is there such a big gap between our first one and second one and ... the third one (hopefully soon)... and why is our revenue so small when we are supposed to be paid for providing engineering support to all these EAPs and wtf is sales team doing... or the support engineers doing, why they r paying bonuses, etc...

And pretty much there is no info on the previously announced partnerships... wtf happened to the companies developing new tech utilising brainchip, ford, valeo, nanose, etc. Are they truely actively engaged? If yes why did we only earn 40k (don't try to educate me on revenue vs cash inflows, fk off plz)...

I bought heaps more (from my perspective, although the amount might be much smaller / bigger than some of you here) when I saw royalties from MegaChips, but yea... I am getting more negative n negative...

I don't know if I am being delusional but I truely thought this would've been super massive by now... or at least on clear track... hopefully it is on track behind the scene..

Can I try to answer some of these questions for you from my perspective?

You wrote,....But that being said, I don't really have any technical background and only know basic terms which I learnt mostly from researching and from this website or HC... my understanding was that this tech is revolutionary and can drive massive change and lead Edge AI field... but lack of annoucements, lack of new IP contracts, revenue, cash inflows and previous failed products along with fked up share price made me question that there are multiple things that doesnt really add up... if we were able to sell 2 IP contracts, how come we weren't able to sell more? See below response ***

Why is there such a big gap between our first one and second one (I don't think the gap is big, but you do,....so?) and ... the third one (hopefully soon)... and why is our revenue so small when we are supposed to be paid for providing engineering support to all these EAPs (I believe "all" the EAP's are not really all that many, so that's why) and wtf is sales team doing (then ...in early 2020, ...or now? If you mean now, I would have you ask Chris Stevens but advise you not to open your conversation with "wtf is the sales team doing") ... or the support engineers doing, (again I'm sure Anil Mankar or PVDM would be happy to spend their time talking to you about what engineering is doing, if you have the chutzpah to ask) why they r paying bonuses, etc...

*** Our stock investment is in what was an R and D company for about 8 years and "maybe" had 35 employees in the late 2018-2019 period. By 2020 +/- (Louis DeNardo(SP?) years as CEO) mainly due to their limited size and human resource limitations they reached out to a number of select target markets and specific companies for any interest to test their developed solution called Akida. Right? This targeted early adopter program resulted in some BIG early success, namely Renesas IP, ...positivite feedback from Socionext, ....Megachips IP, ....interest from NASA, ....obvious (we now know) testing of our technology by Mercedes that they proudly announced as part of their EQXX concept car. This in part should answer your quetion why we weren't able to see more. I'm not sure they were even trying to sell much more because they couldn't support much more with their small staff numbers.

In summary I believe BRAINCHIP selected who they wanted to use their extremely limited engineering and design tech resources on and I think these small amount of companies were VERY satisfied with what Brainchip was developing and likewise I believe Brainchip has been very satisfied with the results of what the early development program produced. The strategic imperative to build up their ecosystem partner network will propel earnings and revenue in the future, but it will not happen overnight. Let's revisit this again in late 2024 or even 2025,....okay?

I'm not sure that it will, but I hope I have helped answer some of your frustration if I may call it that.
Yours, ...dippY

My speculation and opinions only
 
  • Like
  • Love
  • Fire
Reactions: 29 users
This post put it all in perspective for me... while i questioned my investment decision, BRN was getting on with grinding out a place in the market.. do you think any of these guys give a shit about the share price.. ?

just as the world continues turning without us.. Brainchip will be successful... do you think that if the money runs out in 3 quarters that Akida will be parked in the back of a cupboard to be forgotten?

The tech is part of our future now.. Akida is part of something... it will be part of more.....i'm still in...
"just as the world continues turning without us.. Brainchip will be successful... do you think that if the money runs out in 3 quarters that Akida will be parked in the back of a cupboard to be forgotten?"

We actually have a lot more than 3 quarters left of funding.

That figure was done on this quarter's higher than usual outflow and doesn't include the USD12.2 million from LDA Capital.

We also still have, over another 9 million shares for LDA capital to sell (plus up to another additional 10 million, which I won't count) before the end of the year, which would fulfill an additional quarter's spend (assuming done at a similar level as just completed).

So even at the current higher spend rate, which may be required to gain traction here, we will have a minimum 6 quarters of funding left, or a year and a half.

More than enough, to get cash flow positive and a whole lot more hopefully 😉

I remain hugely positive going forward and although a little disappointed at the increasing timeline to success and becoming a multi millionaire, I will continue to add to my position, as I see current levels as an opportunity.

I'm still trying to get that car together @Shadow59 🙄..

I'm going all in here and I don't see it, as putting everything on red...

_20230428_223113.JPG


This ain't Roulette or some juiced up nag..

This is AKIDA!
 
Last edited:
  • Like
  • Love
  • Fire
Reactions: 42 users

SERA2g

Founding Member
Going back a few weeks ago now ........... I don't recall if anyone could actually confirm for certain as to whether Peter will be or will not be attending this upcoming AGM.
Howdy. I have it on good authority that Peter will be at the AGM.
 
  • Like
Reactions: 13 users
I’m hoping that the kick ass announcement may come from the ARM webinar on May 9. ARM sold something like 4 billion of the Cortex M85s a few years ago. Recognition is great but ARM stating Akida can be embedded within and explanation of the advantages of what we can bring to the chip will be brilliant.
Don't forget we also exhibiting with Nandan presenting on the 23rd May at the Embedded Vision Summit.

Guess won't be at the AGM huh.



Enabling Ultra-Low Power Edge Inference and On-Device Learning with Akida​


Date: Tuesday, May 23
Start Time: 16:50
End Time: 17:20

The AIoT industry is expected to reach $1T by 2030—but that will happen only if edge devices rapidly become more intelligent. In this presentation, we’ll show how BrainChip’s Akida IP solution enables improved edge ML accuracy and on-device learning with extreme energy efficiency. Akida is a fully digital, neuromorphic, event-based AI engine that offers unique on-device learning abilities, minimizing the need for cloud retraining. We will demonstrate Akida’s compelling performance and extreme energy efficiency on complex models and explain how Akida executes spatial-temporal convolutions using innovative handling of 3D and 1D data. We’ll also show how Akida supports low-power implementation of vision transformers, and we’ll introduce the Akida developer ecosystem, which enables both AI experts and newcomers to quickly deploy disruptive edge AI applications that weren’t possible before.
 
  • Like
  • Fire
Reactions: 20 users

Rach2512

Regular
I am not a day trader nor a professional investor. I do not care for the changes in the financial industry that play games with people's money. I invest in companies that I believe have produced something following trends with great potential in the consumer market.

My degree is in Electrical Engineering, but I am a Software Engineer by profession. I acquired BrainChip's PCIe hardware to play with AI as a hobby, using it to perform classification from sensor input. I know the technology exists and that it works. What I do not have in front of me is competing technology for comparative benchmarking.

My holding in BrainChip is the largest of any stock I have ever held in a company. Yes, it is a gamble, but I believe in where the markets are heading and the product that BrainChip has produced to get us there. I firmly believe in my investment. If it fails, that is on me.

The company does not, and should not, control the stock price. There are too many factors beyond the company's control to do this. What I do want the company to do, is to be motivated to continue innovating the technology, securing patents, marketing the technology aggressively, and ensuring that they hire the best talent in the field.

Akida 1.0 was not a failure and is still a commercially viable product. I believe the decision to offer IP was wise, giving the company a way to make recurring revenue without the overhead cost of producing silicon which companies are not yet ready to adopt, especially in this economic climate. I believe the company would be in a worse-off position than it is today had it gone that route.

BrainChip has accomplished getting kits into the hands of companies that do want to embrace change and evaluate the technology for their business. Additionally, they have introduced curriculum into universities to be at the forefront of those young minds choosing a career path in AI. This is not unlike the same move that Apple made in its early years by donating its hardware to schools. I know this because my High School had an Apple IIe lab full of donated machines, which I enjoyed using often.

I have chosen to reward management for making decisions that were best for the company's continued survival in the current market, securing patents to protect its technological advantage, and acquiring individuals with the contacts that are needed to get the technology exposed to an industry that has yet to see its value. The company has also listened to those who have evaluated the technology and decided to incorporate that input into Akida 2.0.

I believe that in good times, the share price will follow the company's survival and what I believe to be its eventual success. BrainChip cannot force other companies to adopt a new way of doing things, but there will be companies that will lead by example. When the benefits of using this technology are realized, others will follow or be left out. There is always the risk that a better mousetrap will evolve, but over the past few years, even the juggernauts such as IBM and Intel have yet to materialize this. What's interesting is that in neuromorphic news, BrainChip's name appears alongside them.

Feel free to pay a visit if you must, as I would not mind discussing why I believe rewarding what I feel are the choices that will benefit the company, not the day traders. I think, in turn, this will be of benefit to the long-term shareholders. These are my views and opinion, not investment advice. As I have said, I will continue to hold and reassess this investment in 2025.
Thank you for this great post. Also can I add, many here were expecting a lot more revenue NOW based on Sean's comments at the last AGM, since then the business direction changed from chips to IP, now I am no technical expert far from it, but surely anyone with an ounce of common sense would realise that this change would take time to implement and so would present a delay in revenue. Maybe the explosive sales expected at the end of 2022 are still in the pipeline and are only now about ready to explode.

On another note, I'm actually quite happy with today's result as it provides another opportunity to buy more shares at a bargain. Someone mentioned that the sp might of hit $1 if 10 million revenue had been reported or thereabouts, how many of you would of sold? I know I wouldn't have, but I know I'd be paying a higher price per ticket! I think whilst the price is really low I won't be tempted to sell but only want to buy more, if it starts getting around the $2/$3 mark I might be tempted to sell some but then could miss out on a fortune, I'd rather the price went from zero to hero in a shorter space of time so I'm not tempted during that increase, I feel the SNN snowball is about to go MENTAL which may have the effect of going from zero to hero in the not too distant future.

Also I've lost count how many I've put on ignore in the last week or so. I've had one too many glasses of red!

Sorry I know I'm waffling now...

I am totally happy with the ENTIRE BrainChip team, go you beautiful thing, you have my vote on ALL accounts. What's not to LOVE. I've got the Power 🔋 💪 🙌 😍 ✨ 👏

 
  • Like
  • Love
  • Fire
Reactions: 37 users

Cardpro

Regular
Can I try to answer some of these questions for you from my perspective?

You wrote,....But that being said, I don't really have any technical background and only know basic terms which I learnt mostly from researching and from this website or HC... my understanding was that this tech is revolutionary and can drive massive change and lead Edge AI field... but lack of annoucements, lack of new IP contracts, revenue, cash inflows and previous failed products along with fked up share price made me question that there are multiple things that doesnt really add up... if we were able to sell 2 IP contracts, how come we weren't able to sell more? See below response ***

Why is there such a big gap between our first one and second one (I don't think the gap is big, but you do,....so?) and ... the third one (hopefully soon)... and why is our revenue so small when we are supposed to be paid for providing engineering support to all these EAPs (I believe "all" the EAP's are not really all that many, so that's why) and wtf is sales team doing (then ...in early 2020, ...or now? If you mean now, I would have you ask Chris Stevens but advise you not to open your conversation with "wtf is the sales team doing") ... or the support engineers doing, (again I'm sure Anil Mankar or PVDM would be happy to spend their time talking to you about what engineering is doing, if you have the chutzpah to ask) why they r paying bonuses, etc...

*** Our stock investment is in what was an R and D company for about 8 years and "maybe" had 35 employees in the late 2018-2019 period. By 2020 +/- (Louis DeNardo(SP?) years as CEO) mainly due to their limited size and human resource limitations they reached out to a number of select target markets and specific companies for any interest to test their developed solution called Akida. Right? This targeted early adopter program resulted in some BIG early success, namely Renesas IP, ...positivite feedback from Socionext, ....Megachips IP, ....interest from NASA, ....obvious (we now know) testing of our technology by Mercedes that they proudly announced as part of their EQXX concept car. This in part should answer your quetion why we weren't able to see more. I'm not sure they were even trying to sell much more because they couldn't support much more with their small staff numbers.

In summary I believe BRAINCHIP selected who they wanted to use their extremely limited engineering and design tech resources on and I think these small amount of companies were VERY satisfied with what Brainchip was developing and likewise I believe Brainchip has been very satisfied with the results of what the early development program produced. The strategic imperative to build up their ecosystem partner network will propel earnings and revenue in the future, but it will not happen overnight. Let's revisit this again in late 2024 or even 2025,....okay?

I'm not sure that it will, but I hope I have helped answer some of your frustration if I may call it that.
Yours, ...dippY

My speculation and opinions only
Thanks for trying but sorry it wasn't too helpful at all.

We have more than 15 EAPs and we received 40k from the customers despite having multiple support engineers... that's like 13k per month... which probably just covers one or two employees salary... for whatever reason, it's unacceptable IMO... they should provide explannation...

I give up, there is no point complaining here anyways. No one has answers and only time will tell. I haven't sold any and I will provably give it another year or two...

I truely hope Valeo is using us for Scala 3 (would explain why they r focusing working on Akida 2)...

Byeeee gltah dyor
 
  • Like
  • Love
Reactions: 7 users

The Pope

Regular
Going back a few weeks ago now ........... I don't recall if anyone could actually confirm for certain as to whether Peter will be or will not be attending this upcoming AGM.
My understanding from a email reply Tony Dawe sent to me nearly two weeks is he stated the following

All our directors will be in attendance at the AGM. Also attending will be CEO Sean Hehir, CFO Ken Scarince and myself

Based on current BRN directors as per BRN website then Peter should be there plus others as per attached
 

Attachments

  • IMG_0049.jpeg
    IMG_0049.jpeg
    314.4 KB · Views: 67
  • IMG_0050.jpeg
    IMG_0050.jpeg
    221.4 KB · Views: 73
Last edited:
  • Like
  • Haha
Reactions: 18 users

Diogenese

Top 20
I’m hoping that the kick ass announcement may come from the ARM webinar on May 9. ARM sold something like 4 billion of the Cortex M85s a few years ago. Recognition is great but ARM stating Akida can be embedded within and explanation of the advantages of what we can bring to the chip will be brilliant.
That is also my hope.

As I said before, the recent flurry of ARM announcements (Akida compatibility with all ARM processors, ARM "more advanced" chip manufacture (IFS?), and the fact that SiFive (an up-and-coming competitor of ARM) and Akida are cozy, leads me to hope that the ARM/BrainChip presentation will be that the new ARM chip will incorporate Akida as its AI block.

Some supporting reasons:

1. ARM presently has an in-house AI block called Helium available with its processor IP. Helium is light weight AI compared to Akida, so replacing Helium with Akida would make the ARM chip "more advanced";

2. Sifive and Akida are a good fit and would give SiFive an advantage over present ARM processors, and ARM will need to swallow any "not-invented-here" attitude they may have if they are to attempt to keep up with SiFive's more efficient RISC-V architecture;

3. BrainChip has joined the ARM partnership group;

4. BrainChip and ARM have both joined the Intel Foundry Services (IFS) fellowship;

5. Why would ARM be doing a presentation for a company they barely know?

Of course, the counter-argument is that, since RISC-V is open-source, ARM is bringing out its own RISC-V processor, which would qualify as "more advanced".

Then again, an ARM RISC -V processor could be mated with Akida. That would be very advanced.
 
Last edited:
  • Like
  • Love
  • Fire
Reactions: 80 users

Learning

Learning to the Top 🕵‍♂️
After a few beer,

The day seem a bit better. 😊

But here is a thought. Or conversation.

Why this quarterly so low in cash receipts. And where are the engineering fee?

It's obvious that's Brianchip hasn't signed any new customer since MegaChips directly.

But Brainchip had partnered with.......
We all know who they are. For the one who don't.(Refer to Brainchip.com.😊🍺😎)

So in a partnership, for example;

Brainchip works with company A.
Brainchip will provide the technology, the engineering assistance and co development with Company A. Company A provide the development funding, product marketing and sales. Company A sell the product. Brainchip will receive % of the sales of products.

So in essence, Brainchip is providing the technology with engineering expertise and co development the product with Company A.

Hence, Brainchip ARE NOT receiving engineering fee from Company A. Why? Because it's a Partnership. Both company working together to develop a certain product. When completion of the product and are in the hands of customers = REVENUE & ROYALTIES.

So if this theory is on point, Revenue and Royalties will arrive from all directions.

Learning 🏖
(Just my intoxicated opinion. DYOR.)
 
Last edited:
  • Like
  • Love
  • Fire
Reactions: 53 users
Nothing new but I see our Gen 2 got an article recently in a French publication.



BrainChip goes second with its neuromorphic chip tailored for ultra-low-power embedded devices
Posted on 04-19-2023 by Pierrick Arlot

Akida-2

The Australian-based company BrainChip, which developed in the form of IP a 100% digital neuromorphic processor intended to instill artificial intelligence (AI) in SoC system chips at the edge of the network (edge), and this at difficult levels of performance and consumption to be achieved by other approaches, announces the availability of the second generation of its Akida platform for the third quarter of 2023. The key: efficient 8-bit processing which, according to the company, opens the way to advanced functionalities such as as time-domain convolutions and acceleration Vision Transformers that aim to deliver an unprecedented level of performance in edge products consuming less than 1W. Features that will be able to give these products cognitive abilities,therefore beyond “simple” perception.
TENN.png
In detail, the second generation of the Akida platform now integrates spatiotemporal convolutions of the TENN (Temporal Event-based Neural Networks) type, a new class of neural networks which are distinguished by their lightness and efficiency in use. of resources and which are particularly well suited to the needs of applications where it is necessary to continuously process raw data containing important temporal information. Like video analysis, target tracking, audio classification, analysis of MRIs and CT scans for the prediction of vital signs, or time series analysis used in predictive maintenance.
According to BrainChip, such capabilities are needed in areas like industrial, automotive, digital health, smart home and smart city.
TENNs would thus allow simpler implementations on silicon because they can consume raw data directly from the signals recovered by the sensors. Which, adds BrainChip, significantly reduces the size of the models and the number of operations to be performed, while maintaining very high accuracy.
The other important addition to the second generation of the Akida platform is the acceleration of Vision Transformers (ViT), edge-of-network neural networks that have proven to perform extremely well on various machine vision tasks, such as image classification, object detection and semantic segmentation. This type of acceleration, combined with the ability of the Akida platform to process multiple layers simultaneously and to support residual or skip connections, allows it to manage the execution of networks itself. complexes like ResNet-50 entirely within the neural processor without host CPU intervention, minimizing overall system overhead, the company claims.
Among the products particularly targeted by the 2nd generation of the Akida platform, BrainChip cites compact and secure devices such as hearing aids or worn on the person, which accept an input for raw audio data, or health devices capable of monitor heart and respiratory rates and other vital signs and consume only a few microwatts. The company also has its sights set on battery-operated or fanless HD resolution vision solutions for monitoring and industrial production management applications.
" We are seeing a growing demand for real-time intelligence embedded in AI applications powered by our microcontrollers and the need to make sensors smarter in industrial and IoT equipment," notes Roger Wendelken, VP of IoT and Infrastructure . by Renesas.We have licensed Akida Neural Processors because of their unique neuromorphic approach that brings ultra-efficient acceleration to the currently most prevalent AI models at the edge of the network. With the addition of advanced temporal convolutions and vision transformer accelerators, we anticipate that low-power microcontrollers will revolutionize vision, perception and predictive applications in a wide variety of markets such as industrial IoT and consumer and personalized healthcare, to name a few . »
" The next generation of the Akida platform enables designers and developers to do things that weren't possible before in a low-power edge device ," said Sean Hehir, CEO of BrainChip. By providing learning and inference capabilities from raw sensor data, which eliminates the need for digital signal pre-processing, we are taking an important step towards delivering a cloud-agnostic Edge AI experience. . »
In order to simplify the development, optimization and deployment of AI solutions and services, BrainChip additionally offers an efficient execution engine that autonomously manages model accelerations in a transparent way for the developer, as well as the MetaTF software which developers can use with their favorite AI framework, like TensorFlow/Keras, or with an AI development platform like the one offered by Edge Impulse .
Among its technological partners, BrainChip counts companies such as Arm, Intel, Prophesee and SiFive , to which are added in the field of AI/ML development Ai Labs, Edge Impulse, Emotion3D and nViso .
Note that the 2nd generation Akida IP will be offered in three product classes. The IP Akida-E, extremely energy efficient, will target an operation always very close to the sensors or can be integrated within sensors. The Akida-S IP will be able to be embedded in a microcontroller or other general-purpose platforms used in a wide variety of sensor-related applications. Finally the Akida-P IP will target mid- to high-end configurations with optional vision transformers for very high performance while remaining eco-efficient.
You can also follow our news on the LinkedIN showcase of L'Embarqué dedicated to artificial intelligence in the embedded: Embedded-IA

1682692721924.png
 
  • Like
  • Love
  • Fire
Reactions: 39 users
Don't recall seeing this posted prev but it was from a May 22 paper so quite possibly just missed it.

We get a write up in it as "another relevant approach".

Source Doc

Screenshot_2023-04-28-22-45-45-24_e2d5b3f32b79de1d45acd1fad96fbb0f.jpg
Screenshot_2023-04-28-22-45-09-93_e2d5b3f32b79de1d45acd1fad96fbb0f.jpg
 
  • Like
  • Fire
  • Love
Reactions: 30 users
Also hadn't read this article previously either. Posted only a few weeks back and Nandan has a section in the interview.

Some may have seen it...or not.

Found it interesting that another interviewee were Synaptics given their field / products and that their founders (30 yrs ago) were tied to neuromorphic eg Carver Mead was one.


Musing if any potential (future) relationship opp here :unsure:



How smart will the edge get?​

By Gary HilsonApr 6, 2023 10:00am
SynapticsBrainChipITTIASmart Edge
Share
country image
Local, as in edge, machine learning is more feasible now than ever, but not every application works that way. (Getty Images)
What goes around comes around: after two decades of moving data and applications to a central cloud to be accessed by “dumb terminals,” the tide has turned. The edge is getting smarter, but how smart can it get?


Intelligence at the edge could be as simple as running analytics on data without having to send it back to a central data center, or even artificial intelligence (AI) to do simple inference. But until recently, the best practice was that the AI training and machine learning would be done in the cloud – now low power hardware is making it feasible for some of that machine learning to be done on a local edge device.

Being able to do more at the edge has some obvious advantages – you don’t need to consume energy and network capacity to send the data back and forth, nor do you have to worry about securing the data while in transit. And with some AI functionality becoming somewhat ubiquitous and affordable for most businesses, it appears inevitable that the edge will keep getting smarter.

Automation drives smart edge adoption

Embedded systems and devices are slowly taking steps to replace people, ITTIA founder Sasan Montaseri said in an interview with Fierce Electronics. “We allow them to capture and digest as much data as possible data and process that really rapidly.” This shift, he said, is exemplified by autonomous cars and robotics in factories – embedded systems and devices can handle and store more data to aid with tasks such as preventative maintenance. It is accomplished by looking at patterns of data, which makes the edge more intelligent.

Montaseri said one factor that is driving intelligence at the edge is that connectivity is not a hundred percent available, which means delays getting information back from the cloud. Another is that microprocessors and micro controllers are becoming more powerful. This enables the necessary data management at the edge, he said, and allows to devices quickly analyze and make sense of data.

ITTIA is focused on providing the software necessary for edge data streaming, analysis and management for embedded systems and IoT devices – robust data management is foundational for doing AI and machine learning in embedded systems at the edge, Montaseri said.

diagram

ITTIA provides software for edge data streaming, analysis and management for embedded and IoT for uses such as transportation when it's not feasible to depend on a central cloud. (ITTIA)
Reliability is critical for smart edge devices, he added, whether it’s for industrial robotics, medical or transportation applications. “You want to make sure that they don't go down.”

What’s also becoming apparent is that not all edges are created equal – some will be smarter sooner than others depending on the use case and industry, such as robotics and medical devices. Montaseri said today’s embedded systems that gain intelligence through IoT deployments will be doing the jobs needed for the next generation of computing. “The nature of everything is changing,” he said. “We are seeing more security, more safety, and more functionality, like the ingestion rate and the query rate. Our focus is safety, security, and reliability.”


Not all edges are created equal

What makes the smart edge murky is the definition of edge, which means different things to different people, Nandan Nayampally, CMO at BrainChip, said in an interview with Fierce Electronics. He was previously at ARM for more than 15 years when the edge was viewed as primarily sensor driven. “That's how IoT kind of cropped up,” he said. “IoT is a sensor plus connectivity plus processing.” While a Dell or an Intel might think of the smart edge as another giant box that’s now smaller, the right starting point to him is IoT with AI.

AI on the edge is a step forward from a sensor just doing one function, with devices now having more storage, memory, and processing power. Nayampally said this battle between cloud and the edge has been going on a for a while, going back to the days of a terminal connected to mainframe before the move to a client/server model. “What you realize is however much we think that latency to cloud or connectivity to cloud is guaranteed, and the bandwidth assured, it's never going to be the case,” he said. “You need that intelligence and computational power at the edge.”

diagram of chip

BrainChip's Akida processor can learn at the edge to address security and privacy while limiting network congestion. (BrainChip)

Having the smarts at the edge is beneficial for preventative maintenance in factories and patient monitoring, Nayampally said, both in terms of latency and privacy. “Anytime you send raw data or sensitive data out, you are obviously going to have challenges.” Privacy and security have become especially important to the general public, he added. BrainChip was started with the idea that edge computing was necessary and that any approach to AI at the edge had to be different from the cloud. “The cloud kind of assumes almost infinite resources and infinite compute.”

While compute resources at the edge are rather finite, more AI is possible due to advances with low power hardware including memory and systems on chip (SoC), which means not all training and machine learning need be shipped back to the cloud. Nayampally said it’s a matter of scaling, with neuromorphic computing offering inspiration for how to low power intelligence at the edge. “Let's try to emulate the efficiency of it and start from there.”

Machine learning will increasingly happen at the edge both because of inherent capability but also out of necessity. Nayampally said some applications that require a real-time response can’t afford the delay between the edge and the cloud, or the power. “Any time you use radio and connectivity, especially to cloud, that burns a lot of power,” he said. “Radios are the most expensive parts of devices.” Smaller, more cost-effective devices may not be able to afford to have connectivity and need to do more compute locally.

Nayampally said the neuromorphic nature of BrainChip’s Akida platform allows it to learn at the edge, which also addresses security and privacy and reduces network congestion – today’s autonomous vehicles can generate a terabyte of data per day, he noted, so it makes sense to be selective about how much data needs to travel the network.

For the smart edge, simple is better and BrainChip’s processor does that from a computational standpoint, as well as from a development and deployment standpoint, Nayampally said. “It's almost like a self-contained processor.” It is neuromorphic and event driven, so it only processes data when needed, and only communicates when needed, he said.

Being event driven is an excellent example of how basic machine learning may express itself in a single device for the user or the environment, which is what Synaptics is calling the “edge of the edge,” said Elad Baram, director of product marketing for low-power edge AI. The company has a portfolio of low power AI offerings operating at a milliwatt scale which is enabling machine learning using minimal processing and minimal learning – an initiative in line with the philosophy of the tinyML Foundation, he said. While an ADAS uses gigabytes of memory, Synaptics is using megabytes of memory.

Baram’s expertise is in computer vision, and Synaptics sees a lot of potential around any kind of sensing and doing the compute right next to where the data is coming from. Moving data requires power and increases latency and creates privacy issues. Organizations like tinyML are an indicator of how smart the edge could get. “We are at an inflection point within this year or next year,” he said. “This inflection point is where it's booming.”

diagram of a chip

Synaptics has a context aware Smart Home SoC with an AI accelerator for 'edge at the edge'. (Synaptics)
Context aware edge remain application specific

Baram said just as the GPU boom occurred in the cloud five years ago, the same evolution is happening with TinyML. Workloads at the edge that previously required an Nvidia processor, such as detection and recognition, can now be done on a microcontroller. “There is a natural progression.”

Sound event detection is already relatively mature, Baram said, starting with Alexa and Siri and progressing to detecting glass breaking or a baby crying. “We are seeing a lot of use cases in smart home and home security around the audio space.” In the vision space, he said, Synaptics is supporting “context awareness” for laptops so they can detect whether a user is present or not, and to ensure privacy, any imaging stays on the on-camera module – it’s never stored on the computer’s main processor.

Power, of course, is important for context awareness applications. Baram said. “You don't want the power to the battery to drain too fast.” But having this awareness actually extends battery life, he said, because now the system understands if the user is engaged with the device and its content and can respond accordingly. “You approach the laptop, it's turned on and you log in and it's like magic. The machine just knows what you want to do, what you are doing, and it can adapt itself.”

Similarly, an HVAC system could adapt based on the number of occupants in a room, or a dishwasher could let you know how full it is. Baram said a fridge could be smart enough so you can know whether or not you need to pick up milk on the way home. Aside from smart laptops and home appliances, there are many safety applications in construction, manufacturing and agriculture that could benefit from context awareness. “The amount of use cases out there in the world is pretty amazing.”

Baram said the hardware is coming together to enable the smart edge, including machine learning, while algorithms and networking are also improving significantly. “The neural networks are way more efficient than they were a decade ago.” As compute capabilities advance, devices will be able to have more general purposes, but for now processor and algorithm constraints mean smart edge devices will have targeted applications.

In the long run, making the edge smarter is ultimately contingent on successfully pulling on these elements together, which requires an AI framework, Baram said, such as TensorFlow, an open source ML platform. “Those frameworks make it much easier to deploy a neural network into edge devices.”
 
  • Like
  • Fire
  • Love
Reactions: 35 users

Beebo

Regular
That is also my hope.

As I said before, the recent flurry of ARM announcements (Akida compatibility with all ARM processors, ARM "more advanced" chip manufacture (IFS?), and the fact that SiFive (an up-and-coming competitor of ARM) and Akida are cozy, leads me to hope that the ARM/BrainChip presentation will be that the new ARM chip will incorporate Akida as its AI block.

Some supporting reasons:

1. ARM presently has an in-house AI block called Helium available with its processor IP. Helium is light weight AI compared to Akida, so replacing Helium with Akida would make the ARM chip "more advanced";

2. Sifive and Akida are a good fit and would give SiFive an advantage over present ARM processors, and ARM will need to swallow any "not-invented-here" attitude they may have if they are to attempt to keep up with SiFive's more efficient RISC-V architecture;

3. BrainChip has joined the ARM partnership group;

4. BrainChip and ARM have both joined the Intel Foundry Services (IFS) fellowship;

5. Why would ARM be doing a presentation for a company they barely know?

Of course, the counter-argument is that, since RISC-V is open-source, ARM is bringing out its own RISC-V processor, which would qualify as "more advanced".

Then again, an ARM RISC -V processor could be mated with Akida. That would be very advanced.
I’ve always pondered a theory where ARM initially wanted exclusive rights to Akida, and BrainChip declined. BrainChip then got cozy with SiFive, and that tipped the balance for ARM to deal with BrainChip non-exclusively.

I expect both ARM and SiFive to license IP in the 4th Quarter, if not sooner!
 
  • Like
  • Fire
  • Thinking
Reactions: 37 users

Beebo

Regular
I’ve always pondered a theory where ARM initially wanted exclusive rights to Akida, and BrainChip declined. BrainChip then got cozy with SiFive, and that tipped the balance for ARM to deal with BrainChip non-exclusively.

I expect both ARM and SiFive to license IP in the 4th Quarter, if not sooner!
Akida is potentially secret sauce to ARM when it comes to competing with QCOM.

Onward and upward!
 
  • Like
  • Fire
  • Wow
Reactions: 33 users
I’ve always pondered a theory where ARM initially wanted exclusive rights to Akida, and BrainChip declined. BrainChip then got cozy with SiFive, and that tipped the balance for ARM to deal with BrainChip non-exclusively.

I expect both ARM and SiFive to license IP in the 4th Quarter, if not sooner!
Funny you just raised that part about SiFive / Risc-V.

Was just reading this recent article and below is a snip from it haha

Link to full article bottom of post.

Inside Arm’s vision for the ‘software-defined vehicle’ of the future​

The chip giant is betting big on cars​


April 11, 2023 - 12:33 pm


“One executive I was talking to said: ‘The best negotiating strategy when Arm comes in is to have a RISC-V brochure sitting on my desk’,” Jim Feldhan, the president of semiconductor consultancy Semico Research, said last year. “It’s a threat. Arm is just not going to have its super dominant position in five or 20 years.”


 
  • Like
  • Fire
  • Haha
Reactions: 26 users

IloveLamp

Top 20
Screenshot_20230429_043643_LinkedIn.jpg
 
  • Like
  • Fire
  • Love
Reactions: 38 users

IloveLamp

Top 20
BMW feeling left out of the party.....how will they "compete" ........

I think i know how..........dyor

Screenshot_20230429_051421_LinkedIn.jpg
 
  • Like
  • Fire
  • Thinking
Reactions: 13 users
It's hard to know if we should punish the directors for simply making a mistake that we all agreed to, at the time.

We all fell for the "let's just sell IP" line - Oh, revenue will be virtually all profit!. In reality this meant that we were going to hand over the heavy sales lifting to Renesas and MegaChip. Surprise surprise, that didn't work.
But it was worth a try.
So now we're going back to selling the actual MCU that engineers can just buy off the shelf.
And we've wasted a year.
We made this mistake because the product is 100% digital, so can be scaled and made cheaply at any of the Fab companies. We were unlucky in that no-one took a punt on us and is selling a product containing the Akida IP. If they had of, we would be in a much better position - others would have had to follow suit. But they didn't.

At any rate we are back to the "let's produce the MCU" tack, a decision which I think we took quickly.
So I'll just be abstaining, worried that any push back will just add to our problems.
I’m not sure you actually have a good grasp of what’s going on. To say that the deals with Renesas and Megachips have failed is completely rubbish. Renesas have taped out their chip containing our IP in dec last year. We signed with Megachips 6 months later than renesas so it’s not surprising there hasn’t been any revenue out of them first.
 
  • Like
  • Love
  • Fire
Reactions: 21 users

Foxdog

Regular
  • Like
Reactions: 3 users

Frangipani

Regular
So is that it

Hi @Makeme 2020,

honestly, what kind of reply did you expect to your provocative rhetorical 🎺question?

In case you didn’t get the hint - I’d actually say my choice of album and song title was a pretty accurate reflection of how a lot of posters in this forum have been feeling today:
Kind of Blue (ranging from disappointment to disbelief with the 4C), yet a defiant
So What? (conviction that Brainchip’s future remains bright despite the 4C seemingly suggesting lack of interest from potential customers, confidence that revenue will eventually come, alas later than hoped for & let’s cross our fingers for some surprise reveals before/at the AGM)

Trumpet players occasionally use mutes to purposely change their instrument’s timbre (tone colour) or lower its volume. So if their sound is a little muffled at times, it doesn‘t mean they‘ve stopped playing altogether. And once in a while they need to take their instrument down and empty the spit valves - but don‘t worry, it‘s mostly water (condensation of the player’s warm moist breath to be precise) and very little actual spit. Also, playing the trumpet can be quite taxing on your lips and you may therefore find it necessary to remove the mouthpiece from your lips from time to time and rest. And last but not least there are those kind of rests that are part of your score and thus intended. The composer may even have chosen to write in a general pause - the absence of sound in all instruments as a powerful means of expression. Sometimes we forget how important silence is in music.

For what it’s worth. The German word for an instrumental mute is “Dämpfer” - this word can also be used metaphorically in the sense of “putting a damper on something“. So while the mood may have been a little subdued today, you should soon be hearing that familiar brilliant sound of trumpets once again, if you choose not to leave the concert hall early, which would indeed be a shame and waste of money in my eyes. And guess what - you are very welcome to join the brass ensemble on stage, playing the trombone or even the tuba, if you prefer that kind of sound over that of a trumpet, as long as your bass line contribution is mostly harmonious - some disharmony is fine, though, and in fact at times even desirable and refreshing:

“Despite their differences, consonance and dissonance tend to work well together in music. Like a good story, tonal music needs conflict to generate tension to drive the story. Dissonance creates that tension in the musical story. The conflict can be but is not required to be, resolved with consonance. Essentially, the composer creates a sense of movement in music by creating tension using dissonant sounds and then releases that tension by returning to consonant sounds.“

Wouldn’t it be gratifying if we all ended up making wonderful music together? After all, aren’t we all in awe of this masterpiece of a composition?

P.S.: Interesting trivia: “Trumpet-like instruments have historically been used as signalling devices in battle or hunting, with examples dating back to at least 1500 BC. They began to be used as musical instruments only in the late 14th or early 15th century.” (Wikipedia)

Doesn’t “Akida Ballista!” sound just like a rousing fanfare? 🎺🎺🎺

I guess there is more than just one way to turn lemons into lemonade.
 
Last edited:
  • Like
  • Love
  • Fire
Reactions: 14 users
Top Bottom