BRN Discussion Ongoing

Bravo

If ARM was an arm, BRN would be its biceps💪!
Qualcomm's Durga Malladi, right, talks edge AI during CES 2025.


Qualcomm's Durga Malladi, right, talks edge AI during CES 2025.


Making the case for edge AI


Sean Kinney, Editor in Chief
BySean Kinney, Editor in Chief
January 13, 2025




The laws of physics, economics and countries suggest edge AI inference—where your device is the far edge—just makes sense​

In the race to build-out distributed infrastructure for artificial intelligence (AI), there’s a lot more glitz and glam around the applications and devices than around the cooling, rackspace and semiconductors doing the heavy lifting in hyperscaler clouds. While that’s maybe a function of what most people find interesting, it’s also not misplaced. For AI to live up the world-changing hype it’s riding high upon, distributing workload processing for AI makes a lot of sense—in fact, running AI inferencing on a device reduces latency leading to an improved user experience, it saves the time and cost of piping data back to the cloud for processing, and it helps safeguard personal or otherwise private data.
To say that another way, when you read an announcement for a new class of AI-enabled PCs or smartphones, don’t just think of it as just another product launch. Think of it as an essential piece of building out the connected edge-to-cloud continuum that AI needs to rapidly scale.
During a panel discussion last week at the Consumer Electronics Show (CES) in Las Vegas, Nevada, Qualcomm’s Durga Malladi, senior vice president and general manager of technology planning and edge solutions, not only defined the edge, but made the case for why it’s a key piece in the larger AI puzzle. “Our definition of the edge is practically every single device we use in our daily lives.” That includes PCs, smartphones and other devices you keep on your person as well as the Wi-Fi access points and enterprise servers that are one hop away from those devices.
“The definition of the edge is not just devices but absolutely something close to it,” Malladi continued. “But the question is why?” Why edge AI inference? First, “Because we can. The computational power we have in the devices today is significantly more than what we’ve seen in the last five years.” Second is immediacy and responsiveness derived from latency (or the lack thereof) when inferencing is done on-device. Third is around contextual data enhancing AI outcomes while also enhancing privacy.

Subscribe now to get the daily newsletter from RCR Wireless News​


He expanded on the privacy point. From a consumer perspective, Malladi explained the contextual nature of AI and gave the example of asking an on-device assistant when your next doctor’s appointment is. For schedule planning, it’d be great for the AI assistant to know about your medical appointments but perhaps concerning if that data leaves your device; but it doesn’t have to. In the enterprise context, Malladi talked about how enterprises fine-tune AI models by loading in proprietary corporate data to, again, contextualize the information and improve the outcome. “There’s a lot of reasons why privacy becomes not just a consumer-centric topic,” he said.

AI is the new UI​

As the conversation expanded, Malladi got into an area of thought that he, I think, debuted last year at the Snapdragon Summit, an annual Qualcomm-hosted event. The idea is that on-device AI agents will access your apps on your behalf, connecting various dots in service of your request and delivering an outcome not tied to one particular application. In this paradigm, the user interface of a smart phone changes; as he put it, “AI is the new UI.”
He tracked computing from command line interfaces to graphical interfaces accessible with a mouse. “Today we live in an app-centric world…It’s a very tactile thing…The truth is that for the longest period of time, as humans, we’ve been learning the language of computers.” AI changes that; when the input mechanism is something natural like your voice, the UI can now transform using AI to become more custom and personal. “The front-end is dominated by an AI agent…that’s the transformation that we’re talking of from a UI perspective.”
He also talked through how a local AI agent will co-evolve with its user. “Over time there is a personal knowledge graph that evolves. It defines you as you, not as someone else.” Localized context, made possible by on-device AI, or edge AI more broadly, will improve agentic outcomes over time. “Lots of work to be done in that space though,” Malladi acknowledged. “And that’s a space where I think, from the tech industry standpoint, we have a lot of work to do.”

https://www.rcrwireless.com/20250113/devices/making-the-case-for-edge-ai
 
  • Like
  • Fire
  • Love
Reactions: 36 users

MrRomper

Regular
  • Like
  • Fire
  • Love
Reactions: 55 users

7für7

Top 20
Looks like a short seller got annoyed by my post and reported it. That makes me even happier 🤩 BrainChip has left its bad days behind. The news is becoming increasingly positive, and the signs of rapid growth are intensifying! Have fun sweating it out 😂 GO BRAINCHIP
 
  • Like
  • Fire
Reactions: 16 users
Just waiting for the next confirmation on the AFRL contract.

We have our suspicions of who it might be but the initial Ann did state that whilst there are no other material considerations restricting release of a binding contract, confirmation of the subcontractor agreement appears to be the only one.

With milestone payments to commence in Jan, I expect we should have the subcontractor Ann anytime now....well, we should imo.

That should also provide further confidence in the mkt that it's off & running...to me, it's not about the $, this is about proving it up further and potentially opening the DoD procurement channel.

Something I found interesting though when looking at the original solicitation was the highlighted bit below.

To me that reads that POC (to offset a Phase I requirement) essentially had to be undertaken by parties outside of any previous or ongoing SBIR/STTR engagements. The work was done independently between most likely the incoming subcontractor (algos) and BRN hence the condition of R&D payments to the subcontractor for their ongoing component of work.



PHASE I: As this is a Direct-to-Phase-II (D2P2) topic, no Phase I awards will be made as a result of this topic. To qualify for this D2P2 topic, the Government expects the applicant(s) to demonstrate feasibility by means of a prior “Phase I-type” effort that does not constitute work undertaken as part of a prior or ongoing SBIR/STTR funding agreement. The required feasibility demonstration must include successfully developing advanced AI-based radio frequency (RF) algorithms and successfully porting them to a neuromorphic chip, with the final chip performing very well.
 
  • Like
  • Fire
  • Love
Reactions: 25 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
This post on Linkedin does not specifically mention Brainchip. It does state that 'Neuromorphics will be the Achilles Heel for NVIDIA'. Go Brainchip.
https://www.linkedin.com/posts/bria...9r?utm_source=share&utm_medium=member_desktop
View attachment 75910


"Neuromorphics are the real threat to NVIDIA's dominance in AI". Now that's a statement and a half! 🥳

I remember when Fact Finder expressed similar sentiments in his post and subsequent online blog "Can NVIDIA survive the 4th Industrial Revolution" and sadly there were some posters on here and The Crapper who derided him for it.

It's interesting to note that Brian Anderson has been focussing his Project Phasor efforts at AMD as you can see from this Linkedin post from about a month ago.

Back in June 2024, I posted the below PC World article in which Lisa Su (CEO of AMD) seemed open to the possibility that AMD might develop a neuromorphic chip, so maybe Brian knows something we don't?




Screenshot 2025-01-14 at 11.59.16 am.png






EXTRACT - Quoting from PC World article Lisa Su (CEO of AMD)

But that’s not to say we could see just an NPU, either. In response to another question about whether AMD could develop a neuromorphic chip like Intel’s Loihi, Su seemed open to the possibility. “I think as we go forward, we always look at some specific types of what’s called new acceleration technologies,” she said. “I think we could see some of these going forward.”





 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 25 users

Guzzi62

Regular

Tiny AI chip modeled on the human brain set to boost battery life in smart devices — with lifespan in some rising up to 6 times​

Story by Keumars Afifi-Sabet
• 12h • 2 min read

LAS VEGAS — The world's first "neuromorphic chip" will be on shelves by next year — and it will extend smart devices' battery life. The chip, which mimics the human brain's architecture, is meant to enable artificial intelligence (AI) capabilities on power-limited smart devices.


"Smart" devices like lightbulbs, doorbells or smoke alarms that are Wi-Fi connected are built with sensors that make detections and send data to the cloud for processing.

But this process is power-hungry, Sumeet Kumar, CEO of processor company Innatera Nanosystems, told Live Science in an interview at CES 2025. And any AI processing these devices perform also requires an internet connection.

But the Spiking Neural Processor T1 should drastically slash the power consumption of future smart devices.

It works by analyzing sensor data in real time to identify patterns and potentially clean up the data coming out of the sensors — and no internet connection would be required.

Mimicking the brain​


The device is a neuromorphic processor — meaning its architecture is arranged to mimic the brain's pattern-recognition mechanisms. To draw an analogy, when you sense something — whether it's a smell or a sound — different collections of neurons fire to identify it.

Similarly, in the chip, different groups of artificial neurons register spikes. The underlying principle is the spiking neural network (SNN) — where a neural network is a collection of machine learning algorithms and the spikes it produces are akin to the signals produced by brain cells.

Related: Intel unveils largest-ever AI 'neuromorphic computer' that mimics the human brain

SNN algorithms also tend to be around 100 times smaller in terms of file size than conventional deep neural networks used in large language models.

Layers of computation​


There are three fundamental layers in the T1 chip. The first is the SNN-based computing engine, which records a power dissipation of less than 1 milliwatt and latency, or delay, that's typically under 1 millisecond for most applications, Kumar said. The second layer includes conventional deep neural networks, while the third layer includes a standard processor that handles how the system functions.


The T1, or similar chips, would increase battery life up to sixfold in some smart devices and scenarios, Kumar said. For example, a prototype of a smart doorbell built with the T1 processor that could detect the presence of a person using radar technology lasted 18 to 20 hours, versus one or two hours in a conventional Wi-Fi-based product that sends image and video data to servers.


Applications include smart lighting, any kind of detectors for people-counting, door-opening systems, and even earbuds — in which the T1 chip can theoretically isolate different sounds for noise cancellation. When used for any sound-based applications, the company claims there is an 80 to 100 times reduction in energy consumption as well as a 70 times reduction in latency.

The chip is being readied for mass production this year, with samples shipping to device manufacturers. Kumar expects the first products with the T1 neuromorphic chip to hit the shelves by 2026.

Really! The first neuromorphic chip!! Nope, but surely seems like a serious competitor.

Edit: Seems to be analog, AK is as we know digital!

Link below:


 
Last edited:
  • Like
  • Thinking
  • Fire
Reactions: 19 users

manny100

Regular
They must have had 11 on hand because I managed to order one.

Oddly enough, I was having a problem with it accepting my address, so I sent an e-mail to sales who then routed it to someone working on their storefront.

Before I received a reply from sales though, I got back on a couple of hours later and made the purchase. About an hour ago, I received an e-mail from sales asking if I wanted an E Key or B+M Key device, so I wonder if they're soldering them up on demand.

Anyhow, the site noted receipt within seven days of receiving an order, and if I remember correctly, it was $15 shipping in the US. That's quite a hefty shipping/handling charge for a stick of gum. Hopefully, it doesn't get stuck to the bottom of someone's shoe.

I haven't received an answer yet, but I've also inquired if the M.2 and the PCIe devices can occupy the same PC.
With luck the early M.2's will be collectors items some time in the future.:)
Did you pick the E key or the other B+M?
 
  • Like
  • Thinking
Reactions: 5 users
Interesting as some have already posted, our shop was updated a few days ago with current product available.

Sold Out has since been removed, the fact it states estimated delivery 7 days only would strongly indicate that stock
is on hand, as in the AKD 1000 NSoC.......it has been reinforced by Sean numerous times that we aren't in the business
of becoming a retailer in customer over the counter supplies....BUT in saying that, if we want to get these products into
the hands of developers worldwide, I'd suggest they had better make sure that we have a steady supply of NSoC's
available because my gut is telling me this is going to explode, every "kid" on the bloke wants the new toy. :ROFLMAO::ROFLMAO:

I've got this great feeling about 2025/2026.....subject to no WW3 or massive market corrections, which ultimately
take every man and his dog southwards....comon Sean...lets pump up the volume !!

Good morning all..........Tech.
The imagery for the clip, is even apt Tech, surprised you didn't include it 😛




Even Esky will like this one..
 
  • Love
  • Like
Reactions: 4 users

TECH

Regular
"Neuromorphics are the real threat to NVIDIA's dominance in AI". Now that's a statement and a half! 🥳

I remember when Fact Finder expressed similar sentiments in his post and subsequent online blog "Can NVIDIA survive the 4th Industrial Revolution" and sadly there were some posters on here and The Crapper who derided him for it.

It's interesting to note that Brian Anderson has been focussing his Project Phasor efforts at AMD as you can see from this Linkedin post from about a month ago.

Back in June 2024, I posted the below PC World article in which Lisa Su (CEO of AMD) seemed open to the possibility that AMD might develop a neuromorphic chip, so maybe Brian knows something we don't?




View attachment 75914





EXTRACT - Quoting from PC World article Lisa Su (CEO of AMD)

But that’s not to say we could see just an NPU, either. In response to another question about whether AMD could develop a neuromorphic chip like Intel’s Loihi, Su seemed open to the possibility. “I think as we go forward, we always look at some specific types of what’s called new acceleration technologies,” she said. “I think we could see some of these going forward.”







AND....we all know how Jensen can leap frog the mob, PLUS with his dominance at the other end (GPU) Nvidia would turn into
an absolute world dominating monster, good or bad ? depends if he coughs up 20 Billion US + for Brainchip....some major, well
cashed up company must be quietly observing Brainchip's rise....the writing is on the wall in my opinion.

As they say in footy....one week at a time :ROFLMAO::ROFLMAO:
 
  • Like
  • Love
  • Fire
Reactions: 19 users

Diogenese

Top 20
Not so sure about short term but medium/long term its looking promising now.
The weekly chart with 250 day Donchian channells clearly show the long term downtrend has stopped and a range has developed. This range will eventually morph into an uptrend as the pace of good news increases. Long term looks very promising.
Deals take ages to stitch up but engagements have been on foot for almost ages so should be finalised soon.
The trend line since Sept'24 drawn on the chart shows that the SP has wandered a fair way from it and may come back a bit - short term.

View attachment 75906
Hi Manny,

Are there different charts for pre-rvenue companies and full trading companies?

Similarly for tech companies and resource companies?
Not so sure about short term but medium/long term its looking promising now.
The weekly chart with 250 day Donchian channells clearly show the long term downtrend has stopped and a range has developed. This range will eventually morph into an uptrend as the pace of good news increases. Long term looks very promising.
Deals take ages to stitch up but engagements have been on foot for almost ages so should be finalised soon.
The trend line since Sept'24 drawn on the chart shows that the SP has wandered a fair way from it and may come back a bit - short term.

View attachment 75906
Hi manny,

Are there different charts for pre-revenue companies and going concerns?

Also for different industrial sectors, eg, mining and tech?
 
  • Like
Reactions: 3 users

Diogenese

Top 20
They must have had 11 on hand because I managed to order one.

Oddly enough, I was having a problem with it accepting my address, so I sent an e-mail to sales who then routed it to someone working on their storefront.

Before I received a reply from sales though, I got back on a couple of hours later and made the purchase. About an hour ago, I received an e-mail from sales asking if I wanted an E Key or B+M Key device, so I wonder if they're soldering them up on demand.

Anyhow, the site noted receipt within seven days of receiving an order, and if I remember correctly, it was $15 shipping in the US. That's quite a hefty shipping/handling charge for a stick of gum. Hopefully, it doesn't get stuck to the bottom of someone's shoe.

I haven't received an answer yet, but I've also inquired if the M.2 and the PCIe devices can occupy the same PC.
The expense is all in the bubble wrap.
 
  • Haha
  • Like
  • Thinking
Reactions: 9 users

jtardif999

Regular
I was actually drafting a follow up response to dingo to raise this very problem as an additional issue that would need to be considered.

In any case, every dog has its day and I am positive the brainchip shareholder base dog will have its day in the not too distant future.

It'll then have its day again and again and again over the next 30 years as the company grows and fast becomes the world's leading provider of edge AI technology.... unless of course NVIDIA throw a few billion at us before we get there.
@SERA2g it would need to be more than a few to interest anyone with a reasonable holding imo.
 
  • Like
Reactions: 4 users

jtardif999

Regular
You would have thought with the people we have employed they would have been fully aware how long these things take?
Though no amount of experience helps when the market is nascent imo.
 
  • Like
Reactions: 2 users

manny100

Regular
Hi Manny,

Are there different charts for pre-rvenue companies and full trading companies?

Similarly for tech companies and resource companies?

Hi manny,

Are there different charts for pre-revenue companies and going concerns?

Also for different industrial sectors, eg, mining and tech?
Hi Diogenese, my comments are largely BRN spefific.
In our case IMO the key is pre and post deal and not pre and post revenue because deals create expectations rather than cash.
The cash can take a while to roll in from a signed deal even though we know its coming.
The recent LDA arrangement shows we have the ability to draw up to $A140 mill. If deals are complex, cash inflow is slow and we want rapid growth including maybe a small relevant acquisition or 2 we may need to access a chunk of the LDA cash???
IMO the uptrend will commence on decent deals and dilution will not be much of an issue as the call notice/share will be in the $$ and not cents.
Summary of chart
The long term downtrend started from Jan'22 and pretty much ran out of puff in Jan'22. Trends do not last forever.
Donchian 250 days channel shows that long term lower highs have ceased as well as lower lows.
So IMO we are in a range after the downtrend.
Ranges do not last forever either so we will either see a break to the upside or downside.
We are pre revenue. We are pre deal and closing in on deal completion and good new news on that front should see a break to the upside from the range.

brn 14thJAN25.png
 
Last edited:
  • Like
  • Fire
Reactions: 13 users

Diogenese

Top 20

Tiny AI chip modeled on the human brain set to boost battery life in smart devices — with lifespan in some rising up to 6 times​

Story by Keumars Afifi-Sabet
• 12h • 2 min read

LAS VEGAS — The world's first "neuromorphic chip" will be on shelves by next year — and it will extend smart devices' battery life. The chip, which mimics the human brain's architecture, is meant to enable artificial intelligence (AI) capabilities on power-limited smart devices.


"Smart" devices like lightbulbs, doorbells or smoke alarms that are Wi-Fi connected are built with sensors that make detections and send data to the cloud for processing.

But this process is power-hungry, Sumeet Kumar, CEO of processor company Innatera Nanosystems, told Live Science in an interview at CES 2025. And any AI processing these devices perform also requires an internet connection.

But the Spiking Neural Processor T1 should drastically slash the power consumption of future smart devices.

It works by analyzing sensor data in real time to identify patterns and potentially clean up the data coming out of the sensors — and no internet connection would be required.

Mimicking the brain​


The device is a neuromorphic processor — meaning its architecture is arranged to mimic the brain's pattern-recognition mechanisms. To draw an analogy, when you sense something — whether it's a smell or a sound — different collections of neurons fire to identify it.

Similarly, in the chip, different groups of artificial neurons register spikes. The underlying principle is the spiking neural network (SNN) — where a neural network is a collection of machine learning algorithms and the spikes it produces are akin to the signals produced by brain cells.

Related: Intel unveils largest-ever AI 'neuromorphic computer' that mimics the human brain

SNN algorithms also tend to be around 100 times smaller in terms of file size than conventional deep neural networks used in large language models.

Layers of computation​


There are three fundamental layers in the T1 chip. The first is the SNN-based computing engine, which records a power dissipation of less than 1 milliwatt and latency, or delay, that's typically under 1 millisecond for most applications, Kumar said. The second layer includes conventional deep neural networks, while the third layer includes a standard processor that handles how the system functions.


The T1, or similar chips, would increase battery life up to sixfold in some smart devices and scenarios, Kumar said. For example, a prototype of a smart doorbell built with the T1 processor that could detect the presence of a person using radar technology lasted 18 to 20 hours, versus one or two hours in a conventional Wi-Fi-based product that sends image and video data to servers.


Applications include smart lighting, any kind of detectors for people-counting, door-opening systems, and even earbuds — in which the T1 chip can theoretically isolate different sounds for noise cancellation. When used for any sound-based applications, the company claims there is an 80 to 100 times reduction in energy consumption as well as a 70 times reduction in latency.

The chip is being readied for mass production this year, with samples shipping to device manufacturers. Kumar expects the first products with the T1 neuromorphic chip to hit the shelves by 2026.

Really! The first neuromorphic chip!! Nope, but surely seems like a serious competitor.

Edit: Seems to be analog, AK is as we know digital!

Link below:


Innaterra are probably hair-splitting on their pedantic definition of "neuromorphic" as only including analog as this is a much closer replication of the neuron architecture. Remember Jason Eshraghian, whose background was also in analog, was quoted as saying of Akida: "I don't know what it is but it is not neuromorphic", or words to that effect.

It's an analog SNN maker's thing.
 
  • Like
  • Thinking
Reactions: 9 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
In terms of the below info, you'd have to think it would be highly likely that Sailesh Kottapalli would have some knowledge of Intel's neuromorphic computing initiatives, given his extensive 28-year tenure at Intel and his role as a senior fellow and chief architect for Xeon processors.





Extract 1
Screenshot 2025-01-14 at 3.04.37 pm.png


Extract 2
Screenshot 2025-01-14 at 3.05.15 pm.png
 
  • Like
  • Wow
Reactions: 5 users

JB49

Regular
Interesting no CES podcasts out yet. Last year they released each podcast the day after they were recorded.
 
Last edited:
  • Like
Reactions: 5 users

Esq.111

Fascinatingly Intuitive.
Good Afternoon Chippers,

Found the second page rather tittelating.

RTX 2023 Annual Report... big download , so may require a savey Chipper to copy said page.
Thankyou in advance.
On my phone and can't for the life of me figure out how. .😄

Be thinking my price indicater , RockerRothsGettyFellerChild, last price estimate of $7.117 per BRN share needs to be SUBSTANTIALLY UPGRADED.


Remember, we only need to clip 0.25% ( give or take , preferably more take ) of gross net sales to be extremely happy.

Thought this should also be added , though perhaps start on the cheaper bubbles first.
Alot of people get hospitalised annually from champagne related injuries, so don't f%#@around, serious stuff.



* Tech shows his prowess towards the end of this vid.😁, knew golf had a usefully application .

&



Regards,
Esq.

😁
 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 18 users

7für7

Top 20
Literally shortseller to keep the share price down

1736829014879.gif
 
  • Haha
  • Fire
Reactions: 6 users
Top Bottom