BRN Discussion Ongoing

And sorry to bang on about the Manic Street Preachers.....but there is a least an Australian connection....

1) It's the title of one of their songs......

2) They wrote some songs for Kylie Minogue..

This one made the charts ... (lead singer for MSP in the clip)

And kinda BRN related if you hold shares.......



I'll shut up now.....

A 67/68 Camaro and Kylie Minogue?
Who cares about the song?..

Actually had a crush on her little sister..

Hey I'm from that Era okay 😛..

My favourite song with Kylie (for the song)..
Is with Nick Cave.



Very dark, but hauntingly beautiful.


And.. AKIDA TREBUCHET!!


Gotta catch on sooner or later..
 
Last edited:
  • Like
  • Love
  • Fire
Reactions: 6 users

BrainShit

Regular
Geez, Sold out. I hope their presale quantity was atleast 100,000 M.2 cards...........

View attachment 75899

They pressed 10 more out...

M2_more.png
 
  • Haha
  • Like
Reactions: 13 users

TECH

Regular
Interesting as some have already posted, our shop was updated a few days ago with current product available.

Sold Out has since been removed, the fact it states estimated delivery 7 days only would strongly indicate that stock
is on hand, as in the AKD 1000 NSoC.......it has been reinforced by Sean numerous times that we aren't in the business
of becoming a retailer in customer over the counter supplies....BUT in saying that, if we want to get these products into
the hands of developers worldwide, I'd suggest they had better make sure that we have a steady supply of NSoC's
available because my gut is telling me this is going to explode, every "kid" on the bloke wants the new toy. :ROFLMAO::ROFLMAO:

I've got this great feeling about 2025/2026.....subject to no WW3 or massive market corrections, which ultimately
take every man and his dog southwards....comon Sean...lets pump up the volume !!

Good morning all..........Tech.
 
  • Like
  • Fire
  • Love
Reactions: 34 users

itsol4605

Regular
Does TI implement Akida ?

Edge AI Innovations
 
  • Fire
  • Like
  • Thinking
Reactions: 6 users

BrainShit

Regular
Does TI implement Akida ?

Edge AI Innovations


The AWR6843 from Texas Instruments does not contain neuromorphic architecture... integrated is a single-chip mmWave sensor based on FMCW (Frequency-Modulated Continuous Wave) radar technology.

The device features:

  1. An Arm Cortex-R4F microcontroller for object detection and interface control
  2. A C674x DSP for advanced signal processing
  3. A hardware accelerator for FFT, filtering, and CFAR processing
These components are traditional digital processing architectures and do not include neuromorphic or Akida-based designs.

Source: https://www.ti.com/product/AWR6843/part-details/AWR6843AQGABLQ1
Source: https://www.ti.com/product/AWR6843/part-details/AWR6843AQGABLRQ1
 
Last edited:
  • Like
  • Fire
Reactions: 11 users
NDAs have been holding back the SP, sure. The company have said, and I am paraphrasing here, timeframes to deals and market are longer than anticipated.
You would have thought with the people we have employed they would have been fully aware how long these things take?
 
  • Like
  • Fire
Reactions: 7 users

manny100

Regular
Interesting as some have already posted, our shop was updated a few days ago with current product available.

Sold Out has since been removed, the fact it states estimated delivery 7 days only would strongly indicate that stock
is on hand, as in the AKD 1000 NSoC.......it has been reinforced by Sean numerous times that we aren't in the business
of becoming a retailer in customer over the counter supplies....BUT in saying that, if we want to get these products into
the hands of developers worldwide, I'd suggest they had better make sure that we have a steady supply of NSoC's
available because my gut is telling me this is going to explode, every "kid" on the bloke wants the new toy. :ROFLMAO::ROFLMAO:

I've got this great feeling about 2025/2026.....subject to no WW3 or massive market corrections, which ultimately
take every man and his dog southwards....comon Sean...lets pump up the volume !!

Good morning all..........Tech.
Not so sure about short term but medium/long term its looking promising now.
The weekly chart with 250 day Donchian channells clearly show the long term downtrend has stopped and a range has developed. This range will eventually morph into an uptrend as the pace of good news increases. Long term looks very promising.
Deals take ages to stitch up but engagements have been on foot for almost ages so should be finalised soon.
The trend line since Sept'24 drawn on the chart shows that the SP has wandered a fair way from it and may come back a bit - short term.

brn 14thJAN25.png
 
  • Like
Reactions: 12 users

itsol4605

Regular
  • Like
  • Fire
Reactions: 14 users
  • Haha
  • Like
Reactions: 6 users

JDelekto

Regular
Nah...I bought all 10 for posterity...that's why they said on the site...for more than 10 contact them...cause they only had 10 :LOL::ROFLMAO:
They must have had 11 on hand because I managed to order one.

Oddly enough, I was having a problem with it accepting my address, so I sent an e-mail to sales who then routed it to someone working on their storefront.

Before I received a reply from sales though, I got back on a couple of hours later and made the purchase. About an hour ago, I received an e-mail from sales asking if I wanted an E Key or B+M Key device, so I wonder if they're soldering them up on demand.

Anyhow, the site noted receipt within seven days of receiving an order, and if I remember correctly, it was $15 shipping in the US. That's quite a hefty shipping/handling charge for a stick of gum. Hopefully, it doesn't get stuck to the bottom of someone's shoe.

I haven't received an answer yet, but I've also inquired if the M.2 and the PCIe devices can occupy the same PC.
 
  • Like
  • Haha
  • Thinking
Reactions: 20 users

AARONASX

Holding onto what I've got
  • Like
  • Love
  • Fire
Reactions: 18 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
Qualcomm's Durga Malladi, right, talks edge AI during CES 2025.


Qualcomm's Durga Malladi, right, talks edge AI during CES 2025.


Making the case for edge AI


Sean Kinney, Editor in Chief
BySean Kinney, Editor in Chief
January 13, 2025




The laws of physics, economics and countries suggest edge AI inference—where your device is the far edge—just makes sense​

In the race to build-out distributed infrastructure for artificial intelligence (AI), there’s a lot more glitz and glam around the applications and devices than around the cooling, rackspace and semiconductors doing the heavy lifting in hyperscaler clouds. While that’s maybe a function of what most people find interesting, it’s also not misplaced. For AI to live up the world-changing hype it’s riding high upon, distributing workload processing for AI makes a lot of sense—in fact, running AI inferencing on a device reduces latency leading to an improved user experience, it saves the time and cost of piping data back to the cloud for processing, and it helps safeguard personal or otherwise private data.
To say that another way, when you read an announcement for a new class of AI-enabled PCs or smartphones, don’t just think of it as just another product launch. Think of it as an essential piece of building out the connected edge-to-cloud continuum that AI needs to rapidly scale.
During a panel discussion last week at the Consumer Electronics Show (CES) in Las Vegas, Nevada, Qualcomm’s Durga Malladi, senior vice president and general manager of technology planning and edge solutions, not only defined the edge, but made the case for why it’s a key piece in the larger AI puzzle. “Our definition of the edge is practically every single device we use in our daily lives.” That includes PCs, smartphones and other devices you keep on your person as well as the Wi-Fi access points and enterprise servers that are one hop away from those devices.
“The definition of the edge is not just devices but absolutely something close to it,” Malladi continued. “But the question is why?” Why edge AI inference? First, “Because we can. The computational power we have in the devices today is significantly more than what we’ve seen in the last five years.” Second is immediacy and responsiveness derived from latency (or the lack thereof) when inferencing is done on-device. Third is around contextual data enhancing AI outcomes while also enhancing privacy.

Subscribe now to get the daily newsletter from RCR Wireless News​


He expanded on the privacy point. From a consumer perspective, Malladi explained the contextual nature of AI and gave the example of asking an on-device assistant when your next doctor’s appointment is. For schedule planning, it’d be great for the AI assistant to know about your medical appointments but perhaps concerning if that data leaves your device; but it doesn’t have to. In the enterprise context, Malladi talked about how enterprises fine-tune AI models by loading in proprietary corporate data to, again, contextualize the information and improve the outcome. “There’s a lot of reasons why privacy becomes not just a consumer-centric topic,” he said.

AI is the new UI​

As the conversation expanded, Malladi got into an area of thought that he, I think, debuted last year at the Snapdragon Summit, an annual Qualcomm-hosted event. The idea is that on-device AI agents will access your apps on your behalf, connecting various dots in service of your request and delivering an outcome not tied to one particular application. In this paradigm, the user interface of a smart phone changes; as he put it, “AI is the new UI.”
He tracked computing from command line interfaces to graphical interfaces accessible with a mouse. “Today we live in an app-centric world…It’s a very tactile thing…The truth is that for the longest period of time, as humans, we’ve been learning the language of computers.” AI changes that; when the input mechanism is something natural like your voice, the UI can now transform using AI to become more custom and personal. “The front-end is dominated by an AI agent…that’s the transformation that we’re talking of from a UI perspective.”
He also talked through how a local AI agent will co-evolve with its user. “Over time there is a personal knowledge graph that evolves. It defines you as you, not as someone else.” Localized context, made possible by on-device AI, or edge AI more broadly, will improve agentic outcomes over time. “Lots of work to be done in that space though,” Malladi acknowledged. “And that’s a space where I think, from the tech industry standpoint, we have a lot of work to do.”

https://www.rcrwireless.com/20250113/devices/making-the-case-for-edge-ai
 
  • Like
  • Fire
  • Love
Reactions: 31 users

MrRomper

Regular
  • Like
  • Fire
  • Love
Reactions: 45 users

7für7

Top 20
Looks like a short seller got annoyed by my post and reported it. That makes me even happier 🤩 BrainChip has left its bad days behind. The news is becoming increasingly positive, and the signs of rapid growth are intensifying! Have fun sweating it out 😂 GO BRAINCHIP
 
  • Like
  • Fire
Reactions: 14 users
Just waiting for the next confirmation on the AFRL contract.

We have our suspicions of who it might be but the initial Ann did state that whilst there are no other material considerations restricting release of a binding contract, confirmation of the subcontractor agreement appears to be the only one.

With milestone payments to commence in Jan, I expect we should have the subcontractor Ann anytime now....well, we should imo.

That should also provide further confidence in the mkt that it's off & running...to me, it's not about the $, this is about proving it up further and potentially opening the DoD procurement channel.

Something I found interesting though when looking at the original solicitation was the highlighted bit below.

To me that reads that POC (to offset a Phase I requirement) essentially had to be undertaken by parties outside of any previous or ongoing SBIR/STTR engagements. The work was done independently between most likely the incoming subcontractor (algos) and BRN hence the condition of R&D payments to the subcontractor for their ongoing component of work.



PHASE I: As this is a Direct-to-Phase-II (D2P2) topic, no Phase I awards will be made as a result of this topic. To qualify for this D2P2 topic, the Government expects the applicant(s) to demonstrate feasibility by means of a prior “Phase I-type” effort that does not constitute work undertaken as part of a prior or ongoing SBIR/STTR funding agreement. The required feasibility demonstration must include successfully developing advanced AI-based radio frequency (RF) algorithms and successfully porting them to a neuromorphic chip, with the final chip performing very well.
 
  • Like
  • Fire
  • Love
Reactions: 20 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
This post on Linkedin does not specifically mention Brainchip. It does state that 'Neuromorphics will be the Achilles Heel for NVIDIA'. Go Brainchip.
https://www.linkedin.com/posts/bria...9r?utm_source=share&utm_medium=member_desktop
View attachment 75910


"Neuromorphics are the real threat to NVIDIA's dominance in AI". Now that's a statement and a half! 🥳

I remember when Fact Finder expressed similar sentiments in his post and subsequent online blog "Can NVIDIA survive the 4th Industrial Revolution" and sadly there were some posters on here and The Crapper who derided him for it.

It's interesting to note that Brian Anderson has been focussing his Project Phasor efforts at AMD as you can see from this Linkedin post from about a month ago.

Back in June 2024, I posted the below PC World article in which Lisa Su (CEO of AMD) seemed open to the possibility that AMD might develop a neuromorphic chip, so maybe Brian knows something we don't?




Screenshot 2025-01-14 at 11.59.16 am.png






EXTRACT - Quoting from PC World article Lisa Su (CEO of AMD)

But that’s not to say we could see just an NPU, either. In response to another question about whether AMD could develop a neuromorphic chip like Intel’s Loihi, Su seemed open to the possibility. “I think as we go forward, we always look at some specific types of what’s called new acceleration technologies,” she said. “I think we could see some of these going forward.”





 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 21 users

Guzzi62

Regular

Tiny AI chip modeled on the human brain set to boost battery life in smart devices — with lifespan in some rising up to 6 times​

Story by Keumars Afifi-Sabet
• 12h • 2 min read

LAS VEGAS — The world's first "neuromorphic chip" will be on shelves by next year — and it will extend smart devices' battery life. The chip, which mimics the human brain's architecture, is meant to enable artificial intelligence (AI) capabilities on power-limited smart devices.


"Smart" devices like lightbulbs, doorbells or smoke alarms that are Wi-Fi connected are built with sensors that make detections and send data to the cloud for processing.

But this process is power-hungry, Sumeet Kumar, CEO of processor company Innatera Nanosystems, told Live Science in an interview at CES 2025. And any AI processing these devices perform also requires an internet connection.

But the Spiking Neural Processor T1 should drastically slash the power consumption of future smart devices.

It works by analyzing sensor data in real time to identify patterns and potentially clean up the data coming out of the sensors — and no internet connection would be required.

Mimicking the brain​


The device is a neuromorphic processor — meaning its architecture is arranged to mimic the brain's pattern-recognition mechanisms. To draw an analogy, when you sense something — whether it's a smell or a sound — different collections of neurons fire to identify it.

Similarly, in the chip, different groups of artificial neurons register spikes. The underlying principle is the spiking neural network (SNN) — where a neural network is a collection of machine learning algorithms and the spikes it produces are akin to the signals produced by brain cells.

Related: Intel unveils largest-ever AI 'neuromorphic computer' that mimics the human brain

SNN algorithms also tend to be around 100 times smaller in terms of file size than conventional deep neural networks used in large language models.

Layers of computation​


There are three fundamental layers in the T1 chip. The first is the SNN-based computing engine, which records a power dissipation of less than 1 milliwatt and latency, or delay, that's typically under 1 millisecond for most applications, Kumar said. The second layer includes conventional deep neural networks, while the third layer includes a standard processor that handles how the system functions.


The T1, or similar chips, would increase battery life up to sixfold in some smart devices and scenarios, Kumar said. For example, a prototype of a smart doorbell built with the T1 processor that could detect the presence of a person using radar technology lasted 18 to 20 hours, versus one or two hours in a conventional Wi-Fi-based product that sends image and video data to servers.


Applications include smart lighting, any kind of detectors for people-counting, door-opening systems, and even earbuds — in which the T1 chip can theoretically isolate different sounds for noise cancellation. When used for any sound-based applications, the company claims there is an 80 to 100 times reduction in energy consumption as well as a 70 times reduction in latency.

The chip is being readied for mass production this year, with samples shipping to device manufacturers. Kumar expects the first products with the T1 neuromorphic chip to hit the shelves by 2026.

Really! The first neuromorphic chip!! Nope, but surely seems like a serious competitor.

Edit: Seems to be analog, AK is as we know digital!

Link below:


 
Last edited:
  • Like
  • Thinking
  • Fire
Reactions: 16 users

manny100

Regular
They must have had 11 on hand because I managed to order one.

Oddly enough, I was having a problem with it accepting my address, so I sent an e-mail to sales who then routed it to someone working on their storefront.

Before I received a reply from sales though, I got back on a couple of hours later and made the purchase. About an hour ago, I received an e-mail from sales asking if I wanted an E Key or B+M Key device, so I wonder if they're soldering them up on demand.

Anyhow, the site noted receipt within seven days of receiving an order, and if I remember correctly, it was $15 shipping in the US. That's quite a hefty shipping/handling charge for a stick of gum. Hopefully, it doesn't get stuck to the bottom of someone's shoe.

I haven't received an answer yet, but I've also inquired if the M.2 and the PCIe devices can occupy the same PC.
With luck the early M.2's will be collectors items some time in the future.:)
Did you pick the E key or the other B+M?
 
  • Like
  • Thinking
Reactions: 5 users
Top Bottom