BRN Discussion Ongoing

IloveLamp

Top 20
1000019589.jpg
 
  • Like
Reactions: 13 users

mrgds

Regular
  • Like
  • Fire
  • Love
Reactions: 24 users

1730784906339.png



1730784998573.png


1730785039717.png
 
  • Like
  • Fire
  • Love
Reactions: 45 users
  • Haha
  • Thinking
  • Like
Reactions: 10 users
  • Haha
Reactions: 8 users

7für7

Top 20
  • Haha
  • Sad
  • Like
Reactions: 4 users

TECH

Regular
Uiux is of the male species, having talked with him via mobile phone conversation..Bravo is of the female species.

Uiux and Fact Finder had a personality clash..both very intelligent, both top class researchers...a very similar scenario between Bravo and Frangapani, who are both fantastic contributors to our forum, both have different approaches in the way they communicate and share their research..I'd suggest that being Australian, we can relate to Bravo's style in a little more relaxed Aussie way, if that makes sense.

The key thing to remember is that we are all different, we all have an individual style in how we communicate, that's what makes us unique..just like Peters brilliant original SNAP64...out of which the AKIDA suite of products are now being realized.

Accept others posts without trying to be so judgemental...ultimately we ALL want to see Brainchip succeed, with or without any input from Nvidia 🤣🤣

God Bless...Tech x
 
  • Like
  • Love
  • Fire
Reactions: 55 users

7für7

Top 20
Uiux is of the male species, having talked with him via mobile phone conversation..Bravo is of the female species.

Uiux and Fact Finder had a personality clash..both very intelligent, both top class researchers...a very similar scenario between Bravo and Frangapani, who are both fantastic contributors to our forum, both have different approaches in the way they communicate and share their research..I'd suggest that being Australian, we can relate to Bravo's style in a little more relaxed Aussie way, if that makes sense.

The key thing to remember is that we are all different, we all have an individual style in how we communicate, that's what makes us unique..just like Peters brilliant original SNAP64...out of which the AKIDA suite of products are now being realized.

Accept others posts without trying to be so judgemental...ultimately we ALL want to see Brainchip succeed, with or without any input from Nvidia 🤣🤣

God Bless...Tech x
Sir? Sir??? One question…. Just to confirm…”Accept others posts” or “except other posts”…because there is T&J waiting for further clarification …. I just want to be sure… don’t think I could accept something from an individual like him! For the rest of your post thumbs up

1730813254964.gif
 
  • Like
Reactions: 1 users
Uiux is of the male species, having talked with him via mobile phone conversation..Bravo is of the female species.

Uiux and Fact Finder had a personality clash..both very intelligent, both top class researchers...a very similar scenario between Bravo and Frangapani, who are both fantastic contributors to our forum, both have different approaches in the way they communicate and share their research..I'd suggest that being Australian, we can relate to Bravo's style in a little more relaxed Aussie way, if that makes sense.

The key thing to remember is that we are all different, we all have an individual style in how we communicate, that's what makes us unique..just like Peters brilliant original SNAP64...out of which the AKIDA suite of products are now being realized.

Accept others posts without trying to be so judgemental...ultimately we ALL want to see Brainchip succeed, with or without any input from Nvidia 🤣🤣

God Bless...Tech x
1730824805292.gif
 
  • Haha
  • Like
Reactions: 5 users

Gazzafish

Regular
Not sure if anyone could dig a little deeper. Running on a ARM cortex A53 with in house NPU… could it be Akida?

https://www.hackster.io/news/m5stac...ngs-with-the-3-2-tops-llm-module-f0a4e061f0de

Article :-

M5Stack Adds Large Language Model Support to Its Offerings with the 3.2 TOPS LLM Module​

Get your M5Stack-powered project a little offline AI with the company's Axera AX630C-powered edge AI LLM host.​


Gareth HalfacreeFollow
1 day ago • Machine Learning & AI / HW101
image_oAts7O9pIw.png



Ad


Modular electronics specialist M5Stack has announced its latest hardware release, a module that aims to add large language model (LLM) artificial intelligence (AI) to your builds: the sensibly-named M5Stack LLM Module.
"[The LLM Module] is an integrated offline large language model (LLM) inference module designed for terminal devices that require efficient and intelligent interaction," M5Stack says of the hardware in question. "Whether for smart homes, voice assistants, or industrial control, Module LLM provides a smooth and natural AI experience without relying on the cloud, ensuring privacy and stability."
M5Stack is adding LLM-powered edge AI to its Core family of modular developments boards with the new LLM Module. (📹: M5Stack)
Aiming to ride the current boom in LLM technology, in which user queries are broken down into "tokens" and the most likely tokens returned in response to create an answer-shaped object that may, if everything goes well, also be correct, the LLM Module — brought to our attention by Linux Gizmos — is powered by an Axera AX630C system-on-chip. This combines two Arm Cortex-A53 cores running at up to 1.2GHz with an in-house neural processing unit (NPU) delivering 3.2 tera-operations per second (TOPS) of compute at INT8 precision rising as high as 12.8 TOPS if you drop to INT4 precision.
This, combined with 3GB of memory dedicated to the NPU with 1GB left for the operating system installed on a 32GB eMMC module, is enough to run smaller large language models entirely on-device — while drawing, the company claims, as little as 1.5W. The module includes an integrated microphone with wake-word and speech recognition models pre-loaded, and a speaker to serve as an output through an integrated text-to-speech model. The eMMC comes with an unspecified version of Canonical's Ubuntu Linux pre-loaded, and can be upgraded through a microSD Card slot.


On the software front, M5Stack says the module is compatible with multiple large language models — featuring the Qwen2.5-0.5B model out-of-the-box, a compact LLM with 500,000 parameters tweaked for edge AI operations. The company has promised that future updates will bring support for the more capable Qwen2.5-1.5B model, three times the size of the launch model, as well as Llama3.2-1B and InternVL2-1B. If connected to a USB camera, the module also supports computer vision models including CLIP and YoloWorld at launch with DepthAnything, SegmentAnything, "and other advanced models" to follow in future updates.
The M5Stack LLM Module has been listed on the M5Stack store at $49.90, though at the time of writing was showing as out-of-stock; the company has also announced an "LLM debugging kit" that adds a Fast Ethernet network port and a dedicated kernel serial port, pricing for which has not yet been announced. The module is compatible with the company's Core, Core2, CoreS3, and Core MP135 development boards
 
  • Like
  • Wow
Reactions: 3 users

buena suerte :-)

BOB Bank of Brainchip
Better action today :) :) :) 0.255 & 0.26 wiped

1730849222045.png


1730849056163.png
 
  • Like
  • Love
  • Fire
Reactions: 21 users

buena suerte :-)

BOB Bank of Brainchip
Last edited:
  • Haha
  • Like
  • Love
Reactions: 12 users

itsol4605

Regular
12% ... nice !! Why?? Any fundamental reason?
 
  • Like
Reactions: 1 users
  • Like
  • Fire
Reactions: 6 users

charles2

Regular
Yeah, action looking pretty good today.

A few biggish trades going through so far, so maybe someone with a bit of cash ready to buy in. Wondering if they're keen to get in before the US election gets going.

Could be an interesting few days ahead with the US election going on.
Could be an interesting few months, years.....
 
  • Like
  • Love
  • Fire
Reactions: 13 users
 
  • Like
  • Fire
Reactions: 4 users

Slade

Top 20
Trump and Elon are good for BrainChip
 
  • Like
  • Fire
  • Sad
Reactions: 12 users
Top Bottom