BRN Discussion Ongoing

buena suerte :-)

BOB Bank of Brainchip
What a few weeks it been to be alive, 1st becoming a grandad, then we just had an offer accepted on a new house and get the keys in 3 weeks, plus work hasn’t been so good and I picked up a few decent contracts worth a few million as well. Then brainchip after so much pain the last few years seems like it’s just about to turned a corner ❤️❤️❤️
Celebrate Happy New Year GIF by Lucas and Friends by RV AppStudios


Birthday Champagne GIF by Saweetie
 
  • Like
  • Love
  • Fire
Reactions: 16 users

DK6161

Regular
What a few weeks it been to be alive, 1st becoming a grandad, then we just had an offer accepted on a new house and get the keys in 3 weeks, plus work hasn’t been so good and I picked up a few decent contracts worth a few million as well. Then brainchip after so much pain the last few years seems like it’s just about to turned a corner ❤️❤️❤️

I’ve always believed in karma and maybe Karma is a thing 😁
Great to hear pop! Hope the good news keep rolling in till the end of the year.
I bloody well hope I am finally in the green by Christmas with this one.
 
  • Like
Reactions: 1 users
Great to hear pop! Hope the good news keep rolling in till the end of the year.
I bloody well hope I am finally in the green by Christmas with this one.
Thanks DK even if I don’t always agree with what you say 😂
 
  • Like
  • Haha
  • Love
Reactions: 6 users

Mea culpa

prəmɪskjuəs
What a few great weeks it been to be alive, 1st becoming a grandad, then we just had an offer accepted on a new house and get the keys in 3 weeks, plus work hasn’t been so good and I picked up a few decent contracts worth a few million as well. Then brainchip after so much pain the last few years seems like it’s just about to turned a corner ❤️❤️❤️

I’ve always believed in karma and maybe Karma is a thing 😁
Jeez Pom, I posted something earlier for you that was on the profile status list for you to check out, may still be there. However, the Dreddbot thing took umbrage, gave me a lecture about questionable services and binned my post. Anyway, well done.
 
  • Fire
  • Haha
  • Love
Reactions: 6 users

Cgc516

Regular
I'll give you a tip. It doesn't just sound stupid ... it is.
i don’t see the tips. Hopefully , you are not!
 

7für7

Top 20
What a few great weeks it been to be alive, 1st becoming a grandad, then we just had an offer accepted on a new house and get the keys in 3 weeks, plus work hasn’t been so good and I picked up a few decent contracts worth a few million as well. Then brainchip after so much pain the last few years seems like it’s just about to turned a corner ❤️❤️❤️

I’ve always believed in karma and maybe Karma is a thing 😁
Congrats man!!! What a fantastic news! 👍☺️
 
  • Like
  • Fire
  • Love
Reactions: 5 users

MDhere

Regular
I'm not really jealous ...

...

I'm really, really, really jealous.
I know I know, I will take plenty of pics and will chat with my French friends and of course Valeo :)
I paid for the trip and the ticket "to represent all my fellow brners" :)
 
  • Like
  • Love
  • Fire
Reactions: 39 users

7für7

Top 20
Some day trader made profit on the end…but closed green
 
  • Fire
  • Love
Reactions: 2 users

Terroni2105

Founding Member
  • Like
  • Love
  • Fire
Reactions: 57 users

CHIPS

Regular

I can beat that 😂 ! 445.491 pcs is a high amount for Germany less than an hour of market opening and it is only for the Tradegate marketplace.

1727852680040.png


This is the chart of the trades of all German markets for one hour. It adds up to 651.349 roughly. This is a lot for Germany!

1727852942848.png
 
  • Like
  • Love
  • Fire
Reactions: 12 users

Xray1

Regular
  • Like
  • Fire
Reactions: 14 users
Could Akida by any chance be the integrated AI accelerator used in the newly launched Raspberry Pi AI Camera, jointly developed by Raspberry Pi and Sony Semiconductor Solutions (SSS)? (For the sake of completeness, I should, however, add that while Sony’s IMX500 Camera Module is being promoted as having on-chip AI image processing, there is no specific mention of this involving neuromorphic technology.)

I am afraid I can’t answer the legitimate question regarding potential revenue, though. It is a fact that Sony Semiconductor Solutions has not signed a license with us, so it would have to be a license through either Megachips or Renesas (both of which also happen to be Japanese companies).
I guess - as usual - we will have to resort to watching the financials, unless we find out sooner one way or the other…

So here goes my train of thought: Yesterday (Sept 30) was the Raspberry Pi AI Camera’s official launch - coincidentally (or not?), this happened to be the day when the BRN share price soared without any official news as a catalyst… Could there have been some kind of leak, though?

Also, BrainChip has promised “exciting demos, including our Temporal Event-based Neural Networks (TENNs) and the Raspberry Pi 5 with Face and Edge Learning” for the Embedded World North America, taking place from October 8-10, 2024, at the Convention Center in Austin, TX.

The now sold-out Akida Raspberry Pi Dev Kit was based on the Raspberry Pi 4, so they won’t be using one of those for the announced Raspberry Pi 5 demo. Since we are nowadays primarily an IP company, would manufacturing and releasing a new Akida Dev Kit based on a Raspberry Pi 5 make sense? Not really. How about demonstrating an affordable AI camera using Akida technology manufactured by someone else (the mysterious Custom Customer SoC?)… 🤔

The Raspberry Pi AI Kit that came out in June and uses a Hailo AI acceleration module will only work with a Raspberry Pi 5 (introduced in October 2023), whereas the new Raspberry Pi AI Camera will work with all Raspberry Pi models. So while it may perform best on the latest Raspberry Pi 5, it will still be useful for developers with older RPI models as well.

Maybe one of the resident TSE techies will be able to tell us right away that this is a dead end or a pie in the sky, but until then I’ll keep my fingers crossed…


View attachment 70181



View attachment 70190



View attachment 70192


Sony Semiconductor Solutions and Raspberry Pi Launch the Raspberry Pi AI Camera​

Accelerating the development of edge AI solutions​

Sony Semiconductor Solutions Corporation
Raspberry Pi Ltd.
Atsugi, Japan and Cambridge, UK — Sony Semiconductor Solutions Corporation (SSS) and Raspberry Pi Ltd today announced that they are launching a jointly developed AI camera. The Raspberry Pi AI Camera, which is compatible with Raspberry Pi’s range of single-board computers, will accelerate the development of AI solutions which process visual data at the edge. Starting from September 30, the product will be available for purchase from Raspberry Pi’s network of Approved Resellers, for a suggested retail price of $70.00*.
* Not including any applicable local taxes.

raspberrypi_ai-camera.jpg

In April 2023, it was announced that SSS would make a minority investment in Raspberry Pi Ltd. Since then, the companies have been working to develop an edge AI platform for the community of Raspberry Pi developers, based on SSS technology. The AI Camera is powered by SSS’s IMX500 intelligent vision sensor, which is capable of on-chip AI image processing, and enables Raspberry Pi users around the world to easily and efficiently develop edge AI solutions that process visual data.
  • AI camera features
  • Because vision data is normally massive, using it to develop AI solutions can require a graphics processing unit (GPU), an accelerator, and a variety of other components in addition to a camera. The new Raspberry Pi AI Camera, however, is equipped with the IMX500 intelligent vision sensor which handles AI processing, making it easy to develop edge AI solutions with just a Raspberry Pi and the AI Camera.
  • The new AI Camera is compatible with all Raspberry Pi single-board computers, including the latest Raspberry Pi 5. This enables users to develop solutions with familiar hardware and software, taking advantage of the widely used and powerful libcamera and Picamera2 software libraries.
“SSS and Raspberry Pi Ltd aim to provide Raspberry Pi users and the development community with a unique development experience,” said Eita Yanagisawa, General Manager, System Solutions Division, Sony Semiconductor Solutions Corporation. “I’m very excited to share SSS edge AI sensing technology with the world’s largest development community as the first fruits of our strategic partnership. We look forward to further collaboration with Raspberry Pi using our AITRIOS™ edge AI solution development and operations platform. We aim to make the most of AI cameras equipped with our image sensors in our collaborative efforts with Raspberry Pi.”

“AI-based image processing is becoming an attractive tool for developers around the world,” said Eben Upton, CEO, Raspberry Pi Ltd. “Together with our longstanding image sensor partner Sony Semiconductor Solutions, we have developed the Raspberry Pi AI Camera, incorporating Sony’s image sensor expertise. We look forward to seeing what our community members are able to achieve using the power of the Raspberry Pi AI Camera.”

Specifications
  • Sensor model: SSS's approx. 12.3 effective megapixel IMX500 intelligent vision sensor with a powerful neural network accelerator
  • Sensor modes: 4,056(H) x 3,040(V) at 10 fps / 2,028(H) x 1,520(V) at 40 fps​
  • Unit cell size: 1.55 µm x 1.55 µm​
  • 76 degree FoV with manual/mechanical adjustable focus​
  • Integrated RP2040 for neural network firmware management​
  • Works with all Raspberry Pi models using only Raspberry Pi standard camera connector cable​
  • Pre-loaded with MobileNetSSD model​
  • Fully integrated with libcamera​
About Sony Semiconductor Solutions Corporation
Sony Semiconductor Solutions Corporation is a wholly owned subsidiary of Sony Group Corporation and the global leader in image sensors. It operates in the semiconductor business, which includes image sensors and other products. The company strives to provide advanced imaging technologies that bring greater convenience and fun. In addition, it also works to develop and bring to market new kinds of sensing technologies with the aim of offering various solutions that will take the visual and recognition capabilities of both human and machines to greater heights.
For more information, please visit https://www.sony-semicon.com/en/index.html.

About Raspberry Pi Ltd
Raspberry Pi is on a mission to put high-performance, low-cost, general-purpose computing platforms in the hands of enthusiasts and engineers all over the world. Since 2012, we’ve been designing single-board and modular computers, built on the Arm architecture, and running the Linux operating system. Whether you’re an educator looking to excite the next generation of computer scientists; an enthusiast searching for inspiration for your next project; or an OEM who needs a proven rock-solid foundation for your next generation of smart products, there’s a Raspberry Pi computer for you.
Note: AITRIOS is the registered trademark or trademark of Sony Group Corporation or its affiliates.




View attachment 70196
View attachment 70197


After reading this post early this morning I'm a little surprised that there hasn't been further discussion with the AKIDA Pico and Raspberry Pi.

My first thought this morning, when I read about "AKIDA Pico", I immediately thought of the Raspberry Pi board known as the "Raspberry Pico". This is a board that has use cases typically powered by battery, so power draw needs to be kept to a minumum.

What are people thoughts on the AKIDA Pico & Raspberry Pico being linked somehow?? (Keep in mind that the Rasberry Pico 2 has just been prereleased)
 
  • Like
  • Love
  • Fire
Reactions: 25 users

manny100

Regular
According to Simply Wall sxxxt insiders ownership of BRN is 16.7%..
Employee share scheme holds 0.0445%
That is an indication of faith by people who know.
No wonder we have had several enhancements in a short period of time.
 
  • Like
  • Love
Reactions: 11 users

CHIPS

Regular
I am in love ... with PICO :love:

Just imagine you would have sold your BRN stocks some days ago to make some extra money planning to rebuy them at a lower price :eek:. Who was it again, who had to sell some of the stocks to avoid an unhappy wife? :unsure:
I feel very sorry for your wife now 😂😂😂

Schitts Creek Love GIF by CBC
 
  • Haha
  • Fire
  • Like
Reactions: 4 users

GStocks123

Regular
Good to see us listed as partners on the Neurobus website. Looks like we may already be embedded in some of their products..

 
  • Like
  • Love
  • Fire
Reactions: 22 users

TECH

Regular
"Single neural processing engine
Minimal core for TENNS
"

I had assumed that the Akida NN would need at least 2 NPEs, but TENNS can run in a single NPE???!!!!

That is truly astonishing.

... and it does not need a microprocessor????!!!!!

https://www.epdtonthenet.net/articl...ion-for-Resource-Constrained-Deployments.aspx

This IP relies on the Akida2 event-based computing platform configuration engine as its foundation, meaning that the data quantities needing to be dealt with are kept to a minimum. Consequently only a small logic die area is required (0.18mm x 0.18mm on a 22nm semiconductor process with 50kBytes of SRAM memory incorporated), plus the associated power budget remains low (with <1mW operation being comfortably achieved). It can serve as either a standalone device (without requiring a microcontroller) or alternatively become a co-processor.

It's a self-contained package that sets the benchmark for low power.

Proving that we ARE the benchmark leaders at the "far edge"....PICO.....Pico Brainchip when you want first mover advantage, stop
procrastinating and sign up, our bus feels like it's warming the engines up again.

Fancy that, the company making an nice announcement on 1 October, my birthday 66 clickity...click.....thanks for sharing the love back in the US.

Is it just me, or has the trading pattern changed somewhat ?

Regards to all....Tech.
 
  • Like
  • Love
  • Fire
Reactions: 37 users

Tothemoon24

Top 20
IMG_9675.jpeg




Redefining AI Processing with Event-Based Architecture

BrainChip has launched the Akida Pico, enabling the development of compact, ultra-low power, intelligent devices for applications in wearables, healthcare, IoT, defense, and wake-up systems, integrating AI into various sensor-based technologies.

Akida Pico offers the lowest power standalone NPU core (less than 1mW), supports power islands for minimal standby power, and operates within an industry-standard development environment.

This new technology makes it possible for common things like drills, hand tools, and other consumer products to have smart features without costing a lot more.

Steve Brightfield, CMO at BrainChip:
“Today, a battery with a built-in tester can show how healthy it is with a simple color code:
green means it’s good, red means it needs to be replaced.

Providing a similar indicator, AI in these products can tell you when parts are wearing out before they break.

BrainChip’s low-power, low-maintenance AI works in the background without being noticed, so advanced tests can be used by anyone without needing to know a lot about them”

Steve Brightfield claimed that ordinary NPUs—including those with multiplier-accumulator arrays—run on fixed pipelines, processing every input whether or not it is beneficial.

Particularly in cases of sparse data, a typical occurrence in AI applications where most input values have little impact on the final outcome, this inefficacy often leads in wasted calculations.

By use of an event-based computing architecture, BrainChip saves computational resources and electricity by activating calculations only when relevant data is present.

BrainChip’s Akida main benefit comes from using data and neural weights’ sparsity.

Traditional NPU architectures can take advantage of weight sparsity with pre-compilation, benefiting from model weight pruning, but cannot dynamically schedule for data sparsity, they must process all of the inputs.

By processing data just when needed, BrainChip’s SNN technology can drastically lower power usage based on the degree of sparsity in the data.

BrainChip’s Akida NPU, for instance, could execute only when the sensor detects a significant signal in audio-based edge applications such as gunshot recognition or keyword detection, therefore conserving energy in the lack of acceptable data.




BrainChip’s Akida NPU: Redefining AI Processing with Event-Based Architecture​

BrainChip’s Akida NPU: Redefining AI Processing with Event-Based Architecture

1727862300585.jpg

Embedded Staff
6 min read
0
BrainChip has launched the Akida Pico, enabling the development of compact, ultra-low power, intelligent devices for applications in wearables, healthcare, IoT, defense, and wake-up systems, integrating AI into various sensor-based technologies. According to BrainChip, Akida Pico offers the lowest power standalone NPU core (less than 1mW), supports power islands for minimal standby power, and operates within an industry-standard development environment. It’s very small logic die area and configurable data buffer and model parameter memory help optimize the overall die size.

AI era​

In the sophisticated artificial intelligence (AI) era of today, including smart technology into consumer items is usually connected with cloud services, complicated infrastructure, and high expenses. Computational power and energy economy are occasionally in conflict in the realm of edge artificial intelligence. Designed for deep learning activities, traditional neural processing units (NPUs) require significant quantities of power, so they are less suited for always-on, ultra-low-power applications including sensor monitoring, keyword detection, and other extreme edge artificial intelligence uses. BrainChip is providing a fresh approach to this challenge.
BrainChip’s solution addresses one of the major challenges in edge AI: how to perform continuous AI processing without draining power. Traditional microcontroller-based AI solutions can manage low-power requirements but often lack the processing capability for complex AI tasks.


2014 saw the launch of BrainChip, which took its inspiration from Peter Van Der Made’s work on neuromorphic computing concepts. Especially using spiking neural networks (SNNs), this technique replicates how the brain manages information, therefore transforming a fundamentally different method to traditional convolutional neural networks (CNNs). The SNN-based systems of BrainChip only compute when triggered by events rather than doing continuous calculations, hence optimizing power efficiency.

In an interview with Embedded, Steve Brightfield, CMO at BrainChip, talked about how this new method will change the game for ultra-low-power AI apps, showing big steps forward in the field. Brightfield said that this new technology makes it possible for common things like drills, hand tools, and other consumer products to have smart features without costing a lot more. “Today, a battery with a built-in tester can show how healthy it is with a simple color code: green means it’s good, red means it needs to be replaced. Providing a similar indicator, AI in these products can tell you when parts are wearing out before they break. BrainChip’s low-power, low-maintenance AI works in the background without being noticed, so advanced tests can be used by anyone without needing to know a lot about them,” Brightfield said.

Traditional NPUs vs. Event-Based Computing​

Brightfield claimed that ordinary NPUs—including those with multiplier-accumulator arrays—run on fixed pipelines, processing every input whether or not it is beneficial. Particularly in cases of sparse data, a typical occurrence in AI applications where most input values have little impact on the final outcome, this inefficacy often leads in wasted calculations. By use of an event-based computing architecture, BrainChip saves computational resources and electricity by activating calculations only when relevant data is present.

“Most NPUs keep calculating all data values, even for sparse data,” Brightfield remarked. “We schedule computations dynamically using our event-based architecture, so cutting out unnecessary processing.”

The Influence of Sparsity​

BrainChip’s main benefit comes from using data and neural weights’ sparsity. Traditional NPU architectures can take advantage of weight sparsity with pre-compilation, benefiting from model weight pruning, but cannot dynamically schedule for data sparsity, they must process all of the inputs.

By processing data just when needed, BrainChip’s SNN technology can drastically lower power usage based on the degree of sparsity in the data. BrainChip’s Akida NPU, for instance, could execute only when the sensor detects a significant signal in audio-based edge applications such as gunshot recognition or keyword detection, therefore conserving energy in the lack of acceptable data.

Akida Pico Block Diagram (Source: Brainchip)

Introducing the Akida Pico: Ultra-Low Power NPU for Extreme Edge AI​

Designed on a spiking neural network (SNN) architecture, BrainChip’s Akida Pico processor transforms event-based computing. Unlike conventional artificial intelligence models that demand constant processing capability, Akida runs just in response to particular circumstances. For always-on uses like anomaly detection or keyword identification, where power economy is vital, this makes it perfect. The latest innovation from BrainChip is built on the Akida2 event-based computing platform configuration engine, which can execute with power suitable for battery-powered operation of less than a single milliwatt.

Wearables, IoT devices, and industrial sensors are among the jobs that call for continual awareness without draining the battery where the Akida Pico is well suited. Operating in the microwatt to milliwatt power range, this NPU is among the most efficient ones available; it surpasses even microcontrollers in several artificial intelligence applications.

For some always-on artificial intelligence uses, “the Akida Pico can be lower power than microcontrollers,” Brightfield said. “Every microamp counts in extreme battery-powered use cases, depending on how long it is intended to perform.”

The Akida Pico can stay always-on without significantly affecting battery life, whereas microcontroller-based AI systems often require duty cycling—turning the CPU off and on in bursts to save power. For edge AI devices that must run constantly while keeping a low power consumption, this benefit becomes very vital.

BrainChip’s MetaTF software flow allows developers to compile and optimize Temporal-Enabled Neural Networks (TENNs) on the Akida Pico. Supporting models created with TensorFlow/Keras and Pytorch, MetaTF eliminates the need to learn a new machine language framework, facilitating rapid AI application development for the Edge.

Akida Pico die area versus process (mm2) (Source: Brainchip)

Standalone Operation Without a Microcontroller​

Another remarkable feature of the Akida Pico is its ability to function alone, that is, without a host microcontroller to manage its tasks. Usually beginning, regulating, and halting operations using a microcontroller, the Akida Pico comprises an integrated micro-sequencer managing the full neural network execution on its own. This architecture reduces total system complexity, latency, and power consumption.
For applications needing a microcontroller, the Akida Pico is a rather useful co-processor for offloading AI tasks and lowering power requirements. From battery-powered wearables to industrial monitoring tools, this flexibility appeals to a wide range of edge artificial intelligence applications.

Targeting Key Edge AI Applications​

The ultra-low power characteristics of the Akida Pico help medical devices that need continuous monitoring—such as glucose sensors or wearable heart rate monitors—benefit.

Likewise, good candidates for this technology are speech recognition chores like voice-activated assistants or security systems listening for keywords. Edge artificial intelligence’s toughest obstacle is combining compute requirements with power consumption. In markets where battery life is crucial, the Akida Pico can scale performance while keeping inside limited power budgets.


One of the most notable uses of BrainChip’s artificial intelligence, according to Brightfield, is anomaly detection for motors or other mechanical systems Both costly and power-intensive, traditional methods monitor and diagnose equipment health using cloud-based infrastructure and edge servers. BrainChip embeds artificial intelligence straight within the motor or gadget, therefore flipping this concept on its head.

BrainChip’s ultra-efficient Akida Neural Processor Unit (NPU) for example, may continually examine vibration data from a motor. Should an abnormality, such as an odd vibration, be found, the system sets off a basic alert—akin to turning on an LED. Without internet access or a thorough examination, this “dumb and simple” option warns maintenance staff that the motor needs care instead of depending on distant servers or sophisticated diagnosis sites.

“In the field, a maintenance technician could only glance at the motor. Brightfield said, “they know it’s time to replace the motor before it fails if they spot a red light.” This method eliminates the need for costly software upgrades or cloud access, therefore benefiting equipment in distant areas where connectivity may be restricted.

Regarding keyword detection, BrainChip has included artificial intelligence right into the device. According to Brightfield, with 4-5% more accuracy than historical methods using raw audio data and modern algorithms, the Akida Pico uses just under 2 milliwatts of power to provide amazing results. Temporal Event-Based Neural Networks (TENNS), a novel architecture built from state space models that permits high-quality performance without the requirement for power-hungry microcontrollers, enable this achievement.

As demand for edge AI grows, BrainChip’s advancements in neuromorphic computing and event-based processing are poised to contribute significantly to the development of ultra-efficient, always-on AI systems, providing flexible solutions for various applications.
 
Last edited:
  • Like
  • Love
  • Fire
Reactions: 50 users

FJ-215

Regular
"Single neural processing engine
Minimal core for TENNS
"

I had assumed that the Akida NN would need at least 2 NPEs, but TENNS can run in a single NPE???!!!!

That is truly astonishing.

... and it does not need a microprocessor????!!!!!

https://www.epdtonthenet.net/articl...ion-for-Resource-Constrained-Deployments.aspx

This IP relies on the Akida2 event-based computing platform configuration engine as its foundation, meaning that the data quantities needing to be dealt with are kept to a minimum. Consequently only a small logic die area is required (0.18mm x 0.18mm on a 22nm semiconductor process with 50kBytes of SRAM memory incorporated), plus the associated power budget remains low (with <1mW operation being comfortably achieved). It can serve as either a standalone device (without requiring a microcontroller) or alternatively become a co-processor.

It's a self-contained package that sets the benchmark for low power.
Hi Dio,

A little left field but can you see a way for the JAST learning rules to be implemented in to TENNs? (orthogonal polynomials)

From memory the JAST rules only took up something like 64K lines of code.

Way, way, way out side of my pay grade.
 
  • Like
  • Thinking
  • Fire
Reactions: 7 users
Top Bottom