BRN Discussion Ongoing

Gearitup

Member

Brainchip ..... GOLD Sponsors!​

Sessions 22nd -- 24th May​


Check out the incredible sessions for the 2023 Summit!​

.........Tuesday May 23rd........


  • Nandan Nayampally

    4:50pm - 5:20pm

    Enabling Ultra-Low Power Edge Inference and On-Device Learning with Akida

    By Nandan Nayampally Chief Marketing Officer, BrainChip
Gold Sponsors are 4th on the list....
 
  • Like
Reactions: 7 users

Unfortunately no. Xperi has a competing NCU chip (Perceive Ergo) that's not exactly truly neuromorphic in architecture (uses MAC functions, at least in the first version, no info on second info to suggest a change). @Diogenese I believe has had a look at Xperi about 2 years ago on that other place where the grass is browner.
 
Last edited:
  • Like
  • Haha
Reactions: 12 users

cosors

👀
I did not know that there are now also German articles about us. It's already a month old but we here like Akida!

1683105985563.png


"These are the top articles of the month on all-electronics​

Which articles on all-electronics.de have the readers been most interested in over the past month? We have put together the top 10 for you.
1683106031557.png


The top articles from March 2023​

10th place: Why the EU Machinery Regulation now belongs on the agenda

9th place: How Mathworks intends to systematize development

8th place: ZVEI guest comment: Bidirectional charging – AC or DC?

7th place: Pi-Day: Everything there is to know about the circle number Pi

6th place: Tesla's plans with rare earths and SiC - against the trend

5th place: Highlights and impressions from Embedded World 2023

4th place: "For me, sustainability is THE economic factor of the future"

3rd place: Why the Extremely Large Telescope is also a triumph of automation

2nd place: Fraunhofer technology makes quantum computers suitable for industry




and finally ;)

1683106103747.png

1st place: This is achieved by the 2nd generation Akida platform from Brainchip
The manufacturer of digital neuromorphic AI IP announces the second generation of its Akida platform. This should enable efficient and intelligent edge devices for AIoT (Artificial Intelligence of Things). Which technologies ensure this.
This brings the Vision Transformer acceleration to the second Akida generation (Image: Brainchip)"
See the full article below

https://www.all-electronics.de/markt/das-sind-die-top-artikel-des-monats-auf-all-electronics-812.html

________________________
Because it is so beautiful here again:

from German

"AIoT: Vision Transformer and Spatial-Temporal Convolution​

This is what Brainchip's 2nd generation Akida platform does​

The manufacturer of digital neuromorphic AI IP announces the second generation of its Akida platform. This should enable efficient and intelligent edge devices for AIoT (Artificial Intelligence of Things). Which technologies ensure this.
The second generation of Akida now includes Temporal Event Based Neural Nets (TENN) spatial-temporal convolutions for processing time-continuous raw data streams.

The second generation of Akida now includes Temporal Event Based Neural Nets (TENN) Spatial-Temporal Convolutions for processing continuous-time raw data streams. (Image: Brainchip)

The very efficient yet powerful neural processing system developed by Brainchip for embedded edge AI applications now offers 8-bit processing along with features such as time domain convolutionos and vision transformer acceleration. This enables high levels of performance in sub-watt devices, taking them from perception to cognition.
The second generation of Akida now includes Temporal Event Based Neural Nets (TENN) spatial-temporal convolutions (Spatial-Temporal Convolutions) that allow the processing of continuous-time raw data streams such as video analysis, target tracking, audio classification, analysis of MRI and CT scans for prediction of vital signs and time series analysis for forecasting and predictive maintenance. These skills are badly needed in industry, automotive, digital healthcare, smart home and smart city. The TENNs allow for significantly simpler implementations by retrieving the raw data directly from the sensors, drastically reducing the model size and operations performed while maintaining very high accuracy.
The Akida platform at a glance: From the Akida-E for maximum efficiency, through the Akida-S (sensor balanced) to the Akida-P with maximum performance.


The Akida platform at a glance: From the Akida-E for maximum efficiency, through the Akida-S (sensor balanced) to the Akida-P with maximum performance. (Image: Brainchip)


Due to the high demand from the automotive, edge vision and factory automation sectors, Brainchip has started an early access program for the neural SoC Akida.
market Spiking Neural Networks

Neural SoC Akida: Brainchip starts early access program

With the early access program for the neuronal SoC Akida, Brainchip is responding to the high demand for development systems after the announcement of the start of production for autumn 2020. You can find out more about the early access program here.

This brings the Vision Transformer acceleration to the second Akida generation​

Another addition to the second Akida generation is Vision Transformers Acceleration (ViT), a neural network that has proven extremely powerful in various computer vision tasks such as image classification, object recognition, and semantic segmentation. This powerful acceleration, combined with Akida's ability to process multiple layers simultaneously and hardware support for skip connections, allows the execution of complex networks like RESNET-50 to be managed entirely in the neural processor itself without CPU intervention, minimizing system load .
The Vision Transformer technology is particularly powerful for tasks such as image classification and object recognition.


The Vision Transformer technology is particularly powerful for tasks such as image classification and object recognition. (Image: Brainchip)



The Akida IP platform has the ability to learn on the device, enabling continuous improvement and dataless customization that enhances security and privacy. This, combined with the available efficiency and performance, allows for highly differentiated solutions that were previously not possible. This includes secure small form factor devices such as B. Hearing aids and wearable devices that process raw audio data, as well as medical devices that monitor heart and breathing rates and other vital data that consume only 1 µW of power. This scales up to HD resolution imaging solutions delivered via high quality battery powered or fanless devices,

TENNs allow for significantly easier implementation by retrieving raw data directly from the sensors, reducing model size and operations performed.


TENNs allow for significantly easier implementation by retrieving raw data directly from the sensors, reducing model size and operations performed. (Image: Brainchip)



Akida
electronics development Spiking Neural Network will initially be launched as an FPGA solution

SNN Neuromorphic System-on-Chip Akida as of 2019

Brainchip Holdings Ltd. specializes in neuromorphic computers. is the first company to mass market a spiking neural network (SNN) architecture: the Akida neuromorphic system-on-chip (NSoC) device. Read the background to the presentation of the first Akida generation here.

Brainchip CEO Sean Hehir on the second generation Akida​

"Our customers wanted us to enable advanced predictive intelligence, target tracking, object detection, scene segmentation, and advanced image processing capabilities. This new generation of Akida enables designers and developers to do things previously not possible in a low-power edge device "By inferring and learning from raw sensor data without the need for pre-processing of digital signals, we're taking a big step towards a cloudless edge AI experience."

Software and tooling by Akida​

Akida's software and tooling further simplify the development and delivery of solutions and services with these features:
  • An efficient run-time engine that manages model accelerations autonomously and completely transparent to the developer.
  • MetaTF software that developers can use with their preferred framework, like TensorFlow/Keras, or their development platform, like Edge Impulse, to easily build, fine-tune, and deploy AI solutions.
  • Supports all types of Convolutional Neural Networks (CNN), Deep Learning Networks (DNN), Vision Transformer Networks (ViT), as well as Spiking Neural Networks (SNNs), making it future-proof as models become more advanced.
Akida has a model zoo and a growing ecosystem of software, tool and model providers as well as IP, SoC, foundry and system integrator partners. BrainChip works with early adopters on the second generation IP platform. General availability will occur in Q3 2023.

The author: Dr.-Ing. Nicole Ahner​

The author: Dr.  Nicole Ahner

(Image: Hüthig)
Her enthusiasm for physics and material development ensured that she found her true calling during her electrical engineering studies, which she then also placed at the center of her professional work: microelectronics and semiconductor production. After years in semiconductor research, she now researches and writes with deep specialist knowledge about electronic components. Her special interests are in wide-bandgap semiconductors, batteries, the technologies behind electromobility, materials research topics and electronics in space.
More articles from Nicole Ahner"
https://www.all-electronics.de/elektronik-entwicklung/das-leistet-die-akida-plattform-der-2-generation-von-brainchip-526.html
 
Last edited:
  • Like
  • Love
  • Fire
Reactions: 83 users

Diogenese

Top 20
  • Haha
  • Like
Reactions: 18 users

Sirod69

bavarian girl ;-)
I'm also curious about what's going on at our partner Teksun

Teksun Inc
Teksun IncTeksun Inc


Teksun Inc's cognitive services use AI and ML to solve complex problems, improve decision-making, and automate processes.

Our services include natural language processing, computer vision, predictive analytics, and chatbots. With our advanced tools, you can transform your data into insights and drive business growth.

Contact us to power your business with AI-driven insights today!
1683118195387.png
 
  • Like
  • Fire
  • Love
Reactions: 21 users

Sirod69

bavarian girl ;-)
  • Haha
  • Like
  • Sad
Reactions: 14 users

Frangipani

Top 20
Hi Labsy....very pleased for you being in your 40's and holding Brainchip stock...such a great decision you have made !

Interesting how "Disney" got a mention...they are working with Robotics in the AI space, also video streaming etc...such a
good fit with the release of AKD 2.0....were Sean and Geoff talking about our engagements prior to the Podcast in private,
meaning, Geoff's subconscious mind was coughing up the words Mercedes, Disney and Tesla. :ROFLMAO::ROFLMAO::ROFLMAO: purely speculative of course.

I am truly "hoping" to hear that another 2 companies have signed an IP License by years end...which 2, any 2 !!

Tech ;)

Look what I’ve found…

DisneyResearch|Studios in Zurich, Switzerland, focuses on exploring the scientific frontiers in a variety of domains in service to the technical and creative filmmaking process. Our world-class research talent in visual computing, machine learning, and artificial intelligence shapes early-stage ideas into technological innovations that revolutionize the way we produce movies and create media content (…)
To complement our considerable research talent, DisneyResearch|Studios maintains a close academic partnership with ETH Zürich—supporting joint research programs and PhD students—but also collaborates with the best of academia and industry from all over the world.”

So yes, maybe there is indeed more to Disney than just a fleeting mention of the name as a random example of a world-class company during the podcast. I wonder if Geoffrey knows Moore than we do. 😄

Mind you, this is mere speculation, I didn’t find any direct links to Brainchip.
And keep in mind that Zürich is also home to the renowned Institute of Neuroinformatics (INI), which was established at the University of Zürich and ETH Zürich at the end of 1995. With its focus on neuromorphic engineering, INI has been a fertile breeding ground for spin-offs such as IniLabs, iniVation (they are the ones that have been collaborating with WSU‘s International Centre for Neuromorphic Systems and the RAAF & UNSW Canberra Space resp. US Air Force on neuromorphic event cameras in space - let’s keep our fingers crossed that Brainchip will be involved in the Falcon Neuro’s follow-on experiment Falcon ODIN planned for later this year) and SynSense.
Then again, Disney could of course be collaborating with several competing companies to find out which ones best suit their needs.

To quote Disney’s Aladdin:
🎼A whole new world
A new fantastic point of view
No one to tell us “no“, or where to go
Or say we‘re only dreaming…”




Welcome to DisneyResearch|Studios!​

For over 12 years Disney Research has been at the forefront of technological innovation, pushing the boundaries of what is possible to help the Walt Disney Company differentiate its entertainment products, services, and content.
Our mission is rooted in the long Disney tradition of inventing new technologies to contribute to the magic of the stories we tell and the characters we all love.
Markus Gross

Markus Gross​

Chief Scientist
DisneyResearch|Studios in Zurich, Switzerland, focuses on exploring the scientific frontiers in a variety of domains in service to the technical and creative filmmaking process. Our world-class research talent in visual computing, machine learning, and artificial intelligence shapes early-stage ideas into technological innovations that revolutionize the way we produce movies and create media content.
Our inventions are used in almost every Disney feature film production and have dazzled hundreds of millions of people in audiences worldwide.DisneyResearch|Studios is part of a wider innovation ecosystem operating in close partnership with our technology units at the Walt Disney Animation Studios, Pixar Animation Studios, Lucasfilm/ILM, Marvel Studios, and Walt Disney Pictures.
To complement our considerable research talent, DisneyResearch|Studios maintains a close academic partnership with ETH Zürich—supporting joint research programs and PhD students—but also collaborates with the best of academia and industry from all over the world. With such a strong academic and creative grounding, DisneyResearch|Studios is a lab like no other—with the unique mission of bringing the magic to life.
DisneyReseachZurich_003.jpg
 
Last edited:
  • Like
  • Love
  • Fire
Reactions: 39 users

goodvibes

Regular
Prophesee…sounds like Akida inside…

We’re headed to #AutomateShow in Detroit 22-25 May!
Secure a meeting with our experts at booth #3645 to discuss how Event-Based vision is impacting the future of industrial automation.

We’ll be showing demos for various applications in Industry 4.0 including ultra high-speed counting, vibration monitoring and frequency analysis for predictive maintenance, batch homogeneity & gauging and more.

Book your meeting today 👉 https://lnkd.in/dpV8eWhA

A3 - Association for Advancing Automation
#Automate2023 #machinevision #industrialautomation #industry4_0

 
  • Like
  • Fire
  • Love
Reactions: 17 users

goodvibes

Regular
Does someone knows about EU project REBECCA?


Our MISSION​

The mission of REBECCA is to develop efficient and secure edge-AI systems using open CPU architecture, to enhance European strategic autonomy and sovereignty.

Bonseyes is one of 24 partners…Bonseyes linked to Nviso…


 
  • Like
Reactions: 5 users

Sirod69

bavarian girl ;-)

A LOOK AT THE TOP HOLDERS OF BRAINCHIP SHARES​

(An impressive list of big funds that are taking Brainchip seriously and are invested. For example, Citicorp owns nearly 1 in 10 shares of Brainchip, Merrill Lynch (Australia) 1 in 20 shares. Personally, I find this reassuring we are on the right track. The professional big boy investors have their hat in the ring with us and believe we are on to something and they want in too.)
According to the company, BrainChip’s top 20 shareholders are as follows:
  1. Citicorp, with 9.15% of all outstanding shares
  2. Mr Peter Adrien van der Made, with 8.87%
  3. Merrill Lynch, with 4.88%
  4. BNP Paribas, with 4,75%
  5. HSBC, with 4.44%
  6. JPMorgan, with 2.82%
  7. BNP Paribas (DRP), with 2.53%
  8. HSBC (customer accounts), with 1.17%
  9. National Nominees, with 0.67%
  10. LDA Capital, with 0.52%
  11. BNP Paribas (Retail Clients), with 0.47%
  12. Mrs Rebecca Ossieran-Moisson, with 0.45%
  13. Crossfield Intech (Liebskind Family), with 0.4%
  14. Certane CT Pty Ltd (BrainChip’s unallocated long-term incentive plan), with 0.4%
  15. Mr Paul Glendon Hunter, with 0.35%
  16. Certane CT Pty Ltd ((BrainChip’s allocated long-term incentive plan), with 0.35%
  17. Mr Louis Dinardo, with 0.34%
  18. Mr Jeffrey Brian Wilton, with 0.31%
  19. Mr David James Evans, with 0.31%
  20. Superhero Securities (Client Accounts), with 0.3%

 
  • Like
  • Fire
Reactions: 13 users

Sirod69

bavarian girl ;-)
Exciting news from Plumerai! 🔥 You can now test Plumerai’s People Detection right here in your browser. 😱 Click below and witness the accuracy of our tiny AI model running with your webcam. Rest assured, your privacy is protected. We do not capture any images and everything stays on your PC. How does it work? Normally we compile our models for CPUs and NPUs, but here we’ve compiled them for WebAssembly, which runs in the browser. Give it a try and see for yourself what kind of accuracy we can achieve with our tiny models! 🚀

Plumerai People Detection will run locally in your browser, with no involvement from the cloud. This is how we preserve your privacy.
Your videos or images are not transmitted, not stored, and not shared with Plumerai. For full details, see our privacy policy.

This exact same AI model runs on tiny chips.​

And that’s why we can deploy AI in devices where others can’t.
We are running an extremely tiny AI model in your browser. There’s no involvement from the cloud, so preserving your privacy. It’s so small and so efficient that we can run the exact same AI model on tiny and low-cost chips. That’s how we enable our customers to run Plumerai People Detection on nearly any device, while providing the same highly-accurate detections that you are seeing here in your browser.

SINGLE CORE ARM CORTEX-A72 @ 1.5 GHz​

29 frames⁄s

WITH A TINY FOOTPRINT​

2.3 MB
Plumerai People Detection runs on Arm Cortex-A, x86, and RISC-V CPUs and on $1 Arm Cortex-M and ESP32-S3 microcontrollers. It can also easily be adapted to leverage AI accelerators.

1683137512384.png

 
  • Like
  • Love
  • Fire
Reactions: 36 users

TheDon

Regular
Brn is making noise and its getting louder and louder!
 
  • Like
Reactions: 22 users
  • Like
  • Fire
Reactions: 12 users

MrRomper

Regular
Unfortunately no. Xperi has a competing NCU chip (Perceive Ergo) that's not exactly truly neuromorphic in architecture (uses MAC functions, at least in the first version, no info on second info to suggest a change). @Diogenese I believe has had a look at Xperi about 2 years ago on that other place where the grass is browner.
Yes. understand exactly what you are saying in regards to Xperi.
I was essentially referencing part of the post where it states 'powered by PROPHESEE Event-Based Metavision® sensor.'

When looking further into Event-Based Metavision you get the following:
One for Akida.
https://www.linkedin.com/feed/updat...date:(V2,urn:li:activity:6944573360253059072)
For balance. One for Snapdragon (in smartphones)
https://www.linkedin.com/feed/updat...date:(V2,urn:li:activity:7036327512653594624)

Does it have Akida? Ultimately until it is definitively answered it still speculation.
 
  • Like
Reactions: 2 users
  • Like
  • Fire
Reactions: 16 users

RobjHunt

Regular
Would it be too far from the realms of possibilities that Elon hasnt yet released his fantastical Pi Phone due to being in cahoots with our little nipper (Akida Version??) wanting it to do the things that he envisages it to do?

Now my own speculation is getting me excited possums ;)

Pantene Peeps!
 
  • Like
  • Fire
Reactions: 15 users

RobjHunt

Regular
Would it be too far from the realms of possibilities that Elon hasnt yet released his fantastical Pi Phone due to being in cahoots with our little nipper (Akida Version??) wanting it to do the things that he envisages it to do?

Now my own speculation is getting me excited possums ;)

Pantene Peeps!
With no disrespect to Barry, Dame Edna or Les. God rest their wonderful soles!
 
Last edited:
  • Like
  • Haha
  • Love
Reactions: 13 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
I love this part @Tothemoon24!

According to Arm, more than 90% of in-vehicle infotainment (IVI) systems use the company’s chip designs. The architectures are also found in various under-the-hood applications, including meter clusters, e-mirrors, and heating, ventilation, and air conditioning (HVAC) control.

If I was Arm, I would be incorporating AKIDA, into not just in the Cortex M based MCU's but into the A and R based MCU's as well, just to cover all bases.🏏

View attachment 35570

View attachment 35572
View attachment 35573
View attachment 35575 View attachment 35574

View attachment 35576

https://armkeil.blob.core.windows.n...ide-to-arm-processing-power-in-automotive.pdf


Continuing on from the above ramblings, if Arm were to incorporate AKIDA 1500 in all of its M based MCU's then it would tie in nicely with Renesas plans to in regards to the 22nm RA-family which is being sampled right now with select customers with plans for general availability towards the end of the year. Seems to marry in nicely with the tape out times of Global Foundaries 22nm AKIDA 1500.

We know AKIDA is compatible with all of Arm's product families , so it wouldn't make sense just to incorporate it with Cortex M-85, would it?

Why stop there?

IMO.
Screen Shot 2023-05-04 at 10.40.28 am.png


Renesas Makes the Jump to 22nm with a New RA-Class MCU with Software-Defined Radio, Sampling Now Offering Bluetooth 5.3 Low Energy (BLE) at launch, this cutting-edge Arm Cortex-M33 microcontroller can be upgraded for future releases.​


Gareth HalfacreeFollow
22 days ago • HW101 / Internet of Things / Communication
image_R2qlbKqlN4.png





2



Renesas Electronics has announced sampling of its first microcontroller to be built on a 22nm semiconductor process node — an RA-family 32-bit Arm Cortex-M33-based chip with Bluetooth 5.3 Low Energy (BLE) provided via an on-board software-defined radio (SDR).
"Renesas' MCU [Microcontroller Unit] leadership is based on a wide array of products and manufacturing process technologies," boasts Renesas' Roger Wendelken of the sampling. "We are pleased to announce the first 22nm product development in the RA MCU family which will pave the way for next generation devices that will help customers to future proof their design while ensuring long term availability. We are committed to providing the best performance, ease-of-use, and the latest features on the market. This advancement is only the beginning."
Renesas has announced a new RA-class microcontroller with SDR-powered Bluetooth 5.3 Low Energy (BLE) support, built on a 22nm process node. (📷: Renesas)

Renesas has announced a new RA-class microcontroller with SDR-powered Bluetooth 5.3 Low Energy (BLE) support, built on a 22nm process node. (📷: Renesas)

Modern semiconductor manufacturing processes are measured, after a fashion, in nanometers — once the size of a given feature, then the smallest gap between features, and now a somewhat hand-wavy way of differentiating a next-generation process node from a previous one. While bleeding-edge high-frequency application class processors, like those from Intel or AMD, are now playing with single-digit nanometer process nodes, traditionally microcontrollers — needing to pack in far fewer transistors than high-performance application processors — have stuck with proven, and more affordable, double- or triple-digit process nodes.
That's key to why Renesas' announcement of a part built on a 22nm process node, a node which Intel began using back in 2012 for its Ivy Bridge family of chips before moving to 14nm for Broadwell in 2014, is notable: for microcontrollers, 22nm is an advanced node indeed. It allows the company to pack more components into a given area, and Renesas has taken full advantage of that extra capacity by fitting the chip with a software-defined radio (SDR) — powering Bluetooth 5.3 Low Energy (BLE) connectivity with direction-finding and low-power audio capabilities at launch, but upgradeable post-release to support new radio protocols and standards as-required.
The new microcontroller enters the RA family, alongside the recently-launched entry-line RA4E2. (📷: Renesas)

The new microcontroller enters the RA family, alongside the recently-launched entry-line RA4E2. (📷: Renesas)

The shift to a 22nm node will also bring with it an overall reduction in part size and gains in efficiency which can be exploited as either increased performance for the same power draw or a lower power draw for the same performance — or a balanced combination of the two. Renesas has not, however, yet shared full specifications for the part, including frequency and power requirements.
Renesas is now sampling the 22nm RA-family chips to "select customers," with plans for general availability towards the end of the year. Parties interested in requesting a sample should contact their local sales office for more details,

 
  • Like
  • Love
  • Fire
Reactions: 38 users

Diogenese

Top 20
With no disrespect to Barry, Dame Edna or Les. God rest their wonderful soles!
... and sadly, shares in Gladioli Growers Pty Ltd may never recover.
 
  • Haha
  • Sad
  • Like
Reactions: 8 users
Top Bottom