BRN Discussion Ongoing

Diogenese

Top 20
This BRN document compares Akida 1 with Nvidia and Google as well as STM, renesas and Silicon Labs.

https://brainchip.com/wp-content/uploads/2023/01/BrainChip_Benchmarking-Edge-AI-Inference-1.pdf

1697968267564.png



1697968296704.png



Looking forward to see how TeNNs stacks up with the Akida 2 version.
 
  • Like
  • Love
  • Fire
Reactions: 70 users

Mt09

Regular
Show me how it's done oh great one, or should i take note from all your high quality offerings? Rhetorical question.

How about you contribute something before critisising others that try?

In fact let's revisit my aim in 18 months, my arrows are still in the air for the most part.
Can’t take a joke?
 
  • Like
Reactions: 1 users

IloveLamp

Top 20
  • Haha
  • Like
Reactions: 2 users
Latest GitHub info & updates from early Sept. Lots nice 2.0 :)


Sep 5
@ktsiknos-brainchip
ktsiknos-brainchip
2.4.0-doc-1
cb33735
Upgrade to QuantizeML 0.5.3, Akida/CNN2SNN 2.4.0 and Akida models 1.2.0
Latest


Update QuantizeML to version 0.5.3

  • "quantize" (both method and CLI) will now also perform calibration and cross-layer equalization
  • Changed default quantization scheme to 8 bits (from 4) for both weights and activations

Update Akida and CNN2SNN to version 2.4.0

New features​

  • [Akida] Updated compatibility with python 3.11, dropped support for python 3.7
  • [Akida] Support for unbounded ReLU activation by default
  • [Akida] C++ helper added on CLI to allow testing Akida engine from a host PC
  • [Akida] Prevent user from mixing V1 and V2 Layers
  • [Akida] Add fixtures for the DepthwiseConv2D
  • [Akida] Add AKD1500 virtual device
  • [Akida] Default buffer_bitwidth for all layers is now 32.
  • [Akida] InputConv2D parameters and Stem convolution parameters take the same parameters
  • [Akida] Estimated bit width of variables added to json serialised model
  • [Akida] Added Akida 1500 PCIe driver support
  • [Akida] Shifts are now uint8 instead of uint4
  • [Akida] Bias variables are now int8
  • [Akida] Support of Vision Transformer inference
  • [Akida] Model.predict now supports Akida 2.0 models
  • [Akida] Add an Akida 2.0 ExtractToken layer
  • [Akida] Add an Akida 2.0 Conv2D layer
  • [Akida] Add an Akida 2.0 Dense1D layer
  • [Akida] Add an Akida 2.0 DepthwiseConv2D layer
  • [Akida] Add an Akida 2.0 DepthwiseConv2DTranspose layer
  • [Akida] Add an Akida Dequantizer layer
  • [Akida] Support the conversion of QuantizeML CNN models into Akida 1.0 models
  • [Akida] Support the conversion of QuantizeML CNN models into Akida 2.0 models
  • [Akida] Support Dequantizer and Softmax on conversion of a QuantizeML model
  • [Akida] Model metrics now include configuration clocks
  • [Akida] Pretty-print serialized JSON model
  • [Akida] Include AKD1000 tests when deploying engine
  • [Akida/infra] Add first official ADK1500 PCIe driver support
  • [CNN2SNN] Updated dependency to QuantizeML 0.5.0
  • [CNN2SNN] Updated compatibility with tensorflow 2.12
  • [CNN2SNN] Provide a better solution to match the block pattern with the right conversion function
  • [CNN2SNN] Implement DenseBlockConverterVX
  • [CNN2SNN] GAP output quantizer can be signed
  • [CNN2SNN] removed input_is_image from convert API, now deduced by input channels

Bug fixes:​

  • [Akida] Fixed wrong buffer size in update_learn_mem, leading to handling of bigger buffers than required
  • [Akida] Fixed issue in matmul operation leading to an overflow in corner cases
  • [Akida] Akida models could not be created by a list of layers starting from InputConv2D
  • [Akida] Increasing batch size between two forward did not work
  • [Akida] Fix variables shape check failure
  • [engine] Optimize output potentials parsing
  • [CNN2SNN] Fixed conversion issue when converting QuantizeML model with Reshape + Dense
  • [CNN2SNN] Convert with input_is_image=False raises an exception if the first layer is a Stem or InputConv2D
Note that version 2.3.7 is the last Akida and CNN2SNN drop supporting Python 3.7 (EOL end of June 2023).

Update Akida models to 1.2.0

  • Updated CNN2SNN minimal required version to 2.4.0 and QuantizeML to 0.5.2
  • Pruned the zoo from several models: Imagenette, cats_vs_dogs, melanoma classification, both occular disease, ECG classification, CWRU fault detection, VGG, face verification
  • Added load_model/save_models utils
  • Added a 'fused' option to separable layer block
  • Added a helper to unfuse SeparableConvolutional2D layers
  • Added a 'post_relu_gap' option to layer blocks
  • Stride 2 is now the default for MobileNet models
  • Training scripts will now always save the model after tuning/calibration/rescaling
  • Reworked GXNOR/MNIST pipeline to get rid of distillation
  • Removed the renaming module
  • Data server with pretrained models reorganized in preparation for Akida 2.0 models
  • Legacy 1.0 models have been updated towards 2.0, providing both a compatible architecture and a pretrained model
  • 2.0 models now also come with a pretrained 8bit helper (ViT, DeiT, CenterNet, AkidaNet18 and AkidaUNet)
  • ReLU max value is now configurable in layer_blocks module
  • It is now possible to build ‘unfused’ separable layer blocks
  • Legacy quantization parameters removed from model creation APIs
  • Added an extract.py module that allows samples extraction for model calibration
  • Dropped pruning tools support
  • Added Conv3D blocks

Bug fixes:​

  • Removed duplicate DVS builders in create CLI
  • Silenced unexpected verbosity in detection models evaluation pipeline

Known issues:​

  • Pretrained helpers will fail downloading models on Windows
  • Edge models are not available for 2.0 yet

Documentation update

  • Large rework of the documentation to integrate changes for 2.0
  • Added QuantizeML user guide, reference API and examples
  • Introduced a segmentation example
  • Introduced a vision transformer example
  • Introduce a tutorial to upgrade 1.0 to 2.0
  • Updated zoo performance page with 2.0 models
  • Aligned overall theme with Brainchip website
  • Fixed a menu display issue in the example section
Don't know if all vision transformer models include, presume so?

We have seen "ViT" from BRN in 2.0 and I noticed in the GitHub update...DeiT.

Wondered what that was.

  • 2.0 models now also come with a pretrained 8bit helper (ViT, DeiT, CenterNet, AkidaNet18 and AkidaUNet)
Snip from Medium explaining and looks pretty handy re training data and time taken. Nice to see we on to it off the bat imo.

  • Prior Vision Transformer, ViT, needs to be pre-trained with hundreds of millions of images using external data. ViT does not generalize well when trained on insufficient amounts of data.
  • Data-Efficient Image Transformer, DeiT, is proposed. While the architecture is mostly the same as ViT, it is trained on ImageNet only using a single computer in less than 3 days, with no external data.

 
  • Like
  • Fire
  • Love
Reactions: 30 users

IloveLamp

Top 20
Last edited:
  • Like
  • Fire
  • Love
Reactions: 33 users
Thought I'd check in on Numem and see if any updates. They closed out Ph I earlier this year and waiting to see, once Ames done their assessment etc, whether we move to a Ph II.

Nothing just yet.


1646847609588.jpg


IMG_20231022_195334.jpg
 
  • Like
  • Fire
  • Love
Reactions: 31 users

Galaxycar

Regular
On a upbeat note on brain hips website both the shuttle PC and the Raspberry PI are sold out, now question is how many were originally made at least we will have at least 10k in the next 4c yahoooooo!
 

Diogenese

Top 20
This emphasizes the significance of the announcement that Akida was compatible across the range of ARM processors.

My interpretation is that, basically the hard work has been done in designing the integration of Akida IP into all ARM processor IP there are "ready-made" circuit designs which incorporate Akida with all ARM processors which can be ordered virtually off-the-shelf. This does not mean that Akida will be in all ARM processors, but it means that, if Akida's functionality is needed in the end product, the work of integrating Akida IP into the AM IP has already been done.
 
  • Like
  • Fire
  • Love
Reactions: 89 users

Gies

Regular
1697979798803.png
 
  • Like
  • Fire
  • Love
Reactions: 33 users

jtardif999

Regular

WHAT IS BRAINCHIP USED FOR?​

What is Brainchip used for?

Understanding the Applications of Brainchip: Revolutionizing Artificial Intelligence​

In recent years, the field of artificial intelligence (AI) has witnessed remarkable advancements, pushing the boundaries of what was once thought possible. One such groundbreaking technology that has emerged is Brainchip, a neuromorphic computing platform that mimics the functioning of the human brain. With its unique architecture and capabilities, Brainchip is being utilized in a multitude of applications, revolutionizing various industries.
At its core, Brainchip is designed to process vast amounts of data in real-time, enabling rapid decision-making and pattern recognition. This innovative technology utilizes spiking neural networks (SNNs), which are inspired by the way neurons communicate in the human brain. By leveraging SNNs, Brainchip can efficiently process and analyze complex data sets, making it ideal for applications that require high-speed and low-power processing.

One of the primary applications of Brainchip lies in the field of surveillance and security. Traditional video surveillance systems often struggle to analyze and interpret large volumes of video footage in real-time. However, Brainchip’s advanced capabilities allow it to process video streams in real-time, detecting and identifying objects, faces, and even abnormal behavior. This technology has the potential to revolutionize the way security systems operate, enhancing public safety and reducing response times.
Another significant application of Brainchip is in the field of autonomous vehicles. Self-driving cars rely heavily on AI algorithms to navigate and make split-second decisions. Brainchip’s ability to process data in real-time and recognize patterns makes it an ideal solution for autonomous vehicles. By leveraging Brainchip’s capabilities, self-driving cars can analyze sensor data, detect obstacles, and make informed decisions, ultimately improving safety and efficiency on the roads.
Furthermore, Brainchip is also being utilized in the healthcare industry. Medical professionals are constantly faced with vast amounts of patient data that need to be analyzed accurately and efficiently. Brainchip’s high-speed processing and pattern recognition capabilities enable healthcare providers to quickly analyze medical images, detect anomalies, and diagnose diseases. This technology has the potential to revolutionize medical diagnostics, leading to faster and more accurate diagnoses, ultimately saving lives.
It is important to note that Brainchip is not limited to these applications alone. Its versatility and adaptability make it suitable for a wide range of industries, including robotics, industrial automation, and even gaming. As the technology continues to evolve, we can expect to see Brainchip being integrated into various sectors, transforming the way we live and work.

In conclusion, Brainchip is a revolutionary technology that is transforming the field of artificial intelligence. With its ability to process vast amounts of data in real-time and recognize patterns, Brainchip is being utilized in various applications, including surveillance and security, autonomous vehicles, healthcare, and more. As this technology continues to advance, it holds the potential to reshape industries and improve our daily lives in ways we never thought possible.
Sources:
– “Brainchip: The World’s First Neuromorphic Computing Platform” – Brainchip Holdings Ltd.
– “Brainchip Technology: A New Era of AI” – Analytics Insight
– “How Brainchip Works: The Future of AI” – TechRadar
To me unfortunately this reeks of being composed by generative AI 🙄
 
  • Like
  • Thinking
Reactions: 8 users

Tezza

Regular
Hoping for a gobsmaking 4c! If not I am ready to pounce on what I believe will become a ridiculous price.
 
  • Like
  • Love
  • Thinking
Reactions: 18 users

HopalongPetrovski

I'm Spartacus!
Hoping for a gobsmaking 4c! If not I am ready to pounce on what I believe will become a ridiculous price.
Yeah, no doubt a few of us have been keeping some powder dry for this potentiality.
GLTAH
 
  • Like
  • Love
  • Thinking
Reactions: 14 users

Xray1

Regular
Yeah, no doubt a few of us have been keeping some powder dry for this potentiality.
GLTAH

From the last 4C:

" Cash inflow from customers in the current quarter of $0.83M was higher than the prior quarter (US$0.04M)."

Now just a wait and see situation to see how " LUMPY " things can get.
 
  • Like
Reactions: 5 users

toasty

Regular
Interesting buy/sell ratio this morning........... ;)
 
  • Like
Reactions: 1 users

Xray1

Regular
From the last 4C:

" Cash inflow from customers in the current quarter of $0.83M was higher than the prior quarter (US$0.04M)."

Now just a wait and see situation to see how " LUMPY " things can get.

However, I thought that the following historical fact may be also a eye opener refresher course of our Co's 12 year journey thus far.

Brainchip Holdings Ltd (BRN) floated on the Australian Securities Exchange (ASX) on Wednesday, 9 November 2011.
BRN's current share price of $0.18 is a -$0.08 or 30% discount to its original offer price of $0.25.

I hope this upcoming 4C and the next two following 4C's before the AGM shows significant financial improvements ..... otherwise, imo the "Strike 2" scenario will play out from disgruntled S/Holders.
 
  • Like
  • Thinking
Reactions: 8 users

HopalongPetrovski

I'm Spartacus!
From the last 4C:

" Cash inflow from customers in the current quarter of $0.83M was higher than the prior quarter (US$0.04M)."

Now just a wait and see situation to see how " LUMPY " things can get.
Yeah. Hopefully it won't be Brown and Sticky.
You know, like a stick. 🤣
 
  • Haha
  • Like
Reactions: 4 users

7für7

Top 20
Interesting buy/sell ratio this morning........... ;)
Good morning! Can I ask you where you watch the buy/sell ratio? Thanks
 
  • Like
Reactions: 2 users

jla

Regular
Good morning! Can I ask you where you watch the buy/sell ratio? Thanks
I have a westpac acount it shows up there.
 
  • Like
Reactions: 2 users

jla

Regular
I have a westpac acount it shows up there.
That is a trading acount, But it is on the other banks to.
 
  • Like
Reactions: 1 users

HarryCool1

Regular
Chippers ,
America's Love of 'Yellowstone' Helps Launch Bull Riding as a Team Sport -  WSJ's Love of 'Yellowstone' Helps Launch Bull Riding as a Team Sport -  WSJ


HANG ON..... more hand chalk may be required....... giddy up

Regards ,
Esq
And less chalk hands wouldn't go astray either.
 
  • Like
Reactions: 1 users
Top Bottom