wilzy123
Founding Member
Couldn't recall if posted prev and tbh didn't search.
Wonder how the update mid Dec re Transformers fits?
Is it to do with interface of third party models or to with ours or combination or...??
Might need @Diogenese thoughts if not already provided b4.
Upgrade to akida/cnn2snn 2.2.6 and akida_models 1.1.8
Latest
Compare
ktsiknos-brainchip released this Dec 14, 2022![]()
2.2.6-doc-1
d334eea
Update akida and cnn2snn to version 2.2.6
New features
- [akida] Upgrade to quantizeml 0.0.13
- [akida] Attention layer
- [akida] Identify AKD500 devices
- [engine] Move mesh scan to host library
API changes
- [engine] toggle_learn must be called instead of program(p,learn_enabled)
- [engine] set_batch_size allows to preallocate inputs
Bug fixes
- [engine] Memory can grow indefinitely if queueing is faster than processing
Update akida_models to 1.1.8
- updated CNN2SNN minimal required version to 2.2.6 and QuantizeML to 0.0.13
- VWW model and training pipeline refactored and aligned with TinyML
- Layer names in almost all models have been updated in preparation for quantization with QuantizeML
- Tabular data models and tools have been removed from the package
- Transformers pretrained models updated to 4-bits
- Introduced calibration utils in training toolset
- KWS and ImageNet training scripts now offer a "calibrate" CLI action
- ImageNet training script will now automatically restore the best weights after training
These github commits are definitely worth watching. They are public and need to genuinely reflect changes in source code, which legitimately hold clues into the ongoings at BRN for those that know what it is they are reading.