Wags
Regular
Thankyou @Diogenese for the reply.The 8 bit floating point discussion relates to the type of data to be used to store the model libraries for use in AI/NNs - that is an information byte is a string of eight ones and zeros, eg, 10010011.
The first bit is the sign, 1 for minus, zero for plus.
The next 4 bits are the exponent (the power to which 2 is raised), adjusted to accommodate fractional numbers (negative exponents).
The last 3 bits are the mantissa (plus the implied 1 to the left of the 3 bits). [Thrifty lot, these engineers, aren't they?]*
As I was going down the stair,
I met a man(tissa) who wasn't there.
He wasn't there again today ...
BrainChip has developed a number of model libraries adapted to Akida's 1-to-4 bit format.
The CNN2SNN functionality of Akida is able to adapt CNN models to Akida format.
https://doc.brainchipinc.com/user_guide/akida_models.html
Overview
Brainchip akida_models package is a model zoo that offers a set of pre-built akida compatible models (e.g Mobilenet, VGG or AkidaNet), pretrained weights for those models and training scripts.
See the model zoo API reference for a complete list of the available models.
akida_models also contains a set of quantization blocks and layer blocks that are used to define the above models.
Command-line interface for model creation
In addition to the programming API, the akida_models toolkit provides a command-line interface to instantiate and save models from the zoo.
Instantiating models using the CLI makes use of the model definitions from the programming interface with default values. To quantize a given model, the CNN2SNN quantize CLI should be used.
The beauty of Akida is that it can learn on the job without having to reprogramme the whole dataset.
*FN: This thrift of Silicon real estate originates well before USB sticks were $1 per megabit.
PS: 8-bit FP model library is designed for use with CPU?GPU based processors - MetaTF can convert those model libraries for use in Akida.
It's good and comforting to have smart people like you to lean on with this highly specialised and technical info, thankyou.
I sort of get the difference between the 4 and 8bit scenario. It just amplifies how clever and ahead of the game Peter and Anil are, and have been for many years.
Like I said earlier, it seems to me Nvidia and Intel, by way of this "industry specification" with Arm, have run the white flag up the pole in terms of technical ability. Simply put, they can't match Akida in performance
Ok, how else can we shaft this Aussie mob from taking a major chunk of our market share?. Use our combined size and horsepower to control the market, create a specification that they can't comply with, and spread the word??? tell the world Brainchip doesn't comply with the industry specification. (Edit) That should slow them down for awhile, whilst we figure out what their secret sauce is, and how to get around it.
I have absolutely no doubt in my mind who has the better product.
Brainchip's goal of creating the defacto standard, based on performance, cost and ease of adoption, supported by good morals, the research centre in Perth, and the sponsorships to universities, hopefully prevails.
Akida Ballista
Last edited: