Hi DB,
You've sent me on a trip down memory lane:
BrainChip Announces: Unsupervised Visual Learning Achieved (2016)
https://brainchip.com/brainchip-announces-unsupervised-visual-learning-achieved/
ALISO VIEJO, CA — (Marketwired) — 02/23/16 —
B
rainChip Holdings Limited (ASX: BRN), developer of a revolutionary new Spiking Neuron Adaptive Processor (SNAP) technology that has the ability to learn autonomously, evolve and associate information just like the human brain, is pleased to report that it has achieved a further significant advancement of its artificial intelligence technology.
The R&D team in Southern California has completed the development of an Autonomous Visual Feature Extraction system (AVFE), an advancement of the recently achieved and announced Autonomous Feature Extraction (AFE) system. The AVFE system was developed and interfaced with the DAVIS artificial retina purchased from its developer, Inilabs of Switzerland. DAVIS has been developed to represent data streams in the same way as BrainChip’s neural processor, SNAP.
Highlights
- Capable of processing 100 million visual events per second
- Learns and identifies patterns in the image stream within seconds — (Unsupervised Feature Learning)
- Potential applications include security cameras, collision avoidance systems in road vehicles and Unmanned Aerial Vehicle’s (UAV’S), anomaly detection, and medical imaging
- AVFE is now commercially available
- Discussions with potential licensees for AVFE are progressing
AVFE is the process of extracting informative characteristics from an image. The system initially has no knowledge of the contents of an input stream. The system learns autonomously by repetition and intensity, and starts to find patterns in the image stream. BrainChip’s SNAP learns to recognize features within a few seconds, just like a human would when looking at a scene. This image stream can originate from any source, such as an image sensor like the DAVIS artificial retina, but also from other sources that are outside of human perception such as radar or ultrasound images.
In traditional systems, a computer program loads a single frame from a video camera and searches that frame for identifying features, predefined by a programmer. Each section of the image is compared to a template until a match is found and a percentage of the match is returned, along with its location. This is a cumbersome operation.
An AVFE test sequence was conducted on a highway in Pasadena, California for 78.5 seconds. An average event rate of 66,100 events per second was recorded. The SNAP spiking neural network learned the features of vehicles passing by the sensor within seconds (see Figure 1). It detected and started counting cars in real time. The results of this hardware demonstration shows that SNAP can process events emitted by the DAVIS camera in real time and perform unsupervised learning of temporally correlated features.
AVFE can be configured for a large number of uses including surveillance and security cameras, collision avoidance systems in road vehicles and Unmanned Aerial Vehicle’s (UAV’S), anomaly detection, medical imaging, audio processing and many other applications.
Peter van der Made, CEO and Inventor of the SNAP neural processor said, “We are very excited about this significant advancement. It shows that BrainChips neural processor SNAP acquires information and learns without human supervision from visual input. The AVFE is remarkable and capable of high speed visual perception and learning, that has wide spread commercial applicability.”
It hasn't been all plain sailing. Just a couple of years ago, we were flying by the seat of our pants:
https://s3.amazonaws.com/content.st...f/Product+Development+and+Business+Update.pdf
S
ydney, Australia – 28 February, 2019: BrainChip Holdings Ltd (ASX: BRN), the leading neuromorphic computing company, today provided a product development and business update. Akida product development Since inception BrainChip has been committed to providing an artificial intelligence solution as an integrated circuit. The Company’s acquisition of Spikenet Technologies in September of 2016 has provided software validation of a spiking neural network (SNN) specific to image processing. The Spikenet research and engineering team have proved invaluable in the area of image processing and have provided significant insight for the development of Akida.
The Company announced the Akida Development Environment (ADE) and Architecture in the fourth quarter of 2018. The ADE allows users to fully simulate the implementation of the Akida IC and determine benchmark performance in terms of accuracy and power consumption in AI Edge applications. Importantly, this allows OEM equipment design prior to the IC introduction.
The development of Akida is proceeding well, with refinements from inputs of early access potential customers. The Company has implemented the Akida NPC in a Field Programmable Gate Array (FPGA) for internal use in evaluation. This is an important step in the process of developing a complex IC as it provides verification of the logic design, thereby improving prospects for a successful implementation prior to incurring manufacturing expenses.
The Company has determined that an Application Specific Integrated Circuit (ASIC) vendor that provides full services, from the layout of the design through all subsequent manufacturing processes, will be most cost effective, [### Socionext ###] reduce risk and accelerate time-tomarket for Akida in AI Edge applications. The Company expects to select an ASIC vendor in the first quarter of 2019 and commence logic circuit design in the first quarter of 2019.
BrainChip has made great strides in the Akida design, creating a device that is very compact, flexible, provides low-latency and is low-power for AI Edge applications. The device can be user-configured for both convolutional and fully-connected networks applicable to a broad range of visual, data, and sensor applications. Akida will deliver up to 1.2 million neurons and over 10 billion synapses in a low-power chip, expandable to a far greater capacity by utilizing off-chip memory.
The details of the design are proprietary and are described in the Company’s currently pending provisional patent. The Company is working on a series of patents covering all aspects of the unique Akida design in detail.
Restructuring and Expense Control
The Company is implementing a restructuring and series of expense controls to focus resources primarily on the Akida product development.
With regard to BrainChip Studio, end-user engagement has provided the Company deep knowledge of customer expectations and insight regarding the human capital and sales process required to be successful. However, the Company underestimated the time and effort to support end-users and the time to achieve revenue from the ongoing trials. The insights gained from end-user engagement has reinforced the Company’s view that focusing on OEM customers provides many benefits including a significant reduction in direct cost and opportunity cost savings.
Taking these valuable learnings into account, the Company has shifted its focus from end-user sales, to OEM relationships. This allows the primary focus of the the organisation to be on the completion of the Akida IC. Because the Company’s success with OEM partners is highly dependent on their own success in marketing their platform, the Company intends to partner with those OEMs best able to bring the innovation of a low-power, high-accuracy spiking neural network to its customer base. The business model for BrainChip Studio includes license and revenue sharing while Akida includes product, license and royalty revenue.
With regard to restructuring and reduction in expenses, BrainChip Studio end-user sales and marketing roles will be eliminated, and discretionary spending will be reduced. In total, the changes implemented in this restructuring are expected to result in a decrease of 10% to 15% of overall planned spending. The restructuring will not affect research or engineering development resources. In addition, certain key management personnel have agreed to accept a temporary reduction in their salaries until such time as the board considers the Company to be in a position to revert to their current market based remuneration.
The ending of Studio sales and marketing would not have eliminated their contractual obligations to support existing customers.
"
Certain key management personnel have agreed to accept a temporary reduction in their salaries", and to burn the candle at both ends for over a year, but I remember the furore over at the other place about the allocation of bonus shares.
When
@TECH talks about the quality of the Brainchip team, this shows their mettle.