After receiving my board, I purchased a modest, refurbished PC for about $580 (USD) to run Ubuntu 20.04 as I typically work on Windows systems. I was able to get a customized Dell OptiPlex 7050 SFF (Small Form Factor) Desktop, which looks similar to BrainChip's shuttle PC.
The purchase provided the following specs:
- Intel Core i7-7700K, 4-Core, 4.2GHz
- 16GB (2x8) DDR4 RAM
- 256GB SATA SSD
- Nvidia Quadro K600 Low Profile, 1GB DDR3 VRAM
- Slim DVD+RW Drive
So for the PC and the Akida PCIe board, I spent roughly ~1000, which is about 1/5th of the cost of the shuttle PC. Yes, unfortunately, I contributed to their lower revenues this past quarter.
I spent a weekend installing/configuring Linux, installing development tools such as Python 3 (the version required for MetaTF), Visual Studio Code with associated add-ons, and several other Open Source apps I use under Windows.
I installed the BrainChip PCIe card, and it is useful to note that the mounting bracket and screw were useless to me, as the bracket was too tall for the small form-factor case. Instead, I just seated the card firmly. The card is low-profile enough that it is firmly seated without any play.
I provided my serial number on BrainChip's site to get access to the PDF with the driver installation instructions. I was able to install the drivers as documented and run the tests to ensure the card was working. To my surprise, the card has a tiny bright-blue LED light when powered on.
What I have yet to do before I consider the system complete:
- Acquire a Web and set up a Web camera.
- Finishing a course or two for TensorFlow on Udemy.
- Play around with the existing MetaTF examples.
The first project I have planned is a hobby project incorporating the use of the
Emotiv Insight Brain-Computer Interface (BCI) which uses electroencephalography (EEG) to train and recognize commands that can be personalized for the individual using it. Several years ago,
a Community SDK was provided (I will need to sift through all the forked branches) for developers to write custom interfaces to the device.
The initial goal is to figure out how to take that raw EEG data and determine how to provide that information to Akida's neural fabric to make use of its one-shot (or multi-shot) learning. The data collection application would consist of showing the user training the chip to read a command (which would be provided by a visual representation) while they concentrate on how they would perform it. Simple primitives like Stop, Go, Right, and Left would be sampled, then the user could "think" about these commands and the detected results from the training data would be displayed.
I should note that the Emotiv software itself provides a means of gathering EEG information and can play it back. Think of it like "Dragon Naturally Speaking", but instead of speech recognition, it would be recognizing brain activity. What I hope to accomplish is a proof of concept for Smart Edge devices that can be individually trained and controlled by a user on-demand.
A secondary, but future project would be to analyze the PCIe driver for which BrainChip provides the Linux source code and create a Windows PCIe driver. While it is fun to play with Linux once in a while, I'm mainly a Windows developer. It's my comfort zone for being more productive.