Akida™ PCIe Board - What have you done & how did you do it?

JK200SX

Regular
Nah, I’ve just got giant hands:)

I’ll have a go at installing Ubuntu later today. I have a spare i3-6100 HP Prodesk lying around that I’ll use with the OS on an SSD.
Ubuntu installed, now the fun part.......
 
  • Like
Reactions: 3 users

MADX

Regular
  • Haha
Reactions: 2 users

MADX

Regular
After receiving my board, I purchased a modest, refurbished PC for about $580 (USD) to run Ubuntu 20.04 as I typically work on Windows systems. I was able to get a customized Dell OptiPlex 7050 SFF (Small Form Factor) Desktop, which looks similar to BrainChip's shuttle PC.

The purchase provided the following specs:
  • Intel Core i7-7700K, 4-Core, 4.2GHz
  • 16GB (2x8) DDR4 RAM
  • 256GB SATA SSD
  • Nvidia Quadro K600 Low Profile, 1GB DDR3 VRAM
  • Slim DVD+RW Drive
So for the PC and the Akida PCIe board, I spent roughly ~1000, which is about 1/5th of the cost of the shuttle PC. Yes, unfortunately, I contributed to their lower revenues this past quarter. :p

I spent a weekend installing/configuring Linux, installing development tools such as Python 3 (the version required for MetaTF), Visual Studio Code with associated add-ons, and several other Open Source apps I use under Windows.

I installed the BrainChip PCIe card, and it is useful to note that the mounting bracket and screw were useless to me, as the bracket was too tall for the small form-factor case. Instead, I just seated the card firmly. The card is low-profile enough that it is firmly seated without any play.

I provided my serial number on BrainChip's site to get access to the PDF with the driver installation instructions. I was able to install the drivers as documented and run the tests to ensure the card was working. To my surprise, the card has a tiny bright-blue LED light when powered on.

What I have yet to do before I consider the system complete:
  • Acquire a Web and set up a Web camera.
  • Finishing a course or two for TensorFlow on Udemy.
  • Play around with the existing MetaTF examples.
The first project I have planned is a hobby project incorporating the use of the Emotiv Insight Brain-Computer Interface (BCI) which uses electroencephalography (EEG) to train and recognize commands that can be personalized for the individual using it. Several years ago, a Community SDK was provided (I will need to sift through all the forked branches) for developers to write custom interfaces to the device.

The initial goal is to figure out how to take that raw EEG data and determine how to provide that information to Akida's neural fabric to make use of its one-shot (or multi-shot) learning. The data collection application would consist of showing the user training the chip to read a command (which would be provided by a visual representation) while they concentrate on how they would perform it. Simple primitives like Stop, Go, Right, and Left would be sampled, then the user could "think" about these commands and the detected results from the training data would be displayed.

I should note that the Emotiv software itself provides a means of gathering EEG information and can play it back. Think of it like "Dragon Naturally Speaking", but instead of speech recognition, it would be recognizing brain activity. What I hope to accomplish is a proof of concept for Smart Edge devices that can be individually trained and controlled by a user on-demand.

A secondary, but future project would be to analyze the PCIe driver for which BrainChip provides the Linux source code and create a Windows PCIe driver. While it is fun to play with Linux once in a while, I'm mainly a Windows developer. It's my comfort zone for being more productive. :)
In case you didn't know, Elon (Musk) has https://neuralink.com/ https://www.tesla.com/elon-musk which sounds like it is right up your alley.
 
  • Like
Reactions: 2 users

JDelekto

Regular
  • Like
Reactions: 3 users
Has the honeymoon period finished or you ladies, guys still doing stuff?
 
  • Like
Reactions: 1 users
Yea doing heaps mate, look at it everyday
 
  • Haha
Reactions: 2 users
D

Deleted member 118

Guest
Only board I got is a

 
  • Haha
  • Like
Reactions: 5 users

JDelekto

Regular
I still have plans to do stuff before the end of this year. I finally have a complete kit together (SFF PC, PCIe card, Camera, and Emotiv device). The only thing I'm waiting for is the 2nd half of December to take my PTO time.

Work has gotten in the way of my independent learning and hobby time after the company I work for was acquired. It's almost like I'm working two jobs now. :(
 
  • Like
  • Love
Reactions: 8 users

Jasonk

Regular
Anyone managed to get akida PCIe card running?
I am a beginner to python and getting the below error while attempting to run the beginner script.

I'll have a further play later.

20221213_220650.jpg
 
  • Like
  • Fire
Reactions: 3 users

JDelekto

Regular
Anyone managed to get akida PCIe card running?
I am a beginner to python and getting the below error while attempting to run the beginner script.

I'll have a further play later.

View attachment 24546
The assertion is failing because it expects the value of label[0] to be equal to the value of test_label[sample_image].

The test_label[] is populated from the MNIST data set, and sample_image is initialized to zero, so the first element in the label array returned from the model.predict_classes() function is not matching the first element in the test_labels.

Before the assert, you can add a print(labels[0]) on one line, and a print(test_label[sample_image]) below that to see their values, it may give some clue as to which one is problematic.
 
  • Like
Reactions: 3 users

Jasonk

Regular
Thanks JD.
I'll look further into that, I noticed the installs and downloads ran and just made an assumption it would be okay. appreciated

I come from a web background (php, js) with limited C and arduino development experience so a fair bit to learn with Python. Hopefully it's not to much of a slow burn.
 
  • Like
Reactions: 1 users

MADX

Regular
I am a lay person with programming but have recently become aware of ChatGPT. You can use it for all sorts of marvellous things, incluing having it unbug python
 
  • Like
Reactions: 1 users

Jasonk

Regular
I am a lay person with programming but have recently become aware of ChatGPT. You can use it for all sorts of marvellous things, incluing having it unbug python
ChatGPT3 is good value; very useful for work. Next ititeration will be interesting.

Time to sit down and see if I can get this code to run.
 
  • Like
  • Thinking
Reactions: 3 users

perceptron

Regular
Thanks JD.
I'll look further into that, I noticed the installs and downloads ran and just made an assumption it would be okay. appreciated

I come from a web background (php, js) with limited C and arduino development experience so a fair bit to learn with Python. Hopefully it's not to much of a slow burn.
Hi.
I have been in contact with sales about purchasing an ADK1000 board. I am trying to find out if the board can be delivered to a PO Box in Australia. Sales have told me it can be divered overseas and nothing more. Did anyone have theirs arrive in this way?
Cheers.
 
  • Like
Reactions: 3 users

Jasonk

Regular

perceptron

I don't see why a PO Box wouldn't work. The package size is small.



Mine was ordered via a friend who purchased it through a PC hardware shop in Australia as a special buy so unfortunately can not provide further details on shipping.



 

Attachments

  • 16779231711954764804542796877866.jpg
    16779231711954764804542796877866.jpg
    1.9 MB · Views: 131
Last edited:
  • Like
Reactions: 1 users

perceptron

Regular
Apprc

perceptron

I don't see why a PO Box wouldn't work. The package size is small.



Mine was ordered via a friend who purchased it through a PC hardware shop in Australia as a special buy so unfortunately can not provide further details on shipping.




perceptron

I don't see why a PO Box wouldn't work. The package size is small.



Mine was ordered via a friend who purchased it through a PC hardware shop in Australia as a special buy so unfortunately can not provide further details on shipping.



Thank you for the response.
Appreciate your time.
 

RHanneman

Emerged
Hi.
I have been in contact with sales about purchasing an ADK1000 board. I am trying to find out if the board can be delivered to a PO Box in Australia. Sales have told me it can be divered overseas and nothing more. Did anyone have theirs arrive in this way?
Cheers.
That's unfortunate... I bought mine when they were initially made available and was able to have it shipped to a residential address in Australia. I wonder why they've stopped allowing that
 

Jasonk

Regular
I will be chipping away at a project involving the Phrophsee GenX320 event-based camera and an Akida1000 PCIe card for those interested.
The goal is to improve the understanding of neuromorphic processing and event-based cameras for myself and anyone else who decides to follow. I will attempt to use as little technical speak as possible. Hopefully, posting here will add some accountability for me to get to the other end of the project. I will be covering as many aspects as possible of the process in detail, including:

  • Why do we use event-based cameras compared to frame-based
  • Technical dive into a Prophesee event-based GenX320 Camera
  • Setup and interfacing an X320 camera with a microprocessor for data capture
  • Using Phrophesee Metavision software and processing data
  • Review of neuromorphic processing
  • Installation and setup of an Akida1000 PCIe card
  • Simulation of SNN data processing using MetaTF
  • Processing of data on an Akida1000 PCIe card
  • Comparison of conventional and SNN neuromorphic processing time and power consumption
  • Real-world applications for event-based cameras
It could be two or three weeks between topics, depending on life; enjoy.



Event-Based Cameras, Neuromorphic Processing and Application.​

What is the fundamental difference between a frame-based and event-based camera?​

For a foundational understanding of a conventional frame-based camera and the more recent event-based camera, this section assumes that a photoreceptor is interchangeable with the term pixel and maintains a 1-to-1 ratio to the image's resolution.

A conventional frame-based camera captures light at periodic intervals via light-sensitive photoreceptors to produce a frame more commonly known as an image. The pixels sampled within a current frame are memoryless meaning the frame has no relation to past or future frame samples. For this reason, each frame must be re-processed regardless of whether the scene is static and contains redundant information. As the resolution of an image increases so does the pixel count as shown in Figure 1. The more pixels within a frame, in turn, will generate more data to be processed; common resolutions used today include:
  • 1280x720 pixels / HD (High Definition),
  • 1920x1080 pixels / FHD (Full HD),
  • 2560 x 1440 pixels / QHD (Quad HD), and
  • 3840x2160 pixels / 4K UHD (Ultra HD).

Another consideration for a Frame-based camera is Frames Per Second (FPS), shown in Figure 2, which relates to the speed at which a processor requests new frames. Simplistically, it is possible to increase the FPS in an attempt to replicate a continuous stream of frames, but once coupled with the specific resolution of a camera, the data size of a frame can rapidly increase. Therefore, the increasing volume of data will need to be processed, which requires increased processing speed, power consumption, and heat dissipation to avoid potential processing delays, potentially leading to a system no longer fit for purpose [2].

1704721927443.png

Figure 1 Demonstrates the increase in pixel count as camera resolution increases.
1704721935519.png

Figure 2 demonstrates the potential for missed information based on FPS [1].



Like a frame-based camera photoreceptor, event-based photoreceptors generate a voltage signal when exposed to light [3], similar to how a solar cell produces a voltage. While the operation of a frame and event-based camera may sound the same, the fundamental difference between the two technologies is that a frame-based camera will be polled periodically by a processor with all pixels providing current state information regardless of change and the information processed synchronously. On the other hand, the event-based camera will interrupt the processor only when a change in the light intensity of an individual pixel triggers an event. The ability to interrupt and transmit only dynamic scene changes from a triggered event reduces the processing data 10 to 1000 times compared to conventional frame-based cameras, thus reducing the processor's power consumption [4][6]. The interrupt process enables event-based cameras to achieve sub-1-microsecond event processing asynchronously and precisely, capturing and representing dynamic changes in the scene over time. The event-based camera characteristics result in a continuous stream of information processed in real-time, in stark contrast to frame-based cameras, as illustrated in Figure 3 [5].


1704721953648.png
Figure 3 - The fundamental difference between frame-based and event-based camera outputs is the ability of event-based cameras to produce a continuous data stream of dynamic scene changes compared to a frame-based camera, which captures a complete static frame [5].



Frame and event-based camera comparison based on 2020 information in Table 1 provide a broad overview between camera types and to further highlight the benefits of event-based cameras when processing limitations and power efficiency are a crucial requirement, such as those found on edge devices or limited internet connectivity [7].

Table 1 - General comparison of two conventional frame-based cameras against an event-based camera.
1704722020890.png
1704722027285.png
1704722028625.png
High-speed cameraStandard cameraEvent Camera
Max fps or measurement rateUp to 1MHz100-1,000 fps1MHz
Resolution at max fps64x16 pixels>1Mpxl>1Mpxl
Bits per pixels (event)12 bits8-10 per pixel~40 bits/event
Weight6.2 Kg30 g30 g
Data rate1.5 GB/s32MB/s~1MB/s on average
Mean power consumption150 W1 W1 mW



In conclusion, frame-based cameras can capture a complete scene within a single frame that including static and dynamic information. The trade-off is high data processing, higher power consumption, and the potential for critical data not captured between frames. On the other hand, event-based cameras employ an interrupt process driven by an individual pixel, thus reducing data processing and allowing for a continuous stream of dynamic changes within a scene at a high temporal resolution. As shown in Figure 4, the trade-off for lower power and a continuous data stream is that event-based cameras only detect the dynamic changes in the scene based on a light intensity change with great accuracy and speed; meanwhile, frame-based cameras capture both static and dynamic information in full detail.

1704722057078.png

1705630426942.png

Figure 4 Difference between a frame and event-based camera image.


The event-based camera characteristics position the technology as a promising tool for applications demanding real-time precision in dynamic environments, including robotics, autonomous vehicles, vibration monitoring, and surveillance. One would envisage that both frame and event-based cameras will have individual use cases with more advanced systems relying on both cameras working in conjunction with each other to reduce data processing and power consumption while maintaining static and dynamic information within a scene.

Next section will cover a technical dive into the operation of a Prophesee event-based GenX320 Camera to understand the inner workings of the hardware and configuration of the camera.

References

[1] image-14.png (562×427) (photometrics.com)

[2] Canon Camera sensors explained.

[3] PROPHESEE Metavision (1:05)

[4] PROPHESEE Metavision (7:04)

[5] Sony | Event-based Vision Sensor (EVS) (1:11)

[6] Event-Based Metavision® Sensor GENX320 | PROPHESEE

[7] A Toolbox for Easily Calibrating Omnidirectional Cameras (uzh.ch)
 

Attachments

  • 1704721993517.png
    1704721993517.png
    31.3 KB · Views: 64
Last edited:
  • Like
  • Fire
  • Love
Reactions: 34 users

wasMADX

Regular
Hi JASONK. Have you considered telling Silicon Chip (the magazine) about your project?
If they put the word out their readers might join this forum and provide feedback. Not to mention the publicity for BRN .
 
Last edited:
  • Like
Reactions: 12 users

Jasonk

Regular
Hi JASONK. Have you considered telling Silicon Chip (the magazine) about your project?
If they put the word out their readers might join this forum and provide feedback. Not to mention the publicity for BRN .
Thanks @wasMADX, I was a little worried in posting it to begin with; first time doing a write-up like this.

I'll get moving first and see if everyone likes it. Think of it as a soft launch.
 
  • Like
  • Love
  • Fire
Reactions: 10 users
Top Bottom