BRN Discussion Ongoing

Vojnovic

Regular
Agree, ilL, almost entirely with your outlook, not so sure about your claim about AKD2000. Is there some evidence in regard to this suggestion, and if so, please point me in the direction where I might witness this, asap, please.

The 1000 eyes should be able to help, in this regard. Hope your correct. No announcement in regard to this as far as I'm aware.

Please, ( eyes ) PVDM possibly is all ears as well, A good question to be asked at the AGM IMO. Shame I can't make it to this event.

Do it soon, ilL, I'm all ears.

Akida Ballista >>>>>>> If AKD2000 has been proven functional, well, just WOW <<<<<<<

hotty...
Hmmm, I could swear I saw more links in the past stating Akida 2000 is scheduled to be realesed in 2022, along with details around it working.
I could now only find this:
[PVdM] "At least Akida 2000 is already…the simulation is working already. And we are getting ready to hand it over to engineering."
 
  • Like
  • Fire
  • Love
Reactions: 16 users
Hi all

The following comes from Plumeria website and I think this is where we will find AKIDA IP and note the name they use IKVA. No matter how you say it, it has a certain ring to it:

“Until now, we have been deploying our BNNs on Arm Cortex-M and Arm Cortex-A processors with great results. However, we felt there was more room for improvement, since these CPUs are built to run typical 8-bit and 32-bit workloads and don’t provide native support for the single-bit operations that our BNNs rely on.

Some of our customers asked if our AI solutions also support FPGAs, since these provide incredible flexibility, cost efficiency, and tighter integration. FPGAs turn out to be an ideal platform to implement our models and inference engine on, as they enable us to unlock the full potential of our BNNs. In FPGAs we can natively implement the binary arithmetic that we need to run our models. We therefore decided to develop our own AI accelerator IP core named Ikva, which we introduce for the first time in this blog post. The Ikva accelerator runs our own BNNs and also efficiently supports 8-bit models. Of course, Ikva is fully supported by our extensive tool flow and ultra-fast and memory-efficient inference engine that’s integrated with TensorFlow Lite. A 32-bit RISC-V processor controls Ikva, captures the data from the camera and provides a programmer-friendly runtime environment. During the development of Ikva, we aimed to design a new hardware architecture for our optimized AI models while keeping it highly flexible and suitable for unknown future models. In contrast to other AI companies that seem to either develop models, or training software, or AI processors, we focus on the full AI stack and the Ikva core completes our offering. With Ikva, we now support the full AI stack starting from data collection, to training and model development, to very efficient inference engines, and now all the way down to providing the most optimized hardware implementations.

As you know, we like AI that is tiny, and Ikva fits in small and low-power FPGAs like the Lattice CrossLink-NX. The architecture is scalable, both in memory and in compute power. This means we can target a wide variety of FPGAs, ensure we fit next to other IP blocks, and extract maximum performance out of the resources that are available in the target FPGA device.

The video above showcases one of our proprietary person presence detection models together with our inference software running on the Ikva IP core in a Lattice CrossLink-NX LIFCL-40 FPGA. This is a low-power and low-cost 6x6mm FPGA that is available off-the-shelf and includes a native MIPI camera interface, further reducing the number of components in the system.

Ikva runs our robust and highly accurate person presence detection model 10x faster on the CrossLink-NX FPGA than on a typical Arm Cortex-M microcontroller. Alternatively, the frame rate can be scaled down to 1 or 2 FPS for those applications where low energy consumption is key.



The Lattice CrossLink-NX Voice & Vision Machine Learning Board with the CrossLink-NX LIFCL-40 FPGA
The Lattice CrossLink-NX Voice & Vision Machine Learning Board with the CrossLink-NX LIFCL-40 FPGA


There are many target applications for person presence detection. For instance in your home, to automatically turn off your TV, your lights, or heating when there’s no one in the room. Outside your home, your doorbell can send you a signal when there’s someone walking up to your front door or a small camera can detect when there’s an unexpected visitor in your backyard. In the office, your PC can automatically lock the screen when you leave. Elderly care can be improved when you know how much time they spend in bed, their living room, or outside. The possibilities are endless, whether it’s in the home, on the road, in the city, at the office or on the factory floor. Accurate, inexpensive and battery-powered person detection will enhance our lives.

Of course, besides running Plumerai’s optimized BNN models, you can also run your own model on the Ikva core, or integrate Ikva into your FPGA-based device. We’re excited to enable extremely powerful AI to go to places it couldn’t go before.

The Ikva IP core, the supporting tool flow, and optimized person detection models are available today. Contact us to receive more information or schedule a video call to see our live demonstration. We’re eager to discuss how we can enable your products with Ikva”

In addition Plumeria has its origins in the UK and Ikva has links to the following Oxford University research and Greece:


So I ask you what are the odds.

My opinion only DYOR
FF

AKIDA BALLISTA
I think while this is drawing a longer bow than would allow for a company to be promoted to above the waterline on the Iceberg the fact that Plumeria has engaged with Lattice to build their own board adds weight to the previous Brainchip Lattice links and that Lattice can now be promoted to below the waterline as a Nominee.

A thought to ponder did the fact that Plumeria and perhaps others that we do not know about were leaving ARM for Brainchip push ARM over the Edge so to speak.

My opinion only DYOR
FF

AKIDA BALLISTA
 
  • Like
  • Love
  • Fire
Reactions: 22 users
Hmmm, I could swear I saw more links in the past stating Akida 2000 is scheduled to be realesed in 2022, along with details around it working.
I could now only find this:
[PVdM] "At least Akida 2000 is already…the simulation is working already. And we are getting ready to hand it over to engineering."
Hi @Vojnovic

If you go to Ken Scarince's presentation to the German Investors he speaks about the AKD2000 going to TSMC this year etc;

Regards
FF

AKIDA BALLISTA
 
  • Like
  • Love
  • Fire
Reactions: 27 users
The Plumeria Binarized Neural Network is their software or algorithm and it needs hardware to run. If you read my post above you will understand that you do not really need to understand what a Binarized Neural Network is and it runs best when the hardware can run 1 bit activations. AKIDA IP can run 4 bit, 2 bit and 1 bit activations which the ARM hardware cannot. AKIDA IP architecture is software/algorithm/system agnostic as well as sensor agnostic.

My opinion only DYOR
FF

AKIDA BALLISTA
Thanks Ff didn’t need a tech expert then just someone smarter than me 😂😂😂
 
  • Like
  • Love
  • Haha
Reactions: 9 users

DJM263

LTH - 2015
IMO the share price will rise to $12 plus a share (market cap of $20 billion) before we get dividends, I think dividends will be 3+ years away.
But I want to retire now - doh!
 
  • Like
  • Haha
  • Fire
Reactions: 21 users
D

Deleted member 118

Guest
  • Haha
  • Like
  • Love
Reactions: 12 users

Diogenese

Top 20
Binarized neural networks
“One roadblock in using neural networks are the power, memory, and time needed to run the network. This is problematic for mobile and internet-of-things devices that don’t have powerful CPUs or large memories. Binarized neural networks are a solution to this problem. By using binary values instead of floating point values, the network can be computed faster, and with less memory and power.”

Not sure what all this means needs one of our technical experts opinion @Diogenese et al

Hi MA,

When you first click the "Technology" tab, you see 64 "bit" cubes progressively reduced to a single bit cube. That's the "binarized" bit, ie, binary means 0 or 1, whereas 8-bit bytes have the format, eg, 10101010 or some other combination of 1s and zeros. Similarly for 32-bit and 64-bit bytes.

https://plumerai.com/#home

If you want to see it again, you have to reopen the Plumerai site.

This is also the fastest, most energy efficient embodiment of Akida. Akida's 2-bit and 4-bte modes re more accurate, but are slower and use more power.

In the case of 8-bit bytes, a multiplication requires 64 1-bit multiplications (8*8) and 16 8-bit additions.

Here is an example of the calculations involved in multiplying two 4-bit numbers X (multipliand) and Y (multiplier) (Note that Y1 is the least significant bit and Y4 is the most significant bit. The decimal weights of the product bits and the corresponding calculation field columns are shown on the second last line.)

1653094298108.png



In contrast, multiplying two 1-bit numbers is simply a matter of using an AND gate which produces a 1-bit output only if both inputs are 1 - otherwise the output is zero.

1653095312903.png
1653095643770.png
 
  • Like
  • Love
  • Fire
Reactions: 31 users

hotty4040

Regular
Hmmm, I could swear I saw more links in the past stating Akida 2000 is scheduled to be realesed in 2022, along with details around it working.
I could now only find this:
[PVdM] "At least Akida 2000 is already…the simulation is working already. And we are getting ready to hand it over to engineering."

Vojnovic, thanks for posting this, had not realized that so much progress had been made. This IMO is outstanding evidence of real disruptive progress, for AKD2000, ( much beyond ) and AKD3000, bringing up the rear, will possibly be more disruptive, and just around the corner.

Thanks, firstly to ilL and you Voj for this clarifying news post for the AKD hardware update, and let's hope these developments don't cease, ever.

Oh, so much to read and so little time to appreciate it all. We'll never properly keep up with progress and innovation, it's never ending, thank goodness.

Akida Ballista

N.B.

Happy voting comrades, I think you know who I'm favoring, there really is only one way for this day to perform, if trustworthiness means anything at all, then don't give it a second thought, it's just obvious in my mind. Just like Bainchip and Akida, progress and not stagnancy is the order of the day. 3 More years of what we've witnessed, would be turning back the clock, to despondency and absolute neglect of our nation.

Just some food for serious thought chippers.

hotty...
 
  • Like
  • Sad
  • Love
Reactions: 14 users

Diogenese

Top 20
Hi MA,

When you first click the "Technology" tab, you see 64 "bit" cubes progressively reduced to a single bit cube. That's the "binarized" bit, ie, binary means 0 or 1, whereas 8-bit bytes have the format, eg, 10101010 or some other combination of 1s and zeros. Similarly for 32-bit and 64-bit bytes.

https://plumerai.com/#home

If you want to see it again, you have to reopen the Plumerai site.

This is also the fastest, most energy efficient embodiment of Akida. Akida's 2-bit and 4-bte modes re more accurate, but are slower and use more power.

In the case of 8-bit bytes, a multiplication requires 64 1-bit multiplications (8*8) and 16 8-bit additions.

Here is an example of the calculations involved in multiplying two 4-bit numbers X (multipliand) and Y (multiplier) (Note that Y1 is the least significant bit and Y4 is the most significant bit. The decimal weights of the product bits and the corresponding calculation field columns are shown on the second last line.)

View attachment 7231


In contrast, multiplying two 1-bit numbers is simply a matter of using an AND gate which produces a 1-bit output only if both inputs are 1 - otherwise the output is zero.

View attachment 7233 View attachment 7234

The reason that 1-bit binarization does not lose too much accuracy when compared with 8, 16, or 32 bits is explained by Simon Thorpe's discussion of the JAST Rules.

Basically, the speed at which the optical nerve responds to light stimulus is proportional to the strength of the light stimulus. This means that, for a camera light sensor, the first pixel to respond is the most important, hence the winner-take-all rule.
 
  • Like
  • Fire
  • Love
Reactions: 25 users
Hi MA,

When you first click the "Technology" tab, you see 64 "bit" cubes progressively reduced to a single bit cube. That's the "binarized" bit, ie, binary means 0 or 1, whereas 8-bit bytes have the format, eg, 10101010 or some other combination of 1s and zeros. Similarly for 32-bit and 64-bit bytes.

https://plumerai.com/#home

If you want to see it again, you have to reopen the Plumerai site.

This is also the fastest, most energy efficient embodiment of Akida. Akida's 2-bit and 4-bte modes re more accurate, but are slower and use more power.

In the case of 8-bit bytes, a multiplication requires 64 1-bit multiplications (8*8) and 16 8-bit additions.

Here is an example of the calculations involved in multiplying two 4-bit numbers X (multipliand) and Y (multiplier) (Note that Y1 is the least significant bit and Y4 is the most significant bit. The decimal weights of the product bits and the corresponding calculation field columns are shown on the second last line.)

View attachment 7231


In contrast, multiplying two 1-bit numbers is simply a matter of using an AND gate which produces a 1-bit output only if both inputs are 1 - otherwise the output is zero.

View attachment 7233 View attachment 7234

Does that mean I don’t have to eat my hat?
 
  • Like
  • Haha
  • Love
Reactions: 14 users

Bravo

If ARM was an arm, BRN would be its biceps💪!

I'll be a monkey's uncle if Elon Musk isn't referring to AKIDA when he says "AUTONOMY MUST & WILL WORK EVEN WITH NO CONNECTIVITY".


🐒
😘

23Tesla FSD, Autopilot to Work Even Without Internet Connection? Elon Musk Says It's a Must​



Issiah Richard Tech Times 17 October 2021, 09:10 pm




Tesla's FSD and Autopilot are working without an internet connection, and it goes to show what Elon Musk wants for the future of autonomy, especially with it to need no sort of connections whatsoever. However, that being said, versions and updates are still distributed over the air, and it would be intended for that purpose only.

The Safety focus of Tesla is massive, and it has looked for options to look at possible changes that would improve its features, as well as bring its updates for online releases. The latest version, 10.2 of the Tesla FSD beta, has been distributed to the rightful owners with high-ranking safety scores, bringing the latest version of the feature.

Elon Musk: Autonomous Vehicles Need No Internet to Work​


Tesla CEO Elon Musk has said that autonomous vehicles "must" work without an internet connection, and this says a lot about what he sees in the future of the vehicle industry. However, it also seems like a challenge or statement among those in the industry, as a lot have bordered on an internet connection, even outside the self-driving business.

Autonomous driving features like the FSD, Autopilot, Alphabet's Waymo, General Motors' Cruise, and other features should need no internet connection for it to work, in general. This was stated by Musk, and sure, navigation needs internet and, of course, hailing a ride from robotaxis, but the driving part of it should be independent of the connection.

This only shows that the CEO is concerned of the different trends in motor vehicles and its features as it gets integrated on the streets.

It may be so that he is talking about that the main system of autonomy like driving the car through the streets and traffic should need no internet for it to work, especially with it focusing on its sensors and control of the vehicle.

Screen Shot 2022-05-21 at 11.17.39 am.png

Tesla FSD, Autopilot Working Without Internet?​


For a car to be truly autonomous, it should not rely on any form of internet connection to help a vehicle move and go places, even without human intervention. GPS can be downloaded and updated through a car's infotainment system, but that does not mean that it needs to be reliant on an internet connection for it to work.


The Q3 2020 earnings report by Tesla, as presented by Elon Musk, has shared his initial opinions about autonomous driving features to have no internet. Here, the CEO said that it does not need it, as it navigates the streets and roads for a driver.


The Tesla FSD and Autopilot could work without the need for any network connections, may it be WiFi or cellular data (3G, 4G LTE, 5G).


Tesla's Autonomy for EVs​


Autonomy for EVs has been two correlated things, as these EVs have electric motors that can easily be controlled using engineering that has massive correlations. Unlike traditional cars that are more mechanical with their engines, these EVs have motors and the latest technology for their functions.
 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 46 users

Diogenese

Top 20
  • Haha
  • Like
  • Sad
Reactions: 11 users
D

Deleted member 118

Guest
  • Like
Reactions: 2 users

Deena

Regular
Great work TLS! 1000eyes strike again. Made my morning. Have a great weekend all.

Black Boy Reaction GIF




Plumerai looks to be partnered with ARM

Brainchip :love: Plumerai :love: ARM
Can we have an updated TIP OF THE ICEBERG diagram please?
 
  • Like
  • Love
Reactions: 6 users
Does that mean I don’t have to eat my hat?
“This is also the fastest, most energy efficient embodiment of Akida. Akida's 2-bit and 4-bte modes re more accurate, but are slower and use more power.”

While I am not technical at all my read of @Diogenese excellent post is that Akida and Plumerai are a great match for each other.
 
  • Like
  • Love
Reactions: 11 users

I'll be a monkey's uncle if Elon Musk isn't referring to AKIDA when he says "AUTONOMY MUST & WILL WORK EVEN WITH NO CONNECTIVITY".


🐒
😘

23Tesla FSD, Autopilot to Work Even Without Internet Connection? Elon Musk Says It's a Must​



Issiah Richard Tech Times 17 October 2021, 09:10 pm




Tesla's FSD and Autopilot are working without an internet connection, and it goes to show what Elon Musk wants for the future of autonomy, especially with it to need no sort of connections whatsoever. However, that being said, versions and updates are still distributed over the air, and it would be intended for that purpose only.

The Safety focus of Tesla is massive, and it has looked for options to look at possible changes that would improve its features, as well as bring its updates for online releases. The latest version, 10.2 of the Tesla FSD beta, has been distributed to the rightful owners with high-ranking safety scores, bringing the latest version of the feature.

Elon Musk: Autonomous Vehicles Need No Internet to Work​


Tesla CEO Elon Musk has said that autonomous vehicles "must" work without an internet connection, and this says a lot about what he sees in the future of the vehicle industry. However, it also seems like a challenge or statement among those in the industry, as a lot have bordered on an internet connection, even outside the self-driving business.

Autonomous driving features like the FSD, Autopilot, Alphabet's Waymo, General Motors' Cruise, and other features should need no internet connection for it to work, in general. This was stated by Musk, and sure, navigation needs internet and, of course, hailing a ride from robotaxis, but the driving part of it should be independent of the connection.

This only shows that the CEO is concerned of the different trends in motor vehicles and its features as it gets integrated on the streets.

It may be so that he is talking about that the main system of autonomy like driving the car through the streets and traffic should need no internet for it to work, especially with it focusing on its sensors and control of the vehicle.

View attachment 7235

Tesla FSD, Autopilot Working Without Internet?​


For a car to be truly autonomous, it should not rely on any form of internet connection to help a vehicle move and go places, even without human intervention. GPS can be downloaded and updated through a car's infotainment system, but that does not mean that it needs to be reliant on an internet connection for it to work.


The Q3 2020 earnings report by Tesla, as presented by Elon Musk, has shared his initial opinions about autonomous driving features to have no internet. Here, the CEO said that it does not need it, as it navigates the streets and roads for a driver.


The Tesla FSD and Autopilot could work without the need for any network connections, may it be WiFi or cellular data (3G, 4G LTE, 5G).


Tesla's Autonomy for EVs​


Autonomy for EVs has been two correlated things, as these EVs have electric motors that can easily be controlled using engineering that has massive correlations. Unlike traditional cars that are more mechanical with their engines, these EVs have motors and the latest technology for their functions.

Just reopened the dialogue 😃

D33728A2-B25D-4A10-8105-2D139D97AD4E.jpeg
 
  • Like
  • Haha
  • Love
Reactions: 56 users
The reason that 1-bit binarization does not lose too much accuracy when compared with 8, 16, or 32 bits is explained by Simon Thorpe's discussion of the JAST Rules.

Basically, the speed at which the optical nerve responds to light stimulus is proportional to the strength of the light stimulus. This means that, for a camera light sensor, the first pixel to respond is the most important, hence the winner-take-all rule.
Thanks so much for explaining it @Diogenese though I don’t pretend to understand it all but very much appreciate your response. It looks like Akida and Plumeria are a good fit for each other 😊👍🙏
 
  • Like
  • Love
  • Fire
Reactions: 9 users
🤫 …they’re onto us

202B43A6-DF78-4A05-9713-34ECB1ACABE3.jpeg
 
  • Like
  • Haha
  • Fire
Reactions: 50 users
Hi all

The following comes from Plumeria website and I think this is where we will find AKIDA IP and note the name they use IKVA. No matter how you say it, it has a certain ring to it:

“Until now, we have been deploying our BNNs on Arm Cortex-M and Arm Cortex-A processors with great results. However, we felt there was more room for improvement, since these CPUs are built to run typical 8-bit and 32-bit workloads and don’t provide native support for the single-bit operations that our BNNs rely on.

Some of our customers asked if our AI solutions also support FPGAs, since these provide incredible flexibility, cost efficiency, and tighter integration. FPGAs turn out to be an ideal platform to implement our models and inference engine on, as they enable us to unlock the full potential of our BNNs. In FPGAs we can natively implement the binary arithmetic that we need to run our models. We therefore decided to develop our own AI accelerator IP core named Ikva, which we introduce for the first time in this blog post. The Ikva accelerator runs our own BNNs and also efficiently supports 8-bit models. Of course, Ikva is fully supported by our extensive tool flow and ultra-fast and memory-efficient inference engine that’s integrated with TensorFlow Lite. A 32-bit RISC-V processor controls Ikva, captures the data from the camera and provides a programmer-friendly runtime environment. During the development of Ikva, we aimed to design a new hardware architecture for our optimized AI models while keeping it highly flexible and suitable for unknown future models. In contrast to other AI companies that seem to either develop models, or training software, or AI processors, we focus on the full AI stack and the Ikva core completes our offering. With Ikva, we now support the full AI stack starting from data collection, to training and model development, to very efficient inference engines, and now all the way down to providing the most optimized hardware implementations.

As you know, we like AI that is tiny, and Ikva fits in small and low-power FPGAs like the Lattice CrossLink-NX. The architecture is scalable, both in memory and in compute power. This means we can target a wide variety of FPGAs, ensure we fit next to other IP blocks, and extract maximum performance out of the resources that are available in the target FPGA device.

The video above showcases one of our proprietary person presence detection models together with our inference software running on the Ikva IP core in a Lattice CrossLink-NX LIFCL-40 FPGA. This is a low-power and low-cost 6x6mm FPGA that is available off-the-shelf and includes a native MIPI camera interface, further reducing the number of components in the system.

Ikva runs our robust and highly accurate person presence detection model 10x faster on the CrossLink-NX FPGA than on a typical Arm Cortex-M microcontroller. Alternatively, the frame rate can be scaled down to 1 or 2 FPS for those applications where low energy consumption is key.



The Lattice CrossLink-NX Voice & Vision Machine Learning Board with the CrossLink-NX LIFCL-40 FPGA
The Lattice CrossLink-NX Voice & Vision Machine Learning Board with the CrossLink-NX LIFCL-40 FPGA


There are many target applications for person presence detection. For instance in your home, to automatically turn off your TV, your lights, or heating when there’s no one in the room. Outside your home, your doorbell can send you a signal when there’s someone walking up to your front door or a small camera can detect when there’s an unexpected visitor in your backyard. In the office, your PC can automatically lock the screen when you leave. Elderly care can be improved when you know how much time they spend in bed, their living room, or outside. The possibilities are endless, whether it’s in the home, on the road, in the city, at the office or on the factory floor. Accurate, inexpensive and battery-powered person detection will enhance our lives.

Of course, besides running Plumerai’s optimized BNN models, you can also run your own model on the Ikva core, or integrate Ikva into your FPGA-based device. We’re excited to enable extremely powerful AI to go to places it couldn’t go before.

The Ikva IP core, the supporting tool flow, and optimized person detection models are available today. Contact us to receive more information or schedule a video call to see our live demonstration. We’re eager to discuss how we can enable your products with Ikva”

In addition Plumeria has its origins in the UK and Ikva has links to the following Oxford University research and Greece:


So I ask you what are the odds.

My opinion only DYOR
FF

AKIDA BALLISTA
I don't think anyone actually reads my posts because I thought by now the following would have been mentioned:

"A 32-bit RISC-V processor controls Ikva, captures the data from the camera and provides a programmer-friendly runtime environment."

What is it that SiFive does again I have forgotten?

My opinion only DYOR
FF

AKIDA BALLISTA
 
  • Like
  • Fire
  • Love
Reactions: 45 users
Does that mean I don’t have to eat my hat?
Did you notice there was no mythical beast appearing in any of the responses?

FF

AKIDA BALLISTA
 
  • Like
  • Love
  • Haha
Reactions: 12 users
Top Bottom