BRN Discussion Ongoing

CHIPS

Regular
Intel and AI

 
  • Like
  • Fire
Reactions: 13 users

CHIPS

Regular
Rahul Ghosh of T. Rowe Price explains why he thinks the AI hype is "real and tangible".

 
  • Like
  • Fire
Reactions: 9 users

Rach2512

Regular
From Jan 2019

BrainChip Partners with SoftCryptum to Deliver AI-powered Video Analytics to Government Agencies in European Countries

Sorry I don't remember ever seeing this, has anyone else seen this before or any other info regarding SoftCryptum? Sorry if already posted. Should this be on the list provided by FactFinder the other day or is it and I missed it as it's a bloody long list? 😁


 
  • Like
  • Love
  • Haha
Reactions: 23 users
I decided to revisit this project for a check on dates again as originally posted earlier this year :)


Project supposedly started in March for 6 mths so expect possible outcomes later this year or early 2024 you would think.

Some of this has been previous covered / discussed but some may not have yet so for those who maybe missed it as well.


Evaluation of neuromorphic AI with embedded Spiking Neural Networks​

Context
AI is proliferating everywhere even to embedded systems to integrate intelligence closer to the sensors (IoT, drones, vehicles, satellites …). But the energy consumption of current Deep learning solutions makes classical AI hardly compatible with energy and resource constrained devices. Edge AI is a recent subject of research that needs to take into account the cost of the neural models both during the training and during the prediction. An original and promising solution to face these constraints is to merge compression technics of deep neural networks and event-based encoding of information thanks to Spiking neural networks (SNN). SNN are considered as third generation of artificial neural networks and are inspired from the way the information is encoded in the brain, and previous works tend to conclude that SNN are more efficient than classical deep networks [3]. This internship project aims at confirming this assumption by converting classical CNN to SNN from standard Machine Learning frameworks (Keras) and deploy the resulting neural models onto the Akida neuromorphic processor from BrainChip company [4]. The results obtained in terms of accuracy, latency and energy will be compared to other existing embedded solutions for Edge AI [2].

Project mission
The project mission will be organized in several periods:
  • Bibliographic study on spiking neural network training
  • Introduction to the existing SW framework from BrainChip
  • Training of convolutional neural networks for embedded applications [1] and conversion from CNN to SNN from Keras
  • Deployment of the SNN onto Akida processing platform
  • Experimentations and measurements
  • Publication in an international conference.
Practical information

Location : LEAT Lab / SophiaTech Campus, Sophia Antipolis
Duration : 6 months from march 2023
Grant : from ANR project Deep See
Profile : Machine learning, Artificial intelligence, Artificial neural networks, Python, Keras, Pytorch
Research keywords : Spiking neural network, Edge AI, neuromorphic computing


So....digging a little deeper into project Deep See, I found there appears another parallel project in collaboration with Prophesee and Renault (Renault was picked up in the original post as being involved to a level).



Development of a prototype HW platform for embedded object detection with bio-inspired retinas​

Context
The LEAT lab is leader of the national ANR project DeepSee in collaboration with Renault, Prophesee and 2 other labs in neuroscience (CERCO) and computer science (I3S)
. This project aims at exploring a bio-inspired approach to develop energy-efficient solutions for image processing in automotive applications (ADAS) as explored by [3]. The main mechanisms that are used to follow this approach are event-based cameras (EBC are considered as artificial retinas) and spiking neural networks (SNN).

The first one is a type of sensor detecting the change of luminosity at very high temporal resolution and low power consumption, the second one is a type of artificial neural network mimicking the way the information is encoded in the brain. The LEAT has developed the first model of SNN able to make object detection on event-based data [1] and the related hardware accelerator on FPGA [2]. The goal of this internship project is to deploy this spike-based AI solution onto an embedded smart camera provided by the Prophesee company [4]. The camera is composed of an event-based sensor and an FPGA. The work will mainly consist in deploying the existing software code (in C) on the embedded CPU, integrate the HW accelerator (VHDL) onto the FPGA and make the communication between them through an AXI-STREAM bus. The last part of the project will consist in realizing experimentations of the resulting smart cameras to evaluate the real-time performances and energy consumption before a validation onto a driving vehicle.

Project mission
The project mission will be organized in several periods:
  • Bibliographic study on event-based processing
  • Introduction to the existing Sw and Hw solutions at LEAT, and to the dev kit from Prophesee
  • Deployment of the Sw part on CPU and the Hw part on FPGA
  • Experimentations and validation
  • Publication in an international conference.
Practical information

Location : LEAT Lab / SophiaTech Campus, Sophia Antipolis
Duration : 6 months from march 2023
Grant : from ANR project DeepSee
Profile : VHDL programming, FPGA design, C programming, signal/image processing
Research keywords : Embedded systems, Event-based camera, artificial neural network, Edge AI



Autonomous and intelligent embedded solutions are mainly designed as cognitive systems composed of a three step process: perception, decision and action, periodically invoked in a closed-loop manner in order to detect changes in the environment and appropriately choose the actions to be performed according to the mission to be achieved. In an autonomous agent such as a robot, a drone or a vehicle, these 3 stages are quite naturally instantiated in the form of i) the fusion of information from different sensors, ii) then the scene analysis typically performed by artificial neural networks, and iii) finally the selection of an action to be operated on actuators such as engines, mechanical arms or any mean to interact with the environment. In that context, the growing maturity of the complementary technologies of Event-Based Sensors (EBS) and Spiking Neural Networks (SNN) is proven by recent results. The nature of these sensors questions the very way in which autonomous systems interact with their environment. Indeed, an Event-Based Sensor reverses the perception paradigm currently adopted by Frame-Based Sensors (FBS) from systematic and periodical sampling (whether an event has happened or not) to an approach reflecting the true causal relationship where the event triggers the sampling of the information. We propose to study the disruptive change of the perception stage and how event-based processing can cooperate with the current frame-based approach to make the system more reactive and robust. Hence, SNN models have been studied for several years as an interesting alternative to Formal Neural Networks (FNN) both for their reduction of computational complexity in deep network topology, but also for their natural ability to support unsupervised and bio-inspired learning rules. The most recent results show that these methods are becoming more and more mature and are almost catching up with the performance of formal networks, even though most of the learning is done without data labels. But should we compare the two approaches when the very nature of their input-data is different? In the context of interest of image processing, one (FNN) deals with whole frames and categorizes objects, the other (SNN) is particularly suitable for event-based sensors and is therefore more suited to capture spatio-temporal regularities in a constant flow of events. The approach we propose to follow in the DeepSee project is to associate spiking networks with formal networks rather than putting them in competition.

Partners
RENAULT SAS - GUYANCOURT , Laboratoire d'Electronique, Antennes et Télécommunications , Laboratoire informatique, signaux systèmes de Sophia Antipolis , Centre de recherche cerveau et cognition UMR5549 CNRS/UPS


The other interesting person I found involved in Deep See is none other than Timothee Masquelier as part of CERCO, who were listed in the retina project with Renault & Prophesee. He's also involved in the BrainNet Project. (CV attached)

2021 – now Senior Research Scientist (Directeur de Recherche), CNRS (CERCO), Toulouse, France. Spike-based computing and learning in brains and machines.

Awards / Grants (as PI)
2021 – 2024 ANR PRCE. “BrainNet” Project. 144ke / 643ke in total.
2021 – 2024 ANR PRCE. “DeepSee” Project. 158ke / 711ke in total.

He is also one of the authors of the following paper which was posted previously (attached) where they consider Akida is a good fit.

StereoSpike: Depth Learning With a Spiking Neural Network
ULYSSE RANÇON 1 , JAVIER CUADRADO-ANIBARRO1 , BENOIT R. COTTEREAU 1,2, AND TIMOTHÉE MASQUELIER 1

StereoSpike DVS Study Vid Snip.png


I also now found a Mar 2023 presso on Edge AI by LEAT & the Cote D'Azur Uni (attached). No mention of us unfortunately but good info in there about thinking, other partners including Valeo, Prophesee etc and we know they using Akida in the Deep See Project as one of the tools.
 

Attachments

  • Edge_AI_Miramond2023_CERN.pdf
    4.7 MB · Views: 152
  • masquelier_CV.pdf
    307.1 KB · Views: 108
  • StereoSpike_Depth_Learning_With_a_Spiking_Neural_Network.pdf
    1.7 MB · Views: 164
Last edited:
  • Like
  • Fire
  • Love
Reactions: 60 users

Beebo

Regular
Doesn't matter how many times you (and others) say it. When someone doesn’t want to hear it, you will never change their mind.
Sean Hehir himself said there will be a handful of leaders in the edge Ai market, and BRN aims to be one of those leaders.

As well, he said our successes will start in the power-constrained edge environments.

The power-constrained edge market alone is enormous!

IT’S COMING!
 
  • Like
  • Fire
  • Love
Reactions: 38 users

Diogenese

Top 20
From Jan 2019

BrainChip Partners with SoftCryptum to Deliver AI-powered Video Analytics to Government Agencies in European Countries

Sorry I don't remember ever seeing this, has anyone else seen this before or any other info regarding SoftCryptum? Sorry if already posted. Should this be on the list provided by FactFinder the other day or is it and I missed it as it's a bloody long list? 😁


Hi Rach,

This was pre-Akida Brainchip Studio software NN, which enabled searching of 16 channels of video at the same time.

We haven't heard if it is still in use.

It was paired with a hardware FPGA accelerator, Brainchip Accelerator from memory.

I haven't heard if it is still in use.
 
  • Like
  • Love
Reactions: 20 users

Cardpro

Regular
Hi Rach,

This was pre-Akida Brainchip Studio software NN, which enabled searching of 16 channels of video at the same time.

We haven't heard if it is still in use.

It was paired with a hardware FPGA accelerator, Brainchip Accelerator from memory.

I haven't heard if it is still in use.

From Jan 2019

BrainChip Partners with SoftCryptum to Deliver AI-powered Video Analytics to Government Agencies in European Countries

Sorry I don't remember ever seeing this, has anyone else seen this before or any other info regarding SoftCryptum? Sorry if already posted. Should this be on the list provided by FactFinder the other day or is it and I missed it as it's a bloody long list? 😁


Email from Tony a while back.

"I've been in touch with my colleagues in the US to ascertain the status of our relationship with SoftCryptum.



Based on their advice, it would appear that when we discontinued selling the BrainChip Studio product, we our relationship with SoftCryptum ceased.



We do not have any current relationship with SoftCryptum."
 
  • Like
  • Fire
  • Love
Reactions: 23 users

Labsy

Regular
Morning all Chippers... It's a glorious day.... Hope everyone has a good one and an even better next couple months 😉 😉😉😁👌🫵🫵🚀🚀🚀
 
  • Like
  • Love
  • Fire
Reactions: 32 users

Rach2512

Regular
Hi Rach,

This was pre-Akida Brainchip Studio software NN, which enabled searching of 16 channels of video at the same time.

We haven't heard if it is still in use.

It was paired with a hardware FPGA accelerator, Brainchip Accelerator from memory.

I haven't heard if it is still in use.


Thanks Diogenese, could it be a case of, if they liked us then, then their going to absolutely LOVE us now? Sorry I'm not at all technical and so am very thankful to all of you that are 😍🤩 💖.
 
  • Like
  • Love
Reactions: 16 users

Tony Coles

Regular
Shorts are increasing and there turning into trousers soon. 🤬 What the hell is happening?
 
Last edited:
  • Sad
  • Like
  • Fire
Reactions: 5 users

Cardpro

Regular
Shorts are increasing and there turning in trousers soon. 🤬 What the hell is happening?
Unfortunately, nothing is happening so they are taking their chances :(

Hopefully there will be an annoucement of next gen Akida followed by new annoucements on new IP contracts...
 
  • Like
Reactions: 12 users

Vladsblood

Regular
Shorts are increasing and there turning into trousers soon. 🤬 What the hell is happening?
Shorters love...The sound of silence...thats their play tune to go to work on a company Tony. Vlad.
 
  • Like
Reactions: 10 users

HopalongPetrovski

I'm Spartacus!
Judging by what the company has told us in the latest 4c, the steadily increasing volume of commentary regarding AI and it's movement towards the Edge, sensor fusion and the IoT, and the known and supposed partnerships and alliances quietly revealed over the past 6-12 months, a lot is happening, that is below the event horizon of an ASX price sensitive announcement and not currently registering on the share price needle.

We also know that the revelation of same is as deeply desired by Antonio, Sean and the rest of our management, as it is by us.
Not only is a significant portion of their fortunes depending on it, but also their professional reputations.

Lots and lots of pots on the boil, but in virtually all cases the exact timing of the release of this information is in the hands and at the discretion of third parties.
For in most cases we are an enabler rather than a product in and of itself.
We help to make the sensor smart reducing the latency, load and bandwidth required to get the job done.
We sit between the cloud based server and the onboard intelligence improving its functionality and providing added privacy and energy efficiency to boot.
The product cycles take time to implement and time is also required and desired by initiators to extract value from previously invested in tech.
Our time is coming, but like adults driving the car we know the journey takes as long as it takes, and no amount of "are we there, yetting" from inexperienced and impatient minds will speed the process.
We all want our cookies now, but some know that a touch more delayed gratification may result in that magical experience of both having ones cake and eating it too.
Coming up on 8 years in for me, gradually building my position, watching others come and go, so believe me, I too am subject to the angst, anxiety and impatience many here are feeling, but I also think we are on the right track and I don't wanna be that guy that sold Apple. 🤣
No advise of course and DYOR but even after all this time it feels closer to me, every day.
GLTAH
 
  • Like
  • Love
  • Fire
Reactions: 89 users

Slade

Top 20
  • Like
  • Love
  • Fire
Reactions: 109 users

Quiltman

Regular
  • Like
  • Fire
  • Love
Reactions: 64 users
Me politely encouraging MB....c'mon....just make it official pls :LOL:

signing.gif
 
  • Haha
  • Like
  • Fire
Reactions: 57 users

Mea culpa

prəmɪskjuəs
Judging by what the company has told us in the latest 4c, the steadily increasing volume of commentary regarding AI and it's movement towards the Edge, sensor fusion and the IoT, and the known and supposed partnerships and alliances quietly revealed over the past 6-12 months, a lot is happening, that is below the event horizon of an ASX price sensitive announcement and not currently registering on the share price needle.

We also know that the revelation of same is as deeply desired by Antonio, Sean and the rest of our management, as it is by us.
Not only is a significant portion of their fortunes depending on it, but also their professional reputations.

Lots and lots of pots on the boil, but in virtually all cases the exact timing of the release of this information is in the hands and at the discretion of third parties.
For in most cases we are an enabler rather than a product in and of itself.
We help to make the sensor smart reducing the latency, load and bandwidth required to get the job done.
We sit between the cloud based server and the onboard intelligence improving its functionality and providing added privacy and energy efficiency to boot.
The product cycles take time to implement and time is also required and desired by initiators to extract value from previously invested in tech.
Our time is coming, but like adults driving the car we know the journey takes as long as it takes, and no amount of "are we there, yetting" from inexperienced and impatient minds will speed the process.
We all want our cookies now, but some know that a touch more delayed gratification may result in that magical experience of both having ones cake and eating it too.
Coming up on 8 years in for me, gradually building my position, watching others come and go, so believe me, I too am subject to the angst, anxiety and impatience many here are feeling, but I also think we are on the right track and I don't wanna be that guy that sold Apple. 🤣
No advise of course and DYOR but even after all this time it feels closer to me, every day.
GLTAH
As always I look forward to reading your comprehensive and articulate posts. Holding for eight years shows foresight, confidence and patience. Some years ago, on some other forum I read that if you had been a shareholder for more than twelve months you were considered a long-term holder. I’d been holding for 18 months, so I was chuffed to learn I was now an LTH. I’m now in my sixth year.

A few days ago, I completed another orbit of the sun and am well into my geriatricacy. Often, I remind myself of the comment from the sage of BRN discussion, the Fact Finder, who described his investment as inter-generational. This has been my purpose also, for my children and grand-children.

Cheers Hoppy.
 
  • Like
  • Love
  • Fire
Reactions: 52 users

Terroni2105

Founding Member
  • Like
  • Fire
  • Love
Reactions: 31 users

HopalongPetrovski

I'm Spartacus!
“Mercedes has experimented with a new type of processor that performs tasks in “neuromorphic spikes”

Yes Slade, very promising.
And hopefully gives us a more sustained boost along with the revenue and brand association.
Great find and a September release is just around the corner and would be a lovely entree to a follow up from another big fish for Christmas this year. 🤣
 
  • Like
  • Fire
  • Love
Reactions: 23 users

TECH

Regular
“Mercedes has experimented with a new type of processor that performs tasks in “neuromorphic spikes”


Love your work Slade ❤️....we all know that Mr Mercedes is a client, please stop questioning that fact (not you slade'o), I'd say they
are one of our most powerful (right word?) NDA's...we are in such a brilliant space, timing wise...the data centres as we all know are under
ever increasing pressure from so many sides, what do you mean Tech ?

Just use your imagination, we are talking Mr John von Neumann dying a slow death due to heat exhaustion, power exhaustion, bandwidth
exhaustion, he's just becoming overall, slower, hotter and running out of energy, greenies worldwide should be pumping company's like ours,
not only are we a BENEFICIAL AI company we are an ESSENTIAL AI company...Brainchip is well and truly doing it's part in trying to help (save) our planet, OUR HOME !

We are in such a powerful position at present, Peters technology is cutting edge, we are still well beyond human understanding, is the uptake
fast ? no, exactly....I wonder why young Einstein's ??

Do you truly believe that we (Brainchip) are moving further ahead or "just" maintaining our lead ? whatever your answer is to yourself, the
fact remains, it's ahead and still trying to be understood as real by the majority...if Brainchip can't succeed, well in my opinion, none of the
rest will either.

Good evening from the Far North of Kiwiland.....Tech 🖐
 
  • Like
  • Love
  • Fire
Reactions: 53 users
Top Bottom