BRN Discussion Ongoing

Eeehhhh the price? šŸ„“
PCIe boards I believe....eg more than 10 contact BRN.
 
  • Like
Reactions: 3 users

Diogenese

Top 20
M.2 card now available to purchase from the Brainchip Shop:


View attachment 75860
This is great news.

It comes with a number of different board configurations ... where are the chips coming from?
 
  • Like
  • Thinking
  • Love
Reactions: 33 users
Was looking at BH again and RTX searches.

This PDF came up from the Navy (NAVAIR) and appears BH would have been working on the Snap Card prior to 2023 as this update was dated July 23.

Indicates how long Dev & testing cycles can be.

Navy definitely think highly of it at that point.

Looking good.


View attachment 75859
Maybe I'm stating the obvious, but that's our AKIDA chips on the CGI of the Snap board šŸ˜‰

20250113_033532.jpg

Screenshot_20250113-033617_Firefox.jpg
 
  • Like
  • Fire
  • Love
Reactions: 30 users

RobjHunt

Regular
This is great news.

It comes with a number of different board configurations ... where are the chips coming from?
Great minds think alike.
 
  • Like
Reactions: 2 users
This is great news.

It comes with a number of different board configurations ... where are the chips coming from?
Does the B_ _ number relate to the Batch number??..

Because 2 of the photos for the M.2 have silver B02 and 1 has B00 (both ES which is Engineering Samples?).

The PCIe card has a gold AKD1000 with no ES after it (B00) but we know that @Humble Genius's PCIe card had AKD1000-ES B01 (Bascom Hunter also using the ES B01).

So if it's "batches" that would mean 3 ES runs and 1 "production run" that we know of?.

How many runs do we actually know occurred?
I can only recall 3 (the first, which worked well, the 2nd improved and then the one production run).

Or do we only know of the first 2?..

(Just concerning TSMC runs).
 
Last edited:
  • Like
  • Thinking
  • Fire
Reactions: 14 users

TECH

Regular
As at 30/9/24 we had 7.7 quarters of cash in the bank - almost 2 years. Cash is not an issue at all.
As we have no pressing need at all for immediate cash it appears that we are expecting a major deal before 30th June2025 which is the last day of drawing the LDA funds.
Pico and the M2 format were developed in response to clients' requests/needs. We also know for example that Tata intends driving AKIDA into its products.
We know NDA's have been in place for some time. Plenty of engagements.
With 2 new products (Pico and M.2) recently released plus our new Edge box partnerships i doubt the raise is to produce another product so soon.
IMO its more likely connected to a deal. Why else would we raise cash when there is no pressing need?

Hi Manny,

The point that you have raised is one I have pondered over as well, let's all be honest, the LDA deal came out of the blue, I quite frankly thought
we had finished with LDA, to close out the last deal, we even bought back the outstanding shares for around 0.03 each (from memory), whilst having access to, 140 Million AUD and to date only having drawn down 68 Million AUD it's obviously been a good avenue to funds, despite the selling off of shares and the dilution of shareholders holdings.

So, in my opinion, we are either going to do a wafer run, which goes against all the business changes that have recently taken place or we are
partnering up with a major player and need to pump in our agreed share ?

A deal involves more than one party, so I don't believe we are going solo, and our Board would have weighed up this recent LDA agreement
and come to the conclusion that whatever is going to be going down over the next few months is worth taking the risk...add to that thought,
Sean has stated some news around the Akida development would be shared over the coming months, so interesting times indeed.

Regards......Tech.
 
  • Like
  • Fire
  • Thinking
Reactions: 50 users

Shadow59

Regular

Akidaā„¢ M.2 Card​

1._Powered_By_1_43020700-e2e6-4919-a73b-055f49782acf.png




$249.00
Purchasing greater than 10 PCIe boards?
Contact sales@brainchip.com
Systems specifications may vary based on component availability. Images are proofs and may have a slight variation to the finished product. Shipment is estimated 7 days from the time of purchase. Delivery times may fluctuate based on supply chain availability.

They must have already have some fairly available but need funding for maunfacture of more. Hence the fund raise.
 
  • Like
  • Fire
  • Love
Reactions: 38 users

Bravo

If ARM was an arm, BRN would be its bicepsšŸ’Ŗ!
This is pretty cool when you consider it in terms of our existing relationship with Penn State. For example, Abhronil Sengupta (not mentioned in this article) is an Associate Professor at Penn State. He has also been involved in the cyber-neuro RT project, along with other authors from Quantum Ventura.



$1.23M NASA grant to support improving satellite weather forecasting with AI​

Satellite image of hurricane Patricia showing the boundary layer.

Penn State researchers will use a grant from NASA to improve atmosphere and ocean forecasts by incorporating AI and satellite data into current forecasting models. Credit: NASA. All Rights Reserved.
Expand
January 8, 2025
By Mary Fetzer

UNIVERSITY PARK, Pa. ā€” A team led by a Penn State College of Information Sciences and Technology (IST) researcher received a two-year, $1.23 million grant from NASA to improve atmosphere and ocean forecasts by incorporating artificial intelligence (AI) and satellite data into current forecasting models.
ā€œTypically, forecasts of the atmosphere and oceans require data assimilation ā€” combining different sources of information about the weather to obtain a more accurate result,ā€ said Romit Maulik, assistant professor in the College of IST. ā€œHowever, that data assimilation can slow down the forecast time significantly. We plan to use computer vision to dramatically accelerate this process.ā€
Computer vision is a form of artificial intelligence that uses machine learning and neural networks to teach computers to understand and interpret visual information data and to learn from that data to improve their performance.
The research team ā€” which includes Steven Greybush, associate professor of meteorology in the Penn State College of Earth and Mineral Sciences, and scientists from Argonne National Laboratory, NASA Goddard Space Flight Center, the National Oceanic and Atmospheric Administration and the University of Chicago ā€” plans to introduce various sources of data, such as satellite images, to build on past weather forecasting that used transformer-based AI algorithms and machine learning models.
ā€œThe work will involve retraining some portions of our model to take these new datasets as inputs and improve predictions,ā€ Maulik said. ā€œThen, we will integrate these improved algorithms into the NASA Goddard Earth Observing System so it can rapidly incorporate satellite system observations into its operational data assimilation workflows.ā€
Maulik and Greybush are also co-hires of the Penn State Institute for Computational and Data Sciences.
Last Updated January 8, 2025

 
Last edited:
  • Like
  • Love
  • Fire
Reactions: 48 users

AARONASX

Holding onto what I've got
This is pretty cool when you consider it in terms of our existing relationship with Penn State. For example, Abhronil Sengupta is an Associate Professor at Penn State. He has also been involved in the cyber-neuro RT project, along with other authors from Quantum Ventura.



$1.23M NASA grant to support improving satellite weather forecasting with AI​

Satellite image of hurricane Patricia showing the boundary layer.

Penn State researchers will use a grant from NASA to improve atmosphere and ocean forecasts by incorporating AI and satellite data into current forecasting models. Credit: NASA. All Rights Reserved.
Expand
January 8, 2025
By Mary Fetzer

UNIVERSITY PARK, Pa. ā€” A team led by a Penn State College of Information Sciences and Technology (IST) researcher received a two-year, $1.23 million grant from NASA to improve atmosphere and ocean forecasts by incorporating artificial intelligence (AI) and satellite data into current forecasting models.
ā€œTypically, forecasts of the atmosphere and oceans require data assimilation ā€” combining different sources of information about the weather to obtain a more accurate result,ā€ said Romit Maulik, assistant professor in the College of IST. ā€œHowever, that data assimilation can slow down the forecast time significantly. We plan to use computer vision to dramatically accelerate this process.ā€
Computer vision is a form of artificial intelligence that uses machine learning and neural networks to teach computers to understand and interpret visual information data and to learn from that data to improve their performance.
The research team ā€” which includes Steven Greybush, associate professor of meteorology in the Penn State College of Earth and Mineral Sciences, and scientists from Argonne National Laboratory, NASA Goddard Space Flight Center, the National Oceanic and Atmospheric Administration and the University of Chicago ā€” plans to introduce various sources of data, such as satellite images, to build on past weather forecasting that used transformer-based AI algorithms and machine learning models.
ā€œThe work will involve retraining some portions of our model to take these new datasets as inputs and improve predictions,ā€ Maulik said. ā€œThen, we will integrate these improved algorithms into the NASA Goddard Earth Observing System so it can rapidly incorporate satellite system observations into its operational data assimilation workflows.ā€
Maulik and Greybush are also co-hires of the Penn State Institute for Computational and Data Sciences.
Last Updated January 8, 2025


...and Penn State is a current university participants under Brainchip:

1736726598382.png
 
  • Like
  • Fire
Reactions: 35 users

7fĆ¼r7

Top 20
I would like to introduce the all new brainchip sneakers collaborated with Nike ā€¦ for the upcoming run! Donā€™t miss the opportunity!

IMG_9473.jpeg
 
  • Haha
  • Like
  • Fire
Reactions: 20 users

RobjHunt

Regular
I would like to introduce the all new brainchip sneakers collaborated with Nike ā€¦ for the upcoming run! Donā€™t miss the opportunity!

View attachment 75874
Just missing the BrainChip logo somewhere, otherwise Iā€™d be happy running alongside Bravo in them šŸ˜‰šŸ‘Œ
 
  • Like
  • Haha
  • Fire
Reactions: 6 users

RobjHunt

Regular
Just missing the BrainChip logo somewhere, otherwise Iā€™d be happy running alongside Bravo in them šŸ˜‰šŸ‘Œ
Maybe just below BrainChip on the top part of the tongue.

Pantene Peeps šŸ˜‰
 
  • Like
Reactions: 1 users

Diogenese

Top 20
...and Penn State is a current university participants under Brainchip:

View attachment 75873
With 8 unis involved, there could be well above 100 engineers a year emerging with Akida expertise and taking this unique skill to their new employers. The early graduates should be well sought after as the AI snowball gathers momentum.
 
  • Like
  • Fire
  • Love
Reactions: 45 users

Diogenese

Top 20
Just missing the BrainChip logo somewhere, otherwise Iā€™d be happy running alongside Bravo in them šŸ˜‰šŸ‘Œ
They are keeping the spikes for the track shoes.
 
  • Haha
  • Like
  • Fire
Reactions: 20 users
Here is a presentation from the University of Virginia on a research project using Akida for a project called AGESCOPE an Age-Based Recommendation System. From what I can gather they used Akida to predict the age of people from images and found it to be extremely accurate. About 10 mins long and audio not the best.......


@TheDrooben

Further to your find on the project by the Uni of Virginia, below is some more info from GitHub which appears to be one and the same.


Project Title​

Age-Based Recommendation System Using BrainChip Akida (Neuromorphic Architecture)

Project Description​

This project aims to develop an AI-powered age-based recommendation system leveraging BrainChip Akida, a neuromorphic processor designed for energy-efficient and hardware-optimized inference. The system uses age estimation from facial images as a key input to recommend personalized content, products, or services tailored to specific age groups.

Benefits​

Energy Efficiency: Neuromorphic inference with low power consumption.
Scalability: Adaptable across datasets and industries.
User Experience: Enhanced personalization through robust recommendations.

Conclusion​

This project successfully demonstrated the development of an age-based recommendation system by leveraging the BrainChip Akida neuromorphic processor. Through a structured approach combining deep learning, quantization, neuromorphic computing, and efficient deployment techniques, we achieved several key objectives:

Accurate Age Prediction:
By training and optimizing a VGG-based model on the UTKFace dataset, we developed a reliable age estimation system with high accuracy and low Mean Absolute Error (MAE). The use of data augmentation, hyperparameter optimization, and quantization further enhanced robustness and generalization across diverse demographics.

Energy-Efficient Inference:
The conversion of the quantized Keras model to the Akida format demonstrated the power and efficiency of neuromorphic computing. Akida's event-driven architecture achieved real-time, low-power inference while maintaining comparable accuracy to traditional AI models.

Seamless Model Conversion:
The integration of the CNN-to-SNN framework allowed for smooth conversion of traditional models to Akida-compatible spiking models, showcasing the practicality of deploying advanced neuromorphic solutions on edge devices.

Real-Time Edge Deployment:
By implementing the system on an edge device, we demonstrated its ability to perform real-time age predictions and generate personalized recommendations dynamically. This highlights the feasibility of deploying AI solutions in resource-constrained environments like IoT devices and embedded systems.

Scalability and Versatility:
The modular design and scalability of the system make it adaptable for various industries, including healthcare, retail, and entertainment. Future extensions could integrate additional features like emotion recognition or gender-based personalization for enhanced user experiences.
 
  • Like
  • Fire
  • Love
Reactions: 24 users

manny100

Regular
Hi Manny,

The point that you have raised is one I have pondered over as well, let's all be honest, the LDA deal came out of the blue, I quite frankly thought
we had finished with LDA, to close out the last deal, we even bought back the outstanding shares for around 0.03 each (from memory), whilst having access to, 140 Million AUD and to date only having drawn down 68 Million AUD it's obviously been a good avenue to funds, despite the selling off of shares and the dilution of shareholders holdings.

So, in my opinion, we are either going to do a wafer run, which goes against all the business changes that have recently taken place or we are
partnering up with a major player and need to pump in our agreed share ?

A deal involves more than one party, so I don't believe we are going solo, and our Board would have weighed up this recent LDA agreement
and come to the conclusion that whatever is going to be going down over the next few months is worth taking the risk...add to that thought,
Sean has stated some news around the Akida development would be shared over the coming months, so interesting times indeed.

Regards......Tech.
As we are cashed up prior to the LDA raise the very worst case scenario is that they are putting in place a safety net against a recession or AI crash that may be coming??
Worst case IMO is safety net.
We have circa 15.5% to 16.5% of SOI as inside holders.
They will ensure that BRN will remain 'safe'.
I think the LDA funding is more likely for something like a deal of some sort given that Sean and Tony V gave been madly accumulating within the rules.
 
  • Like
  • Thinking
Reactions: 11 users

7fĆ¼r7

Top 20
Just missing the BrainChip logo somewhere, otherwise Iā€™d be happy running alongside Bravo in them šŸ˜‰šŸ‘Œ
Yes I wanted to but it somehow doesnā€™t seem to work
 
  • Sad
  • Thinking
Reactions: 2 users

Plebby

Member
As we are cashed up prior to the LDA raise the very worst case scenario is that they are putting in place a safety net against a recession or AI crash that may be coming??
Worst case IMO is safety net.
We have circa 15.5% to 16.5% of SOI as inside holders.
They will ensure that BRN will remain 'safe'.
I think the LDA funding is more likely for something like a deal of some sort given that Sean and Tony V gave been madly accumulating within the rules.
Hi Manny,

Just hoping you could shed some light on Sean + Tonyā€™s accumulation? Where is that disclosed and what are the rules surrounding share purchase internally?

Cheers
 
  • Like
Reactions: 2 users

7fĆ¼r7

Top 20
Just missing the BrainChip logo somewhere, otherwise Iā€™d be happy running alongside Bravo in them šŸ˜‰šŸ‘Œ
This one is also nice ā€¦ unfortunately again he was not willing to add the logoā€¦

IMG_9477.jpeg
 
  • Like
Reactions: 2 users

Guzzi62

Regular
From the other site:


BrainChip unveils ā€œbrain-inspiredā€ AI gadget for real time AI processing​


FutureCIO Editors
by FutureCIO Editors

January 10, 2025


Edge AI on-chip processing and learning company BrainChip Holdings Ltd has launched a compact, portable computation device to accelerate AI processing for various sectors such as manufacturing, warehouse, retail, hospitals, energy, automotive, and aviation.


Named Akida Edge AI Box, it claims to mimic the brain's learning ability through event-based neural processing. This approach reduces the need for consistent cloud connectivity, enhancing data security and efficiency while lowering processing costs.


Dual Akida processors enable on-chip learning independent of the cloud, which powers the AI gadget. The design targets enhancing application security, reducing training overhead, and hastening learning.


In partnership with electronics innovation and manufacturing company VVDN Technologies, users can develop custom versions of the product with original equipment manufacturers (OEMs) for large-scale commercial applications.


Real world applications


The Akida Edge AI Box is already being applied in various real-world scenarios such as edge AI model training and development, gesture recognition, climate forecasting, model evaluation, cybersecurity enhancement, and computer vision analysis.



ā€œThe Akida Edge Box is a great platform for running AI in standalone edge environments where footprint, cost and efficiency are critical while not compromising performance,ā€ said Sean Hehir, BrainChip CEO.


Hehir expressed his excitement about rolling out even more use cases with other partners who seek to develop edge AI for their customers' specific use cases. He adds that the company looks forward to ā€œthe ideas these companies will bring to life with the Akida Edge AI Box.ā€

 
  • Like
  • Love
  • Fire
Reactions: 24 users
Top Bottom