Thanks for posting
@7für7!
If there were any doubts about the strength of the partnership between BrainChip and Edge Impulse, I think they can now be put to rest. This is a recurring theme throughout the podcast, with Spencer Huang also repeatedly emphasizing how BrainChip enables innovations that were previously impossible—"making science fiction a reality," as he states toward the end.
I like how Spencer talked about smartwatches predicting health conditions before they happen without the necessity to re-charge the watch for months! Achieving highly efficient, ultra-low power, edge AI models running on devices was expressed as the shared goal.
I also loved the analogy that the collaboration between BrainChip and Edge Impulse is like being in a kitchen where Edge Impulse is the cutlery or the tools and BrainChip is the ingredients or the truffles!
Can't wait to find out more about the use cases Spencer describes when discussing how Akida is pushing the boundaries of what is possible in edge AI, which included smartwatches (as previously described), traffic lights that are auto-aware and able to regulate traffic autonomously and anomaly prediction in industrial use cases. Spencer highlights a key challenge, stating,
“There are limitations in today’s non-neuromorphic architectures.” I understood this to mean that traditional architectures lack the capability to effectively support these types of advanced functionalities.
Towards the end of the podcast Spencer mentions that in 2025 Edge Impulse are going to be leveraging a lot more foundational models and focussing on vision, language, LLM's. He talked about BrainChip's demo running LLM's in the suite and said that he wants to make sure that Edge Impulse has the proper tool sets ready for developers because he knows these use cases are coming!