BRN Discussion Ongoing

Frangipani

Regular
Was just about to go to bed, when I saw this video on YouTube, recorded on July 4th. Quickly scrolled through the slides and screenshotted some, but haven’t listened to the whole Webinar, which was jointly hosted by the Centers for Cybersecurity and AI Research and the School of Electrical Engineering and Computer Science at the University of North Dakota College of Engineering and Mines…

Dr. Venkata Sriram Nadendla from Missouri S&T was presenting on
EEG based SNNs for Braking Intent Detection on Neuromorphic Hardware




View attachment 66186

View attachment 66187

View attachment 66188

View attachment 66200

View attachment 66189
View attachment 66193
View attachment 66190

Nice :)

Paper HERE


Submitted on 21 Jul 2024]

Few-Shot Transfer Learning for Individualized Braking Intent Detection on Neuromorphic Hardware​

Nathan Lutes, Venkata Sriram Siddhardh Nadendla, K. Krishnamurthy

Objective: This work explores use of a few-shot transfer learning method to train and implement a convolutional spiking neural network (CSNN) on a BrainChip Akida AKD1000 neuromorphic system-on-chip for developing individual-level, instead of traditionally used group-level, models using electroencephalographic data. The efficacy of the method is studied on an advanced driver assist system related task of predicting braking intention. Main Results: Efficacy of the above methodology to develop individual specific braking intention predictive models by rapidly adapting the group-level model in as few as three training epochs while achieving at least 90% accuracy, true positive rate and true negative rate is presented. Further, results show an energy reduction of over 97% with only a 1.3x increase in latency when using the Akida AKD1000 processor for network inference compared to an Intel Xeon CPU. Similar results were obtained in a subsequent ablation study using a subset of five out of 19 channels.

Significance: Especially relevant to real-time applications, this work presents an energy-efficient, few-shot transfer learning method that is implemented on a neuromorphic processor capable of training a CSNN as new data becomes available, operating conditions change, or to customize group-level models to yield personalized models unique to each individual.

5. Conclusion
The results show that the methodology presented was effective to develop individual-level models deployed on a state-of-the-art neuromorphic processor with predictive abilities for ADAS relevant tasks, specifically braking intent detection.

This study explored a novel application of deep SNNs to the field of ADAS using a neuromorphic processor by creating and validating individual-level braking intent classification models with data from three experiments involving pseudo-realistic conditions. These conditions included cognitive atrophy through physical fatigue and real-time distraction
and providing braking imperatives via commonly encountered visual stimulus of traffic lights. The method presented demonstrates that individual-level models could be quickly created with a small amount of data, achieving greater than 90% scores across all three classification performance metrics in a few shots (three epochs) on average for both the ACS and FCAS. This demonstrated the efficacy of the method for different participants operating under non-ideal conditions and using realistic driving cues and further suggests that a reduced data acquisition scheme might be feasible in the field.
Furthermore, the applicability to energy-constrained systems was demonstrated through comparison of the inference energy consumed with a very powerful CPU in which the Akida processor offered energy savings of 97% or greater. The Akida processor was also shown to be competitive in inference latency compared to the CPU. Future work could
include implementation of the method presented on a larger number of participants, other neuromorphic hardware, different driving scenarios, and in real-world scenarios where individual-level models are created by refining previously developed group-level models in real time.

Previously submitted as a preprint on www.arxiv.org, the above paper by MST (Missouri University of Science and Technology) researchers titled “Few-shot transfer learning for individualized braking intent detection on neuromorphic hardware” (using AKD1000) is now a Journal of Neural Engineering Accepted Manuscript version:


AD60C6D9-3FDF-4734-867C-0D0E5E95F993.jpeg

08AE88AD-6A8C-4AFA-8FA0-678EF4D3E1C7.jpeg



First author Nathan Lutes has since completed his PhD at MST and - according to his LinkedIn profile - appears to still be working for Boeing as a Guidance Navigation and Control Engineer (since 2021).

D39E6E9D-CF15-4E03-AA2F-269FA57C7D1F.jpeg

CD174C37-E9D0-4602-97DD-E0FAC2606B95.jpeg
 
  • Like
  • Fire
  • Love
Reactions: 29 users

charles2

Regular
What do you think about One Stop Systems / OSS compared to BrainChip?

Well clearly OSS has edge envy and aims to transform its business in that direction. And has all the defense/military contacts that BRN could dream of. And now a footprint in Europe.

Up to now they have had difficulty impressing Wall Street as their market cap and revenue are about equal. Really! New management recognizes this and is emphasizing the opportunity in ML, edge and all our usual buzzwords. They see their addressable market as HUGE. (Sound familiar?)

So much of what they specialize in could be enhanced by Akida......even catapulted.

Anyway in my naivety after discovering OSS in the past month, I purchased 5000 shares...so I am talking my book.

And looking for a secret omen....their headquarters are located where my grandsons learned to play ice hockey... in Escondido. FYI...Escondido means hidden or secret in Spanish. See it all fits.

So if OSS and BRN don't know of each other....well it beats me.

And yet, maybe I've got it all wrong.
 
Last edited:
  • Like
  • Love
  • Fire
Reactions: 16 users

Flenton

Regular
Hey Flenton!

Hey Mercedes! has actually been around since 2018.
Implementing the Vision EQXX’s voice control on our neuromorphic chip three years later made it five to ten times more energy-efficient than conventional voice control, but the wake-up words as such that activate the voice assistant are unrelated to Akida. Also think of Apple’s “Hey, Siri!”


View attachment 77057
I did not know that and I've never used an apple or paid attention to anyone using siri so didn't realise they say hey as well.
Thanks for that info. I won't draw my very very long bow next time I see hey something.
 
  • Like
Reactions: 4 users

Frangipani

Regular
I did not know that and I've never used an apple or paid attention to anyone using siri so didn't realise they say hey as well.
Thanks for that info. I won't draw my very very long bow next time I see hey something.

Hey, you never know - you might just miss a good dot join!
I didn’t mean to discourage you from digging deeper when you come across any type of “Hey, xyz!”voice assistant, especially if it is a soon-to-be-launched product promoted as being suspiciously low-energy - I just wanted to clarify that the words as such are not linked to Akida.
(On a side note, Apple users with newer operating systems have been able to drop the ‘hey’ from ‘Hey Siri’ since mid-2023).

However, in the case of the Ray-Ban Meta smart glasses, the article you linked says even though they only hit Australian shores in late 2024, they had already been launched overseas in 2023. So if BrainChip were involved, we would have found out by now.
 
  • Like
  • Love
Reactions: 16 users

HopalongPetrovski

I'm Spartacus!
Well clearly OSS has edge envy and aims to transform its business in that direction. And has all the defense/military contacts that BRN could dream of. And now a footprint in Europe.

Up to now they have had difficulty impressing Wall Street as their market cap and revenue are about equal. Really! New management recognizes this and is emphasizing the opportunity in ML, edge and all our usual buzzwords. They see their addressable market as HUGE. (Sound familiar?)

So much of what they specialize in could be enhanced by Akida......even catapulted.

Anyway in my naivety after discovering OSS in the past month, I purchased 5000 shares...so I am talking my book.

And looking for a secret omen....their headquarters are located where my grandsons learned to play ice hockey... in Escondido. FYI...Escondido means hidden or secret in Spanish. See it all fits.

So if OSS and BRN don't know of each other....well it beats me.

And yet, maybe I've got it all wrong.
Looks very impressive.
Kinda like what I hope for with BrainChip.
Couldn't find much in the way of specifications though and from the pictures looks like latest iteration of current tech.
Nothing wrong with having shares in up coming current solutions along with what maybe will replace a lot of it.
Still need to live whilst waiting for the commercial uptake of the next generation.
Hopefully we have backed the right horse here with neuromorphic design.
It's taking longer to get traction than I assumed it would.
I hope we haven't missed the boat and have got enough of our toe in, at the doors that matter.
 
  • Like
  • Love
  • Fire
Reactions: 14 users

manny100

Regular
Well clearly OSS has edge envy and aims to transform its business in that direction. And has all the defense/military contacts that BRN could dream of. And now a footprint in Europe.

Up to now they have had difficulty impressing Wall Street as their market cap and revenue are about equal. Really! New management recognizes this and is emphasizing the opportunity in ML, edge and all our usual buzzwords. They see their addressable market as HUGE. (Sound familiar?)

So much of what they specialize in could be enhanced by Akida......even catapulted.

Anyway in my naivety after discovering OSS in the past month, I purchased 5000 shares...so I am talking my book.

And looking for a secret omen....their headquarters are located where my grandsons learned to play ice hockey... in Escondido. FYI...Escondido means hidden or secret in Spanish. See it all fits.

So if OSS and BRN don't know of each other....well it beats me.

And yet, maybe I've got it all wrong.
OSS looks good but they say low or extreme low latency.
AKIDA is real time and i assume they are not event based as we are.
That maybe what sets us apart?
 
  • Like
  • Love
Reactions: 6 users

Frangipani

Regular
More details about the AIS 2024 Event-Based Eye Tracking Challenge in April, in which the BrainChip “bigBrains” team consisting of Yan Ru (Rudy) Pei, Sasskia Brüers Freyssinet, Sébastien Crouzet (who has since left our company), Douglas McLelland and Olivier Coenen came in third, narrowly trailing two teams from the University of Science and Technology of China:


View attachment 70431
View attachment 70432
View attachment 70433
View attachment 70434

Remember the AIS 2024 Event-Based Eye Tracking Challenge in April, in which the BrainChip “bigBrains” team consisting of Yan Ru (Rudy) Pei, Sasskia Brüers Freyssinet, Sébastien Crouzet (who has since left our company), Douglas McLelland and Olivier Coenen came in third overall?

I just discovered this on GitHub: TENNs-Eye, “a lightweight spatio-temporal network for online eye tracking with event camera, belonging to the class of TENNs (Temporal Neural Networks) models by BrainChip.”


50F9EE02-4022-4739-B508-4793B7EEE1C9.jpeg

88B3B878-5E96-453F-83EC-500477636B03.jpeg
 
  • Like
  • Fire
  • Love
Reactions: 38 users

Frangipani

Regular
Sounds like a bit of a game changer.
They should have called it BIGGUS DICKUS instead of CENTAURUS:rolleyes:

"Centaurus is a deep SSM that allows for flexible channel connectivity (much like CNNs), which is enabled by optimal tensor contractions. It is an extension of our aTENNuate network, now adapted for several audio tasks such as keyword spotting, denoising, and ASR. Unlike Mamba however, it does not use data gated SSM matrices, yet" 😉


This is from Rudy Pei LinkedIn post.
"As demonstrated, the Centaurus network is SOTA across many real time audio tasks, including keyword sporting, denoising, and speech recognition. More important, it is super lightweight and edge compatible with many devices, including BrainChip's Akida hardware. We see a future where this type of network will become ubiquitous on the edge".

Yet, another feather ditty to the AKIDA repertoire 🎼and not a fat lady in sight.

Are Centaurus and Mamba fighting or dancing together? 😆

8C7DAC3C-DBBE-41A5-8BE3-6A720B2981C7.jpeg
 
  • Like
  • Haha
  • Love
Reactions: 15 users

Frangipani

Regular



Beyond Traditional Security: Neuromorphic Chips and the Future of Cybersecurity​

Bradley Susser

Bradley Susser
·
Follow
10 min read
·
1 hour ago
https://medium.com/m/signin?actionU...b5---------------------clap_footer-----------



A New Era of Cyber Warfare​

The rapid proliferation of cyber threats across the digital world exposes the vulnerabilities of traditional computing architectures, which often rely on outdated signature-based detection methods against increasingly sophisticated attacks. Polymorphic malware, which constantly mutates its code, easily evades conventional signature-based detection, sometimes encrypting files for ransom. Furthermore, distributed denial-of-service (DDoS) attacks overwhelm networks, crippling performance and causing widespread outages. Insider threats, often difficult to detect with traditional security, require analysis of user behavior. In this article, we will explore the intersection of neuromorphic computing and cybersecurity, examining how these two fields can enhance each other and reshape our approach to digital defense.
Moreover, zero-day exploits (vulnerabilities unknown to vendors, giving them zero days to patch them) pose an especially challenging threat to traditional security systems. Neuromorphic systems, by learning from evolving patterns, enable faster, real-time identification of these vulnerabilities. This offers a significant advantage especially in defending against zero-day attacks.
Neuromorphic computing, inspired by the structure and functionality of the human brain, offers a new approach to cybersecurity, providing enhanced threat detection, adaptive defense mechanisms, and energy-efficient real-time processing. These systems learn malware behavior, enabling them to identify threats even with altered signatures, and analyze network traffic patterns in real-time to identify and mitigate DDoS attacksmore effectively than traditional methods. Learning baseline user behaviorallows these systems to identify anomalies indicative of malicious activity. Neuromorphic computing can also enhance cryptographic algorithms for improved data protection and operational efficiency.

Understanding Neuromorphic Computing​

In cybersecurity’s rapidly evolving landscape, speed is key. Neuromorphic computing excels in this area, mimicking biological neural networks to process information in parallel and learn from experience. Think of how your brain instantly reacts when you see a car suddenly brake in front of you, hear a loud noise behind you, or smell smoke. Your eyes, ears, and nose are constantly feeding information to your brain, which processes it all simultaneously and allows you to react in real-time with very little effort. Neuromorphic computing works similarly, using artificial neurons and synapses on specialized chips to make very fast decisions. These chips perform complex computations far more efficiently than conventional computers, which process information one step at a time. Consider, for example, how traditional systems using CPUs and GPUs often rely on sending data back and forth between edge devices (like smartphones, security cameras, or smart home appliances) and the cloud for processing. This constant communication requires significant energy. Even within a single device, CPUs and GPUs must repeatedly access separate memory locations, a process that becomes increasingly energy-intensive with the enormous data models used by AI and large language models. These massive models, growing at such a rapid pace, demand tremendous amounts of energy for processing. In contrast, neuromorphic computing, with its parallel processing and on-chip memory, offers a much more energy-efficient approach. This speed and efficiency make neuromorphic computing particularly valuable for enhancing cryptographic systems and ensuring data protection (e.g., quickly identifying and blocking malicious network traffic).

Real-World Applications and Pilot Projects​

While neuromorphic computing is still in its early stages of adoption in the cybersecurity field, several promising pilot projects and initiatives are already demonstrating the power of chips like Loihi, NorthPole, and Akidain real-world scenarios. Take for example BrainChip’s Akida. One of the most notable applications of Akida is in the field of edge security. Early partnerships with IoT device manufacturers are testing Akida’s ability to handle real-time threat detection in resource-constrained environments. These pilot projects focus on Akida’s ability to process large volumes of data at the edge, detect anomalies, and mitigate threats without requiring a constant connection to a centralized server. These projects are paving the way for more efficient, scalable, and secure IoT networks in industries like smart homes, industrial IoT, robotics and autonomous vehicles.

In the case of IBM, the company has integrated its NorthPole chip into several smart city initiatives to combat the growing challenge of securing urban infrastructures against cyberattacks. NorthPole’s energy efficiencyand low latency make it ideal for monitoring critical infrastructure such as traffic systems, power grids, and public surveillance networks in real-time. Pilot projects in cities like New York and Berlin are exploring the chip’s ability to detect cyber threats as they happen, enabling more proactive and automated responses to attacks.

Finally, Intel’s Loihi 2, which is being trialed in the financial services sector, is helping banks and financial institutions strengthen their cybersecurity defenses. Loihi 2 is currently being tested for fraud detection and real-time transaction monitoring by utilizing event-based, asynchronous processing. Asynchronous processing means that the chip doesn’t wait for all the information to arrive before starting to work; it processes data as soon as it becomes available, like a team working on different parts of a project at the same time instead of waiting for each person to finish their task before moving on. In a pilot program with a major financial institution, Loihi 2 has shown promise in quickly identifying fraudulent transactions by learning transaction patterns and flagging irregularities, even as financial data changes dynamically.

These early applications demonstrate how neuromorphic computing is not only a theoretical concept but a practical, scalable solution for improving cybersecurity systems. As these technologies mature and prove their value, we can expect to see further integration into industries that rely heavily on data security, such as finance, healthcare, and critical infrastructure.

Neuromorphic Computing and Cryptography​

Neuromorphic computing can significantly enhance both the performance and security of cryptographic systems, playing a critical role in protecting data at rest and in transit. Its ability to process information simultaneously (parallel processing) and to react instantly to specific triggers (event-driven computation) makes it ideal for speeding up cryptographic tasks and creating advanced cryptographic methods.

Neuromorphic computing, with its ability to process information in paralleland react to events as they happen, has huge potential for improving the security of encryption. When data is stored (“data at rest”), these systems can automatically update and strengthen encryption, making it much tougher for hackers to guess the codes or use brute-force attacks (trying every possible combination). Think of a brute-force attack like trying every possible sequence on a lock. Hackers use powerful computers to try and guess passwords or encryption keys. Neuromorphic systems make this much more difficult by constantly changing the encryption key, so it’s like a moving target for these adversaries. Because they can adapt, these systems can also respond instantly to new threats, even those from quantum computers.

Regarding “data in transit,” a key challenge is ensuring secure communication without introducing latency, which refers to the delaybetween sending and receiving information. Neuromorphic systems can accelerate encryption and decryption processes, minimizing this delay and ensuring secure, real-time data transfer. This speed is crucial for applications where timely communication is essential, such as video conferencing or online transactions. Furthermore, by handling cryptographic tasks independently and concurrently, these systems can efficiently manage high-volume data streams without straining network resources, further reducing potential bottlenecks (congestion) that could introduce latency..

Looking ahead, the rise of quantum computers poses a serious threat to current online security. These powerful machines could crack the codes in seconds as opposed to utilizing classical computers where it would take years. This is where post-quantum cryptography (PQC) comes in. PQC involves new, super-complex math problems that even quantum computersshould struggle to solve. However, these new methods are also very resource intensive, meaning they take a long time to run. Neuromorphic chips, with their speed and efficiency, can accelerate these PQC calculations, making them practical for everyday use and ensuring our datastays safe in the quantum era.

Another exciting area is homomorphic encryption. Imagine being able to perform calculations on encrypted data without ever having to decrypt it. That’s homomorphic encryption. It’s like having a secure box where you can manipulate the contents without ever opening it. This is great for privacy, but it’s also incredibly complex and slow. Neuromorphic computing, because it’s good at handling complex calculations, could make homomorphic encryption much faster and more practical, opening up new possibilities for secure and private computing.

Challenges and Hurdles in Neuromorphic Computing for Cybersecurity​

While the potential of neuromorphic computing in cybersecurity is clear, several key hurdles must be addressed before widespread adoption. One primary challenge is the lack of mature software frameworks. Unlike traditional computing, neuromorphic systems require specialized programming models not widely available, necessitating the development of new tools, compilers (which translate high-level programming languages into simpler machine code that computers can understand), and debugging environments (which help programmers find and fix errors in their code). This requires significant time and expertise, slowing integration. Furthermore, specialized expertise is crucial for designing, programming, and maintaining these systems. Neuromorphic computing blends hardware, neuroscience, and AI, demanding multidisciplinary understanding. This specialized knowledge limits the pool of qualified professionals, thus impeding growth and limiting widespread adoption.
Neuromorphic systems are also vulnerable to adversarial attacks. Their adaptive nature could be exploited, with nefarious individuals developing attacks to manipulate the learning process. Several specific attack types pose concerns. Backdoor attacks incorporate subtle triggers into the system’s learning, causing misclassification or missed threats. Evasion attacks use artificially generated data to confuse the system’s decision-making, causing it to miss malicious activity. Model poisoning manipulates training data to impair learning, potentially allowing attacks to go undetected. Catastrophic forgetting attacks cause the system to forget critical patterns, reducing effectiveness. Physical layer attacks exploit neuromorphic hardware vulnerabilities using side-channel attacks (attacks that exploit unintended information leaks from a system). Defending against these sophisticated attacks requires specialized countermeasures like adversarial training and Improved system reliability and resilience.

Cost is another significant factor. While offering impressive energy efficiency and speed, neuromorphic chips like NorthPole, Loihi 2, and Akida are currently expensive to develop and deploy. High research and development (R&D) costs and specialized manufacturing processes hinder large-scale deployments. Embedding these chips into existing infrastructure could also require substantial investment and CAPEX (capital expenditures). Although exact pricing is often undisclosed, neuromorphic chips are generally more expensive than traditional processors due to specialized design, limited production, and highly advanced manufacturing techniques. For example, widely used GPUs like the Nvidia H100 have more predictable pricing compared to neuromorphic chips, which are still being refined.

Despite these cost challenges, neuromorphic computing is poised for greater affordability. As manufacturing improves and demand increases, costs are expected to decline. Economies of scale, coupled with growing industry adoption, will drive down prices in the coming years, particularly with the development of standardized software frameworks.

Future Research Directions in Neuromorphic Computing for Cybersecurity​

As neuromorphic computing continues to evolve, several exciting areas of research could vastly improve upon its application in cybersecurity. One of the most pressing challenges is designing powerful defenses against attacks. Researchers are exploring new methods for adversarial training, a process where systems are exposed to adversarial examples during their learning phase, to build more resilient models. Additionally, new ways to defendneuromorphic systems are being researched, like automatically updatingtheir defenses and using special programs to spot unusual activity. These could help the systems recognize and respond to new threats more effectively. These defenses, much like established cybersecurity frameworks such as those from ISO, NIST, ISACA (and others), aim to provide a structured and comprehensive approach by mitigating risks. Implementation of such defenses, similar to the adoption of robust frameworks, is crucial in ensuring system security.

Another critical area of research is the development of standardized software frameworks for neuromorphic computing. Currently, there is a lack of mature, user-friendly tools for programming and debugging these systems. Establishing standardized platforms, much like the development of standardized cybersecurity frameworks (e.g., the NIST Cybersecurity Framework, ISO 27001, or frameworks addressing cybersecurity within broader IT governance frameworks such as ISACA’s COBIT), would make it easier for developers to design and implement neuromorphic solutions across various cybersecurity applications. These frameworks could streamline the development process, enabling faster adoption and integration into real-world systems, similar to how established frameworks facilitate consistent and effective security practices.
Beyond current use cases in threat detection and cryptography, future research could unlock new applications for neuromorphic computing in cybersecurity. For instance, neuromorphic systems could be applied to biometric authentication, improving the speed and accuracy of fingerprint, facial recognition, and voice identification systems. Researchers are also investigating how neuromorphic systems could be leveraged in autonomous defense mechanisms for IoT devices, providing real-time, localizeddecision-making and threat mitigation at the edge. Just as organizations rely on frameworks like those from ISO, NIST, and ISACA (including their COBIT framework for IT governance) to guide their security practices, the development of standardized tools for neuromorphic computing will be essential for its effective deployment.

As the landscape of cybersecurity changes with the rise of quantum computing, future research could explore hybrid systems that combine neuromorphic computing and quantum computing to create robust, scalable, and quantum-resistant cryptographic protocols. This fusion of technologies would not only advance the field of cybersecurity but also ensure that systems can stand the test of future technological advancements. An integrated approach to security, much like the layered security promoted by various frameworks, will be critical for protecting against evolving threats. Exploring these research directions will allow the cybersecurity arena to benefit from more powerful, adaptive, and energy-efficient security systems capable of countering future cybersecurity threats, much like how adherence to established cybersecurity frameworks strengthens an organization’s overall security position.

Conclusion​

Neuromorphic computing holds immense promise for cybersecurity, providing real-time, adaptive, and energy-efficient defenses against increasingly sophisticated attacks. Systems like NorthPole, Loihi 2, and Akida illustrate this potential. Through massive parallelism, low-latency processing, and the ability to incorporate memory and compute on-chip(placing processing capabilities directly on the chip with memory), these systems enable faster, more effective detection of anomalies, DDoS attacks, and malware, ensuring that cybersecurity frameworks are ready to combat tomorrow’s challenges.
Neuromorphic computing, with its impressive speed and energy efficiency, is poised to secure digital infrastructure and advance future technologies. As these technologies mature, they will undoubtedly play a pivotal role in shaping the cybersecurity domain of tomorrow.

References​

Zahm, W., Nishibuchi, G., Jose, A., Chelian, S., & Vasan, S. (2022). Cyber-Neuro RT: Real-time Neuromorphic Cybersecurity. Low-Power Cybersecurity Attack Detection Using Deep Learning on Neuromorphic Technologies, CSIAC 2024: Volume 8 Issue 2.

ER, K. (2024). A Neuroscience Perspective on AI and Cybersecurity. ISACA Journal, 2025 Volume 1.

Cao, B., Zhang, Z., & Liu, W. (2017). A Survey of Neuromorphic Computing and Neural Networks in Hardware. Journal of Neural Networks, 29(3), 213–231.
Büchel, J., Lenz, G., Hu, Y., Sheik, S., & Sorbaro, M. (2021). Adversarial attacks on spiking convolutional neural networks for event-based vision. IEEE Transactions on Neural Networks and Learning Systems, 32(4), 1234–1249.
Modha, D. S., Akopyan, F., Andreopoulos, A., Appuswamy, R., Arthur, J. V., Cassidy, A. S., Datta, P., DeBole, M. V., Esser, S. K., & Ueda, T. (2023). Neural inference at the frontier of energy, space, and time. Science, 382(6668), 329–335. DOI: 10.1126/science.adh1174

Cybersecurity

Technology
Neuromorphic Computing
Future

https://medium.com/m/signin?actionU...b5---------------------clap_footer-----------


Bradley Susser

Written by Bradley Susser​


38 Followers
·570 Following
Masters in Info Systems. Real Estate Agent. Previously founded a consulting firm specializing in IPOs, capital raising, investor relations for over a decade
Follow
 
  • Like
  • Fire
  • Wow
Reactions: 34 users

manny100

Regular
I think different markets will start to develop within the 'Edge' industry. Some chips will be more suited to some tasks than others.
It appears that OSS and AKIDA have different strengths and will be suited to different use cases.
 
  • Like
Reactions: 8 users

Frangipani

Regular

DDAF4F22-73DB-4CB1-AB40-8C617DC87ECB.jpeg
 
  • Like
  • Fire
Reactions: 21 users

TheDrooben

Pretty Pretty Pretty Pretty Good
  • Like
  • Fire
  • Love
Reactions: 72 users
  • Like
  • Fire
  • Love
Reactions: 18 users

RobjHunt

Regular
Hi RobjHunt,

you unknowingly addressed a problem here that keeps surfacing time and again when translating the German word sogenannt into English, as the literal translation so-called has a negative connotation in English, indicating that the speaker/writer thinks a word used to describe something or someone is actually wrong.

However, this is not necessarily the case in German. And certainly not in this context. While the adjective sogenannt can be used sarcastically or as a way to distance yourself from something others say you don’t agree with, it does not convey any pejorative connotation in general, especially not in a scientific-technical context such as this MB LinkedIn post. It is commonly used in texts aimed at a broad audience to refer to specialist nomenclature nouns with a low lexical frequency in the general language, signifying to the readers in a perfectly neutral way that the following term they may not have come across before is the appropriate science-/tech jargon for it.

So in the context of the recent MB post, the word sogenannt in das sogenannte “Event-Based-Processing” simply means “this is how the (or we) experts refer to it”, and so-called - the literal translation into English - does not imply that the author thinks something is fishy about the term “event-based processing”.
Ok then ;)
 
  • Haha
Reactions: 3 users

7für7

Top 20
I hope for a nice price-sensitive announcement this week because it’s just the right time to take advantage of all the current attention on AI, low cost, and low power. If not, it might be too late again—maybe it already is, I don’t know. It’s kind of a pity because BrainChip has so much potential. Wishing everyone a great start to the week! Fingers crossed that something gets released!

IMG_0182.jpeg
 
  • Like
Reactions: 16 users
I hope for a nice price-sensitive announcement this week because it’s just the right time to take advantage of all the current attention on AI, low cost, and low power. If not, it might be too late again—maybe it already is, I don’t know. It’s kind of a pity because BrainChip has so much potential. Wishing everyone a great start to the week! Fingers crossed that something gets released!

View attachment 77078
1738472972983.gif
 
  • Haha
  • Like
Reactions: 6 users

TECH

Regular
Intel is on a really slippery slide........more bad news, maybe Mike Davies and the Neuromorphic Team are financed
by a different executive team, because from where I stand, Intel has all the signs of being a takeover target from
another entity who can keep up with the times, Pat got the arse, the new CEO is under pressure already !

Here's just one recent article, of numerous negative articles.......being a partner of Intel Foundries sounds impressive, but
check out their 13 Billion Dollar loss ?

 
  • Wow
  • Like
  • Sad
Reactions: 12 users
Looks very impressive.
Kinda like what I hope for with BrainChip.
Couldn't find much in the way of specifications though and from the pictures looks like latest iteration of current tech.
Nothing wrong with having shares in up coming current solutions along with what maybe will replace a lot of it.
Still need to live whilst waiting for the commercial uptake of the next generation.
Hopefully we have backed the right horse here with neuromorphic design.
It's taking longer to get traction than I assumed it would.
I hope we haven't missed the boat and have got enough of our toe in, at the doors that matter.
In reference to Sean in this latest podcast he was fairly confident in the eyes, ears and Wrist applications. I think once we get a deal signed which Iam betting will 💯 % happen before the end of April we will all become less stressed over BRN.
 
  • Like
  • Fire
  • Love
Reactions: 30 users

IloveLamp

Top 20
  • Like
  • Fire
Reactions: 25 users
Top Bottom