BRN Discussion Ongoing

mcm

Regular
Nandan likes this........why.........

View attachment 46881
Nandan likes this...............why.........

View attachment 46881
Indeed ... why?! I bought more because of his like ... and because I can't help but feel something big is about to be announced. 😎
 
  • Like
  • Fire
  • Wow
Reactions: 24 users

buena suerte :-)

BOB Bank of Brainchip
1iA9aOOgFiDHmL6U3Q9g009zH7E=.gif
Seems that we were ⬆️ today I got you a Green one :) 🤪

Larger version coming Soooooooooonn 😍

1697088442967.png
 
  • Like
  • Haha
Reactions: 11 users

Diogenese

Top 20
When it comes to SoCs, I think I read somewhere that Google is happy to get take-away rather than home-cooked.
Apropos of nothing in particular, some interesting IP statistics:

Google's patents which utilize NNs - about 200;
Google's patents which cover solid state NN architecture - about zilch.
 
  • Like
  • Fire
  • Love
Reactions: 23 users

Boab

I wish I could paint like Vincent
  • Haha
  • Like
Reactions: 3 users

buena suerte :-)

BOB Bank of Brainchip
Last edited:
  • Haha
  • Like
Reactions: 7 users

Neuromorphia

fact collector
  • Like
  • Love
  • Fire
Reactions: 81 users
  • Haha
Reactions: 1 users

ODAAT

one day at a time
Hi Gies,

That's not correct.

Peter is named as the inventor on most of the patents, but the company is the assignee.
Hi Dio,

There was some discussion earlier this month (post #66,770 and #66,790) where it was said that PVDM owns the foundation patent (circa 2008), and this ownership is good until 2028. It was said in the post this foundation patent is owned by Peter, not the company. I don't know if this is true myself but remember thinking this is sensational, as it's another reason why our IP and lead from our competitors is maintained.
 
  • Like
  • Fire
Reactions: 6 users

Frangipani

Regular
Meet global futurist Dr Bruce McCabe, who is extremely passionate about neuromorphic computing! 🚀

269AD2C3-1CFF-44A1-AC8C-1CFF3CAE4156.jpeg


I listened to his podcast with Dr Alexandre Marcireau from WSU’s ICNS (International Centre for Neuromorphic Systems) a couple of weeks ago, but can’t seem to find it posted here, yet, via the search function. No mention of Brainchip, but worthwhile listening to nevertheless.

A link to the podcast transcript is also provided below.

One thing that confuses me, though, in the article & podcast below as well as in other publications, is how the term “analog” is being used here (eg “The Future of AI is analog“).

Is it correct to say in this context it doesn’t refer to the analog vs digital logic circuitry design specifically (such as Akida being fully digital vs. eg Mythic’s analog compute architecture) but rather to the general concept of the extremely power-efficient way our brain processes information (neurons working asynchronously and in parallel etc) which differs fundamentally from the way a digital computer operates on data expressed in binary code?

So is analog here essentially just being used as a synonym for neuromorphic?




NEUROMORPHIC COMPUTING AND THE FUTURE OF A.I.​


BIO-INSPIRED COMPUTER CHIPS WITH DR ALEX MARCIREAU​


neuromorphic_camera.png



I’m calling neuromorphic computing the most important computer engineering research in the world, now and through the next 20 years. That’s right, more important than quantum computing (you heard it hear first!). Why? Because everything we want to do in the future of AI, everywhere we want to go long-term, is predicated on transitioning to more fit-for-purpose computer architectures, and the most fit-for-purpose architectures are most certainly those inspired by nature.

MEETING DR ALEXANDER MARCIREAU​

I interviewed the irrepressible Dr Alexandre Marcireau at the International Centre for Neuromorphic Systems (ICNS) at University of Western Sydney. Alex is softly-spoken, laughs easily, and as you would expect is extremely passionate about his field. He generously took me on a tour of his lab to check out his prototypes, including the neuromorphic cameras (see right-hand image above, and the cover image for this article) that are currently circulating in the International Space Station. Afterwards he sat down to answer questions and share his views on the future. He’s very much tuned-in to both the immediate applications and the long-term planetary-scale benefits his technology has to offer. I know you’ll enjoy listening to him!




CHECK OUT THE PODCAST TRANSCRIPT

THE FUTURE OF AI IS ANALOG (YES, REALLY!)​

Alex’s long-term dream is analog AI computing from end to end. Biology is messy, every organism is different, but it WORKS SO WELL! If we want to truly emulate the efficiencies of nature in computers that sense and learn then we MUST go analog not digital. We are slicing off and solving one sensory processing function at a time — a bio-inspired camera here, a bio-inspired microphone there — but the long-term dream is end to end. Of course we’ll still be using fast number-crunching and general purpose computing chips for everything else, so the future of computing more broadly will always be a mix of digital and analog, classical and neuromorphic.

And when it comes to the eye-watering energy demands of AI, the comparative advantages with respect to classical chips for AI-related functions are immense. If things keep going as they are, the ruinous energy demands of crypto-mining will be as nothing compared to the future energy demands from exponential AI. Conclusion: long-term we will HAVE TO transition AI workloads off traditional architectures.

ENDLESS REAL-WORLD APPLICATIONS​

I loved Alex’s discussion of the different opportunities at the ‘edge’ and the ‘centre’ of AI, and the lab’s real-world applications in deploying high-efficiency low-energy event-based cameras to detect lightning strikes and satellites from the International Space Station, and to track koalas and insects in forests down here on earth. We also had fun talking about using AI to decipher animal languages, such as they are doing at projectceti.org (decoding the communications of sperm whales), and neuromorphic applications to make drones vastly more capable and to enable the bi-directional brain interfaces and neuro-prosthetics I’ve been looking at lately in the future of medicine. So many good things to be transformed.

BIOLOGY + COMPUTING = MAGIC​

The field crosses many boundaries. Biologists, mathematicians, hardware engineers, physicists, programmers are all working together to create a new industry – what could be more exciting than that? And what about the big gaps in our understanding of various neural systems in nature, and how every improvement in our biological knowledge (such as those auditory, visual, olfactory and learning connectomes that we keep extending for fruit flies and mice) yields new opportunities? I loved Alex’s honesty when he said that the computer scientists were benefiting immensely from the biologists, but perhaps not yet giving back nearly as much. When I see all the ways AI is being used to unravel the biology of animals and plants, I don’t think it will be one-way traffic for long!

HARDWARE PLUS WETWARE?​

As a side note, I enjoyed hearing Alex’s comments about the use of living cells, aka ‘wetware.’ I can’t remember whether this made it onto the recorded interview or not, but when I asked about growing live neurons as a way to resolve some of the challenges of constructing massively parallel data connectors between all those pixels and all those ‘transistors,’ Alex confirmed that this is something scientists have indeed tried, although their experiments have been hampered because the computer quickly dies … literally! When I link these experiments to the work I’m seeing with brain organoids in regenerative medicine, it really opens up my thinking about the long-term (>30 years?) possibilities. Fascinating!

HUGE OPPORTUNITY SPACE​

A big takeaway from this interview is how much headroom there is. It’s like looking at the nascent computer chip industry circa 1950. We’ve only started knocking off the little opportunities. As fast as we explore, we identify new ones. The opportunity space is HUGE. If you are investing in computer engineering, or studying it, or building AI systems (who isn’t?) or you happen to manufacture computer chips, take note!

The software side is a particularly innovative space. How to best to receive all that parallel data? How best to process it? How best to navigate the analog/digital interfaces? How can we take full advantage of super-fast and super-local AI decision-making ‘at the edge.’ With so many different possibilities and no baked standards (because, hey, it’s still way too early for those) this field is a boon for creative and talented programmers.

FUNDAMENTAL TO THE FUTURE OF AI​

When I say more important than quantum, I mean it. Don’t get me wrong, quantum is big. It matters. It tackles big problem spaces, especially in molecular, atomic and sub-atomic simulations that will give us access to new drugs, enzymes and materials, and in complex system optimisation problems. But neuromorphic engineering has the potential to boost the capabilities of EVERY aspect of the biggest technological force of change in our time, artificial intelligence, and further, transitioning to bio-inspired neuromorphic hardware is fundamental to the long-term future of AI if we want it to stay on its current exponential adoption curve without smashing into an energy ceiling.
 
Last edited:
  • Like
  • Love
  • Thinking
Reactions: 14 users

Diogenese

Top 20
Hi Dio,

There was some discussion earlier this month (post #66,770 and #66,790) where it was said that PVDM owns the foundation patent (circa 2008), and this ownership is good until 2028. It was said in the post this foundation patent is owned by Peter, not the company. I don't know if this is true myself but remember thinking this is sensational, as it's another reason why our IP and lead from our competitors is maintained.
Hi Odatt,

That was true until 2015 when it was assigned to the company:

https://worldwide.espacenet.com/patent/search/family/042038652/publication/US8250011B2?q=us8250011

1697099103037.png



US8250011B2 Autonomous learning dynamic artificial neural computing device and brain inspired system
 
  • Like
  • Fire
  • Love
Reactions: 25 users

Perhaps

Regular
Hi Odatt,

That was true until 2015 when it was assigned to the company:

https://worldwide.espacenet.com/patent/search/family/042038652/publication/US8250011B2?q=us8250011

View attachment 46903


US8250011B2 Autonomous learning dynamic artificial neural computing device and brain inspired system
Not to forget, US patents only work for the US. There still remains a lot to do with all those pending patents to have a real IP protection, especially for UK, Korea, Taiwan and Europe. Chinese patents will not work anyway.
The whole patent situation leads to defensive tactics Brainchip is forced to use.
The existing patent portfolio needs further updates, on a low level yet. Not the fault of Brainchip, just bureaucracy.
 
  • Like
  • Fire
  • Thinking
Reactions: 7 users

equanimous

Norse clairvoyant shapeshifter goddess
We are severely underrating BRN's partnership with TCS.

1697102460467.png

1697102531709.png

1697102681701.png
 
  • Like
  • Fire
  • Love
Reactions: 51 users

Perhaps

Regular
  • Like
Reactions: 7 users

equanimous

Norse clairvoyant shapeshifter goddess
Last edited:
  • Like
  • Love
  • Fire
Reactions: 16 users

Perhaps

Regular
  • Like
  • Thinking
Reactions: 3 users

equanimous

Norse clairvoyant shapeshifter goddess
Interesting going back through some filed patents of other companies with SNN

Spiking neural network with reduced memory access and reduced in-network bandwidth consumption​

Mar 7, 2016 - Samsung Electronics
A spiking neural network having a plurality layers partitioned into a plurality of frustums using a first partitioning may be implemented, where each frustum includes one tile of each partitioned layer of the spiking neural network. A first tile of a first layer of the spiking neural network may be read. Using a processor, a first tile of a second layer of the spiking neural network may be generated using the first tile of the first layer while storing intermediate data within an internal memory of the processor. The first tile of the first layer and the first tile of the second layer belong to a same frustum.

SPIKING NEURAL NETWORK SYSTEM, LEARNING PROCESSING DEVICE, LEARNING METHOD, AND RECORDING MEDIUM​

May 18, 2020 - NEC CORPORATION
A spiking neural network system includes: a time-based spiking neural network; and a learning processing unit that causes learning of the spiking neural network to be performed by supervised learning using a cost function, the cost function using a regularization term relating to a firing time of a neuron in the spiking neural network.

Odor discrimination using binary spiking neural network


Patent number: H2215
Abstract: An odor discrimination method and device for an electronic nose system including olfactory pattern classification based on a binary spiking neural network with the capability to handle many sensor inputs in a noise environment while recognizing a large number of potential odors. The spiking neural networks process a large number of inputs arriving from a chemical sensor array and implemented with efficient use of chip surface area.
Type: Grant
Filed: March 29, 2004
Date of Patent: April 1, 2008
Assignee: The United States of America as represented by the Secretary of the Air Force
Inventors:
Jacob Allen, Robert L. Ewing, Hoda S. Abdel-Aty-Zohdy

Continuous time spiking neural network event-based simulation that schedules co-pending events using an indexable list of nodes

Patent number: 9015096
Abstract: Certain aspects of the present disclosure provide methods and apparatus for a continuous-time neural network event-based simulation that includes a multi-dimensional multi-schedule architecture with ordered and unordered schedules and accelerators to provide for faster event sorting; and a formulation of modeling event operations as anticipating (the future) and advancing (update/jump ahead/catch up) rules or methods to provide a continuous-time neural network model. In this manner, the advantages include faster simulation of spiking neural networks (order(s) of magnitude); and a method for describing and modeling continuous time neurons, synapses, and general neural network behaviors.
Type: Grant
Filed: May 30, 2012
Date of Patent: April 21, 2015
Assignee: QUALCOMM Incorporated
Inventor: Jason Frank Hunzinger

SPIKING NEURAL NETWORK DEVICE AND LEARNING METHOD OF SPIKING NEURAL NETWORK DEVICE

Publication number: 20210056383
Abstract: A spiking neural network device according to an embodiment includes a synaptic element, a neuron circuit, a synaptic potentiator, and a synaptic depressor. The synaptic element has a variable weight. The neuron circuit inputs a spike voltage having a magnitude adjusted in accordance with the weight of the synaptic element via the synaptic element, and fires when a predetermined condition is satisfied. The synaptic potentiator performs a potentiating operation for potentiating the weight of the synaptic element depending on input timing of the spike voltage and firing timing of the neuron circuit. The synaptic depressor performs a depression operation for depressing the weight of the synaptic element in accordance with a schedule independent from the input timing of the spike voltage and the firing timing of the neuron circuit.
Type: Application
Filed: February 27, 2020
Publication date: February 25, 2021
Applicant: KABUSHIKI KAISHA TOSHIBA
Inventors:
Yoshifumi NISHI, Kumiko NOMURA, Radu BERDAN, Takao MARUKAME

METHOD AND SYSTEM FOR OPTIMIZED SPIKE ENCODING FOR SPIKING NEURAL NETWORKS

Publication number: 20220222522
Abstract: This disclosure generally relates optimized spike encoding for spiking neural networks (SNNs). The SNN processes data in spike train format, whereas the real world measurements/input signals are in analog (continuous or discrete) signal format; therefore, it is necessary to convert the input signal to a spike train format before feeding the input signal to the SNNs. One of the challenges during conversion of the input signal to the spike train format is to ensure retention of maximum information between the input signal to the spike train format. The disclosure reveals an optimized encoding method to convert the input signal to optimized spike train for spiking neural networks. The disclosed optimized encoding approach enables maximizing mutual information between the input signal and optimized spike train by introducing an optimal Gaussian noise that augments the entire input signal data.
Type: Application
Filed: March 1, 2021
Publication date: July 14, 2022
Applicant: Tata Consultancy Services Limited
Inventors:
DIGHANCHAL BANERJEE, Sounak DEY, Arijit MUKHERJEE, Arun GEORGE

SPIKING NEURAL NETWORK

Publication number: 20180260696
Abstract: Broadly speaking, embodiments of the present technique provide a neuron for a spiking neural network, where the neuron is formed of at least one Correlated Electron Random Access Memory (CeRAM) element or Correlated Electron Switch (CES) element.
Type: Application
Filed: March 8, 2017
Publication date: September 13, 2018
Applicant: ARM LTD
Inventors: Naveen SUDA, Vikas CHANDRA, Brian Tracy CLINE, Saurabh Pijuskumar SINHA, Shidhartha DAS
 
  • Like
  • Fire
  • Wow
Reactions: 11 users
Top Bottom