BRN Discussion Ongoing

Diogenese

Top 20
View attachment 62220



AI Agents: The Future of AI & Sam Altman’s Vision 🚀
Recently, OpenAI's CEO, Sam Altman, sat down with MIT Technology Review to share his exciting vision of how AI tools will seamlessly integrate into our daily lives, becoming even more integral than smartphones. 🕹️
Altman envisions an AI that acts as a "super-competent colleague" that knows everything about us but doesn't feel intrusive. This AI could handle tasks instantly or work on complex ones and seek our input when needed. 🖥️✨

The Current State of AI 🌐
Altman highlighted OpenAI's existing offerings like ChatGPT, DALL-E, and Sora, which have impressed us with their capabilities but primarily assist with isolated tasks. He sees the future where AI extends beyond chat interfaces, handling real-world tasks for us. 📝🔍

The Hardware Debate 💡
When asked about new hardware, Altman mentioned that future AI might not need specialized devices, although he believes new hardware could enhance the experience. He's intrigued by AI hardware but admits it's not his area of expertise. 🕶️

Challenges in Training Data 📊
Altman acknowledged the industry's training data scarcity but remains optimistic. He believes there’s a way to train AI without needing more and more data, pointing out that humans are proof that there’s another way to develop intelligence. 🧠🔬

The Race for AGI 🏆
OpenAI's mission is to develop artificial general intelligence (AGI) that benefits humanity. Altman expects multiple versions of AGI, each excelling in different areas. He believes a certain compute threshold is essential but is open to various outcomes. 🤖✨

What's Next for GPT? 🔮
When asked about GPT-5’s release, Altman simply smiled and confirmed he knew the timeline, keeping us in suspense. 😅

Personal thought
AGI is self learning, maybe concious AI and thus equal to infinite knowledge. Let me know in the comments: how will business work in future, when AGI is at play?
In any case I would strongly advise them to look at The Akida Neuromorphic Chip, as it offers energy efficiency, real-time processing, and adaptive learning, making it a key player in the future of AI agents envisioned by Sam Altman and OpenAI. 🚀



The Hardware Debate 💡
When asked about new hardware, Altman mentioned that future AI might not need specialized devices, although he believes new hardware could enhance the experience. He's intrigued by AI hardware but admits it's not his area of expertise
. 🕶️

I understood Altman was not into hardware when he dumped a truckload of OpenAi's cash into Rain AI's chaos theory NN, a failed system based on the confusion between chaos and complexity, or on some bizarre self-assembly hypothesis.

The fact that he has not turned his attention to hardware AI, suggests a bit of the Kodak NIH* syndrome to me, or maybe "once-bittern ... ".

1714905201990.png





* NIH = not invented here.
 
  • Like
  • Fire
  • Thinking
Reactions: 20 users

Diogenese

Top 20
The board needs to come out at this coming AGM and give us an honest take on how Sean has been performing with regard to his targets and forecasts, rather than say everything is confidential and under NDAs. They could easily also mention how many companies they are working with by industry, and also differentiate the involvement of the engineering teams, e.g., proof-of-concept, prototyping, commercial products. E.g. Amongst the xx car manufacturers, we are working with, yy% are POC, zz% are prototyping and aa% are commercial products. Within the xx consumer electronics / appliance manufacturers we are working with, yy% are POC, zz% are prototyping and aa% are commercial products. The markets for industries are so large that there is not much a competitor can glean from that. For example there are several hundred motor vehicle manufacturers worldwide.
And what happened the last time BRN mentioned a North American car maker?
 
  • Like
  • Haha
  • Fire
Reactions: 15 users

SakisFFM

Emerged
TLDR: I think TENNs is based a lot on ABRs LMUs. ABR are a competitor but not as big as they don't do IP or currently use SNNs.

For those more technically minded:
Regarding TENNs, the presentation slides by Tony Lewis from the other week were pretty interesting.

The slides indicate TENNs is based on the Chebyshev polynomial, and it's compared to the Legendre polynomial on the same page. For reference, these are both from the family of Jacobi polynomials. Chebyshev are generally thought to converge faster (which is better) but there are a few applications where Legendre or other Jacobi polynomials are better. I find this really interesting because I think TENNs was highly inspired by the work from Applied Brain Research (ABR) on their Legendre Memory Unit (LMU).

LMUs were first proposed in a 2019 paper by Chris Eliasmith from ABR. Many on here will recall that PVDM won an award in 2021 following a presentation done on LMUs by Eliasmith (who got second place). PVDM won because a lot of shareholders from here voted for him, which I find it ironic as the last big breakthrough from Brainchip was TENNs (which I think are heavily based off the LMU algorithm).

That said, I think this was one of the better directions Brainchip could have taken. The LMU is an RNN but which overcomes many of the RNN limitations, and the ABR paper indicated it would work well on SNNs.

It's worth noting that ABR have produced a chip based on the LMU but they don't do IP so they have a smaller market. I also don't think their chip is an SNN. They have also licensed their LMU so anyone who wants to use it must pay them. Note that by using a different algorithm and possibly making other changes, Brainchip have bypassed this and potentially made their work patentable (you can't patent something that's been published in a paper). By integrating TENNs with the Akida platform, both further complement each other.

On a side note, one of the more promising architectures to replace transformers for certain tasks right now is Mamba, as it needs less training and works well with long sequences.

I initially thought Mamba may have been the next direction Brainchip would have taken for LLMs (after TENNs) but now I'm not so sure as one of the slides also does a comparison with Mamba, which it shows TENNs does relatively well against. Hopefully we find out more at the AGM.

Comparison research paper:

Original LMU paper:

Tony's slides (from Berlinlover):

ABR chip:
 
  • Like
  • Fire
  • Love
Reactions: 43 users

Frangipani

Regular
Here is further evidence - postdating that of cosors (02/2022) - that AKD1000 has been used for research at FZI (Forschungszentrum Informatik / Research Centre for Information Technology) in Karlsruhe, Germany.

Julius von Egloffstein is presently both an M.Sc. student at Karlsruhe Institute of Technology (KIT) and a research assistant at FZI. The technical university and the non-profit research institute FZI collaborate closely, and hence quite a few KIT students will do research for their Bachelor’s or Master’s theses at FZI, like this young gentleman did two years ago, when he worked on his Bachelor’s thesis titled Application Specific Neural Branch Prediction with Sparse Encoding (for which he got a perfect score, by the way).

And look what kind of neuromorphic hardware he used for evaluation:

View attachment 60120


It is worth noting that FZI researchers are not working in an ivory tower - instead, their research is all about applied computer science and technology transfer to companies and public institutions:


View attachment 60227
fff985da-70f4-41dc-8fac-eb836ad63ae6-jpeg.60228

View attachment 60180

View attachment 60182
View attachment 60183

FZI is currently looking for a research assistant in the field of “Neuromorphic Computing in Edge Systems”, with a focus on automated and autonomous driving. It is a two year fixed-term contract, which may or may not be renewed.

Even those of you who don’t speak German can spot intriguing terms such as spiking neural networks, event-based computing, near-memory computing, biosignals and predictive maintenance. There is also a reference to intelligent traffic junction systems for automated and autonomous driving.

Doesn’t that sound very akida-esque?!


https://karriere.fzi.de/Vacancies/401/Description/1 (German only)


9A59099B-B292-4F26-9D4F-6FE717651FEA.jpeg



FZI currently has a number of other job openings that involve research on SNNs - mostly aimed at students looking for a topic for either their B.Sc or M.Sc. theses (topics range from crowd monitoring, development of innovative neuromorphic hardware, event-based radar processing, vital sign analysis during sleep, the development of an EEG-based BCI to analyse colour vision to event-based sensors in service robots and mobile manipulators), but to me the job description above is the one most likely to involve Akida due to those specific terms used. Some ongoing projects at FZI involving neuromorphic technology have Intel as their partner (eg GreenEdge).

DE5EFAE6-6F70-46BF-93A5-4E05CDEC72DE.jpeg





I also discovered an interesting workshop on “Application of Intelligent Infrastructure for Automated Driving”, scheduled for June 2 and co-organised by FZI, Bosch and Heilbronn University of Applied Sciences. It won’t take place in Germany, though, but on Jeju Island in Korea, at the fully integrated resort Jeju Shinhwa World, as part of the IEEE Intelligent Vehicles Symposium 2024 (https://ieee-iv.org/2024/ - weirdly, the location is only given in the German version of the workshop announcement).

https://www.fzi.de/en/veranstaltung...lligent-infrastructure-for-automated-driving/ (English version)

7F33566D-F7F9-481F-9DCA-72AC6903D4A8.jpeg


Just a gentle reminder of my previous post on FZI:

“It is worth noting that FZI researchers are not working in an ivory tower - instead, their research is all about applied computer science and technology transfer to companies and public institutions.”
 
  • Like
  • Fire
  • Love
Reactions: 20 users
Ford would have been POC phase, they could provide an update by means of a general update like I mentioned earlier, no need for names. I'm actually expecting a board spill given the last AGM results and the lack of visible progress in between.

And what happened the last time BRN mentioned a North American car make
 
  • Like
  • Fire
Reactions: 2 users

Diogenese

Top 20
TLDR: I think TENNs is based a lot on ABRs LMUs. ABR are a competitor but not as big as they don't do IP or currently use SNNs.

For those more technically minded:
Regarding TENNs, the presentation slides by Tony Lewis from the other week were pretty interesting.

The slides indicate TENNs is based on the Chebyshev polynomial, and it's compared to the Legendre polynomial on the same page. For reference, these are both from the family of Jacobi polynomials. Chebyshev are generally thought to converge faster (which is better) but there are a few applications where Legendre or other Jacobi polynomials are better. I find this really interesting because I think TENNs was highly inspired by the work from Applied Brain Research (ABR) on their Legendre Memory Unit (LMU).

LMUs were first proposed in a 2019 paper by Chris Eliasmith from ABR. Many on here will recall that PVDM won an award in 2021 following a presentation done on LMUs by Eliasmith (who got second place). PVDM won because a lot of shareholders from here voted for him, which I find it ironic as the last big breakthrough from Brainchip was TENNs (which I think are heavily based off the LMU algorithm).

That said, I think this was one of the better directions Brainchip could have taken. The LMU is an RNN but which overcomes many of the RNN limitations, and the ABR paper indicated it would work well on SNNs.

It's worth noting that ABR have produced a chip based on the LMU but they don't do IP so they have a smaller market. I also don't think their chip is an SNN. They have also licensed their LMU so anyone who wants to use it must pay them. Note that by using a different algorithm and possibly making other changes, Brainchip have bypassed this and potentially made their work patentable (you can't patent something that's been published in a paper). By integrating TENNs with the Akida platform, both further complement each other.

On a side note, one of the more promising architectures to replace transformers for certain tasks right now is Mamba, as it needs less training and works well with long sequences.

I initially thought Mamba may have been the next direction Brainchip would have taken for LLMs (after TENNs) but now I'm not so sure as one of the slides also does a comparison with Mamba, which it shows TENNs does relatively well against. Hopefully we find out more at the AGM.

Comparison research paper:

Original LMU paper:

Tony's slides (from Berlinlover):

ABR chip:
Hi IDD,

Just ignore her - so rude!

The patent specifications give a comprehensive background of the development of AI time series implementations.


WO2023250092A1 METHOD AND SYSTEM FOR PROCESSING EVENT-BASED DATA IN EVENT-BASED SPATIOTEMPORAL NEURAL NETWORKS 20220622

Paras [0004] to [0011] set out the SotA as known to the inventors as of mid-2022.

Paras [0114] to [0127] discuss various ways of implementing the invention, including as you say Chebyshev polynomials [0126] as a subset of orthogonal polynomials.

Claim 1 sets out the primary inventive concept, which is much broader than Chebyshev. In fact, orthogonal polynomials only come in in subordinate claim 10.

1714921187638.png


A system to process event-based data using a neural network, the neural network comprising
a plurality of neurons associated with a corresponding portion of the eventbased data received at the plurality of neurons, and
one or more connections associated with each of the plurality of neurons,
the system comprising:
a memory; and
a processor communicatively coupled to the memory,
the processor being configured to:
receive, at a neuron of the plurality of neurons, a plurality of events associated with the event-based data over the one or more connections associated with the neuron,
wherein
each of the one or more connections is associated with a first kernel and a second kernel,
and wherein
each of the plurality of events belongs to one of a first category or a second category,
determine, at the neuron, a potential by processing the plurality of events received over the one or more connections,
wherein
to process the plurality of events, the processor is configured to:
when the received plurality of events belong to the first category, select the first kernel for determining the potential,
when the received plurality of events belong to the second category, select the second kernel for determining the potential,
and
generate, at the neuron, output based on the determined potential
.

The core of this TeNN invention is that the system determines whether the events are "spatio" or 'temporal" and directs them to the appropriate processing kernel.

To demonstrate that TeNN is derived from ABR's system, it would be probative to show that their inventive concept had been appropriated, bearing in mind that Legendre and Chebyshev polynomials have been around for a long time.

To me, TeNN seems to be an entirely different invention from ABR's system.

This is one of ABR's Legendre Memory Unit patents from 2019:
US11238345B2 Legendre memory units in recurrent neural networks 20190306+

Applicants APPLIED BRAIN RES INC [CA]

Inventors VOELKER AARON RUSSELL [CA]; ELIASMITH CHRISTOPHER DAVID [CA]

1. A method comprising:
defining, by a computer processor, a node response function for each node in a network, the node response function representing a state over time, wherein the state is encoded into one of binary events or real values, each node having a node input and a node output;
defining, by the computer processor, a set of connection weights with each node input;
defining, by the computer processor, a set of connection weights with each node output;
defining, by the computer processor, one or more Legendre Memory Unit (LMU) cells having a set of recurrent connections defined as a matrix that determines node connection weights based on the formula
:

1714920398475.png


where q is an integer determined by the user, and i and j are greater than or equal to zero; and
generating, by the computer processor, a recurrent neural network comprising the node response function for each node, the set of connection weights with each node input, the set of connection weights with each node output, and the LMU cells by training the network as a recurrent neural network by updating a plurality of its parameters or by fixing one or more parameters while updating the remaining parameters
.
 
  • Like
  • Fire
  • Wow
Reactions: 21 users

MegaportX

Member
A strong day on wall street Friday night our time, it will be interesting to see if the slide in the BRN stock down 50% from the recent highs halts and reverses a little this week. It's a bullish feeling amongst the peers as Ai everywhere progresses.



MegaportX
 
  • Like
  • Fire
Reactions: 9 users
When’s the latest you can vote?
 
  • Like
Reactions: 1 users
Hi IDD,

Just ignore her - so rude!

The patent specifications give a comprehensive background of the development of AI time series implementations.


WO2023250092A1 METHOD AND SYSTEM FOR PROCESSING EVENT-BASED DATA IN EVENT-BASED SPATIOTEMPORAL NEURAL NETWORKS 20220622

Paras [0004] to [0011] set out the SotA as known to the inventors as of mid-2022.

Paras [0114] to [0127] discuss various ways of implementing the invention, including as you say Chebyshev polynomials [0126] as a subset of orthogonal polynomials.

Claim 1 sets out the primary inventive concept, which is much broader than Chebyshev. In fact, orthogonal polynomials only come in in subordinate claim 10.

View attachment 62255

A system to process event-based data using a neural network, the neural network comprising
a plurality of neurons associated with a corresponding portion of the eventbased data received at the plurality of neurons, and
one or more connections associated with each of the plurality of neurons,
the system comprising:
a memory; and
a processor communicatively coupled to the memory,
the processor being configured to:
receive, at a neuron of the plurality of neurons, a plurality of events associated with the event-based data over the one or more connections associated with the neuron,
wherein
each of the one or more connections is associated with a first kernel and a second kernel,
and wherein
each of the plurality of events belongs to one of a first category or a second category,
determine, at the neuron, a potential by processing the plurality of events received over the one or more connections,
wherein
to process the plurality of events, the processor is configured to:
when the received plurality of events belong to the first category, select the first kernel for determining the potential,
when the received plurality of events belong to the second category, select the second kernel for determining the potential,
and
generate, at the neuron, output based on the determined potential
.

The core of this TeNN invention is that the system determines whether the events are "spatio" or 'temporal" and directs them to the appropriate processing kernel.

To demonstrate that TeNN is derived from ABR's system, it would be probative to show that their inventive concept had been appropriated, bearing in mind that Legendre and Chebyshev polynomials have been around for a long time.

To me, TeNN seems to be an entirely different invention from ABR's system.

This is one of ABR's Legendre Memory Unit patents from 2019:
US11238345B2 Legendre memory units in recurrent neural networks 20190306+

Applicants APPLIED BRAIN RES INC [CA]​

Inventors VOELKER AARON RUSSELL [CA]; ELIASMITH CHRISTOPHER DAVID [CA]​

1. A method comprising:
defining, by a computer processor, a node response function for each node in a network, the node response function representing a state over time, wherein the state is encoded into one of binary events or real values, each node having a node input and a node output;
defining, by the computer processor, a set of connection weights with each node input;
defining, by the computer processor, a set of connection weights with each node output;
defining, by the computer processor, one or more Legendre Memory Unit (LMU) cells having a set of recurrent connections defined as a matrix that determines node connection weights based on the formula
:

View attachment 62254

where q is an integer determined by the user, and i and j are greater than or equal to zero; and
generating, by the computer processor, a recurrent neural network comprising the node response function for each node, the set of connection weights with each node input, the set of connection weights with each node output, and the LMU cells by training the network as a recurrent neural network by updating a plurality of its parameters or by fixing one or more parameters while updating the remaining parameters
.
Hi Dio,

Thanks, that was great. Patents can be very revealing (but only if you know how to look)

The part about the 2 kernels was interesting. From the slide contents I suspect your right about what each data type refers to. Which means TENNs is just be treated as an add-on to the Akida platform and everything works with no intervention required by the user. It's pretty clever the way it's been configured, as it automatically handles switching between images and other streams like audio seamlessly, ensuring optimal efficiency.


Slide on TENNs:
Replacement for many Transformer tasks
• Language Models
• Time-series Data
• Spatiotemporal Data
 
  • Like
  • Fire
Reactions: 22 users

IloveLamp

Top 20
Well said Pradeep from TensorWave




1000015554.jpg
1000015551.jpg
 
Last edited:
  • Like
  • Love
  • Fire
Reactions: 31 users

Adam

Regular

TECH

Regular
And some on this forum have the audacity to suggest that the Board, Executive Management Team have or are sitting on their
hands, enjoying some sort of lifestyle in the Bahamas, yesterday's news announcement just shows the respect that our team is
gaining in the space field alone !

If Akida can perform in the harshness of space, which I'm sure it will excel in, doesn't that open up even more doors here on Earth ?
but once again, producing radiation-hardened microprocessors isn't a 6 month sign off just to appease shareholders, it's a step by step
process, isn't it ?

For the ones whom wish to remove the Board, just remember, currently we have 3 NEDs based here in Australia, which I personally
feel is extremely important, you may ask why ?...well having no Directors based here in Australia and say only the US just doesn't
sit well with me, simple as that.

Attempting to remove our original founder, the genius in Peter would be another mistake, despite retiring, I'm 100% sure Peter is still
very much hands on with his continuing research and Chair of the SAB, try thinking of the bigger picture, too many I believe are just
too focused on the now, Sean as I have mentioned is really only halfway through his business plan, if the Board thought that he was
leading us all down the garden path, he would have been removed already, that I'm also 100% sure of....did Lou just quit of his own
accord ? you decide the answer to that.

Have a great week ahead........Tech



AKD II just waking up !

Research Alzheimers GIF by MIT
 
Last edited:
  • Like
  • Love
  • Fire
Reactions: 70 users
And some on this forum have the audacity to suggest that the Board, Executive Management Team have or are sitting on their
hands, enjoying some sort of lifestyle in the Bahamas, yesterday's news announcement just shows the respect that our team is
gaining in the space field alone !

If Akida can perform in the harshness of space, which I'm sure it will excel in, doesn't that open up even more doors here on Earth ?
but once again, producing radiation-hardened microprocessors isn't a 6 month sign off just to appease shareholders, it's a step by step
process, isn't it ?

For the ones whom wish to remove the Board, just remember, currently we have 3 NEDs based here in Australia, which I personally
feel is extremely important, you may ask why ?...well having no Directors based here in Australia and say only the US just doesn't
sit well with me, simple as that.

Attempting to remove our original founder, the genesis in Peter would be another mistake, despite retiring, I'm 100% sure Peter is still
very much hands on with his continuing research and Chair of the SAB, try thinking of the bigger picture, too many I believe are just
too focused on the now, Sean as I have mentioned is really only halfway through his business plan, if the Board thought that he was
leading us all down the garden path, he would have been removed already, that I'm also 100% sure of....did Lou just quit of his own
accord ? you decide the answer to that.

Have a great week ahead........Tech



AKD II just waking up !

Research Alzheimers GIF by MIT
I believe that ASX listed companies must have a minimum of 3 directors, at least 2 must ordinarily reside in Australia.
 
  • Like
Reactions: 12 users

Xray1

Regular
And some on this forum have the audacity to suggest that the Board, Executive Management Team have or are sitting on their
hands, enjoying some sort of lifestyle in the Bahamas, yesterday's news announcement just shows the respect that our team is
gaining in the space field alone !

If Akida can perform in the harshness of space, which I'm sure it will excel in, doesn't that open up even more doors here on Earth ?
but once again, producing radiation-hardened microprocessors isn't a 6 month sign off just to appease shareholders, it's a step by step
process, isn't it ?

For the ones whom wish to remove the Board, just remember, currently we have 3 NEDs based here in Australia, which I personally
feel is extremely important, you may ask why ?...well having no Directors based here in Australia and say only the US just doesn't
sit well with me, simple as that.

Attempting to remove our original founder, the genesis in Peter would be another mistake, despite retiring, I'm 100% sure Peter is still
very much hands on with his continuing research and Chair of the SAB, try thinking of the bigger picture, too many I believe are just
too focused on the now, Sean as I have mentioned is really only halfway through his business plan, if the Board thought that he was
leading us all down the garden path, he would have been removed already, that I'm also 100% sure of....did Lou just quit of his own
accord ? you decide the answer to that.

Have a great week ahead........Tech



AKD II just waking up !

Research Alzheimers GIF by MIT
It was/is my view that LOU left on his own account for medical reason after sustaining significant burns to both his hands .......
 
  • Like
  • Thinking
Reactions: 9 users

HopalongPetrovski

I'm Spartacus!
It was/is my view that LOU left on his own account for medical reason after sustaining significant burns to both his hands .......
Or......maybe this is what happened to poor Lou, after the casino deals got the old kibosh......🤣🤣🤣

 
  • Haha
  • Like
  • Love
Reactions: 8 users

buena suerte :-)

BOB Bank of Brainchip
Very nice :)

1714962563483.png

1714962614224.png
 
Last edited:
  • Like
  • Love
  • Fire
Reactions: 82 users

toasty

Regular
Am I reading this right that this patent means that new classes can be added without affecting those that are existing. And as we know, our CNN/SNN converter has the ability to "re-use" existing models. Soooooo, does that mean potential objections to "change" are reduced or eliminated? Asking for a friend...........😆
 
  • Like
  • Thinking
Reactions: 8 users

mcm

Regular
Am I reading this right that this patent means that new classes can be added without affecting those that are existing. And as we know, our CNN/SNN converter has the ability to "re-use" existing models. Soooooo, does that mean potential objections to "change" are reduced or eliminated? Asking for a friend...........😆
Hey Diogenese,

Would love to hear your explanation in layman's terms of what this patent means. Also, I assume it has also been applied for in other countries and Australia is the first cab off the ramp to approve it.
 
  • Like
  • Fire
Reactions: 9 users

TECH

Regular
It was/is my view that LOU left on his own account for medical reason after sustaining significant burns to both his

It was/is my view that LOU left on his own account for medical reason after sustaining significant burns to both his hands .......

Lou confided in me prior to the market about his burns accident, but that was only part of it, his hands were healing good, I would
have to dig up the email, which I still have among the thousands and thousands :ROFLMAO:....I speak on behalf of myself, not the company.

This morning I did relisten to Sean's podcast the other day, and with regard to Rob leaving, his answer wasn't an answer and probably
should have just abstained in my opinion, we all know now Rob has moved onto another challenge...good on him.
 
  • Like
Reactions: 5 users
Top Bottom