BRN Discussion Ongoing

manny100

Regular
NVISO is a partner of Brainchip so I guess this is relevant to this thread.
It seems NVISO is not taking anymore investors money so we could be seeing a IPO later this year.
The CEO Tim has 20 million shares so I guess it would be in his best interests if they can list while AI is the flavour of the year/ upcoming years.
They did an online survey this year to gauge interest.
 
  • Like
Reactions: 4 users
D

Deleted member 118

Guest
From previous posts

 
  • Haha
Reactions: 1 users

Colorado23

Regular
Afternoon all,

I haven’t received a letter or any correspondence re the AGM, could anyone point me in the direction needed to gain a digital invite please? Thanks
I understnad this hasnt been sent out yet Robsmark as I also havent received an invitation. The ANN says shareholders will be sent letter/email with invite to the AGM.
 

ndefries

Regular
I understnad this hasnt been sent out yet Robsmark as I also havent received an invitation. The ANN says shareholders will be sent letter/email with invite to the AGM.
I believe it is just the voting material that was sent out a while ago. If you haven't got that something is up. not aware of any other documents or invites.
 
  • Like
Reactions: 2 users

cosors

👀
Can anyone do anything with this?



"Error in list_deployment_targets()
Question/Issue:
The function “ei.model.list_deployment_targets()” and other throw the following error:

ValidationError: 2 validation errors for DeploymentTarget supportedEngines -> 0 value is not a valid enumeration member; permitted: 'tflite', 'tflite-eon', 'tensorrt', 'tensaiflow', 'drp-ai', 'tidl', 'akida', 'syntiant' (type=type_error.enum; enum_values=[<DeploymentTargetEngine.TFLITE: 'tflite'>, <DeploymentTargetEngine.TFLITE_EON: 'tflite-eon'>, <DeploymentTargetEngine.TENSORRT: 'tensorrt'>, <DeploymentTargetEngine.TENSAIFLOW: 'tensaiflow'>, <DeploymentTargetEngine.DRP_AI: 'drp-ai'>, <DeploymentTargetEngine.TIDL: 'tidl'>, <DeploymentTargetEngine.AKIDA: 'akida'>, <DeploymentTargetEngine.SYNTIANT: 'syntiant'>]) preferredEngine value is not a valid enumeration member; permitted: 'tflite', 'tflite-eon', 'tensorrt', 'tensaiflow', 'drp-ai', 'tidl', 'akida', 'syntiant' (type=type_error.enum; enum_values=[<DeploymentTargetEngine.TFLITE: 'tflite'>, <DeploymentTargetEngine.TFLITE_EON: 'tflite-eon'>, <DeploymentTargetEngine.TENSORRT: 'tensorrt'>, <DeploymentTargetEngine.TENSAIFLOW: 'tensaiflow'>, <DeploymentTargetEngine.DRP_AI: 'drp-ai'>, <DeploymentTargetEngine.TIDL: 'tidl'>, <DeploymentTargetEngine.AKIDA: 'akida'>, <DeploymentTargetEngine.SYNTIANT: 'syntiant'>]) "

https://forum.edgeimpulse.com/t/error-in-list-deployment-targets/7482

I only saw the two names.
 
  • Like
  • Thinking
Reactions: 5 users

TECH

Regular
I happen to know of a certain company that may be able to provide you the solutions to real world problems that up until now humans
haven't been able to master. Only thing is, you'll need a key to the kingdom, and we have the only one.

An IP license would certainly help in the first instance.....

 
  • Like
  • Thinking
  • Haha
Reactions: 18 users
D

Deleted member 3351

Guest
I believe it is just the voting material that was sent out a while ago. If you haven't got that something is up. not aware of any other documents or invites.
i think your right @ndefries , only the voting material has been sent out.
On that voting form it did say to bring one of the forms with you for easier identification to the AGM
So, im of the understanding that you just arrive if attending.
 
  • Like
Reactions: 1 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
No, I am not removing it.
Just because you don't like it, doesn't mean it shouldn't be there.
That's my stock disclosure and it will be there until AGM. You can do whatever you want.
You don't dictate what's right or wrong for every shareholder, certainly not for me.

But it isn't really a stock disclosure is it? It's more of a "how you intend to vote" disclosure isn't it, unless I'm missing something?
 
  • Like
  • Love
  • Haha
Reactions: 23 users
Afternoon all,

I haven’t received a letter or any correspondence re the AGM, could anyone point me in the direction needed to gain a digital invite please? Thanks
1683797831848.jpeg
 
  • Love
  • Like
Reactions: 2 users

BaconLover

Founding Member
But it isn't really a stock disclosure is it? It's more of a "how you intend to vote" disclosure isn't it, unless I'm missing something?
I could say the same thing about everyone's stock disclosure.

For eg: the resident touchmenot has a stock disclosure of "holding BRN not letting go". Is this an intend for others to do the same? If so, then is it financial advice?

A stock disclosure is just that. What I intend to do with my stock. Nothing to do with how others see it.

I'm just saying it's okay to vote NO if someone wishes to, and do not need to be intimidated by the pack who attack here (not directed at you) with anyone who has another view point.
 
  • Like
  • Fire
  • Thinking
Reactions: 12 users
D

Deleted member 118

Guest
Any Adelaide members here going?

C4C3C9CA-B300-4563-B41D-D5A034646731.png
 
  • Like
  • Love
  • Fire
Reactions: 13 users

Slade

Top 20
I could say the same thing about everyone's stock disclosure.

For eg: the resident touchmenot has a stock disclosure of "holding BRN not letting go". Is this an intend for others to do the same? If so, then is it financial advice?

A stock disclosure is just that. What I intend to do with my stock. Nothing to do with how others see it.

I'm just saying it's okay to vote NO if someone wishes to, and do not need to be intimidated by the pack who attack here (not directed at you) with anyone who has another view point.
Where have all your previous posts gone? Did you delete them so that you can now play the victim. You used to be quite reasonable but ever since FF gave you a serve you seem very bitter.
 
  • Like
  • Haha
  • Love
Reactions: 21 users
D

Deleted member 118

Guest
Where have all your previous posts gone? Did you delete them so that you can now play the victim. You used to be quite reasonable but ever since FF gave you a serve you seem very bitter.
Think we might have an intruder as I just had a few removed as well



 
  • Haha
  • Like
Reactions: 5 users
D

Deleted member 118

Guest
Another warning thats 3 in 15 minutes @zeeb0t

 
  • Like
Reactions: 1 users

zeeb0t

Administrator
Staff member
Another warning thats 3 in 15 minutes @zeeb0t


It was two in the end, and you even acknowledged the message I sent you about one of the incorrect moderations which I reversed. Which also indicates to me that you definitely know where the private message function is, so please message me about this there.
 
  • Like
  • Haha
  • Fire
Reactions: 13 users

Frangipani

Regular
The Human Brain Project (HBP), a ten-year European Union-funded research initiative launched in 2013 that describes itself on Twitter as “A global collaborative effort for neuroscience, medicine and computing to understand the brain, its diseases, and its computational capabilities” is one of three EU FET (Future and Emerging Technologies) Flagship Projects, partnered with more than 150 universities, research institutions and hospitals. It will conclude this September.
The Neuromorphic Computing Platform developed in the HBP provides remote access to two complementary, large-scale neuromorphic computing systems (NCS) built in custom hardware at locations in Heidelberg (the BrainScaleS system) and Manchester (the SpiNNaker system).

A couple of days ago, the Human Brain Project’s website reported on a new study on SNN by two researchers from a Dutch HBP partner institution published in Nature Machine Intelligence. No mention of Akida here, but substantiating my claim about the VR/AR sector being a lucrative field for Brainchip.




MAY 8, 2023

Human Brain Project: Study presents large brain-like neural networks for AI​


In a new study in Nature Machine Intelligence, researchers Bojian Yin and Sander Bohté from the HBP partner Dutch National Research Institute for Mathematics and Computer Science (CWI) demonstrate a significant step towards artificial intelligence that can be used in local devices like smartphones and in VR-like applications, while protecting privacy. They show how brain-like neurons combined with novel learning methods enable training fast and energy-efficient spiking neural networks on a large scale. Potential applications range from wearable AI to speech recognition and Augmented Reality.
human_brain_project-_study_presents_large_brain-like_neural_networks_for_ai.png

While modern artificial neural networks are the backbone of the current AI revolution, they are only loosely inspired by networks of real, biological neurons such as our brain. The brain however is a much larger network, much more energy-efficient, and can respond ultra-fast when triggered by external events. Spiking neural networks are special types of neural networks that more closely mimic the working of biological neurons: the neurons of our nervous system communicate by exchanging electrical pulses, and they do so only sparingly.
Implemented in chips, called neuromorphic hardware, such spiking neural networks hold the promise of bringing AI programmes closer to users – on their own devices. These local solutions are good for privacy, robustness and responsiveness. Applications range from speech recognition in toys and appliances, health care monitoring and drone navigation to local surveillance.
Just like standard artificial neural networks, spiking neural networks need to be trained to perform such tasks well. However, the way in which such networks communicate poses serious challenges. "The algorithms needed for this require a lot of computer memory, allowing us to only train small network models mostly for smaller tasks. This holds back many practical AI applications so far," says Sander Bohté of CWI's Machine Learning group. In the Human Brain Project, he works on architectures and learning methods for hierarchical cognitive processing.

Mimicking the learning brain
The learning aspect of these algorithms is a big challenge, and they cannot match the learning ability of our brain. The brain can easily learn immediately from new experiences, by changing connections, or even by making new ones. The brain also needs far fewer examples to learn something and it works more energy-efficiently. "We wanted to develop something closer to the way our brain learns," says Bojian Yin.
Yin explains how this works: if you make a mistake during a driving lesson, you learn from it immediately. You correct your behaviour right away and not an hour later. "You learn, as it were, while taking in the new information. We wanted to mimic that by giving each neuron of the neural network a bit of information that is constantly updated. That way, the network learns how the information changes and doesn't have to remember all the previous information. This is the big difference from current networks, which have to work with all the previous changes. The current way of learning requires enormous computing power and thus a lot of memory and energy."

Six million neurons
The new online learning algorithm makes it possible to learn directly from the data, enabling much larger spiking neural networks. Together with researchers from TU Eindhoven and research partner Holst Centre, Bohté and Yin demonstrated this in a system designed for recognising and locating objects. Yin shows a video of a busy street in Amsterdam: the underlying spiking neural network, SPYv4, has been trained in such a way that it can distinguish cyclists, pedestrians and cars and indicate exactly where they are.
"Previously, we could train neural networks with up to 10,000 neurons; now, we can do the same quite easily for networks with more than six million neurons," says Bohté. "With this, we can train highly capable spiking neural networks like our ¬¬SPYv4."

Future
And where does it all lead? Now having such powerful AI solutions based on spiking neural networks, chips are being developed that can run these AI programmes at very low power and will ultimately show up in many smart devices, like hearing-aides and augmented or virtual reality glasses.

Original Publication:
Bojian Yin, Federico Corradi, and Sander M. Bohté: Accurate online training of dynamical spiking neural networks through forward propagation through time. Nature Machine Intelligence, 8. May 2023. DOI: 10.1038/s42256-023-00650-4

human_brain_project-_study_presents_large_brain-like_neural_networks_for_ai_-_2.png

The researchers
sander_bohte_.png
Credit: Dirk Gillissen
Sander Bohté works in the Human Brain Project’s research area “Adaptive networks for cognitive architectures: from advanced learning to neurorobotics and neuromorphic applications.”




bojian_yin.png

Researcher Bojian Yin
 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 18 users
D

Deleted member 118

Guest
  • Like
Reactions: 1 users

Deadpool

hyper-efficient Ai
The Human Brain Project (HBP), a ten-year European Union-funded research initiative launched in 2013 that describes itself on Twitter as “A global collaborative effort for neuroscience, medicine and computing to understand the brain, its diseases, and its computational capabilities” is one of three EU FET (Future and Emerging Technologies) Flagship Projects, partnered with more than 100 universities, research institutions and hospitals. It will conclude this September.
The Neuromorphic Computing Platform developed in the HBP provides remote access to two complementary, large-scale neuromorphic computing systems (NCS) built in custom hardware at locations in Heidelberg (the BrainScaleS system) and Manchester (the SpiNNaker system).

A couple of days ago, the Human Brain Project’s website reported on a new study on SNN by two researchers from a Dutch HBP partner institution published in Nature Machine Intelligence. No mention of Akida here, but substantiating my claim about the VR/AR sector being a lucrative field for Brainchip.




MAY 8, 2023

Human Brain Project: Study presents large brain-like neural networks for AI​


In a new study in Nature Machine Intelligence, researchers Bojian Yin and Sander Bohté from the HBP partner Dutch National Research Institute for Mathematics and Computer Science (CWI) demonstrate a significant step towards artificial intelligence that can be used in local devices like smartphones and in VR-like applications, while protecting privacy. They show how brain-like neurons combined with novel learning methods enable training fast and energy-efficient spiking neural networks on a large scale. Potential applications range from wearable AI to speech recognition and Augmented Reality.
human_brain_project-_study_presents_large_brain-like_neural_networks_for_ai.png

While modern artificial neural networks are the backbone of the current AI revolution, they are only loosely inspired by networks of real, biological neurons such as our brain. The brain however is a much larger network, much more energy-efficient, and can respond ultra-fast when triggered by external events. Spiking neural networks are special types of neural networks that more closely mimic the working of biological neurons: the neurons of our nervous system communicate by exchanging electrical pulses, and they do so only sparingly.
Implemented in chips, called neuromorphic hardware, such spiking neural networks hold the promise of bringing AI programmes closer to users – on their own devices. These local solutions are good for privacy, robustness and responsiveness. Applications range from speech recognition in toys and appliances, health care monitoring and drone navigation to local surveillance.
Just like standard artificial neural networks, spiking neural networks need to be trained to perform such tasks well. However, the way in which such networks communicate poses serious challenges. "The algorithms needed for this require a lot of computer memory, allowing us to only train small network models mostly for smaller tasks. This holds back many practical AI applications so far," says Sander Bohté of CWI's Machine Learning group. In the Human Brain Project, he works on architectures and learning methods for hierarchical cognitive processing.

Mimicking the learning brain
The learning aspect of these algorithms is a big challenge, and they cannot match the learning ability of our brain. The brain can easily learn immediately from new experiences, by changing connections, or even by making new ones. The brain also needs far fewer examples to learn something and it works more energy-efficiently. "We wanted to develop something closer to the way our brain learns," says Bojian Yin.
Yin explains how this works: if you make a mistake during a driving lesson, you learn from it immediately. You correct your behaviour right away and not an hour later. "You learn, as it were, while taking in the new information. We wanted to mimic that by giving each neuron of the neural network a bit of information that is constantly updated. That way, the network learns how the information changes and doesn't have to remember all the previous information. This is the big difference from current networks, which have to work with all the previous changes. The current way of learning requires enormous computing power and thus a lot of memory and energy."

Six million neurons
The new online learning algorithm makes it possible to learn directly from the data, enabling much larger spiking neural networks. Together with researchers from TU Eindhoven and research partner Holst Centre, Bohté and Yin demonstrated this in a system designed for recognising and locating objects. Yin shows a video of a busy street in Amsterdam: the underlying spiking neural network, SPYv4, has been trained in such a way that it can distinguish cyclists, pedestrians and cars and indicate exactly where they are.
"Previously, we could train neural networks with up to 10,000 neurons; now, we can do the same quite easily for networks with more than six million neurons," says Bohté. "With this, we can train highly capable spiking neural networks like our ¬¬SPYv4."

Future
And where does it all lead? Now having such powerful AI solutions based on spiking neural networks, chips are being developed that can run these AI programmes at very low power and will ultimately show up in many smart devices, like hearing-aides and augmented or virtual reality glasses.

Original Publication:
Bojian Yin, Federico Corradi, and Sander M. Bohté: Accurate online training of dynamical spiking neural networks through forward propagation through time. Nature Machine Intelligence, 8. May 2023. DOI: 10.1038/s42256-023-00650-4

human_brain_project-_study_presents_large_brain-like_neural_networks_for_ai_-_2.png

The researchers
sander_bohte_.png
Credit: Dirk Gillissen
Sander Bohté works in the Human Brain Project’s research area “Adaptive networks for cognitive architectures: from advanced learning to neurorobotics and neuromorphic applications.”




bojian_yin.png

Researcher Bojian Yin
Good read @Frangipani

Sander looks a bit like old Georgey boy:LOL:

1683802319717.png
george costanza comedy GIF
 
  • Haha
  • Like
Reactions: 34 users

GStocks123

Regular
  • Like
Reactions: 6 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
I could say the same thing about everyone's stock disclosure.

For eg: the resident touchmenot has a stock disclosure of "holding BRN not letting go". Is this an intend for others to do the same? If so, then is it financial advice?

A stock disclosure is just that. What I intend to do with my stock. Nothing to do with how others see it.

I'm just saying it's okay to vote NO if someone wishes to, and do not need to be intimidated by the pack who attack here (not directed at you) with anyone who has another view point.

But it's also a bit confusing too, because it could appear that you are using your Stock Disclosure banner to advocate a position on the Voice to Parliament, otherwise why allude to voting "no" in the ref?

If I have misread this somehow, then my apologies. Just trying to understand what this is all about?

See "ref" crossed out below from your Stock Disclosure.

Screen Shot 2023-05-11 at 10.09.49 pm.png
 
Last edited:
  • Like
  • Haha
  • Love
Reactions: 8 users
Top Bottom