BRN Discussion Ongoing

Xray1

Regular
Given the above ...... Maybe the Co should now reconsider selling both IP and Akida 1000, 1500 and Gen 2 Chips to fill the market for all levels of customers and product requirements ........... I would think that there could be a greater commercial benefit to cater for all potential users with differing needs and financial outlays and would also imo assist our BRN sales force in engaging with a broader range of customers to take up our technology and get the word out there especially given that we seem to be making no progress with just offering a IP product.
 
  • Thinking
  • Fire
Reactions: 3 users

wilzy123

Founding Member
Have we looked Dryad Networks.....
Early forest fire detection. Didnt register for the Whitepaper but some familiar buzzy terms used in capability statements and marketing material.

View attachment 40970 View attachment 40971

We'll probably dramatically dilute discussion about BRN and the internets will run out of hard drive space if we're gonna start posting links where the only connection to BRN are the broad terms you've highlighted....
 
  • Like
Reactions: 3 users

robsmark

Regular
@Realinfo to answer your question last week, yes I did mention January 2025 again, and you are basically correct in assuming that
I personally believe that revenue would have started to show a repeating pattern on review of the next 6 quarters, that is, growing
and heading northwards, I also believe that the share price will reflect that growth coming through, that being said, not North of
around $1.75 taking out the stag spike or overshoot that will probably occur in all the excitement, with fence sitters and the herd all
jumping aboard, giving us a market cap. of around 3 billion AUD, which would be at my top end...not advice for new investors, just
a believers private views on display.

Some may see that valuation (guess) as either to light on or way over the top, but that's what makes the markets so interesting, it's
full of bullshitters, crystal balls and in real terms, winners and losers etc.

Cheers and good evening....Tech 🍷
Hey Tech… This post? What question of mine was you answering?
 
  • Haha
  • Like
Reactions: 7 users

MDhere

Regular
i. got busy looking for a house in melb today, did i miss something. i little birdie said i missed something... 🤣 but managed to log on and saw nearly 40 so maybe i will be busy again tomorrow but was curious to know what i missed if anyone can pm me 🤣🤣
 

keyeat

Regular
i. got busy looking for a house in melb today, did i miss something. i little birdie said i missed something... 🤣 but managed to log on and saw nearly 40 so maybe i will be busy again tomorrow but was curious to know what i missed if anyone can pm me 🤣🤣
SP went up and then back down .....


Season 4 Episode 13 GIF by The Simpsons
 
  • Haha
Reactions: 8 users
S

Straw

Guest
  • Like
Reactions: 5 users

IloveLamp

Top 20
  • Haha
  • Sad
Reactions: 14 users

IloveLamp

Top 20
This might help to satisfy the doubters of our likely association with BMW

Is it just me, or are Robs likes beginning to make a lot more sense...?

Pay attention folks, greatness unfolding at Brainchip.
Screenshot_20230802_221522_LinkedIn.jpg
 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 44 users

Learning

Learning to the Top 🕵‍♂️

Attachments

  • Screenshot_20230802_222936_LinkedIn.jpg
    Screenshot_20230802_222936_LinkedIn.jpg
    402.4 KB · Views: 87
Last edited:
  • Like
  • Love
  • Fire
Reactions: 42 users

Pmel

Regular
Until we sign up a good IP contract or significant revenue we aren't going anywhere. Either stay at the same level or drop even more. It is what it it is. I know few here would give me hard time here for this, some have done that in the past and sent me private message telling me how silly i am. But truth is truth. SP will reflect what company achieves.
 
  • Like
  • Fire
  • Love
Reactions: 19 users

Pmel

Regular
Until we sign up a good IP contract or significant revenue we aren't going anywhere. Either stay at the same level or drop even more. It is what it it is. I know few here would give me hard time here for this, some have done that in the past and sent me private message telling me how silly i am. But truth is truth. SP will reflect what company achieves.
Happy to post the nasty comments few post from time to time just to express your concerns. Some here are just to put you down for that.
 
  • Like
  • Thinking
Reactions: 5 users

Krustor

Regular
Happy to post the nasty comments few post from time to time just to express your concerns. Some here are just to put you down for that.
Just forgot to change accounts? :unsure:
 
  • Haha
  • Like
Reactions: 24 users

robsmark

Regular
Happy to post the nasty comments few post from time to time just to express your concerns. Some here are just to put you down for that.
Just forgot to change accounts? :unsure:
I’m laughing but I shouldn’t be. I’m guessing he just replied to the wrong person.

I personally know Pmel and can assure you he isn’t a manipulator or a downramper. Just a regular guy watching his significant investment getting smashed.
 
  • Like
  • Sad
Reactions: 13 users

wilzy123

Founding Member
  • Haha
  • Like
Reactions: 10 users

Frangipani

Regular
395545C7-BCB8-4197-BF51-EF222DCF8E0E.jpeg


I am aware that the above article on neuromorphic computing that Gabriel Rubio, CEO of SecuRED (a small business “specialising in innovative security and privacy technology solutions including AI”) shared on LinkedIn, has been posted here a couple times of before, but check out the comment section.

Not surprisingly, there is a post by Nick Brown promoting Brainchip 😊 - both he and @chapman89 should really think about changing their profile pictures to something along the lines of

15E7F492-F6FB-4EDE-947B-2721203E8245.jpeg

or

12A2FA95-A748-4557-B97C-A3B644EA2B48.jpeg


🤣🤣🤣

But now have a look at Robert Moore’s comment. While not referring to Brainchip specifically, his enthusiastic assessment of the disruptive nature of neuromorphic technology is yet another validation by someone with an intriguing professional background.

E6CE9F74-50F6-474C-9C51-E84DFB38AF88.jpeg

1F4AC819-434B-4BA8-BCF7-BD3D19F22402.jpeg


D6288500-A479-434C-BDCF-91A54904136D.png




According to Wikipedia, Booz Allen Hamilton is “an American government and military contractor, specializing in intelligence... The company's stated core business is to provide consulting, analysis and engineering services to public and private sector organizations and nonprofits.”

However, it should also be noted that “Booz Allen has particularly come under scrutiny for its ties to the government of Saudi Arabia and the support it provides to the Saudi armed forces. Alongside competitors McKinsey & Company and Boston Consulting Group, Booz Allen are seen as important factors in Crown Prince Mohammed bin Salman’s drive to consolidate power in the Kingdom.[89] On the military side, Booz Allen is employing dozens of retired American military personnel to train and advise the Royal Saudi Navy and provide logistics for the Saudi Army, but denies its expertise is used by Saudi Arabia in its war against Yemen. Additionally, it also entered an agreement with the Saudi government that involves the protection and cyber-security of government ministries,[90] with experts arguing that these defensive maneuvers could easily be used to target dissidents.”

This connection to Saudi-Arabia reminded me of the following slide in the moonbeam Emerging Technology Assessment presentation @Rise from the ashes shared with us yesterday:

FAE66273-8B34-4995-BAF0-C995688D996F.jpeg


Intel has been open about collaborating with the Kingdom of Saudi-Arabia on the US $ 500 billion NEOM desert megacity project, which some view as the world’s first futuristic smart city and ecological prestige project and others as a repressive ruler’s megalomaniac fantasy and ecological disaster “being built on forcible evictions, state violence and death sentences” (https://www.dw.com/en/saudi-arabias-neom-a-prestigious-project-with-a-dark-side/a-65664704).


C8BDAE91-71AC-42B7-9A17-0AB8BDE049A9.jpeg

(And of course “Jesse was here”… 😂)

It is obviously an ethical question whether or not to do business with a government such as that of Saudi-Arabia. I wonder whether or not Brainchip will clearly position itself?

Some (admittedly hard to digest) food for thought:
 
Last edited:
  • Like
  • Love
  • Wow
Reactions: 22 users

Frangipani

Regular

Can AI Continue To Scale?​

Forbes Technology Council
Peter van der Made
Forbes Councils Member
Forbes Technology Council
COUNCIL POST| Membership (Fee-Based)

Aug 2, 2023,09:30am EDT
Peter van der Made is the founder and CTO of BrainChip Ltd. BrainChip produces advanced AI processors in digital neuromorphic technologies.



GETTY

Artificial intelligence is rapidly being deployed within all aspects of business and finance. Some exciting successes are putting pressure on the industry to embrace this new technology. No one wants to be left behind.

The core technologies behind AI are neural network models, deep learning algorithms and massive data sets for training. The model is constructed for a specific purpose such as object recognition, speech recognition and object tracking. A “model” describes how the neural network is constructed, how many parameters the network has and how many layers.

The overall accuracy of the neural network is a function of the quality and size of the training data set, the number of parameters and the training procedure. This is not an exact science. Too much training, and the model will respond well to the training set but not to real-world situations. This is “overfitting” the model. Too little training, and the model will not be able to respond to all known situations.

No model is perfect. There is always a margin of error and the occurrence of outlier conditions for which the model has no parameters. Over the last 10 years, models have become more complex as capabilities and accuracy have increased.

The models used for large language models such as Bard and GPT-4 use hundreds of billions of parameters and need massive data sets to train on. Even the most powerful personal computers cannot handle large models that require considerable computational power and memory resources. The computing is done via the internet (the cloud) on large data center computers—a server farm.

Server farms are used in applications such as natural language processing, generating text and images, classifying video streams, and IoT process control and monitoring. Wired estimates that training a large model like GPT-4 costs $100 million, using as many as 10,000 systems with powerful A100 GPU processor arrays over 11 months. The largest known model is Google GLaM, with more than 1 trillion parameters. Models are getting larger and larger, but can these systems continue to scale?

According to SemiAnalysis chief analyst Dylan Patel (via Insider), the cost of running ChatGPT is estimated to be as high as $700,000 daily. This cost is broken down into maintenance, depreciation on the computer resources, and electricity consumption of the servers and cooling systems. In a study published jointly by Google and UC Berkeley (via Scientific American), the amount of power used by GPT-3 is 1,287 megawatt hours.

This is of great concern when multiplied by the number of server farms worldwide and the increase in AI processing. The power consumption of server farms will likely increase as more people start to access online AI. Server farms could consume more than 20% of the world’s electricity by 2025.

Server farms use large racks with powerful computers and GPUs. They contain thousands of processing cores that can be used as parallel processing units to compute the function of a neural network. The power used by a single GPU can be as high as 400 watts, and a server may use up to 32 of those GPUs. A company’s cluster of large data centers may deploy as many as 2.5 million servers. Even if only half of the servers contain GPUs, a worst-case calculation will reach 16,000 megawatt hours. That is a lot of greenhouse gases.

There are several ways to reduce the environmental impact of server farms. One part of the solution is more efficient hardware, together with the use of renewable energy. Another is to use hybrid solutions that perform much of the processing distributed at the edge in specialized, low-power but high-performance neuromorphic hardware. Neuromorphic processing takes inspiration from the energy-efficient methods of the brain.

The human brain contains approximately 86 billion neuron cells (about 80 times that of GLaM, the largest of the large language models) with an estimated 100 trillion connections (roughly 100 times that of GLaM). Each cell has a variable amount of electrochemical memory. The information stored in this biological memory can be considered equivalent to the parameters in a neural network model.

The brain model is dynamic in contrast to artificial neural networks. It creates new connections and more memory as we learn, and it prunes redundant connections when we sleep. The human brain neural network, even though larger than the largest AI model, consumes only the energy equivalent of 20 watts—less than a light bulb. The brain’s structure is vastly different from the neural network models used in today’s AI systems, notwithstanding the successes we have seen over the last few years.

Neuromorphic processing borrows from the efficient processing techniques of the brain by copying its behavior into digital circuits. While digital circuits may not be as power-efficient as analog circuits, stability, interchangeability and speed outweigh the slight power advantage. Using neuromorphic computing engines is transparent to the developer and the user because of an event-driven convolution shell.

Neuromorphic processing can run convolutional neural networks (CNN) and can run image classification on ImageNet1000, real-time video classification, odor and taste recognition, vibration analysis, voice and speech recognition, and disease and anomaly detection. Using these functions in portable and battery-powered tools is possible because of its low power consumption.

It is possible to reduce the excessive power consumption of data centers by using distributed AI processing in fast neuromorphic computing devices, which reduces operating costs and increases the functionality and responsiveness of edge products. Neuromorphic processing can help compensate for AI’s expected negative environmental impact.

Forbes Technology Council is an invitation-only community for world-class CIOs, CTOs and technology executives. Do I qualify?



Follow me on LinkedIn. Check out my website.
Peter van der Made
Peter van der Made

Peter van der Made is the founder and CTO of BrainChip Ltd. BrainChip produces advanced AI processors in digital neuromorphic technologies. Read Peter van der Made's full executive profile here.
 
Last edited:
  • Like
  • Love
  • Fire
Reactions: 82 users

cosors

👀

Can AI Continue To Scale?​

Forbes Technology Council
Peter van der Made
Forbes Councils Member
Forbes Technology Council
COUNCIL POST| Membership (Fee-Based)

Aug 2, 2023,09:30am EDT
Peter van der Made is the founder and CTO of BrainChip Ltd. BrainChip produces advanced AI processors in digital neuromorphic technologies.



GETTY

Artificial intelligence is rapidly being deployed within all aspects of business and finance. Some exciting successes are putting pressure on the industry to embrace this new technology. No one wants to be left behind.

The core technologies behind AI are neural network models, deep learning algorithms and massive data sets for training. The model is constructed for a specific purpose such as object recognition, speech recognition and object tracking. A “model” describes how the neural network is constructed, how many parameters the network has and how many layers.

The overall accuracy of the neural network is a function of the quality and size of the training data set, the number of parameters and the training procedure. This is not an exact science. Too much training, and the model will respond well to the training set but not to real-world situations. This is “overfitting” the model. Too little training, and the model will not be able to respond to all known situations.

No model is perfect. There is always a margin of error and the occurrence of outlier conditions for which the model has no parameters. Over the last 10 years, models have become more complex as capabilities and accuracy have increased.

The models used for large language models such as Bard and GPT-4 use hundreds of billions of parameters and need massive data sets to train on. Even the most powerful personal computers cannot handle large models that require considerable computational power and memory resources. The computing is done via the internet (the cloud) on large data center computers—a server farm.

Server farms are used in applications such as natural language processing, generating text and images, classifying video streams, and IoT process control and monitoring. Wired estimates that training a large model like GPT-4 costs $100 million, using as many as 10,000 systems with powerful A100 GPU processor arrays over 11 months. The largest known model is Google GLaM, with more than 1 trillion parameters. Models are getting larger and larger, but can these systems continue to scale?

According to SemiAnalysis chief analyst Dylan Patel (via Insider), the cost of running ChatGPT is estimated to be as high as $700,000 daily. This cost is broken down into maintenance, depreciation on the computer resources, and electricity consumption of the servers and cooling systems. In a study published jointly by Google and UC Berkeley (via Scientific American), the amount of power used by GPT-3 is 1,287 megawatt hours.

This is of great concern when multiplied by the number of server farms worldwide and the increase in AI processing. The power consumption of server farms will likely increase as more people start to access online AI. Server farms could consume more than 20% of the world’s electricity by 2025.

Server farms use large racks with powerful computers and GPUs. They contain thousands of processing cores that can be used as parallel processing units to compute the function of a neural network. The power used by a single GPU can be as high as 400 watts, and a server may use up to 32 of those GPUs. A company’s cluster of large data centers may deploy as many as 2.5 million servers. Even if only half of the servers contain GPUs, a worst-case calculation will reach 16,000 megawatt hours. That is a lot of greenhouse gases.

There are several ways to reduce the environmental impact of server farms. One part of the solution is more efficient hardware, together with the use of renewable energy. Another is to use hybrid solutions that perform much of the processing distributed at the edge in specialized, low-power but high-performance neuromorphic hardware. Neuromorphic processing takes inspiration from the energy-efficient methods of the brain.

The human brain contains approximately 86 billion neuron cells (about 80 times that of GLaM, the largest of the large language models) with an estimated 100 trillion connections (roughly 100 times that of GLaM). Each cell has a variable amount of electrochemical memory. The information stored in this biological memory can be considered equivalent to the parameters in a neural network model.

The brain model is dynamic in contrast to artificial neural networks. It creates new connections and more memory as we learn, and it prunes redundant connections when we sleep. The human brain neural network, even though larger than the largest AI model, consumes only the energy equivalent of 20 watts—less than a light bulb. The brain’s structure is vastly different from the neural network models used in today’s AI systems, notwithstanding the successes we have seen over the last few years.

Neuromorphic processing borrows from the efficient processing techniques of the brain by copying its behavior into digital circuits. While digital circuits may not be as power-efficient as analog circuits, stability, interchangeability and speed outweigh the slight power advantage. Using neuromorphic computing engines is transparent to the developer and the user because of an event-driven convolution shell.

Neuromorphic processing can run convolutional neural networks (CNN) and can run image classification on ImageNet1000, real-time video classification, odor and taste recognition, vibration analysis, voice and speech recognition, and disease and anomaly detection. Using these functions in portable and battery-powered tools is possible because of its low power consumption.

It is possible to reduce the excessive power consumption of data centers by using distributed AI processing in fast neuromorphic computing devices, which reduces operating costs and increases the functionality and responsiveness of edge products. Neuromorphic processing can help compensate for AI’s expected negative environmental impact.

Forbes Technology Council is an invitation-only community for world-class CIOs, CTOs and technology executives. Do I qualify?



Follow me on LinkedIn. Check out my website.
Peter van der Made
Peter van der Made

Peter van der Made is the founder and CTO of BrainChip Ltd. BrainChip produces advanced AI processors in digital neuromorphic technologies. Read Peter van der Made's full executive profile here.
I can not decide therefore
❤️‍🔥
 
  • Like
  • Haha
Reactions: 9 users

Krustor

Regular
I’m laughing but I shouldn’t be. I’m guessing he just replied to the wrong person.

I personally know Pmel and can assure you he isn’t a manipulator or a downramper. Just a regular guy watching his significant investment getting smashed.
I know why I already called him "Pumuckl" a few month ago...

Nevertheless, lets just go on.
 
  • Like
Reactions: 1 users
Top Bottom