BRN Discussion Ongoing

MDhere

Regular
I wonder if Nvidia, Apple, Amazon have tiny tiny forums like this bickering like little ants in a molehill. The sooner we get to $4 this stupid bickering about he said this, she said that will all be over. I dare say the Americans will take over (lucky i won't hear the loud accent) (no offence to the rowdy americans)

We have uncovered alot in this anonymous group now time for Brainchip to uncover it all so we can say see he was right, she was right, I said it first, no he said it 45 messages ago.

Lol
HAPPY SUNDAY fellow brners
It's going to be a great week, well for me i know it will be but im just a tiny ant in a molehill. (Yeah I know I got the saying wrong but u all know what I mean ) :)
 
Last edited:
  • Like
  • Love
  • Fire
Reactions: 34 users

Bravo

If ARM was an arm, BRN would be its biceps💪!

AI’s craving for data is matched only by a runaway thirst for water and energy​

John Naughton
John Naughton


The computing power for AI models requires immense – and increasing – amounts of natural resources. Legislation is required to prevent environmental crisis

Sun 3 Mar 2024 02.55 AEDT


One of the most pernicious myths about digital technology is that it is somehow weightless or immaterial. Remember all that early talk about the “paperless” office and “frictionless” transactions? And of course, while our personal electronic devices do use some electricity, compared with the washing machine or the dishwasher, it’s trivial.
Belief in this comforting story, however, might not survive an encounter with Kate Crawford’s seminal book, Atlas of AI, or the striking Anatomy of an AI System graphic she composed with Vladan Joler. And it certainly wouldn’t survive a visit to a datacentre – one of those enormous metallic sheds housing tens or even hundreds of thousands of servers humming away, consuming massive amounts of electricity and needing lots of water for their cooling systems.


On the energy front, consider Ireland, a small country with an awful lot of datacentres. Its Central Statistics Office reports that in 2022 those sheds consumed more electricity (18%) than all the rural dwellings in the country, and as much as all Ireland’s urban dwellings. And as far as water consumption is concerned, a study by Imperial College London in 2021 estimated that one medium-sized datacentre used as much water as three average-sized hospitals. Which is a useful reminder that while these industrial sheds are the material embodiment of the metaphor of “cloud computing”, there is nothing misty or fleecy about them. And if you were ever tempted to see for yourself, forget it: it’d be easier to get into Fort Knox.

OpenAI’s boss warned the next wave of AI will consume vastly more power than expected. Energy systems will struggle to cope
There are now between 9,000 and 11,000 of these datacentres in the world. Many of them are beginning to look a bit dated, because they’re old style server-farms with thousands or millions of cheap PCs storing all the data – photographs, documents, videos, audio recordings, etc – that a smartphone-enabled world generates in such casual abundance.
But that’s about to change, because the industrial feeding frenzy around AI (AKA machine learning) means that the materiality of the computing “cloud” is going to become harder to ignore. How come? Well, machine learning requires a different kind of computer processor – graphics processing units (GPUs) – which are considerably more complex (and expensive) than conventional processors. More importantly, they also run hotter, and need significantly more energy.
On the cooling front, Kate Crawford notes in an article published in Nature last week that a giant datacentre cluster serving OpenAI’s most advanced model, GPT-4, is based in the state of Iowa. “A lawsuit by local residents,” writes Crawford, “revealed that in July 2022, the month before OpenAI finished training the model, the cluster used about 6% of the district’s water. As Google and Microsoft prepared their Bard and Bing large language models, both had major spikes in water use – increases of 20% and 34%, respectively, in one year, according to the companies’ environmental reports.”
Within the tech industry, it has been widely known that AI faces an energy crisis, but it was only at the World Economic Forum in Davos in January that one of its leaders finally came clean about it. OpenAI’s boss Sam Altman warned that the next wave of generative AI systems will consume vastly more power than expected, and that energy systems will struggle to cope. “There’s no way to get there without a breakthrough,” he said.
What kind of “breakthrough”? Why, nuclear fusion, of course. In which, coincidentally, Mr Altman has a stake, having invested in Helion Energy way back in 2021. Smart lad, that Altman; never misses a trick.
As far as cooling is concerned, it looks as though runaway AI also faces a challenge. At any rate, a paper recently published on the arXiv preprint server by scientists at the University of California, Riverside, estimates that “operational water withdrawal” – water taken from surface or groundwater sources – of global AI “may reach [between] 4.2 [and] 6.6bn cubic meters in 2027, which is more than the total annual water withdrawal of … half of the United Kingdom”.
Given all that, you can see why the AI industry is, er, reluctant about coming clean on its probable energy and cooling requirements. After all, there’s a bubble on, and awkward facts can cause punctures. So it’s nice to be able to report that soon they may be obliged to open up. Over in the US, a group of senators and representatives have introduced a bill to require the federal government to assess AI’s current environmental footprint and develop a standardised system for reporting future impacts. And over in Europe, the EU’s AI Act is about to become law. Among other things, it requires “high-risk AI systems” (which include the powerful “foundation models” that power ChatGPT and similar AIs) to report their energy consumption, use of resources and other impacts throughout their lifespan.


It’d be nice if this induces some investors to think about doing proper due diligence before jumping on the AI bandwagon.

 
  • Like
  • Love
  • Fire
Reactions: 31 users

MDhere

Regular
Let us remember this picture - "Any Sensor"
Screenshot_20240303-095147_Edge.jpg
 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 43 users

TopCat

Regular

AI’s craving for data is matched only by a runaway thirst for water and energy​

John Naughton
John Naughton


The computing power for AI models requires immense – and increasing – amounts of natural resources. Legislation is required to prevent environmental crisis

Sun 3 Mar 2024 02.55 AEDT


One of the most pernicious myths about digital technology is that it is somehow weightless or immaterial. Remember all that early talk about the “paperless” office and “frictionless” transactions? And of course, while our personal electronic devices do use some electricity, compared with the washing machine or the dishwasher, it’s trivial.
Belief in this comforting story, however, might not survive an encounter with Kate Crawford’s seminal book, Atlas of AI, or the striking Anatomy of an AI System graphic she composed with Vladan Joler. And it certainly wouldn’t survive a visit to a datacentre – one of those enormous metallic sheds housing tens or even hundreds of thousands of servers humming away, consuming massive amounts of electricity and needing lots of water for their cooling systems.


On the energy front, consider Ireland, a small country with an awful lot of datacentres. Its Central Statistics Office reports that in 2022 those sheds consumed more electricity (18%) than all the rural dwellings in the country, and as much as all Ireland’s urban dwellings. And as far as water consumption is concerned, a study by Imperial College London in 2021 estimated that one medium-sized datacentre used as much water as three average-sized hospitals. Which is a useful reminder that while these industrial sheds are the material embodiment of the metaphor of “cloud computing”, there is nothing misty or fleecy about them. And if you were ever tempted to see for yourself, forget it: it’d be easier to get into Fort Knox.

There are now between 9,000 and 11,000 of these datacentres in the world. Many of them are beginning to look a bit dated, because they’re old style server-farms with thousands or millions of cheap PCs storing all the data – photographs, documents, videos, audio recordings, etc – that a smartphone-enabled world generates in such casual abundance.
But that’s about to change, because the industrial feeding frenzy around AI (AKA machine learning) means that the materiality of the computing “cloud” is going to become harder to ignore. How come? Well, machine learning requires a different kind of computer processor – graphics processing units (GPUs) – which are considerably more complex (and expensive) than conventional processors. More importantly, they also run hotter, and need significantly more energy.
On the cooling front, Kate Crawford notes in an article published in Nature last week that a giant datacentre cluster serving OpenAI’s most advanced model, GPT-4, is based in the state of Iowa. “A lawsuit by local residents,” writes Crawford, “revealed that in July 2022, the month before OpenAI finished training the model, the cluster used about 6% of the district’s water. As Google and Microsoft prepared their Bard and Bing large language models, both had major spikes in water use – increases of 20% and 34%, respectively, in one year, according to the companies’ environmental reports.”
Within the tech industry, it has been widely known that AI faces an energy crisis, but it was only at the World Economic Forum in Davos in January that one of its leaders finally came clean about it. OpenAI’s boss Sam Altman warned that the next wave of generative AI systems will consume vastly more power than expected, and that energy systems will struggle to cope. “There’s no way to get there without a breakthrough,” he said.
What kind of “breakthrough”? Why, nuclear fusion, of course. In which, coincidentally, Mr Altman has a stake, having invested in Helion Energy way back in 2021. Smart lad, that Altman; never misses a trick.
As far as cooling is concerned, it looks as though runaway AI also faces a challenge. At any rate, a paper recently published on the arXiv preprint server by scientists at the University of California, Riverside, estimates that “operational water withdrawal” – water taken from surface or groundwater sources – of global AI “may reach [between] 4.2 [and] 6.6bn cubic meters in 2027, which is more than the total annual water withdrawal of … half of the United Kingdom”.
Given all that, you can see why the AI industry is, er, reluctant about coming clean on its probable energy and cooling requirements. After all, there’s a bubble on, and awkward facts can cause punctures. So it’s nice to be able to report that soon they may be obliged to open up. Over in the US, a group of senators and representatives have introduced a bill to require the federal government to assess AI’s current environmental footprint and develop a standardised system for reporting future impacts. And over in Europe, the EU’s AI Act is about to become law. Among other things, it requires “high-risk AI systems” (which include the powerful “foundation models” that power ChatGPT and similar AIs) to report their energy consumption, use of resources and other impacts throughout their lifespan.


It’d be nice if this induces some investors to think about doing proper due diligence before jumping on the AI bandwagon.

I was just reading this article. Not sure how we can keep supplying the energy for this!
Screenshot_20240303_105157_Chrome.jpg
 
  • Like
  • Love
  • Fire
Reactions: 19 users

Sirod69

bavarian girl ;-)
Nvidia versus Intel

NVIDIA | AI-powered | Market Cap
AI Chip Dominance

NVIDIA 's Rise to 3rd Largest Company Globally in #MarketCap:

News: Nvidia pips Saudi Aramco to become 3rd largest company globally in market cap - DECODING rise of chipmaker
Link: https://lnkd.in/ggnJVF_q

Factors Behind Nvidia's #Rise:
- #Dominance in #artificialintelligence, (#AI), #gaming, #cloud #computing, and #autonomous #vehicles
- Strong #financial #performance

Implications for the #Semiconductor Industry:
- Highlights the growing #importance of semiconductors in the #global #economy

#Opportunities for #Nvidia:
- Well-positioned to capitalize on the continued #growth of AI, cloud computing, and other #technologies

#Challenges for Nvidia:
- Competition from other #chip makers and #emerging AI companies
- Supply chain issues
- #Regulatory #environment

Outlook for Nvidia:
- #Analysts are generally optimistic about Nvidia's #future #prospects
- Expected to remain a dominant #player in the semiconductor #industry.
 
  • Fire
  • Like
Reactions: 7 users

Boab

I wish I could paint like Vincent
  • Like
  • Love
  • Fire
Reactions: 10 users

MrNick

Regular

Attachments

  • IMG_0322.jpeg
    IMG_0322.jpeg
    38.7 KB · Views: 129
  • Haha
Reactions: 4 users

MrNick

Regular
I was just reading this article. Not sure how we can keep supplying the energy for this!
View attachment 58338
We can’t… and as the global need for more power continues, the need for urgent reconfiguring of alternate sources (or sources using negligible output) grows exponentially. And especially when the costs involved in restructuring energy consumption for each and every business could reduce massively if the ‘A’ Game becomes ubiquitous.
 
  • Like
  • Fire
  • Love
Reactions: 14 users

Damo4

Regular
had to look up Somato. It applies to being able to feel heat, cold, pressure etc.
Good for robotics??
And for healthcare/wearables
 
  • Like
  • Fire
Reactions: 8 users

Esq.111

Fascinatingly Intuitive.
Afternoon Chippers ,

This caught my eye , naturaly nothing 🧠 🍟 related ..... though we could help know doubt.

Might add , RED BULL would be a great company to join our partner list ..... these fiends are into every extreme sport....... ie, capture the hearts & minds of the young whipper snappers , who inturn adapt to future tech faster than the past.

World's Fastest Camera Drone Vs F1 Car (ft. Max V…:

Regards,
Esq.
 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 30 users

manny100

Regular

AI’s craving for data is matched only by a runaway thirst for water and energy​

John Naughton
John Naughton


The computing power for AI models requires immense – and increasing – amounts of natural resources. Legislation is required to prevent environmental crisis

Sun 3 Mar 2024 02.55 AEDT


One of the most pernicious myths about digital technology is that it is somehow weightless or immaterial. Remember all that early talk about the “paperless” office and “frictionless” transactions? And of course, while our personal electronic devices do use some electricity, compared with the washing machine or the dishwasher, it’s trivial.
Belief in this comforting story, however, might not survive an encounter with Kate Crawford’s seminal book, Atlas of AI, or the striking Anatomy of an AI System graphic she composed with Vladan Joler. And it certainly wouldn’t survive a visit to a datacentre – one of those enormous metallic sheds housing tens or even hundreds of thousands of servers humming away, consuming massive amounts of electricity and needing lots of water for their cooling systems.


On the energy front, consider Ireland, a small country with an awful lot of datacentres. Its Central Statistics Office reports that in 2022 those sheds consumed more electricity (18%) than all the rural dwellings in the country, and as much as all Ireland’s urban dwellings. And as far as water consumption is concerned, a study by Imperial College London in 2021 estimated that one medium-sized datacentre used as much water as three average-sized hospitals. Which is a useful reminder that while these industrial sheds are the material embodiment of the metaphor of “cloud computing”, there is nothing misty or fleecy about them. And if you were ever tempted to see for yourself, forget it: it’d be easier to get into Fort Knox.

There are now between 9,000 and 11,000 of these datacentres in the world. Many of them are beginning to look a bit dated, because they’re old style server-farms with thousands or millions of cheap PCs storing all the data – photographs, documents, videos, audio recordings, etc – that a smartphone-enabled world generates in such casual abundance.
But that’s about to change, because the industrial feeding frenzy around AI (AKA machine learning) means that the materiality of the computing “cloud” is going to become harder to ignore. How come? Well, machine learning requires a different kind of computer processor – graphics processing units (GPUs) – which are considerably more complex (and expensive) than conventional processors. More importantly, they also run hotter, and need significantly more energy.
On the cooling front, Kate Crawford notes in an article published in Nature last week that a giant datacentre cluster serving OpenAI’s most advanced model, GPT-4, is based in the state of Iowa. “A lawsuit by local residents,” writes Crawford, “revealed that in July 2022, the month before OpenAI finished training the model, the cluster used about 6% of the district’s water. As Google and Microsoft prepared their Bard and Bing large language models, both had major spikes in water use – increases of 20% and 34%, respectively, in one year, according to the companies’ environmental reports.”
Within the tech industry, it has been widely known that AI faces an energy crisis, but it was only at the World Economic Forum in Davos in January that one of its leaders finally came clean about it. OpenAI’s boss Sam Altman warned that the next wave of generative AI systems will consume vastly more power than expected, and that energy systems will struggle to cope. “There’s no way to get there without a breakthrough,” he said.
What kind of “breakthrough”? Why, nuclear fusion, of course. In which, coincidentally, Mr Altman has a stake, having invested in Helion Energy way back in 2021. Smart lad, that Altman; never misses a trick.
As far as cooling is concerned, it looks as though runaway AI also faces a challenge. At any rate, a paper recently published on the arXiv preprint server by scientists at the University of California, Riverside, estimates that “operational water withdrawal” – water taken from surface or groundwater sources – of global AI “may reach [between] 4.2 [and] 6.6bn cubic meters in 2027, which is more than the total annual water withdrawal of … half of the United Kingdom”.
Given all that, you can see why the AI industry is, er, reluctant about coming clean on its probable energy and cooling requirements. After all, there’s a bubble on, and awkward facts can cause punctures. So it’s nice to be able to report that soon they may be obliged to open up. Over in the US, a group of senators and representatives have introduced a bill to require the federal government to assess AI’s current environmental footprint and develop a standardised system for reporting future impacts. And over in Europe, the EU’s AI Act is about to become law. Among other things, it requires “high-risk AI systems” (which include the powerful “foundation models” that power ChatGPT and similar AIs) to report their energy consumption, use of resources and other impacts throughout their lifespan.


It’d be nice if this induces some investors to think about doing proper due diligence before jumping on the AI bandwagon.

...and that is another reason why Brainchip will succeed.
Just like us AI needs food (energy) and water to survive.
BRN will lessen the need for power and water.
 
  • Like
  • Fire
  • Love
Reactions: 19 users
Oh wow!


"Dell, one of the world's largest server makers, has spilled the beans on Nvidia's upcoming AI GPUs, codenamed Blackwell. Apparently, these processors will consume up to 1000 Watts, a 40% increase in power over the prior-gen, requiring Dell to use its engineering ingenuity to cool these GPUs down."
That's just funny.. They are increasing power requirements by 40%??
They've even codenamed it "Blackwell"..

Brings to mind an image, of a heat blackened "water" well...

Looking like they will struggle to cool the damn things and they subconsciously know it..

Let the good times roll.
 
  • Like
  • Fire
  • Love
Reactions: 17 users

Esq.111

Fascinatingly Intuitive.
Maybe BrainChip should do an over the top TOPS , FULLY THROTTLED version with not one but two tungsten filament GLOW coils ( just so we can burn an extra 5 to ten KWH ) .

And Charge customers $100 k plus per chip so thay think thay are getting value for money.

Would appear this is how thay swing pressently.

Truely beggers believes.

No F*€£k IT let's charge $200 k per chip , then it must be realy realy amazing , right.

Esq.
 
  • Haha
  • Like
  • Fire
Reactions: 29 users

IloveLamp

Top 20
  • Like
  • Love
  • Fire
Reactions: 12 users

AI’s craving for data is matched only by a runaway thirst for water and energy​

John Naughton
John Naughton


The computing power for AI models requires immense – and increasing – amounts of natural resources. Legislation is required to prevent environmental crisis

Sun 3 Mar 2024 02.55 AEDT


One of the most pernicious myths about digital technology is that it is somehow weightless or immaterial. Remember all that early talk about the “paperless” office and “frictionless” transactions? And of course, while our personal electronic devices do use some electricity, compared with the washing machine or the dishwasher, it’s trivial.
Belief in this comforting story, however, might not survive an encounter with Kate Crawford’s seminal book, Atlas of AI, or the striking Anatomy of an AI System graphic she composed with Vladan Joler. And it certainly wouldn’t survive a visit to a datacentre – one of those enormous metallic sheds housing tens or even hundreds of thousands of servers humming away, consuming massive amounts of electricity and needing lots of water for their cooling systems.


On the energy front, consider Ireland, a small country with an awful lot of datacentres. Its Central Statistics Office reports that in 2022 those sheds consumed more electricity (18%) than all the rural dwellings in the country, and as much as all Ireland’s urban dwellings. And as far as water consumption is concerned, a study by Imperial College London in 2021 estimated that one medium-sized datacentre used as much water as three average-sized hospitals. Which is a useful reminder that while these industrial sheds are the material embodiment of the metaphor of “cloud computing”, there is nothing misty or fleecy about them. And if you were ever tempted to see for yourself, forget it: it’d be easier to get into Fort Knox.

There are now between 9,000 and 11,000 of these datacentres in the world. Many of them are beginning to look a bit dated, because they’re old style server-farms with thousands or millions of cheap PCs storing all the data – photographs, documents, videos, audio recordings, etc – that a smartphone-enabled world generates in such casual abundance.
But that’s about to change, because the industrial feeding frenzy around AI (AKA machine learning) means that the materiality of the computing “cloud” is going to become harder to ignore. How come? Well, machine learning requires a different kind of computer processor – graphics processing units (GPUs) – which are considerably more complex (and expensive) than conventional processors. More importantly, they also run hotter, and need significantly more energy.
On the cooling front, Kate Crawford notes in an article published in Nature last week that a giant datacentre cluster serving OpenAI’s most advanced model, GPT-4, is based in the state of Iowa. “A lawsuit by local residents,” writes Crawford, “revealed that in July 2022, the month before OpenAI finished training the model, the cluster used about 6% of the district’s water. As Google and Microsoft prepared their Bard and Bing large language models, both had major spikes in water use – increases of 20% and 34%, respectively, in one year, according to the companies’ environmental reports.”
Within the tech industry, it has been widely known that AI faces an energy crisis, but it was only at the World Economic Forum in Davos in January that one of its leaders finally came clean about it. OpenAI’s boss Sam Altman warned that the next wave of generative AI systems will consume vastly more power than expected, and that energy systems will struggle to cope. “There’s no way to get there without a breakthrough,” he said.
What kind of “breakthrough”? Why, nuclear fusion, of course. In which, coincidentally, Mr Altman has a stake, having invested in Helion Energy way back in 2021. Smart lad, that Altman; never misses a trick.
As far as cooling is concerned, it looks as though runaway AI also faces a challenge. At any rate, a paper recently published on the arXiv preprint server by scientists at the University of California, Riverside, estimates that “operational water withdrawal” – water taken from surface or groundwater sources – of global AI “may reach [between] 4.2 [and] 6.6bn cubic meters in 2027, which is more than the total annual water withdrawal of … half of the United Kingdom”.
Given all that, you can see why the AI industry is, er, reluctant about coming clean on its probable energy and cooling requirements. After all, there’s a bubble on, and awkward facts can cause punctures. So it’s nice to be able to report that soon they may be obliged to open up. Over in the US, a group of senators and representatives have introduced a bill to require the federal government to assess AI’s current environmental footprint and develop a standardised system for reporting future impacts. And over in Europe, the EU’s AI Act is about to become law. Among other things, it requires “high-risk AI systems” (which include the powerful “foundation models” that power ChatGPT and similar AIs) to report their energy consumption, use of resources and other impacts throughout their lifespan.


It’d be nice if this induces some investors to think about doing proper due diligence before jumping on the AI bandwagon.

There's a solution, to the power at least..

qF1NU8(1).gif
 
  • Haha
  • Love
Reactions: 3 users

Slade

Top 20
We are lucky to have Rob Telson working at BrainChip.

 
  • Like
  • Love
  • Fire
Reactions: 56 users
Afternoon Chippers ,

This caught my eye , naturaly nothing 🧠 🍟 related ..... though we could help know doubt.

Might add , RED BULL would be a great company to join our partner list ..... these fiends are into every extreme sport....... ie, capture the hearts & minds of the young whipper snappers , who inturn adapt to future tech faster than the past.

World's Fastest Camera Drone Vs F1 Car (ft. Max V…:

Regards,
Esq.

1709442209759.gif
 
  • Like
  • Haha
Reactions: 2 users

jtardif999

Regular
Maybe BrainChip should do an over the top TOPS , FULLY THROTTLED version with not one but two tungsten filament GLOW coils ( just so we can burn an extra 5 to ten KWH ) .

And Charge customers $100 k plus per chip so thay think thay are getting value for money.

Would appear this is how thay swing pressently.

Truely beggers believes.

No F*€£k IT let's charge $200 k per chip , then it must be realy realy amazing , right.

Esq.
Common sense will eventually prevail. The economics and power saving of Akida will make it essential AI every where.
 
  • Like
  • Fire
  • Love
Reactions: 16 users

TECH

Regular
...and that is another reason why Brainchip will succeed.
Just like us AI needs food (energy) and water to survive.
BRN will lessen the need for power and water.

I shall never forget that Engineers Australia event that I attended in 2019 in Perth, I believe I was the only shareholder whom
attended that night, including paying for the privilege to meet Peter and Adam for the first time in person.

That was the night that I learnt that Akida didn't require an internet connection to function, but could at full capacity on 2 AAA batteries for
approximately 6 months (from memory).

I remember Peter delivering his presentation, that statement just blew me away, as we all know the AI explosion is already upon us,
the demand worldwide will never be able to be contained once it's in full swing, and Brainchip's technology just can't be ignored, we
all know it, and I'm hoping within 5 years, Jensen Huang will have realized he should have possibly pounced earlier, because we really
are primed for explosive sales growth worldwide, in my biased opinion of course.

Love Akida..
 
Last edited:
  • Like
  • Love
  • Fire
Reactions: 68 users

ndefries

Regular
  • Like
  • Fire
  • Love
Reactions: 27 users
Top Bottom