BRN Discussion Ongoing

Amazing ..., And just like that a little whale meanders through and hoovers up the small bait ball created .

Who would have thought.
What happened to the parrots, Esq? 🤔
 
  • Like
Reactions: 2 users
  • Like
Reactions: 3 users

Esq.111

Fascinatingly Intuitive.
Afternoon DingoBorat ,

They are still about , though on strike , apparently one of Blackrocks head trading parrots went rouge , Drained their currency trading account of Swiss Franks then went all in on the Chicago future's .. bought some 1.2 million bushels of AAA Grade birdseed which was delivered via the company's private jet
to southern Panama somewhere , and since vanished...... Needles to say Larrys not happy.

:whistle:.

Last known photo of Escobar ( Ex head trader Blackrock ) , before he went propper awole.

1710216831349.png

Regards ,
Esq.
 
  • Haha
  • Like
  • Fire
Reactions: 27 users

hotty4040

Regular
Oh Buddy, have you got it ALL WRONG!!!,
"that isn't doing well today?"
I don't need anybody to ask for my decision to buy or sell.
Did I sell when it went down 35% the other day. NO I DIDN'T, I brought more, DID YOU?
What a PATHETIC reply to my question,
Thought I would get a more reasonable reply from you ,Mr Rob .unt


Now, now, let's not get toooo overly exuberent. It was only an opinion, when all said and done. Now that wasn't a very nice or conciliatory reply at all, now, was it. There's absolutely no reason for you to respond that way. Do you not think that the label "pathetic" might also apply to your posting as well ? ... Very childish IMHO.

And also, you didn't " brought more " you in actual fact ( BOUGHT MORE ) ok........

I mean, where has the respect these days gone from some of us I wonder at times.

Your attitude needs some, examination, don't you think..


Akida Ballista comrades, >>>>> Not long now <<<<<

hotty...
 
  • Like
  • Love
  • Thinking
Reactions: 11 users

Diogenese

Top 20
All of a sudden there are more buyers than sellers
21 million sold down - 2 million bought up - that's an awful lot of sprats for a very modestly sized mackerel.
 
  • Like
  • Haha
  • Fire
Reactions: 17 users
An interesting article about how difficult it is for AI startups to recruit AI talent:


"I tried to hire a very senior researcher from Meta, and you know what they said? 'Come back to me when you have 10,000 H100 GPUs'," Srinivas said on a recent episode of the advice podcast "Invest Like The Best."

"That would cost billions and take 5 to 10 years to get from Nvidia," Srinivas said."

"The CEO added that even if smaller firms like Perplexity are finally able to get Nvidia's chips, they'll continue to fall behind because of AI's rapid speed of development.

That could make it even harder to secure AI talent in the future.

"By the time you waited and got the money and booked the cluster and got it, the guys working here will have already made the next-generation model," Srinivas said, referring to AI talent at major tech companies."
 
  • Like
  • Wow
Reactions: 10 users

skutza

Regular
Hey Skutza, great advice, except I'm not the one who needs an audience.

I did take it private, he didn't respond to me there, instead he screenshotted my private messages, and posted them on the forum.

In fact, why didn't you post this piece of advice to me privately 🤔..
LOL, I kinda thought that myself when I read it back. Kind of hypocrisy right?
 
  • Like
Reactions: 3 users

mrgds

Regular
An interesting article about how difficult it is for AI startups to recruit AI talent:


"I tried to hire a very senior researcher from Meta, and you know what they said? 'Come back to me when you have 10,000 H100 GPUs'," Srinivas said on a recent episode of the advice podcast "Invest Like The Best."

"That would cost billions and take 5 to 10 years to get from Nvidia," Srinivas said."

"The CEO added that even if smaller firms like Perplexity are finally able to get Nvidia's chips, they'll continue to fall behind because of AI's rapid speed of development.

That could make it even harder to secure AI talent in the future.

"By the time you waited and got the money and booked the cluster and got it, the guys working here will have already made the next-generation model," Srinivas said, referring to AI talent at major tech companies."
Sounds just like the Australian Defense Systemn. :rolleyes:

Anyway, nice volume, beatdown, then recovery to finish GREEN TODAY (y)

Time for a run @Bravo
Screenshot (81).png




AKIDA BALLISTA
 
  • Like
  • Haha
  • Fire
Reactions: 9 users

MDhere

Regular
Your first prediction was right @Esq.111 Sunny day, no clouds as we are -
Screenshot_20240312-154337_Google.jpg
 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 12 users

IloveLamp

Top 20
🤔

1000014084.jpg
 
  • Like
  • Thinking
  • Fire
Reactions: 5 users

Adam

Regular
Very special people used that toilet . Look at the width of the toilet paper .
Duh! And it's not even ironed! Methinks the SQL DB crapped itself..Select * from Downrampers > 2
 
  • Haha
Reactions: 2 users

1710231723875.png



I received many great questions from the community in response to my recent post on neuromorphic computing, so I’ll jump right in and answer a few.

How does a more powerful processor increase energy efficiency?

#AI is already used in advanced driving assistance systems (ADAS) and infotainment, and the complex calculations are currently performed on traditional CPUs, GPUs and NPUs, which are not energy efficient. #Neuromorphiccomputing requires for the same tasks less energy. As the number of AI functions continue to increase, the increased computing efficiency of neuromorphic hardware will require less energy in comparison to legacy hardware. Reduced energy usage will also increase vehicle range and improve sustainability.

When can I experience neuromorphic computing?

Widespread use of neuromorphic computing will depend on many factors. The technology requires new programming and algorithms, so it will not immediately replace traditional processors. One key factor for us is that automotive-grade chips must meet extremely strict reliability requirements. However, we are already actively working to drive development and we are committed to being the first to use this technology in the automotive industry.

If you haven’t read the article yet, check it out here https://lnkd.in/epnUc5Sy. Be sure to ask more questions so we can keep the conversation going.



Neuromorphic computing? We’ve got that. 😎

Because it’s still nascent technology, I am frequently asked to describe #neuromorphic computing. It is a paradigm shift for how we perform computations in machine learning (#ML) and artificial intelligence (AI), which process massive amounts of data requiring tons of fast memory.

Currently available processor architecture separates data calculations from system memory, which is inefficient. The biological inspiration for neural networks is the human brain, where computing and memory are combined, and data processing uses neurons to communicate through electrical signals and chemical processes known as neurotransmitters.

In neuromorphic computing, those human neurons and synapses are modelled in circuits and communication is event-driven, with information coded in spikes, mimicking the processing fundamentals of the brain. Those spikes propagate through a Spiking Neural Network of artificial neurons and synapses to predict results. Information processing is measured by spike rate or spike time instead of the number of calculations. Thus, neuromorphic chips are more energy efficient and have lower latency than conventional CPUs and GPUs. That means much faster computation using considerably less power.

However, this change in data processing also requires new software algorithms specifically designed to work with neuromorphic hardware. Existing algorithms can only partially leverage the many benefits of neural technology. Thanks to Valerij, Alexander, Christina in the Innovations & Future Technology area and the rest of our team for tackling this huge project!

𝗔𝗽𝗽𝗹𝗶𝗰𝗮𝘁𝗶𝗼𝗻𝘀

Neuromorphic computing reduces the power required for advanced AI computation, which is useful in applications where energy is limited, like electric vehicles. However, we still need automotive-grade chips with neuromorphic technology before this technology becomes common in cars.

We at Mercedes-Benz AG are currently working on novel algorithms that take advantage of neuromorphic computing to improve the energy efficiency and performance of our cars. Our primary goals are to extend vehicle range, make safety systems react faster, and increase the number of #AI functions possible. In 2020, we already joined the #Intel Neuromorphic Research Community and since then we are continuously expanding our collaborations with other research partners and universities to ensure our software and hardware solutions continue to lead the industry.

It's an exciting time to be in the world of automotive technology. Please share any questions and comments below.
 
Last edited:
  • Like
  • Love
  • Fire
Reactions: 90 users

View attachment 58928


I received many great questions from the community in response to my recent post on neuromorphic computing, so I’ll jump right in and answer a few.

How does a more powerful processor increase energy efficiency?

#AI is already used in advanced driving assistance systems (ADAS) and infotainment, and the complex calculations are currently performed on traditional CPUs, GPUs and NPUs, which are not energy efficient. #Neuromorphiccomputing requires for the same tasks less energy. As the number of AI functions continue to increase, the increased computing efficiency of neuromorphic hardware will require less energy in comparison to legacy hardware. Reduced energy usage will also increase vehicle range and improve sustainability.

When can I experience neuromorphic computing?

Widespread use of neuromorphic computing will depend on many factors. The technology requires new programming and algorithms, so it will not immediately replace traditional processors. One key factor for us is that automotive-grade chips must meet extremely strict reliability requirements. However, we are already actively working to drive development and we are committed to being the first to use this technology in the automotive industry.

If you haven’t read the article yet, check it out here https://lnkd.in/epnUc5Sy. Be sure to ask more questions so we can keep the conversation going.



Neuromorphic computing? We’ve got that. 😎

Because it’s still nascent technology, I am frequently asked to describe #neuromorphic computing. It is a paradigm shift for how we perform computations in machine learning (#ML) and artificial intelligence (AI), which process massive amounts of data requiring tons of fast memory.

Currently available processor architecture separates data calculations from system memory, which is inefficient. The biological inspiration for neural networks is the human brain, where computing and memory are combined, and data processing uses neurons to communicate through electrical signals and chemical processes known as neurotransmitters.

In neuromorphic computing, those human neurons and synapses are modelled in circuits and communication is event-driven, with information coded in spikes, mimicking the processing fundamentals of the brain. Those spikes propagate through a Spiking Neural Network of artificial neurons and synapses to predict results. Information processing is measured by spike rate or spike time instead of the number of calculations. Thus, neuromorphic chips are more energy efficient and have lower latency than conventional CPUs and GPUs. That means much faster computation using considerably less power.

However, this change in data processing also requires new software algorithms specifically designed to work with neuromorphic hardware. Existing algorithms can only partially leverage the many benefits of neural technology. Thanks to Valerij, Alexander, Christina in the Innovations & Future Technology area and the rest of our team for tackling this huge project!

𝗔𝗽𝗽𝗹𝗶𝗰𝗮𝘁𝗶𝗼𝗻𝘀

Neuromorphic computing reduces the power required for advanced AI computation, which is useful in applications where energy is limited, like electric vehicles. However, we still need automotive-grade chips with neuromorphic technology before this technology becomes common in cars.

We at Mercedes-Benz AG are currently working on novel algorithms that take advantage of neuromorphic computing to improve the energy efficiency and performance of our cars. Our primary goals are to extend vehicle range, make safety systems react faster, and increase the number of #AI functions possible. In 2020, we already joined the #Intel Neuromorphic Research Community and since then we are continuously expanding our collaborations with other research partners and universities to ensure our software and hardware solutions continue to lead the industry.

It's an exciting time to be in the world of automotive technology. Please share any questions and comments below.
Brilliant
 
  • Like
Reactions: 12 users

View attachment 58928


I received many great questions from the community in response to my recent post on neuromorphic computing, so I’ll jump right in and answer a few.

How does a more powerful processor increase energy efficiency?

#AI is already used in advanced driving assistance systems (ADAS) and infotainment, and the complex calculations are currently performed on traditional CPUs, GPUs and NPUs, which are not energy efficient. #Neuromorphiccomputing requires for the same tasks less energy. As the number of AI functions continue to increase, the increased computing efficiency of neuromorphic hardware will require less energy in comparison to legacy hardware. Reduced energy usage will also increase vehicle range and improve sustainability.

When can I experience neuromorphic computing?

Widespread use of neuromorphic computing will depend on many factors. The technology requires new programming and algorithms, so it will not immediately replace traditional processors. One key factor for us is that automotive-grade chips must meet extremely strict reliability requirements. However, we are already actively working to drive development and we are committed to being the first to use this technology in the automotive industry.

If you haven’t read the article yet, check it out here https://lnkd.in/epnUc5Sy. Be sure to ask more questions so we can keep the conversation going.



Neuromorphic computing? We’ve got that. 😎

Because it’s still nascent technology, I am frequently asked to describe #neuromorphic computing. It is a paradigm shift for how we perform computations in machine learning (#ML) and artificial intelligence (AI), which process massive amounts of data requiring tons of fast memory.

Currently available processor architecture separates data calculations from system memory, which is inefficient. The biological inspiration for neural networks is the human brain, where computing and memory are combined, and data processing uses neurons to communicate through electrical signals and chemical processes known as neurotransmitters.

In neuromorphic computing, those human neurons and synapses are modelled in circuits and communication is event-driven, with information coded in spikes, mimicking the processing fundamentals of the brain. Those spikes propagate through a Spiking Neural Network of artificial neurons and synapses to predict results. Information processing is measured by spike rate or spike time instead of the number of calculations. Thus, neuromorphic chips are more energy efficient and have lower latency than conventional CPUs and GPUs. That means much faster computation using considerably less power.

However, this change in data processing also requires new software algorithms specifically designed to work with neuromorphic hardware. Existing algorithms can only partially leverage the many benefits of neural technology. Thanks to Valerij, Alexander, Christina in the Innovations & Future Technology area and the rest of our team for tackling this huge project!

𝗔𝗽𝗽𝗹𝗶𝗰𝗮𝘁𝗶𝗼𝗻𝘀

Neuromorphic computing reduces the power required for advanced AI computation, which is useful in applications where energy is limited, like electric vehicles. However, we still need automotive-grade chips with neuromorphic technology before this technology becomes common in cars.

We at Mercedes-Benz AG are currently working on novel algorithms that take advantage of neuromorphic computing to improve the energy efficiency and performance of our cars. Our primary goals are to extend vehicle range, make safety systems react faster, and increase the number of #AI functions possible. In 2020, we already joined the #Intel Neuromorphic Research Community and since then we are continuously expanding our collaborations with other research partners and universities to ensure our software and hardware solutions continue to lead the industry.

It's an exciting time to be in the world of automotive technology. Please share any questions and comments below.
Nice SG.

Toooooo scared to mention Akida in case the SP blows up again :ROFLMAO::cry:

Wasn't sure whether to laugh or cry.

Being part of the INRC, here's a thought.

Given we are not a brand name in the wider scheme of things and Intel are for consumers (todays world & consumer is sadly aligned to brands), wonder if our association with IFS is a designed pathway that could allow MB to us via an Intel name eventually and to complete the requisite testing and certification of the chips in due course.
 
Last edited:
  • Like
  • Fire
  • Thinking
Reactions: 44 users

Iseki

Regular
I know we are all waiting with baited breath for Mercedes-Benz to announce their third chip partner. And good for M-B when they do.
What beats me is that shouldn't we also be beating a path to Boeing? Haven't they just experienced a couple of years of catastrophic events that has destroyed much of their credibility? Events that would have been avoided by the correct interpretation of sensor data.

Who here would willingly fly on a Boeing 737-8 Max? Not many, I would think.

Surely Boeing would be open to talk about a Boeing.OS built around sensor fusion, where sensor data can be uniformly combined in a way that each sensor data is contextualized amongst the network of other sensor data to give the all clear or a red flag - eg yes there can be vibration at take off; no the window should not be open at 33000' etc. This fusion is meant to be something that SSN systems are good at - you can add more and more sensors to the jet's fusion network in a stable fashion.

Surely with the way things are going at Boeing this would have some interest. Surely with the work we are doing with M-B, if it is proceeding, it would be relevant, and Boeing need to spend up now to save their reputation and future sales.
 
  • Like
  • Fire
Reactions: 8 users

suss

Regular
Nice SG.

Toooooo scared to mention Akida in case the SP blows up again :ROFLMAO::cry:

Wasn't sure whether to laugh or cry.

Being part of the INRC, here's a thought.

Given we are not a brand name in the wider scheme of things and Intel are for consumers (todays world & consumer is sadly aligned to brands), wonder if our association with IFS is a designed pathway to allow MB to us via an Intel name eventually and to complete the requisite testing and certification of the chips in due course.
Sean mentioned the Mercedes NDA in that interview last week.
Exciting development ahead, it's all good!
 
  • Like
  • Love
  • Fire
Reactions: 29 users

Pepsin

Regular
  • Like
  • Thinking
  • Wow
Reactions: 19 users

Frangipani

Regular

University of Western Australia Latest to Join the​

BrainChip University AI Accelerator Program​


Laguna Hills, Calif. – March 12, – BrainChip Holdings Ltd (ASX: BRN, OTCQX: BRCHF, ADR: BCHPY), the world’s first commercial producer of ultra-low power, fully digital, event-based, neuromorphic AI IP, today announced that The University of Western Australia has joined the BrainChip University AI Accelerator Program, ensuring that UWA students have the tools and resources needed to help develop leading-edge technologies that will continue to usher in an era of intelligent AI solutions.

As one of the world’s elite, research-intensive universities, UWA is recognized for resolving real-world challenges that are critical to the planet and people. The university has forged and embraced connections with community, partners, and industry to ensure its impact is far reaching, both now and into the future. By joining the BrainChip University AI Accelerator Program, UWA students studying data and computer science gain access to the neuromorphic technology needed to tackle technological challenges and devise innovative solutions to transform the way we live.


BrainChip’s University AI Accelerator Program provides platforms, and guidance to students at higher education institutions with AI engineering programs. Students participating in the program will have access to real-world, event-based technologies offering unparalleled performance and efficiency to advance their learning through graduation and beyond.

“Advancements in data and computer science are influencing the way that people live and interact with one another,” said Rachel Cardell-Oliver, Associate Professor Head of Department of Computer Science & Software Engineering at UWA. “We are constantly seeking ways in which we can give our students real-world experience to expand their knowledge and expertise to prepare them for a future in the technology industry. Partnering with a company like BrainChip to bring in the means for advancing our students’ education is something we know will pay dividends to our students in the years ahead.”

BrainChip’s neural processor, Akida™ IP is an event-based technology that is inherently lower power when compared to conventional neural network accelerators. Lower power affords greater scalability and lower operational costs. BrainChip’s Akida supports incremental learning and high-speed inference in a wide variety of use cases. Among the markets that BrainChip’s Essential AI technology will impact are the next generation of smart cars, smart homes of today and tomorrow, and industrial IoT.

“The University of Western Australia has formed partnerships with industry leaders like BrainChip a part of its focus for giving students the critical edge they need upon graduation,” said Rob Telson, VP Ecosystems and Partnerships at BrainChip. “As the latest addition to our University AI Accelerator Program, UWA students have access to neuromorphic IP that they can utilize to understand how best to advance AI at the device level. This education will inherently help prepare them for jobs in the industry, advancing capabilities in developing solutions supported by incremental learning and high-speed inference across a wide range of use cases and industries.”

UWA joins current participants Arizona State University, Carnegie Mellon University, Rochester Institute of Technology, University of Oklahoma, and the University of Virginia in the accelerator program. Other institutions of higher education interested in how they can become members of BrainChip’s University AI Accelerator Program can find more details at https://brainchip.com/brainchip-university-ai-accelerator/.

About BrainChip Holdings Ltd (ASX: BRN, OTCQX: BRCHF, ADR: BCHPY)

BrainChip is the worldwide leader in Edge AI on-chip processing and learning. The company’s first-to-market, fully digital, event-based AI processor, Akida™ , uses neuromorphic principles to mimic the human brain, analyzing only essential sensor inputs at the point of acquisition, processing data with unparalleled efficiency, precision, and economy of energy. Akida uniquely enables Edge learning local to the chip, independent of the cloud, dramatically reducing latency while improving privacy and data security. Akida Neural processor IP, which can be integrated into SoCs on any process technology, has shown substantial benefits on today’s workloads and networks, and offers a platform for developers to create, tune and run their models using standard AI workflows like Tensorflow/Keras. In enabling effective Edge compute to be universally deployable across real world applications such as connected cars, consumer electronics, and industrial IoT, BrainChip is proving that on-chip AI, close to the sensor, is the future, for its customers’ products, as well as the planet. Explore the benefits of Essential AI at www.brainchip.com.

Follow BrainChip on Twitter: https://www.twitter.com/BrainChip_inc
Follow BrainChip on LinkedIn: https://www.linkedin.com/company/7792006

About University of Western Australia

Established in 1911 as the State’s first university, The University of Western Australia (UWA) was also the first free university in the British Empire actively promoting equal access to tertiary education for all social classes. Today UWA is ranked in the world’s top 100 universities and number one in Western Australia. As one of Australia’s leading research-intensive universities UWA operates more than 40 intensive research centers and 22 schools, in addition to having a broad range of successful industry partnerships. The University is a member of the internationally recognized Australian Group of Eight universities and a foundation member of the Matariki Network of high-quality, research-intensive universities with a particular focus on student experience.



Media Contact:
Mark Smith
JPR Communications
818-398-1424

Investor Relations:
Tony Dawe
Director, Global Investor Relations
tdawe@brainchip.com
 
  • Like
  • Love
  • Fire
Reactions: 77 users

CHIPS

Regular
  • Like
  • Fire
  • Love
Reactions: 41 users

Sirod69

bavarian girl ;-)
I/ONX Neuromorphic
I/ONX Neuromorphic
https://www.linkedin.com/company/i-onx/?miniCompanyUrn=urn:li:fs_miniCompany:96327740
Computational processes using traditional GPUs and CPUs in large data centers running complex data processing tasks significantly lack energy efficiency.

We are essentially at the fundamental limit of Artificial intelligence and machine Learning using traditional data processing technologies.

Enter neuromorphic computing. It has the potential to achieve High Performance Computing and yet consumes 1/1000th of the energy! Sound interesting?

Get in contact with one of our team members today to find out how we can transform your data center into an energy saving force by mimicking the brain's parallel processing capabilities!
1710265323253.png
 
  • Like
  • Wow
  • Thinking
Reactions: 29 users
Top Bottom