Smoothsailing
Regular
BrilliantAs I said before, when the price went DOWN the other day 35%, I BROUGHT.
I don't think you can have more faith than that, @Smoothsailing.
BrilliantAs I said before, when the price went DOWN the other day 35%, I BROUGHT.
I don't think you can have more faith than that, @Smoothsailing.
Oh Buddy, have you got it ALL WRONG!!!,
"that isn't doing well today?"
I don't need anybody to ask for my decision to buy or sell.
Did I sell when it went down 35% the other day. NO I DIDN'T, I brought more, DID YOU?
What a PATHETIC reply to my question,
Thought I would get a more reasonable reply from you ,Mr Rob .unt
21 million sold down - 2 million bought up - that's an awful lot of sprats for a very modestly sized mackerel.All of a sudden there are more buyers than sellers
LOL, I kinda thought that myself when I read it back. Kind of hypocrisy right?Hey Skutza, great advice, except I'm not the one who needs an audience.
I did take it private, he didn't respond to me there, instead he screenshotted my private messages, and posted them on the forum.
In fact, why didn't you post this piece of advice to me privately ..
Sounds just like the Australian Defense Systemn.An interesting article about how difficult it is for AI startups to recruit AI talent:
CEO says he tried to hire an AI researcher from Meta and was told to 'come back to me when you have 10,000 H100 GPUs'
The CEO of an AI startup said he wasn't able to hire a researcher from Meta because his company didn't have enough GPUs.www.businessinsider.com
"I tried to hire a very senior researcher from Meta, and you know what they said? 'Come back to me when you have 10,000 H100 GPUs'," Srinivas said on a recent episode of the advice podcast "Invest Like The Best."
"That would cost billions and take 5 to 10 years to get from Nvidia," Srinivas said."
"The CEO added that even if smaller firms like Perplexity are finally able to get Nvidia's chips, they'll continue to fall behind because of AI's rapid speed of development.
That could make it even harder to secure AI talent in the future.
"By the time you waited and got the money and booked the cluster and got it, the guys working here will have already made the next-generation model," Srinivas said, referring to AI talent at major tech companies."
Duh! And it's not even ironed! Methinks the SQL DB crapped itself..Select * from Downrampers > 2Very special people used that toilet . Look at the width of the toilet paper .
BrilliantMagnus Östberg on LinkedIn: #ai #neuromorphiccomputing #leadincarsoftware | 19 comments
I received many great questions from the community in response to my recent post on neuromorphic computing, so I’ll jump right in and answer a few. How does a… | 19 comments on LinkedInwww.linkedin.com
View attachment 58928
I received many great questions from the community in response to my recent post on neuromorphic computing, so I’ll jump right in and answer a few.
How does a more powerful processor increase energy efficiency?
#AI is already used in advanced driving assistance systems (ADAS) and infotainment, and the complex calculations are currently performed on traditional CPUs, GPUs and NPUs, which are not energy efficient. #Neuromorphiccomputing requires for the same tasks less energy. As the number of AI functions continue to increase, the increased computing efficiency of neuromorphic hardware will require less energy in comparison to legacy hardware. Reduced energy usage will also increase vehicle range and improve sustainability.
When can I experience neuromorphic computing?
Widespread use of neuromorphic computing will depend on many factors. The technology requires new programming and algorithms, so it will not immediately replace traditional processors. One key factor for us is that automotive-grade chips must meet extremely strict reliability requirements. However, we are already actively working to drive development and we are committed to being the first to use this technology in the automotive industry.
If you haven’t read the article yet, check it out here https://lnkd.in/epnUc5Sy. Be sure to ask more questions so we can keep the conversation going.
Neuromorphic computing? We’ve got that.
Because it’s still nascent technology, I am frequently asked to describe #neuromorphic computing. It is a paradigm shift for how we perform computations in machine learning (#ML) and artificial intelligence (AI), which process massive amounts of data requiring tons of fast memory.
Currently available processor architecture separates data calculations from system memory, which is inefficient. The biological inspiration for neural networks is the human brain, where computing and memory are combined, and data processing uses neurons to communicate through electrical signals and chemical processes known as neurotransmitters.
In neuromorphic computing, those human neurons and synapses are modelled in circuits and communication is event-driven, with information coded in spikes, mimicking the processing fundamentals of the brain. Those spikes propagate through a Spiking Neural Network of artificial neurons and synapses to predict results. Information processing is measured by spike rate or spike time instead of the number of calculations. Thus, neuromorphic chips are more energy efficient and have lower latency than conventional CPUs and GPUs. That means much faster computation using considerably less power.
However, this change in data processing also requires new software algorithms specifically designed to work with neuromorphic hardware. Existing algorithms can only partially leverage the many benefits of neural technology. Thanks to Valerij, Alexander, Christina in the Innovations & Future Technology area and the rest of our team for tackling this huge project!
𝗔𝗽𝗽𝗹𝗶𝗰𝗮𝘁𝗶𝗼𝗻𝘀
Neuromorphic computing reduces the power required for advanced AI computation, which is useful in applications where energy is limited, like electric vehicles. However, we still need automotive-grade chips with neuromorphic technology before this technology becomes common in cars.
We at Mercedes-Benz AG are currently working on novel algorithms that take advantage of neuromorphic computing to improve the energy efficiency and performance of our cars. Our primary goals are to extend vehicle range, make safety systems react faster, and increase the number of #AI functions possible. In 2020, we already joined the #Intel Neuromorphic Research Community and since then we are continuously expanding our collaborations with other research partners and universities to ensure our software and hardware solutions continue to lead the industry.
It's an exciting time to be in the world of automotive technology. Please share any questions and comments below.
Nice SG.Magnus Östberg on LinkedIn: #ai #neuromorphiccomputing #leadincarsoftware | 19 comments
I received many great questions from the community in response to my recent post on neuromorphic computing, so I’ll jump right in and answer a few. How does a… | 19 comments on LinkedInwww.linkedin.com
View attachment 58928
I received many great questions from the community in response to my recent post on neuromorphic computing, so I’ll jump right in and answer a few.
How does a more powerful processor increase energy efficiency?
#AI is already used in advanced driving assistance systems (ADAS) and infotainment, and the complex calculations are currently performed on traditional CPUs, GPUs and NPUs, which are not energy efficient. #Neuromorphiccomputing requires for the same tasks less energy. As the number of AI functions continue to increase, the increased computing efficiency of neuromorphic hardware will require less energy in comparison to legacy hardware. Reduced energy usage will also increase vehicle range and improve sustainability.
When can I experience neuromorphic computing?
Widespread use of neuromorphic computing will depend on many factors. The technology requires new programming and algorithms, so it will not immediately replace traditional processors. One key factor for us is that automotive-grade chips must meet extremely strict reliability requirements. However, we are already actively working to drive development and we are committed to being the first to use this technology in the automotive industry.
If you haven’t read the article yet, check it out here https://lnkd.in/epnUc5Sy. Be sure to ask more questions so we can keep the conversation going.
Neuromorphic computing? We’ve got that.
Because it’s still nascent technology, I am frequently asked to describe #neuromorphic computing. It is a paradigm shift for how we perform computations in machine learning (#ML) and artificial intelligence (AI), which process massive amounts of data requiring tons of fast memory.
Currently available processor architecture separates data calculations from system memory, which is inefficient. The biological inspiration for neural networks is the human brain, where computing and memory are combined, and data processing uses neurons to communicate through electrical signals and chemical processes known as neurotransmitters.
In neuromorphic computing, those human neurons and synapses are modelled in circuits and communication is event-driven, with information coded in spikes, mimicking the processing fundamentals of the brain. Those spikes propagate through a Spiking Neural Network of artificial neurons and synapses to predict results. Information processing is measured by spike rate or spike time instead of the number of calculations. Thus, neuromorphic chips are more energy efficient and have lower latency than conventional CPUs and GPUs. That means much faster computation using considerably less power.
However, this change in data processing also requires new software algorithms specifically designed to work with neuromorphic hardware. Existing algorithms can only partially leverage the many benefits of neural technology. Thanks to Valerij, Alexander, Christina in the Innovations & Future Technology area and the rest of our team for tackling this huge project!
𝗔𝗽𝗽𝗹𝗶𝗰𝗮𝘁𝗶𝗼𝗻𝘀
Neuromorphic computing reduces the power required for advanced AI computation, which is useful in applications where energy is limited, like electric vehicles. However, we still need automotive-grade chips with neuromorphic technology before this technology becomes common in cars.
We at Mercedes-Benz AG are currently working on novel algorithms that take advantage of neuromorphic computing to improve the energy efficiency and performance of our cars. Our primary goals are to extend vehicle range, make safety systems react faster, and increase the number of #AI functions possible. In 2020, we already joined the #Intel Neuromorphic Research Community and since then we are continuously expanding our collaborations with other research partners and universities to ensure our software and hardware solutions continue to lead the industry.
It's an exciting time to be in the world of automotive technology. Please share any questions and comments below.
Sean mentioned the Mercedes NDA in that interview last week.Nice SG.
Toooooo scared to mention Akida in case the SP blows up again
Wasn't sure whether to laugh or cry.
Being part of the INRC, here's a thought.
Given we are not a brand name in the wider scheme of things and Intel are for consumers (todays world & consumer is sadly aligned to brands), wonder if our association with IFS is a designed pathway to allow MB to us via an Intel name eventually and to complete the requisite testing and certification of the chips in due course.