BRN Discussion Ongoing

skutza

Regular


Fantastic they are all wonderful endorsments for BRN.

Now lets see what they are doing about it :)

1709866244295.png



Oh, at least they think we are awesome !

But don't get me wrong, i think they're awesome too so I'm first in line at .395 :)

1709866650819.png
 
Last edited:
  • Like
  • Sad
  • Fire
Reactions: 5 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
The battle for the edge is really heating up 🔥 and this can only be VERY GOOD NEWS for us IMO!


'There is a battle looming in 2024': Qualcomm wants to become 'the Nvidia of AI' on your smartphone​

marketwatch.dad5abb.png

Provided by Dow Jones
Mar 7, 2024 9:13am
By Ryan Shrout
All the major players are trying to surf this next wave of AI, including Intel, Nvidia, AMD and Arm Holdings
In the world of artificial intelligence and the computing infrastructure that powers AI, there is only one king: Nvidia (NVDA).
The company has skyrocketed to one of the largest in the world on the back of its success leading the transition to an AI computing ecosystem. It has a dominant hold on the chips that power the data centers and servers that enable companies such as Microsoft (MSFT), Amazon.com (AMZN), and Alphabet (GOOGL) to deliver AI services.
But Nvidia's position of strength is limited to servers and data centers. There is an emerging battle for the AI market at the edges, defined as a combination of laptops, smartphones, and individual consumer and commercial devices. It is on these devices where this AI content will be consumed.
This segment is still in play, with combatants ranging from Intel (INTC) to AMD (AMD) to Qualcomm (QCOM), and yes even Nvidia with its gaming and mobile graphics chips. Everyone is fighting to control what most believe is the next growth frontier for AI and for chips to power it. Who will become the NVIDIA of edge AI?
AI computing on local devices will drive the need for more data and bandwidth consumption. Having the best wireless technology is critical.
Qualcomm is in a unique position of opportunity with AI at the edge, thanks to a combination of its communication background, strong investment in new computing architectures, and an initiative to create a software springboard to rival what Nvidia did with CUDA all those years ago.
Though Qualcomm and CEO Cristiano Amon consistently refer to the company's transition from a communications to a computing company, its heritage and leadership in communications is often undervalued. AI computing on local devices will drive the need for more data and bandwidth consumption. Qualcomm's lead in modems and wireless technology is an advantage that is often overlooked.
Having the best wireless technology is critical. Qualcomm recently unveiled its newest cellular modem called the Snapdragon X80 5G, as well as a new FastConnect 7900 chip that combines the latest Wi-Fi, Bluetooth, and wideband technologies. These will get integrated into the highest-performance smartphones next year, as well as high-end laptops.

If Qualcomm wants to do in the consumer and device AI space what Nvidia has done for the data center, it will take a significant effort in the form of software. Nvidia's CUDA is a combination of software development tools, drivers, pre-configured and updated models that make writing software for its GPUs as simple as possible. Software that is developed using CUDA is more likely to be deployed at scale on those same architectures in the cloud.
Another move that Qualcomm recently made is the introduction of its AI Hub, a toolset of its own for deploying and optimizing AI for its processors. This takes Qualcomm beyond just enabling on-stage demos and promotional content for social media. It's a developer engagement platform that provides a simple way to integrate tools and instructions to make sure the most important AI models run on Snapdragon hardware and run in an optimized, maximum-performance state.
The Qualcomm AI Hub currently offers more than 75 unique AI models that are ready for software teams to integrate and it supports the Snapdragon chips for smartphones, laptops and even automobiles. The company hopes that by making integration and support of its chips as easy as possible, it can entice developers to target its hardware with emerging applications that will be a part of the AI revolution at the edge and create the same inherent advantage that Nvidia has with CUDA.
The third piece to this is about the hardware itself. In the smartphone market, Snapdragon is the clear leader in market share and performance, powering most of the highest-end devices like the Samsung Galaxy S24 Ultra. This leadership position means that Android-application developers will need to target Qualcomm chips for any AI computing. The company has also teased that its latest CPU core architecture, called Oryon and stemming from its Nuvia acquisition a few years back, will be coming soon to its smartphone chips as well, keeping Qualcomm in the driver's seat in this market.
For the upcoming AI PC boom that most are projecting to happen in the second half of this year and into 2025, even though Qualcomm has minimal market share today, the Snapdragon X Elite platform announced in October 2023 offers more than four times the AI performance of the currently shipping chips from Intel or AMD. The first laptops featuring this chip won't be available until June, but all indications are that Qualcomm has some incredibly influential design wins and system partners that will raise eyebrows and drive some impressive volume in the market. Intel, AMD and Nvidia won't go down without a fight of course, and Qualcomm has a steep hill to climb in the PC space.
It's clear to me that there is a battle looming in 2024 for the mind share and wallets of consumers looking to bring AI to their phones, laptops, and almost everything else at the edge. Every consumer technology company is trying to surf this next wave of AI, including Intel, Nvidia, AMD, Arm Holdings (ARM), Qualcomm and smaller startups like MemryX or Rabbit. With its strength in the smartphone market, a growing portfolio of products for PCs, and a new software initiative that will drive developer adoption, Qualcomm has a solid foothold here.
Ryan Shrout is president of Signal65 and founder at Shrout Research. Follow him on X @ryanshrout. Shrout has provided consulting services for AMD, Qualcomm, Intel, Arm Holdings, Micron Technology, Nvidia and others. Shrout holds shares of Intel.
 
  • Like
  • Love
  • Fire
Reactions: 31 users

JoMo68

Regular
I haven’t seen this article before. It is hands down one of the best description of what the problem is that BrainChip solves, how it does it and that we are at that inflection point where the problem must be solved. Might have to send it to my reluctant family members who I have been trying to explain BrainChips’s potential to unsuccessfully do far.
Thank you for sharing this @Fullmoonfever it is a fantastic find 🤩
I’m pretty sure one of the posters here wrote this article and asked us for our comments. I can’t remember who but I feel like it was one of our German posters 🤔
 
  • Like
  • Wow
  • Fire
Reactions: 10 users

Diogenese

Top 20
The battle for the edge is really heating up 🔥 and this can only be VERY GOOD NEWS for us IMO!


'There is a battle looming in 2024': Qualcomm wants to become 'the Nvidia of AI' on your smartphone​

marketwatch.dad5abb.png

Provided by Dow Jones
Mar 7, 2024 9:13am
By Ryan Shrout
All the major players are trying to surf this next wave of AI, including Intel, Nvidia, AMD and Arm Holdings
In the world of artificial intelligence and the computing infrastructure that powers AI, there is only one king: Nvidia (NVDA).
The company has skyrocketed to one of the largest in the world on the back of its success leading the transition to an AI computing ecosystem. It has a dominant hold on the chips that power the data centers and servers that enable companies such as Microsoft (MSFT), Amazon.com (AMZN), and Alphabet (GOOGL) to deliver AI services.
But Nvidia's position of strength is limited to servers and data centers. There is an emerging battle for the AI market at the edges, defined as a combination of laptops, smartphones, and individual consumer and commercial devices. It is on these devices where this AI content will be consumed.
This segment is still in play, with combatants ranging from Intel (INTC) to AMD (AMD) to Qualcomm (QCOM), and yes even Nvidia with its gaming and mobile graphics chips. Everyone is fighting to control what most believe is the next growth frontier for AI and for chips to power it. Who will become the NVIDIA of edge AI?
AI computing on local devices will drive the need for more data and bandwidth consumption. Having the best wireless technology is critical.
Qualcomm is in a unique position of opportunity with AI at the edge, thanks to a combination of its communication background, strong investment in new computing architectures, and an initiative to create a software springboard to rival what Nvidia did with CUDA all those years ago.
Though Qualcomm and CEO Cristiano Amon consistently refer to the company's transition from a communications to a computing company, its heritage and leadership in communications is often undervalued. AI computing on local devices will drive the need for more data and bandwidth consumption. Qualcomm's lead in modems and wireless technology is an advantage that is often overlooked.
Having the best wireless technology is critical. Qualcomm recently unveiled its newest cellular modem called the Snapdragon X80 5G, as well as a new FastConnect 7900 chip that combines the latest Wi-Fi, Bluetooth, and wideband technologies. These will get integrated into the highest-performance smartphones next year, as well as high-end laptops.

If Qualcomm wants to do in the consumer and device AI space what Nvidia has done for the data center, it will take a significant effort in the form of software. Nvidia's CUDA is a combination of software development tools, drivers, pre-configured and updated models that make writing software for its GPUs as simple as possible. Software that is developed using CUDA is more likely to be deployed at scale on those same architectures in the cloud.
Another move that Qualcomm recently made is the introduction of its AI Hub, a toolset of its own for deploying and optimizing AI for its processors. This takes Qualcomm beyond just enabling on-stage demos and promotional content for social media. It's a developer engagement platform that provides a simple way to integrate tools and instructions to make sure the most important AI models run on Snapdragon hardware and run in an optimized, maximum-performance state.
The Qualcomm AI Hub currently offers more than 75 unique AI models that are ready for software teams to integrate and it supports the Snapdragon chips for smartphones, laptops and even automobiles. The company hopes that by making integration and support of its chips as easy as possible, it can entice developers to target its hardware with emerging applications that will be a part of the AI revolution at the edge and create the same inherent advantage that Nvidia has with CUDA.
The third piece to this is about the hardware itself. In the smartphone market, Snapdragon is the clear leader in market share and performance, powering most of the highest-end devices like the Samsung Galaxy S24 Ultra. This leadership position means that Android-application developers will need to target Qualcomm chips for any AI computing. The company has also teased that its latest CPU core architecture, called Oryon and stemming from its Nuvia acquisition a few years back, will be coming soon to its smartphone chips as well, keeping Qualcomm in the driver's seat in this market.
For the upcoming AI PC boom that most are projecting to happen in the second half of this year and into 2025, even though Qualcomm has minimal market share today, the Snapdragon X Elite platform announced in October 2023 offers more than four times the AI performance of the currently shipping chips from Intel or AMD. The first laptops featuring this chip won't be available until June, but all indications are that Qualcomm has some incredibly influential design wins and system partners that will raise eyebrows and drive some impressive volume in the market. Intel, AMD and Nvidia won't go down without a fight of course, and Qualcomm has a steep hill to climb in the PC space.
It's clear to me that there is a battle looming in 2024 for the mind share and wallets of consumers looking to bring AI to their phones, laptops, and almost everything else at the edge. Every consumer technology company is trying to surf this next wave of AI, including Intel, Nvidia, AMD, Arm Holdings (ARM), Qualcomm and smaller startups like MemryX or Rabbit. With its strength in the smartphone market, a growing portfolio of products for PCs, and a new software initiative that will drive developer adoption, Qualcomm has a solid foothold here.
Ryan Shrout is president of Signal65 and founder at Shrout Research. Follow him on X @ryanshrout. Shrout has provided consulting services for AMD, Qualcomm, Intel, Arm Holdings, Micron Technology, Nvidia and others. Shrout holds shares of Intel.


For the upcoming AI PC boom that most are projecting to happen in the second half of this year and into 2025, even though Qualcomm has minimal market share today, the Snapdragon X Elite platform announced in October 2023 offers more than four times the AI performance of the currently shipping chips from Intel or AMD. The first laptops featuring this chip won't be available until June, but all indications are that Qualcomm has some incredibly influential design wins and system partners that will raise eyebrows and drive some impressive volume in the market. Intel, AMD and Nvidia won't go down without a fight of course, and Qualcomm has a steep hill to climb in the PC space.

Fingers crossed ...
 
  • Like
  • Fire
  • Love
Reactions: 31 users

JoMo68

Regular

Here, but maybe I misinterpreted and they just found a good article that explained in simple terms suitable for their parents 🤔
 
  • Like
  • Love
  • Sad
Reactions: 9 users

Diogenese

Top 20

while I love this as an amazing statement for our chipper, I’m wanting something more on this post, in brackets or asterix below with a notable credit from a peer of industry, or an official recognition of some Independant assessment certified etc……. give me kudos with backed up credential's as verification …… I know this groups research is second to none on topic, but in my industry there’s a common problem with some Co. , where u start believing your own BS…… where are the testimonials please ……..🙏

Qualified rant over….. !

View attachment 58686
If BRN were in the brass section, they'd need to borrow someone else's trumpet ...





'cause they don't blow their own very often.
 
  • Haha
  • Like
  • Love
Reactions: 21 users

IloveLamp

Top 20
Last edited:
  • Like
  • Love
  • Fire
Reactions: 13 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
  • Like
  • Love
  • Wow
Reactions: 14 users

Draed

Regular
  • Like
  • Fire
Reactions: 11 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
View attachment 58662 View attachment 58663



If Tony likes it, I suppose I do too.😝



Chipping Away at Edge AI Inefficiencies​

Engineers are developing AI-centric chips that integrate processing and memory into the same unit, enabling efficient AI on mobile devices.​


Nick BildFollow
6 hours ago • Machine Learning & AI
verma-chip-prototype-01-for-web_13uNJGRCOk.jpg

An early prototype of the chip (📷: Hongyang Jia / Princeton University)







The latest and most powerful AI algorithms have reached a level of complexity and sophistication that demands significant computational resources to execute efficiently. These algorithms, often based on deep learning architectures such as convolutional neural networks or transformer models, typically run on powerful computers located in cloud computing environments. These environments offer the scalability and resources needed to handle the intensive computational requirements of cutting edge AI tasks.
In order to limit latency and protect sensitive information, mobile devices, such as smartphones and tablets, need to be capable of running these advanced algorithms locally to power the next generation of AI applications. But they have limited computational capabilities and energy budgets compared to the servers found in cloud environments. Factors such as these have limited the rollout of this critical technology where it is needed most.
Furthermore, traditional computing architectures, both in mobile devices and in servers, have a separation between processing and memory units. This architecture introduces a bottleneck that greatly limits processing speeds in data-intensive applications like AI. In AI tasks, where large amounts of data need to be processed rapidly, this bottleneck becomes particularly problematic. Processing data stored in separate memory units incurs latency and reduces overall efficiency, hindering the performance of AI algorithms even further.
To overcome these challenges and enable the widespread adoption of AI on mobile devices, many innovative solutions are actively being explored. Princeton University researchers are working in conjunction with a startup called EnCharge AI towards one such solution — a new type of AI-centric processing chip that is powerful, yet requires very little power for operation. By reducing both the size of the hardware and the power consumption required by the algorithms, these chips have the potential to free AI from the cloud in the future.
Professor Naveen Verma is leading the effort to build the new chip (📷: Sameer A. Khan / Fotobuddy)

Professor Naveen Verma is leading the effort to build the new chip (📷: Sameer A. Khan / Fotobuddy)

Achieving this goal required an entirely different way of looking at the problem. Rather than sticking with the tried and true von Neumann architecture that has powered our computer systems for decades, the researchers designed their chip such that processing and memory co-exist in the same unit, eliminating the need to shuttle data between units via relatively low bandwidth channels.
This is not the first in-memory computing architecture to be introduced by a long shot, but to date, existing solutions have been very limited in their capabilities. The computing needs to be highly efficient, because the hardware must fit within tiny memory cells. So rather than using the traditional binary language to store data, the team instead encoded data in analog. This allows many more than two states to be stored at each address, which allows for data to be packed much more densely.
Using traditional semiconductor devices like transistors, working with analog signals proved to be challenging. In order to guarantee accurate computations that are not impacted by changing conditions like temperature, the researchers instead used a special type of capacitor that is designed to switch on and off with precision to store and process the analog data.
Early prototypes of the chip have been developed and demonstrate the potential of the technology. Further work will still need to be done before the technology is ready for use in the real world, however. After recently receiving funding from DARPA, the chances of that work being completed successfully have risen.

 
  • Like
  • Fire
  • Love
Reactions: 14 users

IloveLamp

Top 20
  • Thinking
  • Wow
  • Like
Reactions: 9 users

JoMo68

Regular
  • Like
  • Love
Reactions: 16 users

ndefries

Regular
View attachment 58718
With Sean's HP background this is one to watch.
View attachment 58722
It wasn't a good looking them closing down that significant investment. If this was true I would have thought they would buy it while their car division is still up and merge them. Taking all that is better from the acquisition.

If they buy it now it sends a clear message they failed and someone does it better.
 
  • Like
Reactions: 6 users

Frangipani

Regular
I’m pretty sure one of the posters here wrote this article and asked us for our comments. I can’t remember who but I feel like it was one of our German posters 🤔


Here, but maybe I misinterpreted and they just found a good article that explained in simple terms suitable for their parents 🤔

Hi JoMo68,

no, you didn’t misinterpret it - it was indeed @Berlinforever himself/herself who wrote this excellent summary six months ago, as also evident from this recent post:

B2FD4BD5-71A7-4D8E-90FF-C5239CC7CE5C.jpeg
 
Last edited:
  • Like
  • Love
Reactions: 11 users
So....would we expect, that it they get traction and their smart bins in locations and if still going to use Akida, that we should see a formal agreement sooner than later? :)



Mark Grogan
Founder Enzide Technologies - Founder Circle 8 Clean Technologies
1mo

The 10c deposit scheme in Australia for containers is gathering momentum with all states and territories implementing the scheme. This interesting article highlights where the opportunity lies in collecting more. Circle 8 Clean Technologies has identified these same opportunities and will target our smart bins in these location from 2024!
The force driving our circular economy, 10c at a time

The force driving our circular economy, 10c at a time

afr.com

 
  • Like
  • Fire
  • Love
Reactions: 21 users
  • Like
Reactions: 3 users

JoMo68

Regular

Finally figured out how to attach the article! We’re referenced in Box 4. A great journal to appear in.
 
  • Like
  • Love
  • Fire
Reactions: 15 users
Samsung, is certainly a big spicy cabbage, but some prefer fruit..

I think you're being a bit dramatic, saying we're screwed, if we are not in with them.

Yes, they are a huge player, but they don't control or dominate the World product markets.

For example..

"Apple has overtaken Samsung as the world's top smartphone seller, ending the Korean tech firm's 12-year run as industry leader. The iPhone took the top spot in 2023 with 234.6m units sold, according to figures from the International Data Corporation (IDC), overtaking Samsung's 226.6m units"
17 Jan 2024


Are you saying, that if we got in with Apple, or some other Big players, but not Samsung, that we may as well pack up and go home?..

Of course no guarantees anywhere, but your arguments don't make sense, in my opinion.

I'm not saying you can't have one..
It's not necessarily about smartphones. They're interesting too. Don't get me wrong, but I'm worried about the products of the road map 2024.
For the first time these will be ultra low power, including all the other good stuff.
Since they create their own chips, this is where the danger lies if they caught up with us.
This is what I'm talking about: https://www.koreatechtoday.com/samsung-unveils-2024-generative-ai-roadmap-for-home-appliances/
 
  • Like
  • Haha
  • Thinking
Reactions: 7 users
  • Haha
  • Like
Reactions: 7 users
It's not necessarily about smartphones. They're interesting too. Don't get me wrong, but I'm worried about the products of the road map 2024.
For the first time these will be ultra low power, including all the other good stuff.
Since they create their own chips, this is where the danger lies if they caught up with us.
This is what I'm talking about: https://www.koreatechtoday.com/samsung-unveils-2024-generative-ai-roadmap-for-home-appliances/
We do have a history with Samsung.

Anil Mankar, cheekily thanked Samsung, for giving them the DVS? camera to use, in the first demonstration of the now famous (to us anyway) Tiger identification demo.
Just before the infamous 100 million shares at 3 cents, to that shark mob (Royal something?) who sold most of them at 4.5 cents 🤣..

So there is a good chance, that our tech will be somewhere in their offerings.

The other thing is, that the Company (Sean) has stated, that when the dust settles, they want to be one of the 3 major players, in the Edge A.I. space.

It's possible that Samsung, may be one of the other 3, in the end..
If that's the case, I think they will have got there, by "playing" with our tech first.

I believe we have the technology, the people, enough irons in the fire and a strong enough ecosystem, that we will prevail through the inevitable heated competition, in this space.

Would be nice to see some meaty IP deals drop this year, for sure.
 
  • Like
  • Fire
  • Love
Reactions: 35 users
Top Bottom