Pom down under
Top 20

Choosing the Right AI Strategic Partner for Healthcare - Tata Elxsi
Discover how AI's evolution from curiosity to necessity impacts real-world applications. This guide stresses strategic AI partner selection in healthcare.

Who knows?Will ARM buy BRN ?![]()
This article seems to indicate further progression of Masayoshi Son's project codenamed Izanagi, which we discussed in Feb 2024. If realised, this would be a MASSIVE move on behalf of Arm and Softbank! As we know Arm makes most of its money through the royalties that it collects from their customers each time they make a chip using its designs. If they launch their own product next year, then Arm will be competing with the very same companies that it is currently servicing. Whilst at the same time, it plans to reduce their reliance on NVIDIA and also compete with the likes of Amazon Web and Microsoft by using the new AI chips to build their own data centres.
This leads me to beleive that Arm must have something very special up its sleeve,no pun intended!
Correct me if I'm wrong, but I don't think Arm has ever commented on any other partner technology within their ecosystem being capable of making their own products perform BETTER!
PS : I guess if they intend on using our IP in chips that will be produced as early as next year, they're going to have to get a wriggle on and sign on the dotted line fairly soon one would imagine.
UPDATED 17:43 EDT / MAY 12 2024
AI![]()
Arm reportedly set to enter AI chip market with first product next year
![]()
BY DUNCAN RILEY
SHARE![]()
Computer chip designer Arm Holdings Plc is reportedly set to enter the artificial intelligence chip market and is seeking to launch its first product next year.
The decision by Arm to develop its own AI chips is said to be part of a move by parent company SoftBank Group Corp. to transform the group into a sprawling AI powerhouse. Although Arm has been publicly listed since its initial public offering in September, SoftBank still owns approximately 90% of the company.
Nikkei Asia reported today that Arm will set up an AI chip division and is aiming to build a prototype by the northern spring of 2025. Mass production, which will be contracted out to chip manufacturers, including Taiwan Semiconductor Manufacturing Company Ltd., is expected to start later the same year.
Arm will reportedly shoulder the initial development costs, expected to be in the hundreds of billions of yen – 100 billion yen at the current exchange rate is $642 million. SoftBank is also expected to contribute funds to assist. According to Nikkei, once a mass-production system is established, the AI chip business could be spun off and placed under SoftBank.
Arm already supplies circuit architecture for processors used in smartphones and graphic processor units, but the move to design and then subcontract manufacture AI chips would be a first for the company. Currently, Arm makes most of its money through the royalties it collects every time a company makes a chip using its designs. Now, if its vision is realized, Arm will compete with those same companies.
Visions are not rare when it comes to SoftBank Chief Executive Officer Masayoshi Son, who is said to be spearheading the push. Son has a vision of an “AI revolution,” with SoftBank aiming to expand to data centers and robots and. His vision includes bringing together AI, semiconductor and robotics technologies to spur innovation in various industries.
SoftBank plans to build data centers equipped with homegrown Arm chips in the U.S., Europe, Asia and the Middle East as early as 2026. Because Son has never been shy in thinking big, SoftBank also plans to branch out into power generation through windmills and solar power farms, with an eye on next-generation fusion technology to power their data centers.
While ambitious, Son’s plans are certainly achievable, but what remains unknown is whether Arm will make its designs available to its existing customers or how those customers will respond to Arm entering the AI chip market. SoftBank’s plans to build data centers using its own AI chips will also see it compete against the likes of Amazon Web Services Inc. and Microsoft Corp., both of which currently license Arm circuit architecture for processors.
This is what fetaures on Arm's Partner Ecosystem catalogue.
View attachment 62738
![]()
BrainChip
Ubiquitous, distributed AI.<br/>Close to the sensor.<br/>Inspired by the human brain. <br/>www.arm.com
Well natural SNNs are analog..ARM think that NNs use MACs:
Note: This patent addresses the issue of "attention" which enables the processor to refer back to earlier inputs for context - used in natural language processing.
US2024028877A1 NEURAL PROCESSING UNIT FOR ATTENTION-BASED INFERENCE 20220721
[0063] The neural processing unit ( 106 ) includes a central control element ( 110 ), a direct memory access element ( 112 ), an activation output element ( 114 ), a multiplication accumulation engine ( 116 ), a shared buffer ( 118 ), and a weight decoder ( 120 ).
View attachment 62759
[0066] At stage S 11 , the direct memory access element ( 112 ) of the NPU ( 106 ) fetches compressed projection matrices WQ , WK , and WV from the flash memory ( 102 ). The weight decoder ( 120 ) decodes the compressed matrices. The MAC engine ( 116 ) calculates query matrix Q, key matrix K, and value matrix V by multiplying the query projection matrix WQ , the key projection matrix WK , and the value projection matrix WV , by input matrix X.
They also think SNNs are analog:
WO2024018231A2 IMPROVED SPIKING NEURAL NETWORK APPARATUS 20220721
View attachment 62760
View attachment 62761
View attachment 62762
Figure 2 is a schematic diagram showing the structure of a synaptic delay path.
Figure 3 is a schematic diagram showing the structure of a neuron within a spiking neural network apparatus.
A spiking neural network is described that comprises a plurality of neurons in a first layer connected to at least one neuron in a second layer, each neuron in the first layer being connected to the at least one neuron in the second layer via a respective variable delay path. The at least one neuron in the second layer comprises one or more logic components configured to generate an output signal in dependence upon signals received along the variable delay paths from the plurality of neurons in the first layer. A timing component is configured to determine a timing value in response to receiving the output signal from the one or more logic components, and an accumulate component is configured to accumulate a value based timing values from the timing component. A neuron fires in a case that a value accumulated at the accumulate component reaches a threshold value.
Just a few thoughts base on your comments that it appears ARM are still totally cloud focused.Good post. At the moment ARM are totally cloud focussed. They are well aware that cloudless AI at the Edge is going to see exponential growth.
While there is room for both cloud and cloudless (AKIDA) ARM will no doubt hedge its bets and utilise AKIDA to take advantage of the expected growth.
It's the old infallibility thing again- I heard key "customer" not key "competitor" - but I'm damned if I can find it.Just a few thoughts base on your comments that it appears ARM are still totally cloud focused.
Who is the one company that ARM relies on for revenue the most (not limited to) and that company still promoting nearly everything not at the edge linked to cloud or not?
Also linked to other recent news that ARM looking to start building their own AI chips starting in 2025?
Now with Akida 2.0 IP and BRN holding off developing any reference chips unlike Akida 1000 is due to BRN management announcing (my recall) is that they are not progressing any further producing chips as it may upset a key competitor (don’t want to step on any toes). Who may that be? Lol
Unfortunately BRN is playing with the big boys that atleast a couple don’t want AI at the edge as quickly at BRN management timelines due to their existing investments (ie hardware and other infrastructure) so they milking revenue as much as they can out of their current investments before possibly moving to BRN AI at the edge and buy a IP license and / or generate royalties through products etc for BRN shareholders.
I suggest at least a couple of the big boys dictating all terms of engagement with BRN that have upset the sales tactics / progression by BRN management / BOD thus a couple of BRN staff recently left. Did anyone from BRN sales team or management upset one of the big boys in the negotiations towards a IP licence for now so BRN have had to pivot for now?
Interesting times but Akida will get there. Just like Thomas the tank engine getting up the hill.
Wishing you all the very best with your investment in BRN.
Cheers
The Pope
Lmfao. If I change it to customer does that mean the rest of my comments you are ok with Dio?It's the old infallibility thing again- I heard key "customer" not key "competitor" - but I'm damned if I can find it.
It was Sean. See attached slide from an investment presentation last year.Lmfao. If I change it to customer does that mean the rest of my comments you are ok with Dio?
Others have said nivida may be considered more like a friend than a competitor or along those lines. Did that comment come from Sean?
Don’t really call if I’m wrong or not but a few on here can get quickly upset…..that is a given. I do laugh that some put people on ignore regardless of their point of view. Still haven’t put anyone on ignore but have gone close. I notice DK6161 hasn’t been on for a while. lol
https://www.datacenterdynamics.com/en/news/ai-chip-startup-deepx-raises-80m-receives-529m-valuation/
AI chip startup Deepx raises $80m, receives $529m valuation
Funding round was led by SkyLake Equity Partners
South Korean AI chip startup Deepx has raised $80 million in a Series C funding round, leading to the company receiving a valuation of $529 million.
The investment round was led by SkyLake Equity Partners, a South Korean private equity firm, and included participation from AJU IB and previous backer Timefolio Asset Management.
![]()
Founded in 2018 by former Apple and Cisco engineer Lokwon Kim, Deepx produces an 'All-in-4 AI solution' that includes the company’s DX-V1 and DX-V3 processors for use in consumer electronics, and its DX-M1 chip and DX-H1 chips, which have been designed for AI computing boxes and AI servers.
In comments to TechCrunch, Kim said that Deepx would use the funding to mass produce the company’s four existing AI chips, in addition to supporting the development and launch of its next generation of large language model (LLM) on-device solutions.
“Nvidia’s GPU-based solutions are the most cost-effective for large language model services like ChatGPT; the total power consumed by GPUs operating has reached levels exceed the electrical energy of an entire country,” Kim told TechCrunch. “This collaborative operation technology between server-scale AI and on-device large AI models are expected to reduce energy consumption and costs a lot compared to relying solely on data centers.”
Deepx currently employs around 65 people but does not yet have any customers, although Kim said the company was working with more than 100 potential clients and strategic partners. The company also has more than 259 patents pending in the US, China, and South Korea.
Unbelievable Wilziy123, you really are a piece of work aren't you, I refrain from commenting often on this forum, and when I do its generally to praise a members comments or thoughts, however, your attitude is very aloof and condescending, your response to Guzzi62 was totally unnecessary, you are very unpleasant! I base this on your years of history not just this particular comment, your memes' reflect this, and I am amazed at the 14 supportive replies you received. I apologize to the rest of the forum for being totally unrelated but felt I had to say something. Sorry and thank you for those who chose to read. (Ok, back to normal transmission)
Yes who knows … maybe Rob left the company because he don’t want to work again for ARMWho knows?
But one thing is certain is that one of ARM's big claims/aims is to save power and I reckon we fit into this category very nicelyhttps://www.linkedin.com/advice/0/what-current-future-trends-challenges-power-management
Reference
View attachment 62753