BRN Discussion Ongoing

jtardif999

Regular
Patents are worthless if they're defending a concept that's not commercially viable. If BRN doesn't start making serious revenue soon then it won't have the resources to defend patent infringements anyway. This is simply a red herring designed to distract would-be investors from what is actually going on.
'The cornerstone of BRN value' what a bloody joke. The actual cornerstone of BRN value is IP licences, royalties and big tier 1 companies shouting from the rooftops about how amazing akida is. None of which appear to be happening.
You are completely wrong imo, BrainChips patents are everything, so important to our future worth and not only from the point of defending but as an attraction in takeover. It seems to me that you don’t really get how unique BrainChips technology is and how important it could likely become. Let’s for a moment think of the time when inevitably the limits and sustainability of current AI solutions is exhausted. It will happened, it can’t go on the present path that Nvidia are squeezing out atm. The time will come when BRNs tech is suddenly the tech of choice (they are setting themselves up for it) and then we will get very big very quickly or, someone will want control of our patents so badly they will be willing to pay an incredible premium for them. AIMO.
 
  • Like
  • Fire
  • Love
Reactions: 40 users

IloveLamp

Top 20
Screenshot_20230915_011957_LinkedIn.jpg
Screenshot_20230915_012206_Chrome.jpg
 
  • Like
  • Fire
Reactions: 18 users

IloveLamp

Top 20
Screenshot_20230915_013045_LinkedIn.jpg
 
  • Like
  • Fire
  • Love
Reactions: 28 users
Maximum
This brand does door bells too, and very interesting is the 1 TOPS NPU.

My spidey sense is tingling.

Bring it on shorters!
I have no doubt in my mind this has Brainchip Akida S.
I’d like someone to prove me wrong.

Not financial advice.
i have the current version. It requires internet connectivity, power inefficient, doesn’t have specific facial recognition & requires fairly regular recharging, 1-2 x monthly.

If the Internet cuts out it has to be reset and takes ages fluffing around re-setting it it.

I’ve been banging on to my wife about how good it would be if it had Akida inside, so in my view the newest versions would be screaming out for the efficiency benefits of Akida.

Whilst on that topic, the Dee-bot vacuums are in the same boat- rely on internet connection, constant resetting. Another strong use case for Akida..
 
  • Like
  • Fire
Reactions: 8 users

Frangipani

Regular
Hadn't personally seen this project over at Edge Impulse before.

Google search link said late May 23 but who knows.

Have taken all the code sections etc out but full read at the link. Pretty cool running with FOMO.


Here is another article about Naveen Kumar’s traffic monitoring project, published on Wevolver today - they really seem to love Akida! 😍

Real-Time Traffic Monitoring with Neuromorphic Computing​

author avatar

David Tischler
14 Sep, 2023
FOLLOW
Real-Time Traffic Monitoring with Neuromorphic Computing


Article #5 of Spotlight on Innovations in Edge Computing and Machine Learning: A computer vision project that monitors vehicle traffic in real-time using video inferencing performed on the Brainchip Akida Development Kit.​

Artificial Intelligence
- Edge Processors
- Embedded Machine Learning
- Neural Network
- Transportation

This article is part of Spotlight on Innovations in Edge Computing and Machine Learning. The series features some unique projects from around the world that leverage edge computing and machine learning, showcasing the ways these technological advancements are driving growth, efficiency, and innovation across various domains.
This series is made possible through the sponsorship of Edge Impulse, a leader in providing the platform for building smarter, connected solutions with edge computing and machine learning.


In the ever-evolving landscape of urban planning and development, the significance of efficient real-time traffic monitoring cannot be overstated. Traditional systems, while functional, often fall short when high-performance data processing is required in a low-power budget. Enter neuromorphic computing—a technology inspired by the neural structure of the brain, aiming to combine efficiency with computational power. This article delves into an interesting computer vision project that monitors vehicle traffic using this paradigm.

Utilizing aerial camera feeds, the project can detect moving vehicles with exceptional precision, making it a game-changer for city planners and governments aiming to optimize urban mobility. The key lies in the advanced neuromorphic processor that serves as the project's backbone. This processor is not just about low power consumption—it also boasts high-speed inference capabilities, making it ideal for real-time video inferencing tasks.

But the journey doesn't end at hardware selection. This article covers the full spectrum of the project, from setting up the optimal development environment and data collection methods to model training and deployment strategies. It offers a deep dive into how neuromorphic computing can be applied in real-world scenarios, shedding light on the processes of data acquisition, labeling, model training, and final deployment. As we navigate through the complexities of urban challenges, such insights pave the way for smarter, more efficient solutions in traffic monitoring and beyond.



Traffic Monitoring using the Brainchip Akida Neuromorphic Processor​

Created By: Naveen Kumar
Public Project Link:
https://studio.edgeimpulse.com/public/222419/latest

Overview​

A highly efficient computer-vision system that can detect moving vehicles with great accuracy and relative motion, all while consuming minimal power.
cover

By capturing moving vehicle images, aerial cameras can provide information about traffic conditions, which is beneficial for governments and planners to manage traffic and enhance urban mobility. Detecting moving vehicles with low-powered devices is still a challenging task. We are going to tackle this problem using a Brainchip Akida neural network accelerator.

Hardware Selection​

In this project, we'll utilize BrainChip’s Akida Development Kit. BrainChip's neuromorphic processor IP uses event-based technology for increased energy efficiency. It allows incremental learning and high-speed inference for various applications, including convolutional neural networks, with exceptional performance and low power consumption.
eyJidWNrZXQiOiJ3ZXZvbHZlci1wcm9qZWN0LWltYWdlcyIsImtleSI6ImZyb2FsYS8xNjkyNjI2ODUxODgwLTE2OTI2MjY4NTE4ODAucG5nIiwiZWRpdHMiOnsicmVzaXplIjp7IndpZHRoIjo5NTAsImZpdCI6ImNvdmVyIn19fQ==

The kit consists of an Akida PCie board, a Raspberry Pi Compute Module 4 with Wi-Fi and 8 GB RAM, and a Raspberry Pi Compute Module 4 I/O Board. The disassembled kit is shown below.
eyJidWNrZXQiOiJ3ZXZvbHZlci1wcm9qZWN0LWltYWdlcyIsImtleSI6ImZyb2FsYS8xNjkyNjI5MjY1NjkzLXNwYWNlc19FSkI1T2FlWWpNNVZTRkVLTEVGel91cGxvYWRzX2dpdC1ibG9iLTYzMzc3ZDQ2MGUxYzJiMTc0NjViODFkNDQ3ODRkY2MyYzE1OGQ1MTFfaGFyZHdhcmVfdW5hc3NlbWJsZWQuanBlZyIsImVkaXRzIjp7InJlc2l6ZSI6eyJ3aWR0aCI6OTUwLCJmaXQiOiJjb3ZlciJ9fX0=
Hardware UnassembledThe Akida PCIe board can be connected to the Raspberry Pi Compute Module 4 IO Board through the PCIe Gen 2 x1 socket available onboard.
eyJidWNrZXQiOiJ3ZXZvbHZlci1wcm9qZWN0LWltYWdlcyIsImtleSI6ImZyb2FsYS8xNjkyNjI2OTExMjk4LTE2OTI2MjY5MTEyOTgucG5nIiwiZWRpdHMiOnsicmVzaXplIjp7IndpZHRoIjo5NTAsImZpdCI6ImNvdmVyIn19fQ==
Hardware Closeup

Setting up the Development Environment​


(…)

Conclusion​

This project highlights the impressive abilities of the Akida PCIe board. Boasting low power consumption, it could be used as a highly effective device for real-time object detection in various industries for numerous use cases.


This article is based on: Traffic Monitoring using the Brainchip Akida Neuromorphic Processor - Expert Projects, a blog by Edge Impulse. It has been edited by the Wevolver team and Electrical Engineer Ravi Y Rao. It's the third article from the Spotlight on Innovations in Edge Computing and Machine Learning Series.
The first article introduced the series and explored the implementation of Predictive Maintenance system using a Nordic Thingy:91.
The second article described the implementation of voice control for making appliances smarter using a Nordic Thingy:53.
The third article dives deep into the application of EdgeAI for surface crack detection, showcasing its transformative role in modern industrial predictive maintenance systems.
The
fourth article explains the integration of neuromorphic computing for real-time traffic monitoring, offering a technical blueprint for revolutionizing urban management.

About the sponsor: Edge Impulse

Edge Impulse is the leading development platform for embedded machine learning, used by over 1,000 enterprises across 200,000 ML projects worldwide. We are on a mission to enable the ultimate development experience for machine learning on embedded devices for sensors, audio, and computer vision, at scale.
From getting started in under five minutes to MLOps in production, we enable highly optimized ML deployable to a wide range of hardware from MCUs to CPUs, to custom AI accelerators. With Edge Impulse, developers, engineers, and domain experts solve real problems using machine learning in embedded solutions, speeding up development time from years to weeks. We specialize in industrial and professional applications including predictive maintenance, anomaly detection, human health, wearables, and more.
edge-impulse

More by David Tischler

FOLLOW
David is a Senior Developer Program Manager helping to take care of the Edge Impulse community of developers, and is a fan of computing on small, low power devices. He's also an extreme recycler, so use caution if trying to throw away recyclable objects if he's around.
 
  • Like
  • Fire
  • Love
Reactions: 82 users

Esq.111

Fascinatingly Intuitive.
  • Like
  • Fire
  • Love
Reactions: 50 users

Foxdog

Regular
.
 
  • Haha
Reactions: 2 users

Labsy

Regular
iPhone 15 USES akida and so does the new Apple Watch, and AirPods Pro.

Apple revenue is what Sean has been referring to cryptically.

Mark my words.

Will go out on a limb to say iPhone 16 will have akida 2.0 P in it.

Monday 18/09/23 is the date we’ve been waiting for!!! I feel it.

Time to shift gears and SEND ITTTTTT !!!
Go Brainchip!!!!


Not financial advice.



" A new 16-core Neural Engine is capable of nearly 17 trillion operations per second, enabling even faster machine learning computations for features like Live Voicemail transcriptions in iOS 17 and third-party app experiences — all while protecting critical privacy and security features using the Secure Enclave."
Wouldn't it be nice...... 🤔🙏
 
  • Like
  • Fire
  • Love
Reactions: 25 users

Frangipani

Regular

A retrainable neuromorphic biosensor for on-chip learning and classification? Mmmmh, friend or foe? 🧐 🤔

Does anyone here have institutional access to delve a little deeper?


The paper’s authors are two researchers each from Eindhoven University of Technology (https://www.tue.nl/en/research/rese...stems/neuromorphic-edge-computing-systems-lab ) and Northwestern University (https://rivnay.northwestern.edu/ and http://nu-vlsi.eecs.northwestern.edu/). The latter research group states on its website that they have been collaborating with IBM, Texas Instruments (!) and Intel.

  • 4/2023 Three papers are accepted into the prestigious VLSI Symposium'23. Congratulate to Yuhao, Xi, Yijie and all the authors. Special thanks to our collaborators Xin Zhang from IBM, Raveesh Magod from TI, and Nachiket Desai from Intel.
F7C0D833-EB6A-499E-B47D-79F4E54807AE.jpeg



01D5D4A6-8D7A-4904-8369-D352ECDE7255.jpeg



Published: 14 September 2023

A retrainable neuromorphic biosensor for on-chip learning and classification​

Nature Electronics (2023)

Abstract​

Neuromorphic computing could be used to directly perform complex classification tasks in hardware and is of potential value in the development of wearable, implantable and point-of-care devices. Successful implementation requires low-power operation, simple sensor integration and straightforward training. Organic materials are possible building blocks for neuromorphic systems, offering low-voltage operation and excellent tunability. However, systems developed so far still rely on external training in software. Here we report a neuromorphic biosensing platform that is capable of on-chip learning and classification. The modular biosensor consists of a sensor input layer, an integrated array of organic neuromorphic devices that form the synaptic weights of a hardware neural network and an output classification layer. We use the system to classify the genetic disease cystic fibrosis from modified donor sweat using ion-selective sensors; on-chip training is done using error signal feedback to modulate the conductance of the organic neuromorphic devices. We also show that the neuromorphic biosensor can be retrained on the chip, by switching the sensor input signals and alternatively through the formation of logic gates.

This is a preview of subscription content, access via your institution



Just discovered an article on the novel biosensor on the TU Eindhoven website as well:



Breakthrough way to train neuromorphic chips​

SEPTEMBER 14, 2023
Using a biosensor to detect cystic fibrosis as the test case, TU/e researchers have devised an innovative way to train neuromorphic chips as presented in a new paper in Nature Electronics.

[Translate to English:]

Photo: iStockPhoto

Neuromorphic computers – which are based on the structure of the human brain – could revolutionize our future healthcare devices. However, their widespread use is hindered by the need to train neuromorphic computers using external training software, which can be time-consuming and energy inefficient. Researchers from Eindhoven University of Technology and Northwestern University in the US have developed a new neuromorphic biosensor capable of on-chip learning that doesn’t need external training. As a proof-of-concept, the researchers used the biosensor to diagnose cystic fibrosis based on sweat samples.

We have demonstrated that we can create a ‘smart biosensor’ that could learn to detect a disease, such as cystic fibrosis, without using a computer or software.” That’s how Eveline van Doremaele summarized their new paper with Yoeri van de Burgt from TU/e, as well as Xudong Ji and Jonathan Rivnay from Northwestern University in the US that has just been published in Nature Electronics,

The ‘smart biosensor’ in their research is a neuromorphic biosensing computer – a device whose operation takes inspiration from the way that neurons communicate with other neurons in the human brain.

Neuromorphic computing could have a significant impact on healthcare for example, particularly when it comes to point-of-care devices to check for an illness or condition,” says van Doremaele. “And in our research, we have solved a major problem with regards to the use of neuromorphic computers in healthcare.”

We have demonstrated that we can create a ‘smart biosensor’ that could learn to detect a disease, such as cystic fibrosis, without using a computer or software.
csm_van%20Doremaele%20Profile%20image_ada2a47445.jpeg

Eveline van Doremaele

Goodbye to external software

So, what is the problem that van Doremaele and her collaborators solved? “For practical use in healthcare devices, neuromorphic technologies need to have low power requirements, interface with a sensor, and be easily trained for use. The first two of these can be solved with organic-based electronics. But it’s the training part that’s the central issue.”

Until now, a neuromorphic chip’s neural network would be trained using external software, which is a process that can be time-consuming and energy inefficient. “Now, our new chip can learn on-the-fly by processing patient data in real-time, which certainly speeds up the training process and helps promote the use of the chip in real interactive bioapplications,” says the researcher.


Searching for chloride anions

To test the effectiveness of their brand new chip, the researchers used it to test for the genetic disease cystic fibrosis. Cystic fibrosis is a hereditary disease that can damage organs, such as the lungs and digestive system.

One existing way to test for the disease is via a sweat test where a high level of chloride anions is an indicator of cystic fibrosis. Reliable sensors are already available to test for cystic fibrosis, so this test provided the researchers with an easy-to-check case study for their on-chip learning sensor.

“For ease of implementation, we didn’t work with real patient data. Instead, we used sweat samples from healthy donors,” says van Doremaele. “One sample was a negative sample or healthy sample of donor sweat, while a second sample was prepared to have a very high concentration of chloride anions.”

The researchers’ neuromorphic biosensor consists of three main parts – the sensor module, the hardware neural network, and the output classification part. A drop of sweat is added to the sensor module after which chloride and other ion concentrations in the sweat are detected with ion-selective electrodes. These signals are then processed by the neuromorphic chip itself. Finally, the result of the analysis is displayed as a green or red light indicating a negative or positive result, respectively.
csm_van%20Doremaele%20Chip%20layout%20image_52b00e364f.png

The neuromorphic biosensing chip. Image: Eveline van Doremaele

Training at the ‘data gym’

Before the chip was used to evaluate the main sweat samples, the neural network had to go the ‘data gym’ and undergo some supervised training.
“We created a number of sweat samples with varying and known ion concentrations and then tested the samples on the chip. If the result from the chip for a test was wrong, we corrected the chip, which resulted in corrections to the weights between the nodes of the neural network,” says van Doremaele. “Importantly, we train the chip on the hardware itself.”

This is the major advancement in this research – the ability to train the neural network on the chip and all without the need for any external software. “When the chip is trained to the problem of interest (here detection of cystic fibrosis from sweat samples), there is no further external control or intervention needed,” adds van Doremaele.

The ease of retraining

The real novelty is that the chips can learn and adapt to their application and environment.

In addition, even when trained, the chip can be used for another problem. “Say you want to use the same neural network hardware in a smart prosthetic hand or arm. All you have to do is retrain the neural network at the ‘data gym’ with information on hand or arm movements in this case,” says van Doremaele.

This new on-chip learning approach opens up the possibility of personalized implantable neural networks that are trained by the end user through the use of data directly from the user. “Such an approach to training neural networks for healthcare could have significant implications for people, and may someday provide a way to train chips in real-time to control prosthetics or other similar devices. The real novelty is that the chips can learn and adapt to their application and environment. They do not have to be programmed beforehand, as is the case today.”

Further information

A retrainable neuromorphic biosensor for on-chip learning and classification, van Doremaele et al., Nature Electronics, (2023).

Media contact​

Barry Fitzgerald

(Science Information Officer)
+31 40 247 8067 B.Fitzgerald@tue.nl
 
  • Like
  • Fire
  • Sad
Reactions: 35 users

Boab

I wish I could paint like Vincent
" A new 16-core Neural Engine is capable of nearly 17 trillion operations per second, enabling even faster machine learning computations for features like Live Voicemail transcriptions in iOS 17 and third-party app experiences — all while protecting critical privacy and security features using the Secure Enclave."
Wouldn't it be nice...... 🤔🙏
17 trillion operations per second.....Pfffft
We do 131💪💪💪
131 TOPs.jpg
 
  • Like
  • Love
  • Fire
Reactions: 42 users

Rach2512

Regular
Here is another article about Naveen Kumar’s traffic monitoring project, published on Wevolver today - they really seem to love Akida! 😍

Real-Time Traffic Monitoring with Neuromorphic Computing​

author avatar

David Tischler
14 Sep, 2023
FOLLOW
Real-Time Traffic Monitoring with Neuromorphic Computing


Article #5 of Spotlight on Innovations in Edge Computing and Machine Learning: A computer vision project that monitors vehicle traffic in real-time using video inferencing performed on the Brainchip Akida Development Kit.​

Artificial Intelligence
- Edge Processors
- Embedded Machine Learning
- Neural Network
- Transportation

This article is part of Spotlight on Innovations in Edge Computing and Machine Learning. The series features some unique projects from around the world that leverage edge computing and machine learning, showcasing the ways these technological advancements are driving growth, efficiency, and innovation across various domains.
This series is made possible through the sponsorship of Edge Impulse, a leader in providing the platform for building smarter, connected solutions with edge computing and machine learning.


In the ever-evolving landscape of urban planning and development, the significance of efficient real-time traffic monitoring cannot be overstated. Traditional systems, while functional, often fall short when high-performance data processing is required in a low-power budget. Enter neuromorphic computing—a technology inspired by the neural structure of the brain, aiming to combine efficiency with computational power. This article delves into an interesting computer vision project that monitors vehicle traffic using this paradigm.

Utilizing aerial camera feeds, the project can detect moving vehicles with exceptional precision, making it a game-changer for city planners and governments aiming to optimize urban mobility. The key lies in the advanced neuromorphic processor that serves as the project's backbone. This processor is not just about low power consumption—it also boasts high-speed inference capabilities, making it ideal for real-time video inferencing tasks.

But the journey doesn't end at hardware selection. This article covers the full spectrum of the project, from setting up the optimal development environment and data collection methods to model training and deployment strategies. It offers a deep dive into how neuromorphic computing can be applied in real-world scenarios, shedding light on the processes of data acquisition, labeling, model training, and final deployment. As we navigate through the complexities of urban challenges, such insights pave the way for smarter, more efficient solutions in traffic monitoring and beyond.



Traffic Monitoring using the Brainchip Akida Neuromorphic Processor​

Created By: Naveen Kumar
Public Project Link:
https://studio.edgeimpulse.com/public/222419/latest

Overview​

A highly efficient computer-vision system that can detect moving vehicles with great accuracy and relative motion, all while consuming minimal power.
cover

By capturing moving vehicle images, aerial cameras can provide information about traffic conditions, which is beneficial for governments and planners to manage traffic and enhance urban mobility. Detecting moving vehicles with low-powered devices is still a challenging task. We are going to tackle this problem using a Brainchip Akida neural network accelerator.

Hardware Selection​

In this project, we'll utilize BrainChip’s Akida Development Kit. BrainChip's neuromorphic processor IP uses event-based technology for increased energy efficiency. It allows incremental learning and high-speed inference for various applications, including convolutional neural networks, with exceptional performance and low power consumption.
eyJidWNrZXQiOiJ3ZXZvbHZlci1wcm9qZWN0LWltYWdlcyIsImtleSI6ImZyb2FsYS8xNjkyNjI2ODUxODgwLTE2OTI2MjY4NTE4ODAucG5nIiwiZWRpdHMiOnsicmVzaXplIjp7IndpZHRoIjo5NTAsImZpdCI6ImNvdmVyIn19fQ==

The kit consists of an Akida PCie board, a Raspberry Pi Compute Module 4 with Wi-Fi and 8 GB RAM, and a Raspberry Pi Compute Module 4 I/O Board. The disassembled kit is shown below.
eyJidWNrZXQiOiJ3ZXZvbHZlci1wcm9qZWN0LWltYWdlcyIsImtleSI6ImZyb2FsYS8xNjkyNjI5MjY1NjkzLXNwYWNlc19FSkI1T2FlWWpNNVZTRkVLTEVGel91cGxvYWRzX2dpdC1ibG9iLTYzMzc3ZDQ2MGUxYzJiMTc0NjViODFkNDQ3ODRkY2MyYzE1OGQ1MTFfaGFyZHdhcmVfdW5hc3NlbWJsZWQuanBlZyIsImVkaXRzIjp7InJlc2l6ZSI6eyJ3aWR0aCI6OTUwLCJmaXQiOiJjb3ZlciJ9fX0=
Hardware UnassembledThe Akida PCIe board can be connected to the Raspberry Pi Compute Module 4 IO Board through the PCIe Gen 2 x1 socket available onboard.
eyJidWNrZXQiOiJ3ZXZvbHZlci1wcm9qZWN0LWltYWdlcyIsImtleSI6ImZyb2FsYS8xNjkyNjI2OTExMjk4LTE2OTI2MjY5MTEyOTgucG5nIiwiZWRpdHMiOnsicmVzaXplIjp7IndpZHRoIjo5NTAsImZpdCI6ImNvdmVyIn19fQ==
Hardware Closeup

Setting up the Development Environment​


(…)

Conclusion​

This project highlights the impressive abilities of the Akida PCIe board. Boasting low power consumption, it could be used as a highly effective device for real-time object detection in various industries for numerous use cases.


This article is based on: Traffic Monitoring using the Brainchip Akida Neuromorphic Processor - Expert Projects, a blog by Edge Impulse. It has been edited by the Wevolver team and Electrical Engineer Ravi Y Rao. It's the third article from the Spotlight on Innovations in Edge Computing and Machine Learning Series.
The first article introduced the series and explored the implementation of Predictive Maintenance system using a Nordic Thingy:91.
The second article described the implementation of voice control for making appliances smarter using a Nordic Thingy:53.
The third article dives deep into the application of EdgeAI for surface crack detection, showcasing its transformative role in modern industrial predictive maintenance systems.
The
fourth article explains the integration of neuromorphic computing for real-time traffic monitoring, offering a technical blueprint for revolutionizing urban management.

About the sponsor: Edge Impulse

Edge Impulse is the leading development platform for embedded machine learning, used by over 1,000 enterprises across 200,000 ML projects worldwide. We are on a mission to enable the ultimate development experience for machine learning on embedded devices for sensors, audio, and computer vision, at scale.
From getting started in under five minutes to MLOps in production, we enable highly optimized ML deployable to a wide range of hardware from MCUs to CPUs, to custom AI accelerators. With Edge Impulse, developers, engineers, and domain experts solve real problems using machine learning in embedded solutions, speeding up development time from years to weeks. We specialize in industrial and professional applications including predictive maintenance, anomaly detection, human health, wearables, and more.
edge-impulse

More by David Tischler

FOLLOW
David is a Senior Developer Program Manager helping to take care of the Edge Impulse community of developers, and is a fan of computing on small, low power devices. He's also an extreme recycler, so use caution if trying to throw away recyclable objects if he's around.




Article #2
Sorry not sure if already posted before. Could this be a use case for Akida as uses an Arm Cortex M33.



The Nordic Thingy:53™ is an IoT prototyping platform that enables users to create prototypes and proofs of concept without the need for custom hardware. The Thingy:53 is built around the nRF5340 SoC, Nordic Semiconductor’s flagship dual-core wireless SoC. Its dual Arm Cortex-M33 processors provide ample processing power and memory size to run embedded machine learning (ML) models directly on the device with no constraints.
 
  • Like
  • Thinking
  • Love
Reactions: 9 users

skutza

Regular
Lets hope that Patterns knows what he's talking about.....

1694732910965.png
 
  • Like
  • Fire
  • Thinking
Reactions: 12 users

Xray1

Regular
Last trading day today before BRN offically leave's the ASX200 index.

IMO..... Will be most interesting to see how the s/price performs today and how the shorter's and institutions handle the situation especially if they haven't already departed / cleared their holding books.

I personally would like to hope and see a .30 cents close today.
 
  • Like
  • Fire
Reactions: 15 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
" A new 16-core Neural Engine is capable of nearly 17 trillion operations per second, enabling even faster machine learning computations for features like Live Voicemail transcriptions in iOS 17 and third-party app experiences — all while protecting critical privacy and security features using the Secure Enclave."
Wouldn't it be nice...... 🤔🙏

Sorry to be a party-pooper but pretty sure Apple uses Qualcomm's Snapdragon.🥺

Screen Shot 2023-09-15 at 9.22.03 am.png
 
  • Like
  • Sad
Reactions: 11 users
  • Like
  • Fire
Reactions: 5 users

jtardif999

Regular
ARM up 25% 😎

Arm climbs 25% in Nasdaq debut after pricing IPO at $51 a share

Arm Holdings has started trading on the Nasdaq under the ticker "ARM."

The chip design company is valued at a steep premium relative to the rest of the semiconductor market.

SoftBank still holds about 90% of Arm's stock.

Arm Holdings, the chip design company controlled by SoftBank, jumped nearly 25% during its first day of trading Thursday after selling shares at $51 a piece in its initial public offering.

At the open, Arm was valued at almost $60 billion. The company, trading under ticker symbol "ARM," sold about 95.5 million shares. SoftBank, which took the company private in 2016, controls about 90% of shares outstanding.

On Wednesday, Arm priced shares at the upper end of its expected range. On Thursday, the stock first traded at $56.10 and ended the day at $63.59.

It's a hefty premium for the British chip company. At a $60 billion valuation, Arm's price-to-earnings multiple would be over 110 based on the most recent fiscal year profit. That's comparable to Nvidia's valuation, which trades at 108 times earnings, but without Nvidia's 170% growth forecast for the current quarter.

Arm Chief Financial Officer Jason Child told CNBC in an interview that the company is focusing on royalty growth and providing products to its customers that cost and do more.

Many of Arm's royalties come from products released decades ago. About half the company's royalty revenue, which totaled $1.68 billion in 2022, comes from products released between 1990 and 2012.

"As a CFO, it's one of the better business models I've seen. I joke sometimes that those older products are like the Beatles catalog, they just keep delivering royalties. Some of those products are three decades old," Child said.

In a presentation to investors, Arm said it expects the total market for its chip designs to be worth about $250 billion by 2025, including growth in chip designs for data centers and cars. Arm's revenue in its fiscal year that ended in March slipped less than 1% from the prior year to $2.68 billion.

Arm's architecture is used in nearly every smartphone chip and outlines how a central processor works at its most basic level, such as doing arithmetic or accessing computer memory.

Child said the company sold $735 million in shares to a group of strategic investors comprising Apple, Google, Nvidia, Samsung, AMD, Intel, Cadence, Synopsis, Samsung and Taiwan Semiconductor Manufacturing Company. It's a testament to Arm's influence among chip companies, which rely on Arm's technology to design and build their own chips.

"There was interest to buy more than what was indicated, but we wanted to make sure we had a diverse set of shareholders," Child said.

In an interview with CNBC on Thursday, SoftBank CEO Masayoshi Son emphasized how Arm's technology is used in artificial intelligence chips, as he seeks to tie the firm to the recent boom in AI and machine learning. He also said he wanted to keep the company's remaining Arm stake as long as possible.

The debut could kick open the market for technology IPOs, which have been paused for nearly two years. It's the biggest technology offering of 2023.
 
  • Like
  • Wow
  • Fire
Reactions: 28 users

HopalongPetrovski

I'm Spartacus!
Did anyone manage to get any shares in the ARM IPO?
Congrats if you did.
Might possibly be good timing from anywhere around about nowish for a follow up from one of the great enablers.....ahem BRAINCHIP! 🤣
Bring It, Buddy. 🤣
 
  • Like
  • Fire
  • Love
Reactions: 19 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
  • Like
  • Love
  • Fire
Reactions: 38 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
  • Like
  • Fire
  • Love
Reactions: 12 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
The future is looking very bright fellow Brainers IMO!

will-ferrell-stressed-out.gif

CNBC Exclusive: CNBC Transcript: SoftBank Founder & CEO Masayoshi Son and Arm CEO Rene Haas Speak with CNBC’s David Faber on “Squawk on the Street” Today, Thursday, September 14​

PUBLISHED THU, SEP 14 202311:20 AM EDT

SHAREShare Article via FacebookShare Article via TwitterShare Article via LinkedInShare Article via Email
WHEN: Today, Thursday, September 14, 2023
WHERE: CNBC’s “Squawk on the Street”

Following is the unofficial transcript of a CNBC exclusive interview with SoftBank Founder & CEO Masayoshi Son and Arm CEO Rene Haas on “Squawk on the Street” today, Thursday, September 14 to discuss expectations for Arm’s IPO – the biggest IPO of 2023 & more.

PART I
RENE HAAS: Yeah so, Arm of 2023 returning to the public markets, we’re a very different company than we were in 2016 when we were largely associated, as you said, with mobile. And the important thing to think about Arm or remember is that we were born from building a device that was going to run off of battery. So that sensibility about power efficiency is in the DNA of our engineers. So fast forward to 2023, when you look at the diversification of our markets, cloud data center, automotive, everything with EV and automotive. These require extremely power-efficient processors or CPUs, which is what we do. So whether it’s data center, obviously around sustainability, you want to have as much low power as possible. Again, back on these cars running off batteries, it’s a great place for Arm to really grow our business. And that’s what we did in the years being private between 2016 and now. We diversified our business. We’ve got significant growth in the cloud data center and in automotive. And then with AI, AI runs on Arm. It’s hard to find an AI device today that isn’t Arm-based. Google Alexa – excuse me, Amazon Alexa for example, that device which does voice recognition, etc., that’s AI. And what we see happening going forward is products that didn’t have a CPU to run AI, they’ll need AI. You might need more CPUs to run more complex AI, so we see just huge growth opportunities there.
DAVID FABER: What is running generative AI though, is more GPU and is obviously all produced by Nvidia. There is a relationship there obviously, you once worked for the company as well. But investors I’ve spoken to would like to see a lot deeper relationship. Is it your expectation that you’re going to sell more to Nvidia? Because they’re not that large a customer.
HAAS: So Nvidia today is doing obviously a lot around AI – generative AI and training. Their latest product that they announced the Grace Hopper Superchip, which is their accelerated GPU for AI training, now uses 72 Arm CPUs as the core CPU on that. Remember, no type of accelerated training for AI can run without a CPU. So now the combination of the Arm CPUs with Nvidia GPUs we think will be even more growth opportunity for us going forward.

FABER: But you know, when investors looked at your S-1 for example and or your recent numbers, they didn’t see a great deal of growth year-over-year. And yet you’re pointing to a much more significant growth next year over this year and even after that. Why?
HAAS: The year-on-year growth was not that great, as you said. If we look backwards for the last three years as we’ve pivoted our strategy, we’re about 15% year-on-year growth going forward – excuse me, for the last three years. But going forward again, when you look at these mega trends of efficiency, software, complex AI all having to run off batteries or low power devices, this is huge growth for us. And again, David, Grace Hopper is a great example of the kinds of devices that can only be built on Arm.
FABER: Masa, it wasn’t that many years ago that you were selling this company to Nvidia or at least had plans to sell this company to Nvidia. That obviously did not happen. The price tag then was 40 billion. A lot of it was made up of Nvidia stock. What has changed between then and now?
MASAYOSHI SON: We did not want to really sell. It was the Covid that made me really, you know, go into the protective mode. So I had to go protective mode and I had to choose the more conservative, careful operation of SoftBank. So that we were selling to Nvidia, but the deal was actually one-third cash, two-thirds was exchange of Nvidia’s share as a combined company of Nvidia and Arm. I believed in the future of Nvidia back then, and it was right. And I believed in the combinations of the power of the two companies would be enormous. I believed in the future of AI. And it’s really now getting proved. And this is the beginning of big AI time and Arm is going to have a big role in that.
PART II
HAAS: Our China business reflects the growth we see in the rest of the world. We’re seeing huge growth in the data center around cloud computing, also with AI, and then EVs. Huge growth in China in terms of EVs. And China wants a lot of what the rest of the world needs: power efficiency, software ecosystem. A lot of the same software that’s used across the world is used in China. So what we’re seeing David, in terms of our China market growing, is largely around those two areas: data center and automotive. Now in terms of the broader issues, I think I share the same headaches that just about every tech CEO does these days. We comply, of course, with all the regulations that come down relative to export control, if there’s something that we need to adhere to, of course. But it’s really it’s a tricky market to figure out just in general because of all the things that are going on geopolitically. But broadly speaking, our China business has been doing very well.
FABER: Yeah, I want to get back to the business itself. But Masa, you know, your – Softbank owns a significant stake in the joint venture that I’m describing in China. You’ve obviously done business there. You were a very large holder of Alibaba for many years. What is your sense in terms of the risk that China poses given the percentage of revenues that it comprises for your company for Arm is quite high?
SON: U.S., China is having a very complicated situation now. The, China has significant impact to the economy of the rest of the world. So I think – I hope the situation get better, but who knows. I just am one of citizen who is wondering and, you know, concerned about the future of China, U.S. and the rest of the world.
FABER: Yeah. What do you hear? I mean, you have had relationships, deep relationships with many people in the business community there. How deep are your concerns? Not just in terms of the back and forth between the U.S. and China, but also in terms of the regime itself and its crackdown, so to speak, on entrepreneurship, if I could call it that.
SON: Well, it’s difficult to comment. Whatever I comment, it goes into all kinds of headlines here and there, so I have to be careful what I say. But our exposure in China, Softbank as a group has reduced significantly. Because now, you know, most of the shares in Alibaba from Softbank is already sold.
PART III
SON: Yeah. We – SoftBank owned 75% of Arm until just a few weeks ago. And just a few weeks ago, we bought back from 75% to 100% from Vision Fund. We pay even higher price. So I’m more confident about the future. I think the value is gonna have a good upside, really long term. And that’s why I bought back with a higher price than $51. And I just want to have all the investors have a good time going forward.
PART IV
SON: Well, Rene has, you know, a great chemistry with me. We share the vision completely together. We’re talking almost every day, we’re chatting with WhatsApp and, you know, calling all the time almost really every day multiple times, like three times, five times, sometimes 10 times a day.
PART V
SON: Yeah, well, I’m a big believer of AI. Since I started SoftBank, I was a big believer of, you know, the microprocessors enabling all kinds of technology, evolution starting with PC, and then mobile, internet, and then now going into AI. It is a front end of all the innovation of our industry. It’s the core is the microprocessor. So Arm is now going to become the core of the AI revolution. My belief of the front end of the IT industry’s evolution is really shifting to AI so my my focus, my belief is all centered in AI and Arm is going to become the core of that. Many of the Vision Fund portfolio companies, we have about 500 companies, they’re going to be, you know, having a lot of great applications of AI and they’re going to be beneficiary of this Arm and many other AI technology players.
FABER: Yeah, well, I mean you recently said or not that long ago, Arm is positioned to benefit from AI becoming the, quote, “The basis for a new society.” What does that mean?
SON: Yes. Well, I think this is the first time that mankind experience something smarter than mankind itself. Right? Mankind was the smartest animal on the earth. The AI is going to surpass and surpass big time. I think the AGI stage is coming very soon. And once it comes, it goes so far away. And it’s a new stage of society, that we all have to ask ourselves, what is mankind? What is job? What is life? What is the intelligence? It’s a new experience completely new society that we are going to face.
FABER: Masa, if you’d been in that meeting with a lot of the leaders in AI that just took place with Senator Schumer and other members of Congress, what would you have shared? I mean, you just said it’s going to exceed our own intelligence. There are those who are quite frightened by that prospect and worried about the dangers of AI. Do you share that?
SON: Well, if we mishandle it, it has a danger like automotive society. It has a danger of the car accident if you, if you don’t regulate. The automotive society is regulated with a traffic light and speed limit and don’t drive with alcohol drinking. So AI society should regulate to protect humankind. However, it has more merit than the demerits. So I think I’m a believer. I’m optimistic that AI is going to solve the issues that mankind couldn’t solve in the past, like difficult disease, the natural disasters, the car accidents, all kinds of other issues that humankind had in the past will be helped by the advancement of AI and technology.
FABER: Rene, you’re going to be in the middle of this. I mean, you know, you’re going to be providing designs that are going to help Nvidia use, create chips that are going to be powering generative AI. Do you feel similarly or are you perhaps more concerned?
HAAS: I agree with Masa from the standpoint that we are going to see some very, very profound advancements around AI going forward. The ChatGPT moment, if you will, I think was a tipping point relative to what the capability of these large language models could do. So on one level as a CEO of a company that builds a lot of devices based on AI and an engineer at heart, phenomenal opportunity. And I think for us going forward back to the CPU being the center of everything and you can’t really run AI without a CPU, it’s going to be a huge growth opportunity for Arm. That said, there are a lot of social and ethical type of things to consider, which I think as a society, we have to figure out and we’re in early days on that.
PART VI
HAAS: I think what’s important to remember again, our core product is the CPU and what is the superpower of the CPU? It’s the broad software ecosystem around it. We have a software ecosystem like no architecture probably ever invented. Every major operating system whether it’s iOS, Android, Windows, Android Linux, Automotive Linux, all the Linux distributions for the cloud, not only do they run on Arm, but they’ve been optimized for years, in some cases decades. RISC-V, as you mentioned, is a new emerging entity. Right now, what we see is that they just don’t have much traction in the area around software ecosystems. It’s not to say that we don’t watch our competition very seriously. We obviously do and there’s a lot of war room discussions about how to address that. But broadly speaking, I think the the hurdle they have to cross is around the software ecosystem. And that is not an easy moat to cross.
FABER: Why not?
HAAS: Just the sheer amount of development time that goes into it. I’ll give you an example. Windows running on Arm PCs. I worked on them personally in my previous life back 2009. Here it is 2023, 14 years later, and we’re just now starting to see growth of Arm on Windows PCs. It just takes time. There’s a lot of software work that needs to be done, development time, and optimizations. And, and that’s just for PCs, when you think about phones, when you think about the cloud and you think about automotive, the software gets more and more complex. So, I think over time, that is something that we’ll continue to invest in and I think it’s a it’s a big path RISC-V to cross.
FABER: Speaking of over time, Masa, you know again, you at SoftBank own 90% of Arm, the now public company. Are you going to sell any of that? It is your expectation that you will be a holder from here on of that 90% stake or is it something that you do see as potentially monetizable over time?
SON: Well, I want to keep it as much as possible as long as possible. I’m a long-term believer and and so on. Of course, you know, if the next Covid comes, who knows? But our intent is to hold as much as possible as long as possible. I wanted to keep 100% of Arm on the reason we are having IPO and selling is because Arm is such an important company for the industry, I wanted to have investors opportunity to participate on the upside opportunity of arm and wanted it to become a public, used to be a public company. So back into public company’s position so that you know people will have more transparency of the operation of the companies and so on and also give the engineers and employees the stock options to realize.
FABER: Right. Rene, you’re now a public company CEO. I mean, you do have one very large shareholder, but does it change how you do your job?
HAAS: Well, you know, as Masa said, he’s a long-term believer in Arm. So making sure I keep my number one shareholder happy is very important. But at the same time, he and I are very aligned on the on the long-term vision and view for the company. I believe we are one of the most foundational companies inside our industry. The world runs on Arm, it’s hard to find an end device that doesn’t run on Arm. And at the same time, I think when we looked at the percentage of the population that uses Arm, it’s probably north of 70%. So while we’re thrilled to be public today and I’m thrilled for all the employees as I said and our partners and developers, I really think that the best is ahead and I’m super excited about the next five to 10 years for the company.
FABER: Is there any scenario in which you would actually start to not just design chips but design your own chips?
HAAS: I don’t want to foreshadow what our product plans might be. But the company is foundational to the electronics industry and we have a lot of opportunity for growth.
FABER: That’s where you’re going to leave it, just that? A lot of opportunity for growth? We’re going to be able to do interviews in the future together. So I guess I’ll be—

HAAS: You can ask me next time.
FABER: I’ll be able to ask you that again. Let me come back to something as well Rene that a number of investors did at least mention to me as a potential concern, which is this lawsuit against Qualcomm. You know, never great to be suing somebody that you do business with. Can you give us an update in terms of your expectations and, and, and why perhaps people should not be concerned?
HAAS: Qualcomm is a great partner. You know, back to the foundational aspect of what we do, just about every product that Qualcomm puts into the market runs on Arm, their Snapdragon that goes into phones is based on Arm, their forays into PC will be based on Arm, their automotive platforms are all based on Arm so they’re a huge partner for us. We hope that dispute can be settled. I can’t say much about the litigation, it’s headed for courts next year. But I’ll just kind of leave it at that.
 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 38 users
Top Bottom