BRN Discussion Ongoing

Tothemoon24

Top 20

News Release​


onsemi and DENSO Collaborate for a Strengthened Relationship​

December 16, 2024 at 9:00 PM EST
PDF Version
SCOTTSDALE, Ariz. & KARIYA, Japan--(BUSINESS WIRE)--Dec. 16, 2024-- onsemi (Nasdaq: ON) and tier-one automotive supplier DENSO CORPORATION announced today that they are strengthening their long-term relationship to support the procurement of autonomous driving (AD) and advanced driver assistance systems (ADAS) technologies.
For over 10 years, onsemi has been supplying DENSO with the latest intelligent automotive sensors to enhance ADAS and AD performance. These semiconductors have become increasingly vital in improving vehicle intelligence including connectivity, to help reduce traffic accident fatalities.
“DENSO’s desire to work more closely with us demonstrates its confidence in our innovative capabilities, decades of expertise and supply resilience in automotive technologies," said Hassane El-Khoury, president and CEO of onsemi.
“As the second largest global supplier of automotive systems and parts, DENSO depends on a robust supply chain for critical materials like semiconductors to continue to reliably serve our customers with cutting-edge products,” said Shinnosuke Hayashi, president of DENSO CORPORATION. “Therefore, it is essential to collaborate closely with industry leaders like onsemi, which has been improving the safety and autonomy of vehicles with its intelligent sensing technologies for years and provides the supply assurance we expect."
As a mark of their collaboration, DENSO intends to acquire onsemi shares on the open market, aiming to further enhance their long-term relationship.
About onsemi
onsemi
(Nasdaq: ON) is driving disruptive innovations to help build a better future. With a focus on automotive and industrial end-markets, the company is accelerating change in megatrends such as vehicle electrification and safety, sustainable energy grids, industrial automation, and 5G and cloud infrastructure. onsemi offers a highly differentiated and innovative product portfolio, delivering intelligent power and sensing technologies that solve the world’s most complex challenges and leads the way to creating a safer, cleaner and smarter world. onsemi is recognized as a Fortune 500® company and included in the Nasdaq-100 Index® and S&P 500® index. Learn more about onsemi at www.onsemi.com.
 
  • Like
  • Love
  • Wow
Reactions: 31 users

ChrisBRN

Emerged
Snap card purpose
IMG_0124.jpeg
 

Attachments

  • IMG_0125.jpeg
    IMG_0125.jpeg
    1.3 MB · Views: 47
Last edited:

Bravo

If ARM was an arm, BRN would be its biceps💪!

News Release​


onsemi and DENSO Collaborate for a Strengthened Relationship​

December 16, 2024 at 9:00 PM EST
PDF Version
SCOTTSDALE, Ariz. & KARIYA, Japan--(BUSINESS WIRE)--Dec. 16, 2024-- onsemi (Nasdaq: ON) and tier-one automotive supplier DENSO CORPORATION announced today that they are strengthening their long-term relationship to support the procurement of autonomous driving (AD) and advanced driver assistance systems (ADAS) technologies.
For over 10 years, onsemi has been supplying DENSO with the latest intelligent automotive sensors to enhance ADAS and AD performance. These semiconductors have become increasingly vital in improving vehicle intelligence including connectivity, to help reduce traffic accident fatalities.
“DENSO’s desire to work more closely with us demonstrates its confidence in our innovative capabilities, decades of expertise and supply resilience in automotive technologies," said Hassane El-Khoury, president and CEO of onsemi.
“As the second largest global supplier of automotive systems and parts, DENSO depends on a robust supply chain for critical materials like semiconductors to continue to reliably serve our customers with cutting-edge products,” said Shinnosuke Hayashi, president of DENSO CORPORATION. “Therefore, it is essential to collaborate closely with industry leaders like onsemi, which has been improving the safety and autonomy of vehicles with its intelligent sensing technologies for years and provides the supply assurance we expect."
As a mark of their collaboration, DENSO intends to acquire onsemi shares on the open market, aiming to further enhance their long-term relationship.
About onsemi
onsemi
(Nasdaq: ON) is driving disruptive innovations to help build a better future. With a focus on automotive and industrial end-markets, the company is accelerating change in megatrends such as vehicle electrification and safety, sustainable energy grids, industrial automation, and 5G and cloud infrastructure. onsemi offers a highly differentiated and innovative product portfolio, delivering intelligent power and sensing technologies that solve the world’s most complex challenges and leads the way to creating a safer, cleaner and smarter world. onsemi is recognized as a Fortune 500® company and included in the Nasdaq-100 Index® and S&P 500® index. Learn more about onsemi at www.onsemi.com.
Nice one @Tothemoon24!

We also know Onsemi and Emotion3D were working on an occupant age\weight classification system for airbag deployment.

And we know from an earlier podcast that Onsemi and BrainChip have been working on the same, or was I just imagining that? At the very least, we know we are partners with Onsemi. I think it might have been Todd Vierra who confirmed this.

Incidentally, here’s an extract from a Vehicle Safety Airbag report dated August 2023 which describes Denso as a main player in the industry. The report states the following under the header ‘Industry Developments.’

AI-powered Airbag Systems
Artificial Intelligence (AI) has infiltrated every facet of modern life, and the automotive sector is no exception. AI-powered airbag systems leverage machine learning algorithms to process real-time data and make split-second decisions about airbag deployment. This adaptive approach ensures that airbags are deployed optimally, enhancing passenger protection while minimizing the risk of unnecessary deployment.

 
Last edited:
  • Like
  • Love
  • Wow
Reactions: 29 users

cosors

👀
Ooooh, I like this bit "Neuromorphic computing will become increasingly important in 2025 and beyond, particularly as AI demands more from computing than ever before"

Bring on 2025 I'd say 👍 (Still enjoying 2024 though, haha)

"Top 3 Cutting-Edge Micro-Caps Set for 2025 Growth​

Story by Leo Miller, MarketBeat
• 37m

As 2025 approaches, several promising micro-cap stocks are gaining attention for their potential to disrupt key industries. These companies, though small in market size, are making significant strides with innovative technologies and strategic advancements. From revolutionary AI-powered chips to life-changing healthcare devices and cutting-edge 3D metrology solutions, these firms are well-positioned for future growth. Here's a closer look at three micro-cap stocks that could be on the verge of major breakthroughs in the coming year.

BrainChip: Aiming to Power Edge AI With Brain-Like Efficiency

BrainChip (OTCMKTS: BRCHF) is a micro-cap software and semiconductor company focused on neuromorphic computing. The goal of neuromorphic computing is to mimic the way the human brain processes information. This technology's value is in running AI and machine learning tasks with much lower power consumption than other AI accelerator chips. At this point, BrainChip is largely considered pre-revenue. Although, it has had quarters where it generated a couple of million in revenue. The company has mostly been obtaining patents for its technology and improving its designs.

The lower power consumption aspect of the firm's chips is vital to its value proposition. It makes these chips ideal for AI-driven edge computing. This is where chips process information close to the source where they receive it, rather than sending it back to a data center for processing. It is especially beneficial to reduce power use in devices that may go a long time without charging, like electric vehicles.

Recently, optimism around this company has emerged after the U.S. Air Force Research Laboratory awarded it a $1.8 million contract. The revenue aspect of it is somewhat insignificant, but it represents an opportunity for the company to expand. The military wants to use BrainChip's tech in mobile platforms, like drones, robots, and aircraft. The press release indicates that the company's technology has now entered Phase II of the Small Business Innovation Research program. If the company can progress into Phase III of this program, it could transition into operational use by the military. This shows progress toward much more significant revenue generation, making this a stock to watch in 2025.
..."

_________________
 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 54 users

Guzzi62

Regular

"Top 3 Cutting-Edge Micro-Caps Set for 2025 Growth​

Story by Leo Miller, MarketBeat
• 37m

As 2025 approaches, several promising micro-cap stocks are gaining attention for their potential to disrupt key industries. These companies, though small in market size, are making significant strides with innovative technologies and strategic advancements. From revolutionary AI-powered chips to life-changing healthcare devices and cutting-edge 3D metrology solutions, these firms are well-positioned for future growth. Here's a closer look at three micro-cap stocks that could be on the verge of major breakthroughs in the coming year.

BrainChip: Aiming to Power Edge AI With Brain-Like Efficiency

BrainChip (OTCMKTS: BRCHF) is a micro-cap software and semiconductor company focused on neuromorphic computing. The goal of neuromorphic computing is to mimic the way the human brain processes information. This technology's value is in running AI and machine learning tasks with much lower power consumption than other AI accelerator chips. At this point, BrainChip is largely considered pre-revenue. Although, it has had quarters where it generated a couple of million in revenue. The company has mostly been obtaining patents for its technology and improving its designs.

The lower power consumption aspect of the firm's chips is vital to its value proposition. It makes these chips ideal for AI-driven edge computing. This is where chips process information close to the source where they receive it, rather than sending it back to a data center for processing. It is especially beneficial to reduce power use in devices that may go a long time without charging, like electric vehicles.

Recently, optimism around this company has emerged after the U.S. Air Force Research Laboratory awarded it a $1.8 million contract. The revenue aspect of it is somewhat insignificant, but it represents an opportunity for the company to expand. The military wants to use BrainChip's tech in mobile platforms, like drones, robots, and aircraft. The press release indicates that the company's technology has now entered Phase II of the Small Business Innovation Research program. If the company can progress into Phase III of this program, it could transition into operational use by the military. This shows progress toward much more significant revenue generation, making this a stock to watch in 2025.
..."
Here is the link:

 
  • Like
  • Haha
  • Fire
Reactions: 13 users

cosors

👀
Maybe from general interest:


"KBA approval received​

Mercedes-Benz is allowed to drive autonomously faster


In the future, the Drive Pilot will be allowed to drive independently on the motorway at 95 km/h. But one important restriction remains.

December 17, 2024
1734444139573.png


The car manufacturer Mercedes-Benz is allowed to significantly increase the speed limit of its highly automated driving assistant. Instead of the previous 60 km/h, the so-called Drive Pilot will be able to travel at 95 km/h on German motorways in the future. The Federal Motor Transport Authority has issued approval for the higher speed, the company announced on December 17, 2024.

The new version will be available in Germany in spring 2025 in the S-Class and the all-electric EQS. Vehicles that were already using Drive Pilot received a free software update either via the Internet or in the workshop. The hardware does not have to be changed.

Can only be used in the right lane​


The Drive Pilot is a highly automated Level 3 assistance system that allows drivers to perform a secondary activity when activated. Until now, the system could practically only be used in traffic jams or in slow-moving traffic. In the future, this would also be possible outside traffic jams, for example on routes such as the Berlin city motorway, for which a speed limit of 80 km/h applies.

However, the system only works if the Drive Pilot detects a vehicle in front. On the open road, the drivers have to take over the steering again. According to Mercedes, the vehicle must also be in the right-hand lane. This is not necessary at lower speeds of up to 60 km/h.
The price for the optional equipment is at least 5,950 euros, as before. Depending on the basic equipment, additional packages may be required for the EQS electric sedan, so that the system costs almost 14,000 euros, according to the configurator.


130 km/h by the end of the decade​


Highly automated systems with a speed of up to 130 km/h are already legally permitted in Germany. Mercedes-Benz aims to reach this mark by the end of the decade. In this context, the manufacturer is committed to ensuring that other road users can see from the outside whether a Mercedes is driving in self-driving mode. In some US states , turquoise marker lights may already be used for this purpose.

The planned increase in the maximum speed to 95 km/h does not cancel out the numerous other restrictions of the Drive Pilot. The list of system limits is long and includes 20 points in the "Additional Drive Pilot Instructions". For example, the system does not work in the event of defective or dirty sensors, a malfunctioning navigation system, a detected hazard or in the dark, low outside temperatures of less than 3 degrees Celsius or precipitation.

No hands-free system yet​


Other exclusion criteria are a roof rack, switched on high beam or a trailer or bicycle rack. In the practical test of Golem.de , the Drive Pilot from Mercedes was not convincing because of the numerous restrictions and unexpected deactivations.
Mercedes-Benz is also refraining from offering a system according to automation level 2, in which the hands no longer have to touch the steering wheel regularly. The manufacturers BMW and Ford have already received KBA approval for such systems."
 
  • Like
  • Fire
Reactions: 14 users

yogi

Regular
  • Like
Reactions: 5 users

IloveLamp

Top 20
1000020663.jpg
 
  • Like
  • Love
  • Fire
Reactions: 15 users

IloveLamp

Top 20
1000020666.jpg
 
  • Like
  • Fire
  • Wow
Reactions: 22 users

BrainShit

Regular
1000060920.jpg
 
  • Like
  • Fire
  • Love
Reactions: 33 users

Dallas

Regular
 
  • Like
  • Fire
Reactions: 11 users

Dallas

Regular
 
  • Like
  • Love
Reactions: 15 users

IloveLamp

Top 20
Pedro has posted about Brainchip in the past, dyor.


1000020670.jpg
 
  • Like
  • Love
Reactions: 17 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
  • Fire
  • Like
  • Wow
Reactions: 12 users

Esq.111

Fascinatingly Intuitive.
feeling something.......

 
  • Like
  • Fire
Reactions: 9 users
I imagine this to be the last week for BRN staff for 2024 , looking forward to the upcoming video from Sean to close the year out.

Merry Christmas everyone and wishing a fantastic 2025 for Akida.

We are going to win 🥇
 
  • Like
  • Love
  • Fire
Reactions: 32 users
Nothing majorly new, just Tata Elxsi Chief Data Scientist talking about Edge, Federated learning, model challenges, oh and the emerging neuromorphic tech.


Publication Name: Hindustantimes.com
Date: December 12, 2024

Role of Edge AI in enhancing real-time data processing​

Role of Edge AI in enhancing real-time data processing

Edge Artificial Intelligence (Edge AI) has revolutionised the way data is processed in modern IoT applications, bringing real-time capabilities closer to the source of data generation—at the "edge" of networks. This shift in computing has profound implications for industries reliant on Internet of Things (IoT) devices, allowing for faster decision-making, reduced latency, and more efficient use of network resources. As we explore the role of Edge AI, it becomes clear that its ability to enhance real-time data processing is driving new possibilities for various sectors.

Traditional IoT systems have depended heavily on cloud-based models, where data is collected from edge devices, transmitted to centralised cloud servers, and then processed. While effective in some scenarios, this approach comes with inherent challenges. Latency issues arise from the distance between the edge device and the cloud server, often leading to delays that are unacceptable for time-sensitive applications. Additionally, the sheer volume of data generated by IoT devices can strain network bandwidth, causing inefficiencies and higher costs.

This is where Edge AI offers a transformative solution. By moving computation closer to where the data is generated—on the edge devices themselves—it becomes possible to perform data analysis and make decisions locally and in real-time. Instead of transmitting every piece of raw data to the cloud, edge devices can filter and process the data before sending only the most relevant information. This not only reduces network congestion but also significantly speeds up response times, which is crucial for applications like autonomous vehicles, industrial automation, and smart health care systems.

One of the major breakthroughs in Edge AI involves the development of lightweight AI models. Traditional AI models, such as deep learning networks and neural networks, are often too large and computationally heavy to run on edge devices with limited resources. These models typically require powerful GPUs or cloud-based servers to operate efficiently. However, advancements in model optimisation techniques, such as quantization and hyperparameter tuning, have enabled AI models to be compressed and optimised for edge environments. As a result, these smaller models can now run on devices with minimal processing power and memory, like microcontrollers, without compromising performance.

A key advantage of Edge AI lies in its ability to provide immediate insights through local processing. For example, in industrial settings, machines equipped with Edge AI can monitor production lines and detect anomalies in real-time. By processing the data directly on-site, these systems can identify potential faults or inefficiencies without waiting for cloud-based analysis. This allows for rapid intervention, reducing downtime and improving overall operational efficiency.

The health care industry is another sector benefiting from Edge AI. Medical devices, such as wearables and diagnostic tools, generate vast amounts of data. In scenarios where immediate action is critical—such as monitoring a patient’s vital signs—Edge AI enables real-time analysis, allowing for quicker responses to changes in a patient's condition. This capability is especially valuable in remote health care settings, where connectivity to cloud servers might be unreliable or slow.

Another significant development is the growing use of federated learning at the edge. In federated learning, multiple edge devices collaborate to train a shared AI model while keeping the data local. This decentralised approach enhances privacy and security by ensuring that sensitive data never leaves the device. Instead of sending raw data to the cloud for training, only the model updates are transmitted. This approach not only protects user privacy but also reduces the risks associated with data breaches and regulatory non-compliance.

As the edge computing ecosystem continues to mature, more sophisticated tasks can be handled locally. Emerging technologies, such as neuromorphic computing, offer even more potential by mimicking the brain’s neural architecture. These systems, designed for ultra-low-power environments, can process complex data streams with incredible speed and efficiency, making them ideal for applications that require real-time decision-making, such as robotics and autonomous systems.

However, challenges remain in fully realising the potential of Edge AI. One of the primary hurdles is the complexity of deploying AI models on resource-constrained devices. While there have been significant advancements in reducing model size and improving efficiency, many AI models still require more memory and processing power than what edge devices can provide. Additionally, maintaining model accuracy when dealing with reduced datasets at the edge can be difficult, especially in environments where the data is noisy or incomplete.

Another challenge is the variability in hardware platforms for edge computing. The diversity of edge devices—from sensors and cameras to industrial machinery—means that AI models need to be highly adaptable to different architectures. Open-source tools like Apache TVM have made strides in addressing this issue by providing frameworks that allow models to run on a wide range of hardware. This interoperability is crucial for ensuring that AI models can be deployed across various industries without extensive customisation.

Despite these challenges, the progress in Edge AI is undeniable. The ongoing evolution of processors and AI accelerators designed specifically for edge computing is making it possible to perform more complex tasks on smaller, energy-efficient devices. As Edge AI continues to develop, it is poised to unlock new applications that were previously thought to be beyond the capabilities of IoT devices.

In conclusion, the role of Edge AI in enhancing real-time data processing for modern IoT applications is pivotal. By bringing computation closer to the source of data, Edge AI reduces latency, optimises bandwidth usage, and enables faster decision-making. The convergence of AI and edge computing is driving innovation across a wide range of industries, from manufacturing and health care to agriculture and retail. While challenges remain, the advancements in lightweight AI models, federated learning, and hardware optimization are paving the way for a future where real-time, on-device intelligence becomes the norm rather than the exception.

Author: Biswajit Biswas, Chief Data Scientist, Tata Elxsi.
 
  • Like
  • Fire
  • Love
Reactions: 33 users

7für7

Top 20
I bet on one more announcement! Call me pusher …
 
  • Like
  • Fire
  • Haha
Reactions: 9 users
Extract/quote from an interview with Arm CEO, Renee Haas on 17th December 2024.

”What we are observing — and I think this is only going to accelerate — is that whether you’re talking about an AI data center or an AirPod or a wearable in your ear, there’s an AI workload that’s now running and that’s very clear. This doesn’t necessarily need to be ChatGPT-5 running six months of training to figure out the next level of sophistication, but this could be just running a small level of inference that is helping the AI model run wherever it’s at. We are seeing AI workloads, as I said, running absolutely everywhere. So, what does that mean for Arm?
Our core business is around CPUs, but we also do GPUs, NPUs, and neural processing engines. What we are seeing is the need to add more and more compute capability to accelerate these AI workloads. We’re seeing that as table stakes. Either put a neural engine inside the GPU that can run acceleration or make the CPU more capable to run extensions that can accelerate your AI. We are seeing that everywhere. I wouldn’t even say that’s going to accelerate; that’s going to be the default.
What you’re going to have is an AI workload running on top of everything else you have to do, from the tiniest of devices at the edge to the most sophisticated data centers. So if you look at a mobile phone or a PC, it has to run graphics, a game, the operating system, and apps — by the way, it now needs to run some level of Copilot or an agent. What that means is I need more and more compute capability inside a system that’s already kind of constrained on cost, size, and area. It’s great for us because it gives us a bunch of hard problems to go off and solve, but it’s clear what we’re seeing. So, I’d say AI is everywhere.”


Did someone say Pico. Very juicy comment from Arm!
 
  • Like
  • Fire
  • Thinking
Reactions: 17 users
Top Bottom