BRN Discussion Ongoing


IMG_2747.jpeg
 
  • Like
  • Fire
  • Wow
Reactions: 14 users
I remember Sean talking about C executives

As C-suite executives, the time to act is now. Those who successfully navigate this technological shift will not only drive innovation within their organizations but also play a crucial role in shaping the future of computing.
 
  • Like
  • Love
Reactions: 14 users
I don’t like putting people on ignore but some make the list with a single post.
Actually, I think Tony Lewis liked it, for the simple reason, that he, can use Microsoft Eureka, to help evaluate the Very small and Tiny SOTA (State of the Art) Language Models that he and the BrainChip Team have developed and are developing.

So not directly an involvement with BrainChip, as Genyl so intelligently stated, but a tool, to help with our Company's progress.
 
  • Like
  • Wow
  • Fire
Reactions: 15 users

The Pope

Regular
Because he thinks it is exciting? The guy simply likes other posts that interest him. It got nothing to do with brainfart.Got damn you need some reality sense. And If you don't believe me, time will tell you that this got nothing to do with brainchip
Looks like a special someone got out of bed from the wrong side this morning. Chip up buttercup. Hope you have a good day.
 
Last edited:
  • Like
  • Haha
Reactions: 14 users
Thanks,

I checked a different page.
The Mototrend article still has it as well, it's just been moved to its own section near the end of the article.
 
  • Like
Reactions: 5 users

Xray1

Regular
Thanks manny,

Like most of us, I keep wondering "Are we nearly there yet?"
IMO..... I was just wondering if Akida 2 (ESP) and TENNs could be or are still in R & D mode till sometime in the 1st Quarter of 2025 :) :)
 
  • Like
Reactions: 1 users
1726821859305.gif

But

1726822228981.gif

To hold this stock
 
  • Like
  • Haha
Reactions: 10 users

manny100

Regular
IMO..... I was just wondering if Akida 2 (ESP) and TENNs could be or are still in R & D mode till sometime in the 1st Quarter of 2025 :) :)
Gen 2 should turn into Gen 3 next year. Looking forward to that. See the AGM replay for details. Tony Lewis speaks from around the 20 minute mark on and has a few slides. He addresses GEN2, TENNs and the future.
Tony describes TENNs as unique and the Swiss Army knife of 'nuero' networks. He says TENNs can be integrated into daily life and the uses are almost limitless.
R & D will see continuous improvements to TENNs. We have already seen the first improvement/expansion in TENNs being the pending patent aTENNuate.
Gen 2 should turn into Gen3 sometime later next year. See the Future slide in Tony's presentation for a guide as to plans.
Link to AGM BELOW.
 
  • Like
  • Love
Reactions: 24 users
IMO..... I was just wondering if Akida 2 (ESP) and TENNs could be or are still in R & D mode till sometime in the 1st Quarter of 2025 :) :)
TENNs is a "part" of AKIDA 2.0 IP and also available independent of it.

Both are available commercially.
And both are still being developed.

Technology doesn't stand still, unless it's left to..
 
  • Like
Reactions: 7 users
Gen 2 should turn into Gen 3 next year. Looking forward to that. See the AGM replay for details. Tony Lewis speaks from around the 20 minute mark on and has a few slides. He addresses GEN2, TENNs and the future.
Tony describes TENNs as unique and the Swiss Army knife of 'nuero' networks. He says TENNs can be integrated into daily life and the uses are almost limitless.
R & D will see continuous improvements to TENNs. We have already seen the first improvement/expansion in TENNs being the pending patent aTENNuate.
Gen 2 should turn into Gen3 sometime later next year. See the Future slide in Tony's presentation for a guide as to plans.
Link to AGM BELOW.

Except I think Gen 3, or AKIDA 3.0 IP, will be distinct, from 2.0 and not just an evolution of it, supplanting it..
 
  • Like
Reactions: 2 users

Diogenese

Top 20
Except I think Gen 3, or AKIDA 3.0 IP, will be distinct, from 2.0 and not just an evolution of it, supplanting it..
Trouble is, we keep trumping our own aces.
 
  • Like
  • Fire
  • Haha
Reactions: 16 users
Trouble is, we keep trumping our own aces.
Well whoever stands still, is going to lose, at the moment..

And that's only going to get worse.
 
  • Like
  • Thinking
Reactions: 7 users

Tothemoon24

Top 20

BrainChip’s aTENNuate: A Revolution in AI Edge Processing​

By Victoria Reed / September 19, 2024

image 14 20

New innovations arrive daily, each promising to reshape the landscape. But some technologies truly stand out. Enter aTENNuate, the latest breakthrough introduced by BrainChip. This cutting-edge technology represents a monumental leap forward in AI efficiency, specifically designed for edge processing. What makes aTENNuate so special? It tackles one of the most pressing challenges in the AI industry: how to make AI smarter, faster, and more efficient while operating on smaller devices.

The Power of Edge Processing​

Before diving deeper into the benefits of aTENNuate, let’s break down edge processing. In simple terms, edge processing refers to the ability of devices—whether it’s a smartphone, a wearable, or even a car—to process data locally, without needing to rely on cloud computing. It means AI can make real-time decisions on the device itself. No lag, no delays—just fast, seamless AI performance. This is exactly where BrainChip’s aTENNuate comes in.

Reducing Power Consumption Without Sacrificing Performance​

One of the major pain points in AI at the edge has always been the balance between power consumption and performance. Processing large amounts of data requires energy, and that energy can drain battery life or stress small hardware. aTENNuate solves this by cleverly managing and optimizing neural network workloads. In layman’s terms, it’s like giving your AI a smart workout—more efficient, more effective, and with less strain on your device.

Smarter AI Models for Smarter Decisions​

aTENNuate doesn’t just reduce energy consumption; it makes AI models smarter. By enhancing the way neural networks learn and process information, this innovation ensures AI is more adaptive and quicker in decision-making. Imagine your smartphone knowing exactly when to reduce background processes or your wearable device anticipating your next move before you even act. This level of predictive intelligence is a hallmark of aTENNuate.

Real-Time Performance in Critical Applications​

Think of applications where milliseconds matter. Autonomous vehicles, healthcare devices, or even security systems—all rely on split-second decisions. With aTENNuate, AI at the edge is faster, leading to improved safety and efficiency. The ability to make decisions instantly, without a cloud connection, is a game-changer for industries where time is literally of the essence.

A Boost to Sustainability​

Beyond performance, there’s another vital aspect that aTENNuate addresses: sustainability. As the world grows more conscious of energy consumption, technologies that can reduce carbon footprints are essential. BrainChip’s aTENNuate does just that by allowing devices to do more with less power. This opens the door to greener technologies in everything from consumer electronics to industrial applications.

Expanding the AI Horizon with aTENNuate​

BrainChip’s aTENNuate: A Revolution in AI Edge Processing

While AI has made remarkable strides over the past decade, it’s often limited by the very devices we use daily. Edge devices, such as smartphones, drones, and IoT gadgets, traditionally lack the raw computing power of a full-scale cloud server. But with aTENNuate, BrainChip is extending the capabilities of these devices, allowing them to run complex AI modelsthat were once thought impossible outside of a data center. This expansion means more intelligent interactions across more devices in more places.

AI That Learns On the Fly​

Another standout feature of aTENNuate is its ability to support on-device learning. While most AI systems need to send data back to a cloud server for analysis and updates, aTENNuate enables devices to learn and evolve in real-time. For instance, your smart home devices can adapt to your routines without needing constant updates from the cloud. This not only improves performance but also enhances privacy, as sensitive data doesn’t need to be transmitted.

Enhancing Security Through Local Processing​

Security is a top concern for any AI system, especially when it comes to edge devices. With aTENNuate’s focus on local processing, data remains on the device, reducing the risk of breaches during transmission. This makes the technology ideal for sensitive applications, such as healthcare devices or financial systems. Plus, with the constant evolution of cyber threats, having AI that can quickly adapt and bolster security protocols in real-time is invaluable.

Versatility Across Industries​

The applications of aTENNuate extend across various industries, from automotive to healthcare, and even retail. In the automotive sector, edge AI systems powered by aTENNuate can enhance autonomous driving by allowing vehicles to process sensor data instantly, making real-time decisions critical for safety. In healthcare, wearable devices equipped with aTENNuate can monitor vital signs and alert users or physicians about potential health risks immediately, without needing cloud access.

Paving the Way for the Future of AI​

As AI becomes more integrated into our everyday lives, the importance of efficient, real-time processing will only grow. aTENNuate is poised to lead this shift, enabling AI to become not only smarter but also more sustainable. With its ability to reduce power consumption, enhance security, and expand AI’s potential across industries, BrainChip’s aTENNuate is much more than just another AI innovation—it’s the future of edge computing.

The Bottom Line​

The introduction of aTENNuate marks a significant step forward in the world of AI. By addressing the critical challenges of power consumption, performance, and adaptability, BrainChip is setting a new standard for edge AI solutions. Whether you’re looking at smarter homes, safer cars, or more personalized devices, aTENNuate ensures that the next generation of AI will be faster, greener, and smarter than ever before.
As AI continues to evolve, it’s exciting to see where groundbreaking technologies like aTENNuate will take us next. With real-time processing and local learning capabilities, we’re looking at a future where AI is more responsive, secure, and efficient—all without needing a cloud connection.

Resources​

  1. BrainChip Official Website
    The official BrainChip site provides detailed information on aTENNuate, including technical specifications, use cases, and news on future developments.
    BrainChip Official Website
  2. Whitepapers on Edge AI
    These technical documents cover the principles of edge AI and how innovations like aTENNuate enhance efficiency and performance.
    Example: Edge AI: Optimizing for Tomorrow’s Devices
  3. Research Articles on Edge Computing and Neural Networks
    Websites like IEEE Xplore offer in-depth academic research on neural networks, edge processing, and energy-efficient AI models.
    IEEE Xplore Digital Library
  4. AI News Outlets
    Stay updated with the latest AI advancements through reputable AI news portals like VentureBeat or TechCrunch, which frequently cover innovations in edge AI.
    VentureBeat – AI
  5. YouTube Channels and Webinars
    For visual learners, YouTube channels such as BrainChip’s official page offer insightful webinars and product demos, where they explain their cutting-edge technologies like aTENNuate in greater detail.
  6. Podcasts on AI and Edge Computing
    AI-focused podcasts often interview industry experts discussing topics like edge AI and innovations in neural network efficiency, making them a good way to digest complex information on-the-go. Some recommendations include “AI in Business” and “The AI Alignment Podcast.

IMG_9587.jpeg




BrainChip has developed the 2nd generation of its Akida platform, which is based on neuromorphic computing.
Neuromorphic computing allows complex neural networks to operate more efficiently and with lower energy consumption.

A key feature of the new Akida platform is the Temporal Event-Based Neural Networks (TENNs), which enable a significant reduction in model size and computational effort without compromising accuracy.
These networks are particularly useful for applications that process temporal data, such as audio and speech processing, video object recognition, and predictive maintenance.

The Akida platform also supports Vision Transformers (ViT) and Long Range Skip Connections, which accelerate the processing of visual and multisensory data.
These features are crucial for applications in healthcare, the automotive industry, and smart cities.

Another advantage of the Akida technology is its on-chip learning capability, which allows sensitive data to be processed locally, thereby enhancing security and privacy.
The platform is flexible and scalable, making it suitable for a wide range of Edge AI applications.
 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 71 users

Tothemoon24

Top 20

Occupant Safety Redefined: Socionext’s Innovations​

ASEUR24-Matthias-Neumann.png

Socionext is at the forefront of automotive interior sensing technology, known for their cutting-edge radar sensors and advanced signal processing solutions. This year, they introduce the revolutionary 60 GHz Radar Sensor, SC1260, which sets new benchmarks in vehicle safety. The SC1260 is designed to enhance in-cabin experiences, offering critical features such as child presence detection and seat occupancy monitoring, all while being highly efficient with minimal power consumption.​

At InCabin, Socionext presents a dynamic cockpit demonstrator, showcasing the SC1260’s ability to detect vital signs like breathing, ensuring a safer and smarter automotive environment.​

Watch the video below to find out more:​


Q&A with Socionext

What exciting innovations is Socionext bringing to the table this year?
We’re thrilled to showcase our latest innovation—the 60 GHz Radar Sensor, SC1260. At InCabin, we’ll present a cockpit demonstrator that highlights the groundbreaking capabilities of this chip, setting new standards for automotive interior safety.
What makes the SC1260 radar sensor stand out from existing solutions?
The SC1260 is more than just a radar sensor – it’s a highly advanced solution with integrated antennas in a compact 6×9 mm package. It includes two transmitter and four receiver antennas, simplifying vehicle integration. The chip also features advanced signal processing that outputs point cloud data directly, reducing BOM costs by enabling the use of smaller CPUs and memory. With an average power consumption of just 0.7 milliwatts, it’s incredibly efficient.
Signal processing is a critical component. Can you explain the advantages of the SC1260’s signal processing capabilities?
Absolutely. The SC1260 features integrated blocks for advanced signal processing, including range FFT and a CFAR engine for filtering. It measures distance and angle, clusters data, and outputs point cloud data in XYZ coordinates. All of this is built into the hardware, eliminating the need for an external CPU or firmware, which makes it a highly efficient solution.
What are the primary use cases for this radar sensor in automotive applications?
The SC1260 is designed for a variety of automotive interior sensing applications. One key use case is child presence detection, where the radar sensor can detect vital signs like breathing or heartbeat, even through materials like blankets. It’s also highly effective for seat occupancy detection, accurately identifying multiple passengers. Additionally, it’s perfect for security monitoring, such as intrusion detection or proximity alerts.
Are there any other potential applications for the SC1260 that you’d like to highlight?
Definitely. The SC1260 can be used in kick-sensors for tailgate operations, distinguishing between gestures like kicks or swipes. It’s also ideal for gesture control in HMI applications, allowing drivers to control functions with simple hand movements, reducing the need for physical buttons and enhancing the user experience.
You mentioned presenting a cockpit demonstrator at the show. What does the demonstrator consist of?
Our cockpit demonstrator immerses attendees in a real-world automotive experience. It features a replica of a classic car interior with both front and rear seats. Inside, we’ve installed our state-of-the-art radar sensor, connected to a front panel display that visualizes live radar data. It’s a dynamic way to showcase the future of in-cabin sensing.
Will the cockpit show how passengers can be seen in the car as well as other critical information?
Exactly! The radar sensor does more than just detect seat occupancy—it can identify passengers in real time and even detect vital signs like breathing. It’s designed to elevate vehicle safety and comfort, demonstrating how advanced radar technology can monitor passengers with precision and reliability.
Will attendees be able to experience the cockpit demonstrator and test it firsthand?
Absolutely! We invite everyone to sit in the cockpit and experience the technology firsthand. It’s an interactive demonstration where you can see the radar sensor in action. We’re excited to guide attendees through the possibilities and discuss how this technology can transform the future of automotive interiors.
How can our readers learn more?
You can visit our stand and explore our exciting products firsthand. Our experts will be on-site to answer any questions.
For those interested in learning more, Socionext encourages visits to their exhibition stand and engagement on their social media platforms:

Don’t miss key conversations at InCabin Europe this May. Get your pass here.

AutoSens Co-Location

Search
 
  • Like
  • Fire
  • Love
Reactions: 22 users

Tothemoon24

Top 20
IMG_9588.jpeg
 
  • Like
  • Fire
  • Love
Reactions: 38 users

Frangipani

Regular
Encouraging words by our CTO:

D5A2BACA-ABF1-4911-81CC-025F641FE559.jpeg
 
  • Like
  • Fire
  • Love
Reactions: 65 users

Frangipani

Regular


B26B2E31-9E7D-479A-851F-0A12CF79B74D.jpeg




7647F273-5232-439A-8B59-B5E5681A079B.jpeg


By the way, there is at least one confirmed connection as well as two potential dots between our CTO Tony Lewis / BrainChip and members of the NeuroMed 2024 Organizing Committee:

75181FBC-72BD-4794-AD65-D7DB929D01E2.jpeg


Ralph Etienne-Cummings, Professor of Electrical and Computer Engineering at Johns Hopkins University in Baltimore: “Yeah, Tony and I have published together quite a bit as well, so we have a long history together.”
(From the Brains & Machines podcast episode with with Elisa Donati)



1B06F366-784D-4B52-959C-3901F3A485FF.jpeg

(…) The USC Brain-Body Dynamics Lab is led by Francisco Valero-Cuevas, Professor of Biomedical Engineering, Aerospace and Mechanical Engineering, Electrical and Computer Engineering, Computer Science, and Biokinesiology and Physical Therapy (try to fit all this onto one business card!), who has been fascinated with the biomechanics of the human hand for years (see the 2008 article below) and was already talking about utilising neuromorphic chips in robots three years ago, in a ‘research update’ video recorded on June 17, 2021:



“But at the same time we are building physical robots that have what are called ‘neuromorphic circuits’. These are computer chips that are emulating populations of spiking neurons that then feed motors and amplifiers and the like that actually produce manipulation and locomotion.” (from 2:56 min)


View attachment 65930


Given that a number of USC Viterbi School of Engineering faculty members are evidently favourably aware of BrainChip (see below) - plus our CTO happens to be a USC School of Engineering alumnus and possibly still has some ties to his alma mater - I wouldn’t be surprised if Valero Lab researchers were also experimenting with Akida.

View attachment 65856

View attachment 65858
 
  • Like
  • Love
  • Fire
Reactions: 49 users

Slade

Top 20
Great posts by @Frangipani today that I genuinely appreciate.
 
  • Like
Reactions: 27 users

IloveLamp

Top 20
  • Like
  • Fire
  • Thinking
Reactions: 9 users
Top Bottom