BRN Discussion Ongoing

Tothemoon24

Top 20


Information in the public domain suggests Valeo has an early #innovation lead. Its Smart Safety 360 combines interior and exterior #vision with #radar and #ultrasonic sensing. Magna International may be working on a similar solution, as it swallows Veoneer's entire active safety business, and has known partnerships with Mobileye and Seeing Machines.
 
  • Like
  • Fire
Reactions: 10 users

Quiltman

Regular
This post was made by Arijit from TCS Research 5 months ago, seeking PhD and Masters candidates.
I'm unsure if it was posted on this forum at the time.

Two months after this post by Arijit, TCS announced a formal commercial partnership with BrainChip via Tata Elxsi, with a focus on healthcare and industrial ( robotics ).

Just think about what is being said here .... with knowledge it is being done utlising BrainChip IP.

At TCS Research, we specialise in embedding intelligence at the edge through Neuromorphic Computing and Spiking Neural Networks.
Our systems targeted for evolving neuromorphic hardware offer extreme low-power consumption, online learning, and real-time inferencing, ideal for IoT, edge analytics, healthcare, robotics, space-tech & more.


explore new topics, advance ongoing projects

If we can't be bullish about this ... well .... then I am lost for words !

1697454951055.png
 
  • Like
  • Fire
  • Love
Reactions: 67 users

Tothemoon24

Top 20
IMG_7690.jpeg

💥 [BREAKING] Today, we proudly unveil the GenX320 Metavision® sensor - the smallest 🔍 and most power-efficient event-based vision sensor in the world!
👉 https://bit.ly/3QgQoRY

Built for the next generation of smart consumer devices, GenX320 delivers new levels of intelligence, autonomy, and privacy to a vast range of fast-growing Edge market segments, including AR/VR headsets, wearables, smart camera and monitoring systems, IoT devices, and many more.

The sensor has been developed with a specific focus on the unique energy, compute, and size requirements of Edge AI vision systems. It enables robust, high-speed vision at ultra-low power, even in challenging operating and lighting conditions.

GenX320 key benefits include:
✅ Ultra-fast event timestamping (1 µsec) with flexible data formatting
✅ Smart power management for ultra-low power consumption and wake-on-events (as low as 36µW)
✅ Seamless integration with standard SoCs, reducing external processing
✅ Low-latency connectivity through MIPI or CPI data interfaces
✅ AI-ready with on-chip histogram output for AI accelerators
✅ Sensor-level privacy due to inherently sparse event data and static scene removal
✅ Compatibility with Prophesee Metavision Intelligence software suite

🚀 Learn more about how the GenX320 successfully overcomes current vision sensing limitations in edge applications 👉 https://bit.ly/3QgQoRY
 
  • Like
  • Fire
  • Love
Reactions: 89 users

Frangipani

Top 20

Event-based sensor for ‘always-on’ video, low-power apps​

New Products | October 16, 2023
By Peter Clarke
IMAGE SENSOR



Event-based image sensor pioneer Prophesee SA (Paris, France) has launched a low-power 320 pixel by 320 pixel event-based sensor for multiple applications including ‘always-on’ applications.​

The GenX320 is the first of Prophesee’s fifth generation of event-based image sensors and is made at a European foundry that makes image sensors and supports back-side illumination, said Luca Verre, CEO and co-founder of Prophesee. Generations 2 and 3 were fabbed for Prophesee by Tower Semiconductor and Gen4 by Sony said Verre but he declined to identify the manufacturer of the GenX320.


The emphasis for the GenX320 is on low power consumption and it is the world’s smallest and most power-efficient event-based vision sensor, said Verre. This makes it suitable for integration in IoT camera and detection systems, AR/VR headsets, gesture recognition devices and eye-tracking applications.


The fifth generation Metavision sensor has a die size of 3mm by 4mm with a 6.3-micron pixel BSI stacked with a 1/5-inch optical format.

Specifications​

The small size and low power consumption open up numerous edge-applications. For people-counting and fall-detection, the lack of resolution is a virtue allowing the maintenance of privacy, Prophesee said.


Latency is of the order microseconds for high-precision time-stamping of events and the nature of event-based detection makes it suitable for high-dynamic range and low-light applications such as outdoor environments.

Power management modes on-chip reduce power consumption down to 36-microwatts allowing an image sensor to be an always-on resource that can wake up a system. Deep sleep and standby modes are also featured.

MIPI or CPI data output interfaces offer low-latency connectivity to embedded processing platforms, including low-power microcontrollers and modern neuromorphic processor architectures. The sensor also supports histogram output compatible with multiple AI accelerators.

There is native compatibility with Prophesee’s Metavision Intelligence event-based vision software suite.

Early access​

Prophesee has sampled the GenX320 to a number of customers who are developing some specific use cases.

Zinn Labs
is developing gaze tracking systems with a power budget below 20mW. The package size of the GenX320 allows it to be applied to space-constrained head-mounted applications in AR/VR products.

UltraLeap Ltd. is using GenX20 event-based sensors for hand tracking and gesture recognition in its TouchFree interface application.

The GenX320 is available for purchase from Prophesee and its sales partners. It is supported by a complete range of development tools for easy exploration and optimization, including a comprehensive Evaluation Kit housing a chip-on-board GenX320 module

Related links and articles:​

www.prophesee.ai
 
  • Like
  • Love
  • Fire
Reactions: 61 users

Diogenese

Top 20
View attachment 47222
💥 [BREAKING] Today, we proudly unveil the GenX320 Metavision® sensor - the smallest 🔍 and most power-efficient event-based vision sensor in the world!
👉 https://bit.ly/3QgQoRY

Built for the next generation of smart consumer devices, GenX320 delivers new levels of intelligence, autonomy, and privacy to a vast range of fast-growing Edge market segments, including AR/VR headsets, wearables, smart camera and monitoring systems, IoT devices, and many more.

The sensor has been developed with a specific focus on the unique energy, compute, and size requirements of Edge AI vision systems. It enables robust, high-speed vision at ultra-low power, even in challenging operating and lighting conditions.

GenX320 key benefits include:
✅ Ultra-fast event timestamping (1 µsec) with flexible data formatting
✅ Smart power management for ultra-low power consumption and wake-on-events (as low as 36µW)
✅ Seamless integration with standard SoCs, reducing external processing
✅ Low-latency connectivity through MIPI or CPI data interfaces
✅ AI-ready with on-chip histogram output for AI accelerators
✅ Sensor-level privacy due to inherently sparse event data and static scene removal
✅ Compatibility with Prophesee Metavision Intelligence software suite

🚀 Learn more about how the GenX320 successfully overcomes current vision sensing limitations in edge applications 👉 https://bit.ly/3QgQoRY


Event-Based Metavision® Sensor GENX320 | PROPHESEE


The link leads to Prophesee's early Adopters:

Zinn Labs,
ultraleap,
Xperi.

Zinn patent application for eye tracking glasses:


WO2023081297A1 EYE TRACKING SYSTEM FOR DETERMINING USER ACTIVITY


1697460711642.png



1697460889320.png



Embodiments relate to an eye tracking system. A headset of the system includes an eye tracking sensor that captures eye tracking data indicating positions and movements of a user's eye. A controller (e.g., in the headset) of the tracking system analyzes eye tracking data from the sensors to determine eye tracking feature values of the eye during a time period. The controller determines an activity of the user during the time period based on the eye tracking feature values. The controller updates an activity history of the user with the determined activity.

A method comprising: analyzing eye tracking data to determine eye tracking feature values of an eye of a user of a headset during a time period, wherein the eye tracking data is determined from an eye tracking system on the headset; determining an activity of the user during the time period based on the determined eye tracking feature values; and updating an activity history of the user with the determined activity, wherein the feature values include movements of the eye, and determining the activity comprises identifying movements of the eye that correspond to the activity.

In some embodiments, a machine learned model of the activity module 310 is a recurrent neural network (e.g., using a long short-term memory neural network or gated recurrent units) that considers the time-based component of the eye tracking feature values.
 
  • Like
  • Fire
  • Love
Reactions: 50 users

MrRomper

Regular
  • Like
  • Fire
  • Thinking
Reactions: 38 users

charles2

Regular
  • Like
  • Thinking
Reactions: 7 users

IloveLamp

Top 20
  • Like
  • Fire
Reactions: 28 users

Tuliptrader

Regular
  • Like
  • Haha
  • Love
Reactions: 31 users

Nobody

Regular
 
  • Fire
  • Like
  • Love
Reactions: 3 users
The big seller walls are back up to hold share price down and in place for as long as possible…resume manipulation programming.
 
  • Like
  • Sad
Reactions: 10 users

Vladsblood

Regular
The big seller walls are back up to hold share price down and in place for as long as possible…resume manipulation programming.
Gotcha Fastback,, Outright morally criminal systemic activity sanctified by the ever complying ASX throughout Brainchip's advances.
Especially since around the MB announcement. Vlad.
 
  • Like
Reactions: 8 users
Gotcha Fastback,, Outright morally criminal systemic activity sanctified by the ever complying ASX throughout Brainchip's advances.
Especially since around the MB announcement. Vlad.
Yes it sure is criminal by big insto players @Vladsblood

Insto’s know they a going to make a mint out of Brainchip in the medium to long term at these buy prices …they will hold here as long as they can.
 
  • Like
Reactions: 6 users

7für7

Top 20
What about the partnership between Qualcomm and Prophesee? It’s not only brainchip working with Prophesee. I’m still waiting for a statement from brainchips side! dyor
 
  • Like
  • Haha
Reactions: 6 users
Wow! what a flurry of activity!
Could it be as a result of the second generation release, no wait, that would take time to sync with the development cycle of these companies, it must built on those useless AKIDA 1000 and 1500 chip designs:ROFLMAO:.

UNLESS

the eco system partners have managed to really shorted the implementation cycle like they have been suggesting.🚀
Either way, great news for the 'latched on barnacle' (LOBs) BRN holders.
Exciting next 2 Quarters in this LOB's opinion.
 
Last edited:
  • Like
  • Haha
Reactions: 13 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
Modern neuromorphic processor architectures...PLURAL...Hmmm?????



This Tiny Sensor Could Be in Your Next Headset​

Prophesee
PROPHESEE Event-Based Metavision GenX320 Bare Die 2.jpg

Neuromorphic computing company develops event-based vision sensor for edge AI apps.
Spencer Chin | Oct 16, 2023


As edge-based artificial intelligence (AI) applications become more common, there will be a greater need for sensors that can meet the power and environmental needs of edge hardware. Prophesee SA, which supplies advanced neuromorphic vision systems, has introduced an event-based vision sensor for integration into ultra-low-power edge AI vision devices. The GenX320 Metavision sensor, which uses a tiny 3x4mm die, leverages the company’s technology platform into growing intelligent edge market segments, including AR/VR headsets, security and monitoring/detection systems, touchless displays, eye tracking features, and always-on intelligent IoT devices.

According to Luca Verre, CEO and co-founder of Prophesee, the concept of event-based vision has been researched for years, but developing a viable commercial implementation in a sensor-like device has only happened relatively recently. “Prophesee has used a combination of expertise and innovative developments around neuromorphic computing, VLSI design, AL algorithm development, and CMOS image sensing,” said Verre in an e-mail interview with Design News. “Together, those skills and advancements, along with critical partnerships with companies like Sony, Intel, Bosch, Xiaomi, Qualcomm, 🤔and others 🤔have enabled us to optimize a design for the performance, power, size, and cost requirements of various markets.”

Prophesse’s vision sensor is a 320x320, 6.3μm pixel BSI stacked event-based vision sensor that offers a tiny 1/5-in. optical format. Verre said, “The explicit goal was to improve integrability and usability in embedded at-the-edge vision systems, which in addition to size and power improvements, means the design must address the challenge of event-based vision’s unconventional data format, nonconstant data rates, and non-standard interfaces to make it more usable for a wider range of applications. We have done that with multiple integrated event data pre-processing, filtering, and formatting functions to minimize external processing overhead.”

Verre added, “In addition, MIPI or CPI data output interfaces offer low-latency connectivity to embedded processing platforms, including low-power microcontrollers and modern neuromorphic processor architectures.

Low-Power Operation

According to Verre, the GenX320 sensor has been optimized for low-power operation, featuring a hierarchy of power modes and application-specific modes of operation. On-chip power management further improves sensor flexibility and integrability. To meet aggressive size and cost requirements, the chip is fabricated using a CMOS stacked process with pixel-level Cu-Cu bonding interconnects achieving a 6.3μm pixel-pitch.
The sensor performs low latency, µsec resolution timestamping of events with flexible data formatting. On-chip intelligent power management modes reduce power consumption to a low 36uW and enable smart wake-on-events. Deep sleep and standby modes are also featured.

According to Prophesee, the sensor is designed to be easily integrated with standard SoCs with multiple combined event data pre-processing, filtering, and formatting functions to minimize external processing overhead. MIPI or CPI data output interfaces offer low-latency connectivity to embedded processing platforms, including low-power microcontrollers and modern neuromorphic processor architectures.

Prophesee’s Verre expects the sensor to find applications in AR/VR headsets. “We are solving an important issue in our ability to efficiently (i.e. low power/low heat) support foveated rendering in eye tracking for a more realistic, immersive experience. Meta has discussed publicly the use of event-based vision technology, and we are actively involved with our partner Zinn Labs in this area. XPERI has already developed a driver monitor system (DMS) proof of concept based on our previous generation sensor for gaze monitoring and we are working with them on a next-gen solution using GenX320 for both automotive and other potential uses, including micro expression monitoring. The market for gesture and motion detection is very large, and our partner Ultraleap has demonstrated a working prototype of a touch-free display using our solution.”

The sensor incorporates an on-chip histogram output compatible with multiple AI accelerators. The sensor is also natively compatible with Prophesee Metavision Intelligence, an open-source event-based vision software suite that is used by a community of over 10,000 users.

Prophesee will support the GenX320 with a complete range of development tools for easy exploration and optimization, including a comprehensive Evaluation Kit housing a chip-on-board (COB) GenX320 module, or a compact optical flex module. In addition, Prophesee will offer a range of adapter kits that enable seamless connectivity to a large range of embedded platforms, such as an STM32 MCU, speeding time-to-market.

Spencer Chin is a Senior Editor for Design News covering the electronics beat. He has many years of experience covering developments in components, semiconductors, subsystems, power, and other facets of electronics from both a business/supply-chain and technology perspective. He can be reached at Spencer.Chin@informa.com.

 
  • Like
  • Love
  • Fire
Reactions: 36 users

Murphy

Life is not a dress rehearsal!
Berlin, the Disruptive's Substack article is one of the best overviews of the dilemma faced by cloud servers globally, then what BRN solves, why it solves it, provides a vocabulary or glossary of terms needed by a lay person to understand the story, then explains where server farms/data centres are headed, what the edge is, a comparison of Arm and BRN, why BRN will be possibly as big as Arm, what differentiates BRN and a technical description of what Akida represents, why now is the time for BRN to begin to make inroads into the AI scenario and more.

It is compelling and definitely a great article for your parents to read, I will say that it is probably the best overview of BRN that I have seen anywhere outside of this forum. So if you haven't had a look at it, do yourself a favour. And thanks @Berlinforever
What a great read for the layman or computer genius.
Every holder of BRN should read this.


If you don't have dreams, you can't have dreams come true!
 
  • Like
  • Fire
  • Love
Reactions: 18 users
Has anyone seen this:


BrainChip's leadership shares details on the company's development of the Akida chip, future prospects, and more in this informative podcast.​

Artificial Intelligence
- Edge AI
- Edge Processors
In a recent episode of the BrainChip podcast, a select group of the company's top executives gathered to discuss the future trajectory of artificial intelligence (AI) and the pivotal role BrainChip is set to play in this dynamic landscape. We highlight some of the key discussion points below but encourage you to listen to the whole episode here.

Leadership with Vision​

Rob Tolson, serving as the Vice President of Sales and Marketing, is the driving force behind BrainChip's global outreach. With a keen understanding of market dynamics and a vision for BrainChip's expansive global presence, Tolson has been instrumental in positioning the company as a leader in the AI industry.
Peter Vandermade, the CEO and Co-founder of BrainChip, brings to the table a wealth of experience and a visionary approach. His emphasis on the importance of the company's Advanced Research Center in Perth showcases his commitment to pioneering the next wave of AI innovations. Vandermade's insights into the potential of Akida, BrainChip's flagship technology, highlight his forward-thinking approach to AI's future.
Anil Mankar, the Vice President of Product Development and also a Co-founder, offers a deep dive into the technical intricacies of BrainChip's operations. His insights into the production process, from chip manufacturing to rigorous testing, provide a glimpse into the meticulous steps BrainChip takes to ensure top-tier functionality and performance.
Lastly, Ken Scarance, the Chief Financial Officer of BrainChip, sheds light on the company's financial endeavours. His discussions on Brain Chips strategic financial initiatives, including capital raising agreements and efforts to bolster its presence in the U.S. capital markets, underscore the company's ambitions for growth and market dominance.

BrainChip's Evolution: A Glimpse into the Company's Global Strategy

To kick off the episode, the team outlines the company’s history, vision and future plans. BrainChip's commitment to being a global leader in the AI industry is evident in its expansive operational presence. With hubs in California, Perth, France, and India, the company has strategically positioned itself in key tech-centric locations. This global footprint not only facilitates diverse collaborations but also ensures that BrainChip remains at the pulse of AI advancements worldwide.

From Research to Production: A Strategic Pivot​

Historically, BrainChip has been synonymous with cutting-edge research in neuromorphic computing. Their dedication to pushing the boundaries of AI has positioned them as pioneers in the field. However, recognizing the vast commercial potential of their innovations and the industry's shifting dynamics, BrainChip is undergoing a transformation.
Tolson emphasised this transition in the podcast, noting the company's pivot from being primarily research-driven to adopting a production-centric approach. For engineers, this shift signifies BrainChip's intent to translate their groundbreaking research into tangible, market-ready solutions.

Engaging the Tech Community​

Understanding the importance of effective communication in the tech world, especially among engineers and developers, Tolson highlighted BrainChip's efforts to foster engagement. The company's podcast series, for instance, is more than just a marketing tool. It's a platform for knowledge sharing, offering insights into Brain Chips offerings, their vision for AI's future, and the technical intricacies that make their solutions stand out. Stay up to date with BrainChip activities by following their Wevolver profile.

The Advanced Research Center: A Beacon of Innovation​

Located in Perth, Australia the Advanced Research Center is BrainChip's crown jewel. It's not just a research facility; it's a testament to the company's dedication to pushing the boundaries of what's possible in AI. While many in the industry focus on refining existing deep learning models, BrainChip's centre is already looking beyond, exploring the next frontier of AI innovations.

Akida: The Future of AI Technology​

Vandermade's enthusiasm was palpable when discussing Akida, BrainChip's flagship technology. Akida is not just another chip in the market; it embodies BrainChip's vision for the future of AI. Two of its standout features are its low energy consumption and its on-chip learning capabilities.
For engineers and tech enthusiasts, these features are significant. The low energy consumption means that Akida is not only efficient but also environmentally conscious, addressing a growing concern in today's tech-driven world. On the other hand, on-chip learning capabilities represent a leap in AI technology, allowing for faster, more efficient processing without the need for constant back-and-forth with centralised data centres.
In Vandermade's view, Akida is set to redefine the AI industry. Its unique features position it not just as an innovative product but as a transformative solution that could shape the way we think about and implement AI in various applications.
eyJidWNrZXQiOiJ3ZXZvbHZlci1wcm9qZWN0LWltYWdlcyIsImtleSI6ImZyb2FsYS8xNjk3MjA4NDI5MTg2LWFraWRhLmpwZyIsImVkaXRzIjp7InJlc2l6ZSI6eyJ3aWR0aCI6OTUwLCJmaXQiOiJjb3ZlciJ9fX0=

Financial Initiatives and Market Presence​

To wrap up the podcast, Ken Scarance, BrainChip's Chief Financial Officer, touched upon the company's recent financial endeavours. He highlighted the company's agreement with LDA, aimed at raising capital, and BrainChip's strategic move to enhance its presence in the U.S. capital markets. Additionally, the company is bolstering its investor relations strategy, aiming to foster better communication with stakeholders and educate the market about BrainChips groundbreaking offerings.

Conclusion​

As BrainChip continues to make strides in the AI domain, the company remains committed to keeping its audience informed and engaged. With a series of events lined up to showcase Akida's capabilities and an unwavering focus on innovation, BrainChip is undoubtedly poised to redefine the boundaries of AI. Stay up to date with BrainChip's new content here.
 
  • Like
  • Fire
  • Love
Reactions: 27 users
Top Bottom