BRN Discussion Ongoing

Diogenese

Top 20
Fair chance you'd know these lines from your post a few weeks ago referring to the number 'Sydney Town' from G. Shearston.
The monopolies can all arrange
To rig their shares on the Stock Exchange
Through lottery tickets with my spouse
I've got shares in the Opera House.
I've got 6 bottles in the fridge,
 
  • Haha
  • Like
Reactions: 5 users

buena suerte :-)

BOB Bank of Brainchip
  • Like
  • Haha
  • Fire
Reactions: 12 users

McHale

Regular
I thought that newcomers might struggle with a lot of the very well researched content here, so I asked myself how I would explain BrainChip to my parents. I am certain, that aspects of this multiple page summary will not standup to the technical scrutiny applied here. Just trust me that I am here for the long run.

Berlin, the Disruptive's Substack article is one of the best overviews of the dilemma faced by cloud servers globally, then what BRN solves, why it solves it, provides a vocabulary or glossary of terms needed by a lay person to understand the story, then explains where server farms/data centres are headed, what the edge is, a comparison of Arm and BRN, why BRN will be possibly as big as Arm, what differentiates BRN and a technical description of what Akida represents, why now is the time for BRN to begin to make inroads into the AI scenario and more.

It is compelling and definitely a great article for your parents to read, I will say that it is probably the best overview of BRN that I have seen anywhere outside of this forum. So if you haven't had a look at it, do yourself a favour. And thanks @Berlinforever
 
Last edited:
  • Like
  • Love
  • Fire
Reactions: 61 users

M_C

Founding Member
Lockhead Martin on Neuromorphic computing and the like....



1000006807.png
1000006808.png
 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 26 users

Rach2512

Regular
  • Like
  • Fire
  • Love
Reactions: 17 users
Low volume day............Shorts heading in the right direction...........bring it

View attachment 47139

Does anyone have any of the short data that dates back to the start of 2022, ie when the MB announcement was made?
 
  • Like
  • Thinking
Reactions: 2 users

M_C

Founding Member
  • Like
  • Fire
  • Love
Reactions: 19 users
I can't help it, but it's like things are eerily quiet today, like the calm before the storm. Even in in the forum
1697440308505.gif
 
  • Haha
Reactions: 13 users

BigDonger101

Founding Member
It's honestly a surprise shorts are still around 6% tbh when you consider the current share price to progress over time of BRN.

Quite funny. Looks like quite a few sheep on the short train.
 
  • Like
  • Fire
Reactions: 11 users

M_C

Founding Member
It's honestly a surprise shorts are still around 6% tbh when you consider the current share price to progress over time of BRN.

Quite funny. Looks like quite a few sheep on the short train.
The data is at the 10th of October, so as long as the trend continues, should see some form of reversal hopefully
 
  • Like
  • Fire
Reactions: 7 users

BigDonger101

Founding Member
The data is at the 10th of October, so as long as the trend continues, should see some form of reversal hopefully
I think so.

Last time we were 18 cents. Shorts were non existent....
 
  • Like
Reactions: 5 users

Esq.111

Fascinatingly Intuitive.
  • Like
  • Love
  • Fire
Reactions: 18 users
I think so.

Last time we were 18 cents. Shorts were non existent....
Evening Chippers ,

I have put a list of short positions over in the short thread.

Start date was 19.8.2022.

Please excuse the handwriting , coffee rings & general ruffled edge state of papers.

Regards ,
Esq.
Good luck anyone reading anything apart from here

1697446960281.gif
 
  • Like
Reactions: 2 users

Tothemoon24

Top 20


Information in the public domain suggests Valeo has an early #innovation lead. Its Smart Safety 360 combines interior and exterior #vision with #radar and #ultrasonic sensing. Magna International may be working on a similar solution, as it swallows Veoneer's entire active safety business, and has known partnerships with Mobileye and Seeing Machines.
 
  • Like
  • Fire
Reactions: 10 users

Quiltman

Regular
This post was made by Arijit from TCS Research 5 months ago, seeking PhD and Masters candidates.
I'm unsure if it was posted on this forum at the time.

Two months after this post by Arijit, TCS announced a formal commercial partnership with BrainChip via Tata Elxsi, with a focus on healthcare and industrial ( robotics ).

Just think about what is being said here .... with knowledge it is being done utlising BrainChip IP.

At TCS Research, we specialise in embedding intelligence at the edge through Neuromorphic Computing and Spiking Neural Networks.
Our systems targeted for evolving neuromorphic hardware offer extreme low-power consumption, online learning, and real-time inferencing, ideal for IoT, edge analytics, healthcare, robotics, space-tech & more.


explore new topics, advance ongoing projects

If we can't be bullish about this ... well .... then I am lost for words !

1697454951055.png
 
  • Like
  • Fire
  • Love
Reactions: 67 users

Tothemoon24

Top 20
IMG_7690.jpeg

💥 [BREAKING] Today, we proudly unveil the GenX320 Metavision® sensor - the smallest 🔍 and most power-efficient event-based vision sensor in the world!
👉 https://bit.ly/3QgQoRY

Built for the next generation of smart consumer devices, GenX320 delivers new levels of intelligence, autonomy, and privacy to a vast range of fast-growing Edge market segments, including AR/VR headsets, wearables, smart camera and monitoring systems, IoT devices, and many more.

The sensor has been developed with a specific focus on the unique energy, compute, and size requirements of Edge AI vision systems. It enables robust, high-speed vision at ultra-low power, even in challenging operating and lighting conditions.

GenX320 key benefits include:
✅ Ultra-fast event timestamping (1 µsec) with flexible data formatting
✅ Smart power management for ultra-low power consumption and wake-on-events (as low as 36µW)
✅ Seamless integration with standard SoCs, reducing external processing
✅ Low-latency connectivity through MIPI or CPI data interfaces
✅ AI-ready with on-chip histogram output for AI accelerators
✅ Sensor-level privacy due to inherently sparse event data and static scene removal
✅ Compatibility with Prophesee Metavision Intelligence software suite

🚀 Learn more about how the GenX320 successfully overcomes current vision sensing limitations in edge applications 👉 https://bit.ly/3QgQoRY
 
  • Like
  • Fire
  • Love
Reactions: 89 users

Frangipani

Regular

Event-based sensor for ‘always-on’ video, low-power apps​

New Products | October 16, 2023
By Peter Clarke
IMAGE SENSOR



Event-based image sensor pioneer Prophesee SA (Paris, France) has launched a low-power 320 pixel by 320 pixel event-based sensor for multiple applications including ‘always-on’ applications.​

The GenX320 is the first of Prophesee’s fifth generation of event-based image sensors and is made at a European foundry that makes image sensors and supports back-side illumination, said Luca Verre, CEO and co-founder of Prophesee. Generations 2 and 3 were fabbed for Prophesee by Tower Semiconductor and Gen4 by Sony said Verre but he declined to identify the manufacturer of the GenX320.


The emphasis for the GenX320 is on low power consumption and it is the world’s smallest and most power-efficient event-based vision sensor, said Verre. This makes it suitable for integration in IoT camera and detection systems, AR/VR headsets, gesture recognition devices and eye-tracking applications.


The fifth generation Metavision sensor has a die size of 3mm by 4mm with a 6.3-micron pixel BSI stacked with a 1/5-inch optical format.

Specifications​

The small size and low power consumption open up numerous edge-applications. For people-counting and fall-detection, the lack of resolution is a virtue allowing the maintenance of privacy, Prophesee said.


Latency is of the order microseconds for high-precision time-stamping of events and the nature of event-based detection makes it suitable for high-dynamic range and low-light applications such as outdoor environments.

Power management modes on-chip reduce power consumption down to 36-microwatts allowing an image sensor to be an always-on resource that can wake up a system. Deep sleep and standby modes are also featured.

MIPI or CPI data output interfaces offer low-latency connectivity to embedded processing platforms, including low-power microcontrollers and modern neuromorphic processor architectures. The sensor also supports histogram output compatible with multiple AI accelerators.

There is native compatibility with Prophesee’s Metavision Intelligence event-based vision software suite.

Early access​

Prophesee has sampled the GenX320 to a number of customers who are developing some specific use cases.

Zinn Labs
is developing gaze tracking systems with a power budget below 20mW. The package size of the GenX320 allows it to be applied to space-constrained head-mounted applications in AR/VR products.

UltraLeap Ltd. is using GenX20 event-based sensors for hand tracking and gesture recognition in its TouchFree interface application.

The GenX320 is available for purchase from Prophesee and its sales partners. It is supported by a complete range of development tools for easy exploration and optimization, including a comprehensive Evaluation Kit housing a chip-on-board GenX320 module

Related links and articles:​

www.prophesee.ai
 
  • Like
  • Love
  • Fire
Reactions: 61 users

Diogenese

Top 20
View attachment 47222
💥 [BREAKING] Today, we proudly unveil the GenX320 Metavision® sensor - the smallest 🔍 and most power-efficient event-based vision sensor in the world!
👉 https://bit.ly/3QgQoRY

Built for the next generation of smart consumer devices, GenX320 delivers new levels of intelligence, autonomy, and privacy to a vast range of fast-growing Edge market segments, including AR/VR headsets, wearables, smart camera and monitoring systems, IoT devices, and many more.

The sensor has been developed with a specific focus on the unique energy, compute, and size requirements of Edge AI vision systems. It enables robust, high-speed vision at ultra-low power, even in challenging operating and lighting conditions.

GenX320 key benefits include:
✅ Ultra-fast event timestamping (1 µsec) with flexible data formatting
✅ Smart power management for ultra-low power consumption and wake-on-events (as low as 36µW)
✅ Seamless integration with standard SoCs, reducing external processing
✅ Low-latency connectivity through MIPI or CPI data interfaces
✅ AI-ready with on-chip histogram output for AI accelerators
✅ Sensor-level privacy due to inherently sparse event data and static scene removal
✅ Compatibility with Prophesee Metavision Intelligence software suite

🚀 Learn more about how the GenX320 successfully overcomes current vision sensing limitations in edge applications 👉 https://bit.ly/3QgQoRY


Event-Based Metavision® Sensor GENX320 | PROPHESEE


The link leads to Prophesee's early Adopters:

Zinn Labs,
ultraleap,
Xperi.

Zinn patent application for eye tracking glasses:


WO2023081297A1 EYE TRACKING SYSTEM FOR DETERMINING USER ACTIVITY


1697460711642.png



1697460889320.png



Embodiments relate to an eye tracking system. A headset of the system includes an eye tracking sensor that captures eye tracking data indicating positions and movements of a user's eye. A controller (e.g., in the headset) of the tracking system analyzes eye tracking data from the sensors to determine eye tracking feature values of the eye during a time period. The controller determines an activity of the user during the time period based on the eye tracking feature values. The controller updates an activity history of the user with the determined activity.

A method comprising: analyzing eye tracking data to determine eye tracking feature values of an eye of a user of a headset during a time period, wherein the eye tracking data is determined from an eye tracking system on the headset; determining an activity of the user during the time period based on the determined eye tracking feature values; and updating an activity history of the user with the determined activity, wherein the feature values include movements of the eye, and determining the activity comprises identifying movements of the eye that correspond to the activity.

In some embodiments, a machine learned model of the activity module 310 is a recurrent neural network (e.g., using a long short-term memory neural network or gated recurrent units) that considers the time-based component of the eye tracking feature values.
 
  • Like
  • Fire
  • Love
Reactions: 50 users
Top Bottom