BRN Discussion Ongoing

Labsy

Regular
They mentioned Sony, Bosch, Intel but not brainchip… why not brainchip if it would be the groundbreaking technology for them? I would not bet on this horse! Still waiting and holding without hyping every article and announcement about AI dyor
For the same reason ARM wasn't mentioned. We sell IP buddy. The other companies use our IP...
Back to basics mate.
 
  • Like
  • Fire
Reactions: 20 users
  • Like
Reactions: 1 users

Diogenese

Top 20
I don't think they would infringe on our technology, as they are partners and big fans of AKIDA..

But knowing the above and our need for more exposure, yould think we would rank a mention in there somewhere?..

Very Low power envelope, is really the only thing we've got to go on?

Their tech is already neuromorphic..
Hi DB,

Prophesee use the term "neuromorphic" to describe the spiking action of their pixels.

I recently read an article which referred to the function of DVS/event cameras as "retinomorphic". I think this is a better term to describe Prophesee's function as it distinguishes imitation of the action of the retina from the imitation of neurons involved in "neuromorphic" functions.

I posted about Zinn's patents yesterday:


Another one of Prophesee's early adopters is Xperi, who use a NN dating from 2015, based on principles from the last millennium.

WO2017129325A1 A CONVOLUTIONAL NEURAL NETWORK 20160129

1697510952268.png


As indicated above, especially during feature extraction, the convolution engine 32 can process windows of NxM pixels provided by the image cache 31 each clock cycle. In order to produce one output pixel in a given output map, the convolution engine 32 needs: one clock cycle for 2D convolution; or a number of clock cycles equal to the number of input maps for 3D convolutions. The same convolution architecture can be used for feature extracted and classification. For feature classification, each neuron/connection of the fully connected layers will have a different weight so the number of weights will typically be greater than for convolutional feature extraction where the same convolutional kernel is applied when calculating each pixel of a new map. Once the image data and weights are available within the image cache 31 and weights cache 37, the convolution engine 32 performs a number of scalar products to produce an output pixel value. It is appreciated that using a more parallelized cache 31 could accelerate the process by for example calculating pixels for output maps in parallel, but this would excessively increase the size of the convolution engine 32.

They even make a virtue of using more MACs ...

Note that in a 3D convolution, each input channel contributes to each output channel. This means that using this technique, the number of MAC (multiplier-accumulator) operations that can be performed increases quadratically with the number of channels. Thus, by reading pixels from more than one source map (channel) per clock cycle, the throughput of a 3D convolution operation can be increased without unduly complicating write/read operations to the image cache 31 ; 31 -A,31 -B.

I don't think we should lose much sleep over Xperi, other than to hope they see the light ...
 
Last edited:
  • Like
  • Love
  • Fire
Reactions: 23 users

skutza

Regular
Not trying to be an asshole, but just a bit of friendly (non financial) advice. Just because the share price drops doesn't mean it's a buy. We can wish for all the dot connecting and long winded posts from knowledgeable people all we like. Buy one true FACT is that we are here because a lack of revenue. How much $$ would you (and me) have saved if you'd just not bought at lows and waited? $1? 80c? 50c? . I'm happy to wait and wait until I see the company make big $$$ or a real proper DOTs like a signed contract from a big name. If I miss the first 20% waiting, I'm fine with that, but that first 20% will be a lot less than the 50% already lost in the last 2 months?

Food for thought.
 
  • Like
  • Thinking
  • Love
Reactions: 13 users

7für7

Top 20
Rob telson likes it via LinkedIn so suggest he is not clapping for its competitors
Or maybe he is just an AI enthusiast and passionate about the topic. Also he is maybe a fair player 🤷🏻‍♂️
 
  • Like
Reactions: 1 users

7für7

Top 20
For the same reason ARM wasn't mentioned. We sell IP buddy. The other companies use our IP...
Back to basics mate.
You have Insider knowledge ha? 🤗 ok I will buy more because you said this now! Thanks
 

Damo4

Regular
Not trying to be an asshole, but just a bit of friendly (non financial) advice. Just because the share price drops doesn't mean it's a buy. We can wish for all the dot connecting and long winded posts from knowledgeable people all we like. Buy one true FACT is that we are here because a lack of revenue. How much $$ would you (and me) have saved if you'd just not bought at lows and waited? $1? 80c? 50c? . I'm happy to wait and wait until I see the company make big $$$ or a real proper DOTs like a signed contract from a big name. If I miss the first 20% waiting, I'm fine with that, but that first 20% will be a lot less than the 50% already lost in the last 2 months?

Food for thought.

That only hold water if it's the company's fault, and the company's alone.
We've seen mention of headwinds, but we also know for a fact the Edge market is only just being created.
Also it's a product that needs customers to want to buy it, not just how well we can market it.
We know that everyone who has touched it is still engaged and seemingly pleased, ALL feedback has been positive from engagements we have heard back from.
Brainchip is creating a new market for Edge and low power Ai, but it's also forcing some companies to create new visions and pivot in directions previously unknown to them, such as Mercedes Benz creating silicon.

The price is more of an indication to the sophistication and unrealistic expectations of the market IMO
 
  • Like
  • Love
Reactions: 4 users

Labsy

Regular
You have Insider knowledge ha? 🤗 ok I will buy more because you said this now! Thanks
Well... maybe don't do that... you know what I mean. No I'm not. But I am hopefull. 😉🫣😵‍💫
 
  • Like
Reactions: 1 users

7für7

Top 20
Well... maybe don't do that... you know what I mean. No I'm not. But I am hopefull. 😉🫣😵‍💫
I am also hopeful and happy about every step brainchip is doing. Also expanding its ecosystem. But like i said in the past. I’m also an investor and question everything. It’s nice to be informed but interpreting every article and announcement regarding AI, like, “it’s akida inside” is kind of childish. Only my opinion ✌️
 
  • Like
Reactions: 9 users

TECH

Regular
Good afternoon,

A couple of nights ago I had a dream about Brainchip (true story).

I was told that there was a shareholder meeting on 9 January and I remember saying (in the dream) that, that doesn't sound right, what
about ?....then thinking it must be something to do with a takeover or whatever.

100% the truth. Tech ♦️
 
  • Haha
  • Wow
  • Fire
Reactions: 16 users

skutza

Regular
Good afternoon,

A couple of nights ago I had a dream about Brainchip (true story).

I was told that there was a shareholder meeting on 9 January and I remember saying (in the dream) that, that doesn't sound right, what
about ?....then thinking it must be something to do with a takeover or whatever.

100% the truth. Tech ♦️
Well all I know Tech is that I had a dream also. But a nightmare as well. And the funny thing is that I was awake the whole time and it was like watching a car crash.

I have a favorite number and I thought it was a lucky number,4.

So imagine holding 444,444 shares @28c and it turns into $2.34. You see your super and think, these added funds, I could likely retire very soon. Then as the daytime nightmare happens you watch and think, shall I sell? shall I sell, shall I .....sell....?????

Now by no means is this anything to do with managment or the company in anyway. But at the same time it does make you think, WHERE THE FUCK IS MY TIME MACHINE!!!!!

o_Oo_Oo_Oo_Oo_Oo_Oo_Oo_Oo_Oo_O🤣🤣🤣🤣🤣🤣🤣🤣🤣🤣


Sorry if my language is poor. If it is please replace one word with Hell, where the hell. Thanks in advance,
 
  • Haha
  • Like
  • Fire
Reactions: 15 users

charles2

Regular
Naked short selling about to take a hit in the US. Trillions of dollars of damage to retail shareholders....can bankrupt companies and/or lead to death spiral financing.

Will the ASX wake up?

 
  • Like
  • Fire
  • Love
Reactions: 23 users
What a great read for the layman or computer genius.
Every holder of BRN should read this.


If you don't have dreams, you can't have dreams come true!
Thanks for these words of encouragement. As our board is more advanced, perhaps we can get less technical readers interested? Feel free to share the link on other media.
 
  • Like
  • Fire
Reactions: 8 users

Diogenese

Top 20
Good afternoon,

A couple of nights ago I had a dream about Brainchip (true story).

I was told that there was a shareholder meeting on 9 January and I remember saying (in the dream) that, that doesn't sound right, what
about ?....then thinking it must be something to do with a takeover or whatever.

100% the truth. Tech ♦️


Plots have I laid, inductions dangerous,

By drunken prophecies, libels and dreams,

To set my brother Clarence and the King

In deadly hate, the one against the other:

And if King Edward be as true and just

As I am subtle, false and treacherous,

This day should Clarence closely be mew’d up,

About a prophecy, which says that ‘G’

Of Edward’s heirs the murderer shall be –

Dive, thoughts, down to my soul: here Clarence comes.
 
  • Love
  • Like
  • Haha
Reactions: 12 users

Gemmax

Regular
Naked short selling about to take a hit in the US. Trillions of dollars of damage to retail shareholders....can bankrupt companies and/or lead to death spiral financing.

Will the ASX wake up?

Live in hope Charles.
 
  • Like
Reactions: 4 users

Mea culpa

prəmɪskjuəs
Plots have I laid, inductions dangerous,

By drunken prophecies, libels and dreams,

To set my brother Clarence and the King

In deadly hate, the one against the other:

And if King Edward be as true and just

As I am subtle, false and treacherous,

This day should Clarence closely be mew’d up,

About a prophecy, which says that ‘G’

Of Edward’s heirs the murderer shall be –

Dive, thoughts, down to my soul: here Clarence comes.
I ‘fess up. I’ve googled twice in the last hour or so for explanations of Dodgy’s posts.

To add to the now lost count number of times I’ve done so. 🫤
 
  • Haha
  • Like
Reactions: 9 users

robsmark

Regular
That only hold water if it's the company's fault, and the company's alone.
We've seen mention of headwinds, but we also know for a fact the Edge market is only just being created.
Also it's a product that needs customers to want to buy it, not just how well we can market it.
We know that everyone who has touched it is still engaged and seemingly pleased, ALL feedback has been positive from engagements we have heard back from.
Brainchip is creating a new market for Edge and low power Ai, but it's also forcing some companies to create new visions and pivot in directions previously unknown to them, such as Mercedes Benz creating silicon.

The price is more of an indication to the sophistication and unrealistic expectations of the market IMO

“The price is more of an indication of the sophistication and unrealistic expectations of the market IMO”

I remember Ken Scarince - CFO, saying publicly at the 2020 AGM that the SP should be multiple times higher than what it currently was.

So the company holds some accountability here, like it or not.
 
  • Like
  • Fire
  • Wow
Reactions: 21 users

Diogenese

Top 20
Modern neuromorphic processor architectures...PLURAL...Hmmm?????



This Tiny Sensor Could Be in Your Next Headset​

Prophesee
PROPHESEE Event-Based Metavision GenX320 Bare Die 2.jpg

Neuromorphic computing company develops event-based vision sensor for edge AI apps.
Spencer Chin | Oct 16, 2023


As edge-based artificial intelligence (AI) applications become more common, there will be a greater need for sensors that can meet the power and environmental needs of edge hardware. Prophesee SA, which supplies advanced neuromorphic vision systems, has introduced an event-based vision sensor for integration into ultra-low-power edge AI vision devices. The GenX320 Metavision sensor, which uses a tiny 3x4mm die, leverages the company’s technology platform into growing intelligent edge market segments, including AR/VR headsets, security and monitoring/detection systems, touchless displays, eye tracking features, and always-on intelligent IoT devices.

According to Luca Verre, CEO and co-founder of Prophesee, the concept of event-based vision has been researched for years, but developing a viable commercial implementation in a sensor-like device has only happened relatively recently. “Prophesee has used a combination of expertise and innovative developments around neuromorphic computing, VLSI design, AL algorithm development, and CMOS image sensing,” said Verre in an e-mail interview with Design News. “Together, those skills and advancements, along with critical partnerships with companies like Sony, Intel, Bosch, Xiaomi, Qualcomm, 🤔and others 🤔have enabled us to optimize a design for the performance, power, size, and cost requirements of various markets.”

Prophesse’s vision sensor is a 320x320, 6.3μm pixel BSI stacked event-based vision sensor that offers a tiny 1/5-in. optical format. Verre said, “The explicit goal was to improve integrability and usability in embedded at-the-edge vision systems, which in addition to size and power improvements, means the design must address the challenge of event-based vision’s unconventional data format, nonconstant data rates, and non-standard interfaces to make it more usable for a wider range of applications. We have done that with multiple integrated event data pre-processing, filtering, and formatting functions to minimize external processing overhead.”

Verre added, “In addition, MIPI or CPI data output interfaces offer low-latency connectivity to embedded processing platforms, including low-power microcontrollers and modern neuromorphic processor architectures.

Low-Power Operation

According to Verre, the GenX320 sensor has been optimized for low-power operation, featuring a hierarchy of power modes and application-specific modes of operation. On-chip power management further improves sensor flexibility and integrability. To meet aggressive size and cost requirements, the chip is fabricated using a CMOS stacked process with pixel-level Cu-Cu bonding interconnects achieving a 6.3μm pixel-pitch.
The sensor performs low latency, µsec resolution timestamping of events with flexible data formatting. On-chip intelligent power management modes reduce power consumption to a low 36uW and enable smart wake-on-events. Deep sleep and standby modes are also featured.

According to Prophesee, the sensor is designed to be easily integrated with standard SoCs with multiple combined event data pre-processing, filtering, and formatting functions to minimize external processing overhead. MIPI or CPI data output interfaces offer low-latency connectivity to embedded processing platforms, including low-power microcontrollers and modern neuromorphic processor architectures.

Prophesee’s Verre expects the sensor to find applications in AR/VR headsets. “We are solving an important issue in our ability to efficiently (i.e. low power/low heat) support foveated rendering in eye tracking for a more realistic, immersive experience. Meta has discussed publicly the use of event-based vision technology, and we are actively involved with our partner Zinn Labs in this area. XPERI has already developed a driver monitor system (DMS) proof of concept based on our previous generation sensor for gaze monitoring and we are working with them on a next-gen solution using GenX320 for both automotive and other potential uses, including micro expression monitoring. The market for gesture and motion detection is very large, and our partner Ultraleap has demonstrated a working prototype of a touch-free display using our solution.”

The sensor incorporates an on-chip histogram output compatible with multiple AI accelerators. The sensor is also natively compatible with Prophesee Metavision Intelligence, an open-source event-based vision software suite that is used by a community of over 10,000 users.

Prophesee will support the GenX320 with a complete range of development tools for easy exploration and optimization, including a comprehensive Evaluation Kit housing a chip-on-board (COB) GenX320 module, or a compact optical flex module. In addition, Prophesee will offer a range of adapter kits that enable seamless connectivity to a large range of embedded platforms, such as an STM32 MCU, speeding time-to-market.

Spencer Chin is a Senior Editor for Design News covering the electronics beat. He has many years of experience covering developments in components, semiconductors, subsystems, power, and other facets of electronics from both a business/supply-chain and technology perspective. He can be reached at Spencer.Chin@informa.com.


Hi Bravo,

This reference to foveated eye tracking is interesting, particularly as Luminar, who, it has been reported, will take a significant part of Mercedes lidar business in a couple of years, use foveated lidar.

Foveated refers to the difference between central eye vision and peripheral vision. In lidar, this means that the laser spot density is increased for points of interest. I think Luminar do this by increasing the frequency of transmitting laser pulses.

Prophesee’s Verre expects the sensor to find applications in AR/VR headsets. “We are solving an important issue in our ability to efficiently (i.e. low power/low heat) support foveated rendering in eye tracking for a more realistic, immersive experience. Meta has discussed publicly the use of event-based vision technology, and we are actively involved with our partner Zinn Labs in this area.

I don't know who, if anyone, has the controlling patents for foveated lidar, but Luminar does have some patents:

US2018284234A1 Foveated Imaging in a Lidar System

1697525042412.png


To identify the most important areas in front of a vehicle for avoiding collisions, a lidar system obtains a foveated imaging model. The foveated imaging model is generated by detecting the direction at which drivers' are facing at various points in time for several scenarios based on road conditions or upcoming maneuvers. The lidar system identifies an upcoming maneuver for the vehicle or a road condition and applies the identified maneuver or road condition to the foveated imaging model to identify a region of a field of regard at which to increase the resolution. The lidar system then increases the resolution at the identified region by increasing the pulse rate for transmitting light pulses within the identified region, filtering pixels outside of the identified region, or in any other suitable manner.

The Zinn patent

WO2023081297A1 EYE TRACKING SYSTEM FOR DETERMINING USER ACTIVITY 20211105

refers to a "differential camera" which I guess is a DVS, which is where Prophesee would come in.

ZINN uses a NN with a machine learned model trained to identify various optical activities, reading, mobile phone use, social media use, ...


Another Zinn patent application

US2023195220A1 EYE TRACKING SYSTEM WITH OFF-AXIS LIGHT SOURCES 20201217 uses a NN to detect the pupil position and uses this to judge the focus distance and adjust the focal length of a vari-focus lens.

1697528165839.png







1697527794048.png



I couldn't find anything to show Zinn roll their own NNs.
 
  • Like
  • Fire
  • Love
Reactions: 15 users

Diogenese

Top 20
I ‘fess up. I’ve googled twice in the last hour or so for explanations of Dodgy’s posts.

To add to the now lost count number of times I’ve done so. 🫤
Richard the Turd.

(Pardon my Irish accent).
 
  • Haha
  • Like
Reactions: 9 users

wilzy123

Founding Member
“The price is more of an indication of the sophistication and unrealistic expectations of the market IMO”

I remember Ken Scarince - CFO, saying publicly at the 2020 AGM that the SP should be multiple times higher than what it currently was.

So the company holds some accountability here, like it or not.
OK
shhh-shush.gif
 
  • Haha
Reactions: 5 users
Top Bottom