BRN Discussion Ongoing

Damo4

Regular
Not trying to be an asshole, but just a bit of friendly (non financial) advice. Just because the share price drops doesn't mean it's a buy. We can wish for all the dot connecting and long winded posts from knowledgeable people all we like. Buy one true FACT is that we are here because a lack of revenue. How much $$ would you (and me) have saved if you'd just not bought at lows and waited? $1? 80c? 50c? . I'm happy to wait and wait until I see the company make big $$$ or a real proper DOTs like a signed contract from a big name. If I miss the first 20% waiting, I'm fine with that, but that first 20% will be a lot less than the 50% already lost in the last 2 months?

Food for thought.

That only hold water if it's the company's fault, and the company's alone.
We've seen mention of headwinds, but we also know for a fact the Edge market is only just being created.
Also it's a product that needs customers to want to buy it, not just how well we can market it.
We know that everyone who has touched it is still engaged and seemingly pleased, ALL feedback has been positive from engagements we have heard back from.
Brainchip is creating a new market for Edge and low power Ai, but it's also forcing some companies to create new visions and pivot in directions previously unknown to them, such as Mercedes Benz creating silicon.

The price is more of an indication to the sophistication and unrealistic expectations of the market IMO
 
  • Like
  • Love
Reactions: 4 users

Labsy

Regular
You have Insider knowledge ha? 🤗 ok I will buy more because you said this now! Thanks
Well... maybe don't do that... you know what I mean. No I'm not. But I am hopefull. 😉🫣😵‍💫
 
  • Like
Reactions: 1 users

7für7

Top 20
Well... maybe don't do that... you know what I mean. No I'm not. But I am hopefull. 😉🫣😵‍💫
I am also hopeful and happy about every step brainchip is doing. Also expanding its ecosystem. But like i said in the past. I’m also an investor and question everything. It’s nice to be informed but interpreting every article and announcement regarding AI, like, “it’s akida inside” is kind of childish. Only my opinion ✌️
 
  • Like
Reactions: 9 users

TECH

Regular
Good afternoon,

A couple of nights ago I had a dream about Brainchip (true story).

I was told that there was a shareholder meeting on 9 January and I remember saying (in the dream) that, that doesn't sound right, what
about ?....then thinking it must be something to do with a takeover or whatever.

100% the truth. Tech ♦️
 
  • Haha
  • Wow
  • Fire
Reactions: 16 users

skutza

Regular
Good afternoon,

A couple of nights ago I had a dream about Brainchip (true story).

I was told that there was a shareholder meeting on 9 January and I remember saying (in the dream) that, that doesn't sound right, what
about ?....then thinking it must be something to do with a takeover or whatever.

100% the truth. Tech ♦️
Well all I know Tech is that I had a dream also. But a nightmare as well. And the funny thing is that I was awake the whole time and it was like watching a car crash.

I have a favorite number and I thought it was a lucky number,4.

So imagine holding 444,444 shares @28c and it turns into $2.34. You see your super and think, these added funds, I could likely retire very soon. Then as the daytime nightmare happens you watch and think, shall I sell? shall I sell, shall I .....sell....?????

Now by no means is this anything to do with managment or the company in anyway. But at the same time it does make you think, WHERE THE FUCK IS MY TIME MACHINE!!!!!

o_Oo_Oo_Oo_Oo_Oo_Oo_Oo_Oo_Oo_O🤣🤣🤣🤣🤣🤣🤣🤣🤣🤣


Sorry if my language is poor. If it is please replace one word with Hell, where the hell. Thanks in advance,
 
  • Haha
  • Like
  • Fire
Reactions: 15 users

charles2

Regular
Naked short selling about to take a hit in the US. Trillions of dollars of damage to retail shareholders....can bankrupt companies and/or lead to death spiral financing.

Will the ASX wake up?

 
  • Like
  • Fire
  • Love
Reactions: 23 users
What a great read for the layman or computer genius.
Every holder of BRN should read this.


If you don't have dreams, you can't have dreams come true!
Thanks for these words of encouragement. As our board is more advanced, perhaps we can get less technical readers interested? Feel free to share the link on other media.
 
  • Like
  • Fire
Reactions: 8 users

Diogenese

Top 20
Good afternoon,

A couple of nights ago I had a dream about Brainchip (true story).

I was told that there was a shareholder meeting on 9 January and I remember saying (in the dream) that, that doesn't sound right, what
about ?....then thinking it must be something to do with a takeover or whatever.

100% the truth. Tech ♦️


Plots have I laid, inductions dangerous,

By drunken prophecies, libels and dreams,

To set my brother Clarence and the King

In deadly hate, the one against the other:

And if King Edward be as true and just

As I am subtle, false and treacherous,

This day should Clarence closely be mew’d up,

About a prophecy, which says that ‘G’

Of Edward’s heirs the murderer shall be –

Dive, thoughts, down to my soul: here Clarence comes.
 
  • Love
  • Like
  • Haha
Reactions: 12 users

Gemmax

Regular
Naked short selling about to take a hit in the US. Trillions of dollars of damage to retail shareholders....can bankrupt companies and/or lead to death spiral financing.

Will the ASX wake up?

Live in hope Charles.
 
  • Like
Reactions: 4 users

Mea culpa

prəmɪskjuəs
Plots have I laid, inductions dangerous,

By drunken prophecies, libels and dreams,

To set my brother Clarence and the King

In deadly hate, the one against the other:

And if King Edward be as true and just

As I am subtle, false and treacherous,

This day should Clarence closely be mew’d up,

About a prophecy, which says that ‘G’

Of Edward’s heirs the murderer shall be –

Dive, thoughts, down to my soul: here Clarence comes.
I ‘fess up. I’ve googled twice in the last hour or so for explanations of Dodgy’s posts.

To add to the now lost count number of times I’ve done so. 🫤
 
  • Haha
  • Like
Reactions: 9 users

robsmark

Regular
That only hold water if it's the company's fault, and the company's alone.
We've seen mention of headwinds, but we also know for a fact the Edge market is only just being created.
Also it's a product that needs customers to want to buy it, not just how well we can market it.
We know that everyone who has touched it is still engaged and seemingly pleased, ALL feedback has been positive from engagements we have heard back from.
Brainchip is creating a new market for Edge and low power Ai, but it's also forcing some companies to create new visions and pivot in directions previously unknown to them, such as Mercedes Benz creating silicon.

The price is more of an indication to the sophistication and unrealistic expectations of the market IMO

“The price is more of an indication of the sophistication and unrealistic expectations of the market IMO”

I remember Ken Scarince - CFO, saying publicly at the 2020 AGM that the SP should be multiple times higher than what it currently was.

So the company holds some accountability here, like it or not.
 
  • Like
  • Fire
  • Wow
Reactions: 21 users

Diogenese

Top 20
Modern neuromorphic processor architectures...PLURAL...Hmmm?????



This Tiny Sensor Could Be in Your Next Headset​

Prophesee
PROPHESEE Event-Based Metavision GenX320 Bare Die 2.jpg

Neuromorphic computing company develops event-based vision sensor for edge AI apps.
Spencer Chin | Oct 16, 2023


As edge-based artificial intelligence (AI) applications become more common, there will be a greater need for sensors that can meet the power and environmental needs of edge hardware. Prophesee SA, which supplies advanced neuromorphic vision systems, has introduced an event-based vision sensor for integration into ultra-low-power edge AI vision devices. The GenX320 Metavision sensor, which uses a tiny 3x4mm die, leverages the company’s technology platform into growing intelligent edge market segments, including AR/VR headsets, security and monitoring/detection systems, touchless displays, eye tracking features, and always-on intelligent IoT devices.

According to Luca Verre, CEO and co-founder of Prophesee, the concept of event-based vision has been researched for years, but developing a viable commercial implementation in a sensor-like device has only happened relatively recently. “Prophesee has used a combination of expertise and innovative developments around neuromorphic computing, VLSI design, AL algorithm development, and CMOS image sensing,” said Verre in an e-mail interview with Design News. “Together, those skills and advancements, along with critical partnerships with companies like Sony, Intel, Bosch, Xiaomi, Qualcomm, 🤔and others 🤔have enabled us to optimize a design for the performance, power, size, and cost requirements of various markets.”

Prophesse’s vision sensor is a 320x320, 6.3μm pixel BSI stacked event-based vision sensor that offers a tiny 1/5-in. optical format. Verre said, “The explicit goal was to improve integrability and usability in embedded at-the-edge vision systems, which in addition to size and power improvements, means the design must address the challenge of event-based vision’s unconventional data format, nonconstant data rates, and non-standard interfaces to make it more usable for a wider range of applications. We have done that with multiple integrated event data pre-processing, filtering, and formatting functions to minimize external processing overhead.”

Verre added, “In addition, MIPI or CPI data output interfaces offer low-latency connectivity to embedded processing platforms, including low-power microcontrollers and modern neuromorphic processor architectures.

Low-Power Operation

According to Verre, the GenX320 sensor has been optimized for low-power operation, featuring a hierarchy of power modes and application-specific modes of operation. On-chip power management further improves sensor flexibility and integrability. To meet aggressive size and cost requirements, the chip is fabricated using a CMOS stacked process with pixel-level Cu-Cu bonding interconnects achieving a 6.3μm pixel-pitch.
The sensor performs low latency, µsec resolution timestamping of events with flexible data formatting. On-chip intelligent power management modes reduce power consumption to a low 36uW and enable smart wake-on-events. Deep sleep and standby modes are also featured.

According to Prophesee, the sensor is designed to be easily integrated with standard SoCs with multiple combined event data pre-processing, filtering, and formatting functions to minimize external processing overhead. MIPI or CPI data output interfaces offer low-latency connectivity to embedded processing platforms, including low-power microcontrollers and modern neuromorphic processor architectures.

Prophesee’s Verre expects the sensor to find applications in AR/VR headsets. “We are solving an important issue in our ability to efficiently (i.e. low power/low heat) support foveated rendering in eye tracking for a more realistic, immersive experience. Meta has discussed publicly the use of event-based vision technology, and we are actively involved with our partner Zinn Labs in this area. XPERI has already developed a driver monitor system (DMS) proof of concept based on our previous generation sensor for gaze monitoring and we are working with them on a next-gen solution using GenX320 for both automotive and other potential uses, including micro expression monitoring. The market for gesture and motion detection is very large, and our partner Ultraleap has demonstrated a working prototype of a touch-free display using our solution.”

The sensor incorporates an on-chip histogram output compatible with multiple AI accelerators. The sensor is also natively compatible with Prophesee Metavision Intelligence, an open-source event-based vision software suite that is used by a community of over 10,000 users.

Prophesee will support the GenX320 with a complete range of development tools for easy exploration and optimization, including a comprehensive Evaluation Kit housing a chip-on-board (COB) GenX320 module, or a compact optical flex module. In addition, Prophesee will offer a range of adapter kits that enable seamless connectivity to a large range of embedded platforms, such as an STM32 MCU, speeding time-to-market.

Spencer Chin is a Senior Editor for Design News covering the electronics beat. He has many years of experience covering developments in components, semiconductors, subsystems, power, and other facets of electronics from both a business/supply-chain and technology perspective. He can be reached at Spencer.Chin@informa.com.


Hi Bravo,

This reference to foveated eye tracking is interesting, particularly as Luminar, who, it has been reported, will take a significant part of Mercedes lidar business in a couple of years, use foveated lidar.

Foveated refers to the difference between central eye vision and peripheral vision. In lidar, this means that the laser spot density is increased for points of interest. I think Luminar do this by increasing the frequency of transmitting laser pulses.

Prophesee’s Verre expects the sensor to find applications in AR/VR headsets. “We are solving an important issue in our ability to efficiently (i.e. low power/low heat) support foveated rendering in eye tracking for a more realistic, immersive experience. Meta has discussed publicly the use of event-based vision technology, and we are actively involved with our partner Zinn Labs in this area.

I don't know who, if anyone, has the controlling patents for foveated lidar, but Luminar does have some patents:

US2018284234A1 Foveated Imaging in a Lidar System

1697525042412.png


To identify the most important areas in front of a vehicle for avoiding collisions, a lidar system obtains a foveated imaging model. The foveated imaging model is generated by detecting the direction at which drivers' are facing at various points in time for several scenarios based on road conditions or upcoming maneuvers. The lidar system identifies an upcoming maneuver for the vehicle or a road condition and applies the identified maneuver or road condition to the foveated imaging model to identify a region of a field of regard at which to increase the resolution. The lidar system then increases the resolution at the identified region by increasing the pulse rate for transmitting light pulses within the identified region, filtering pixels outside of the identified region, or in any other suitable manner.

The Zinn patent

WO2023081297A1 EYE TRACKING SYSTEM FOR DETERMINING USER ACTIVITY 20211105

refers to a "differential camera" which I guess is a DVS, which is where Prophesee would come in.

ZINN uses a NN with a machine learned model trained to identify various optical activities, reading, mobile phone use, social media use, ...


Another Zinn patent application

US2023195220A1 EYE TRACKING SYSTEM WITH OFF-AXIS LIGHT SOURCES 20201217 uses a NN to detect the pupil position and uses this to judge the focus distance and adjust the focal length of a vari-focus lens.

1697528165839.png







1697527794048.png



I couldn't find anything to show Zinn roll their own NNs.
 
  • Like
  • Fire
  • Love
Reactions: 15 users

Diogenese

Top 20
I ‘fess up. I’ve googled twice in the last hour or so for explanations of Dodgy’s posts.

To add to the now lost count number of times I’ve done so. 🫤
Richard the Turd.

(Pardon my Irish accent).
 
  • Haha
  • Like
Reactions: 9 users

wilzy123

Founding Member
“The price is more of an indication of the sophistication and unrealistic expectations of the market IMO”

I remember Ken Scarince - CFO, saying publicly at the 2020 AGM that the SP should be multiple times higher than what it currently was.

So the company holds some accountability here, like it or not.
OK
shhh-shush.gif
 
  • Haha
Reactions: 5 users
It's honestly a surprise shorts are still around 6% tbh when you consider the current share price to progress over time of BRN.

Quite funny. Looks like quite a few sheep on the short train.
6% is the available short shares on offer in the system, not actually how many shares have to be covered on market.

There could be as little as 1-1.5% of those or less actively sold on market that need to be covered..
 
  • Like
Reactions: 2 users

M_C

Founding Member


1000006822.png
 
  • Like
  • Fire
  • Love
Reactions: 30 users
Well all I know Tech is that I had a dream also. But a nightmare as well. And the funny thing is that I was awake the whole time and it was like watching a car crash.

I have a favorite number and I thought it was a lucky number,4.

So imagine holding 444,444 shares @28c and it turns into $2.34. You see your super and think, these added funds, I could likely retire very soon. Then as the daytime nightmare happens you watch and think, shall I sell? shall I sell, shall I .....sell....?????

Now by no means is this anything to do with managment or the company in anyway. But at the same time it does make you think, WHERE THE FUCK IS MY TIME MACHINE!!!!!

o_Oo_Oo_Oo_Oo_Oo_Oo_Oo_Oo_Oo_O🤣🤣🤣🤣🤣🤣🤣🤣🤣🤣


Sorry if my language is poor. If it is please replace one word with Hell, where the hell. Thanks in advance,
With that sentiment, the bottom could be in..
 

Diogenese

Top 20


View attachment 47297


EP4181081A1 ENERGY EFFICIENT HIERARCHICAL SNN ARCHITECTURE FOR CLASSIFICATION AND SEGMENTATION OF HIGH-RESOLUTION IMAGES 20211116

1697530290148.png



State of art techniques rely of FPGA based approaches when power efficiency is of concern. However, compared to SNN on Neuromorphic hardware, ANN on FPGA requires higher power and longer design cycles to deploy neural network on hardware accelerators. Embodiments of the present disclosure provide a method and system for energy efficient hierarchical multi-stage SNN architecture for classification and segmentation of high-resolution images. Patch-to-patch-class classification approach is used, where the image is divided into smaller patches, and classified at first stage into multiple labels based on percentage coverage of a parameter of interest, for example, cloud coverage in satellite images. The image portion corresponding to the partially covered patches is divided into further smaller size patches, classified by a binary classifier at second level of classification. Labels across multiple SNN classifier levels are aggregated to identify segmentation map of the input image in accordance with the coverage parameter of interest.

This is similar to Akida ViT:

Vision Transformers (ViTs) - Their Popularity And Unique Architecture - BrainChip


In a Vision Transformer, an image is first divided into patches, which are then flattened and fed into a multi-layer transformer network. The self-attention mechanism allows the model to attend to different parts of the image at different scales, enabling it to simultaneously capture global and local features. The transformer’s output is passed through a final classification layer to obtain the predicted class label

1697531291463.png
 
  • Like
  • Fire
  • Love
Reactions: 37 users
Top Bottom