BRN Discussion Ongoing

JK200SX

Regular
I read elsewhere that Blackrock dumped their semiconductor shares today, so maybe the reason for the drop

I don't think they dumped any shares today?


1676457350215.png
 
  • Like
Reactions: 5 users

Sam

Nothing changes if nothing changes
I'm locked & loaded until at least 2027 ... so I view this short attack not with alarm, more interest.

So why the short attack today ... I assume the perpetrators decided that this was the week to drive the SP down as low as possible and cover their position. Is it not positive, I wonder, that they have decided " now or never ".

What's the catalyst for action this week?

Is it the annual report ... perhaps worried it will be better than expected.
Is it the fireside chat ... will this be the SH update we have been longing for
Is it inside info that Valeo, or another "partner", are about to declare their hand

Time to capitalise before good news hits ?

Let's see ....
Yes….. let’s see👍 unsure in my 5 years I’ve seen any thing like this….. unreal
 
  • Like
  • Sad
Reactions: 4 users

BaconLover

Founding Member
I don't think they dumped any shares today?


View attachment 29669
I think the seller was Warren Buffett. Looks like he sold out 85%, must've seen a red flag.
No idea where he directed his money though, most of other semiconductor sector held well.
 
  • Like
  • Thinking
Reactions: 6 users

Tothemoon24

Top 20
Wow 🤩

This is uplifting


FEATURES ISSUE: FEB/MAR 23WHY YOU WILL BE S...
Why you will be seeing much more from event cameras
14 February 2023

February/March 2023
Advances in sensors that capture images like real eyes, plus in the software and hardware to process them, are bringing a paradigm shift in imaging, finds Andrei Mihai



The field of neuromorphic vision, where electronic cameras mimic the biological eye, has been around for some 30 years. Neuromorphic cameras (also called event cameras) mimic the function of the retina, the part of the eye that contains light-sensitive cells. This is a fundamental change from conventional cameras – and why applications for event cameras for industry and research are also different.


Conventional cameras are built for capturing images and visually reproducing them. They take a picture at certain amounts of time, capturing the field of vision and snapping frames at predefined intervals, regardless of how the image is changing. These frame-based cameras work excellently for their purpose, but they are not optimised for sensing or machine vision. They capture a great deal of information but, from a sensing perspective, much of that information is useless, because it is not changing.

Event cameras suppress this redundancy and have fundamental benefits in terms of efficiency, speed, and dynamic range. Event-based vision sensors can achieve better speed versus power consumption trade-off by up to three orders of magnitude. By relying on a different way of acquiring information compared with a conventional camera, they also address applications in the field of machine vision and AI.



Event camera systems can quickly and efficiently monitor particle size and movement


“Essentially, what we’re bringing to the table is a new approach to sensing information, very different to conventional cameras that have been around for many years,” says Luca Verre, CEO of Prophesee, the market leader in the field.

Whereas most commercial cameras are essentially optimised to produce attractive images, the needs of the automotive, industrial, Internet of Things (IoT) industries, and even some consumer products, often demand different performances. If you are monitoring change, for instance, as much as 90% of the scene is useless information because it does not change. Event cameras bypass that as they only monitor when light goes up or down in certain relative amounts, which produces a so-called “change event”.

In modern neuromorphic cameras, each pixel of the sensor works independently (asynchronously) and records continuously, so there is no downtime, even when you go down to microseconds. Also, since they only monitor changing data, they do not monitor redundant data. This is one of the key aspects driving the field forward.

Innovation in neuromorphic vision
Vision sensors typically gather a lot of data, but increasingly there is a drive to use edge processing for these sensors. For many machine vision applications, edge computation has become a bottleneck. But for event cameras, it is the opposite.

More and more, sensor cameras are used for some local processing, some edge processing, and this is where we believe we have a technology and an approach that can bring value to this application“More and more, sensor cameras are used for some local processing, some edge processing, and this is where we believe we have a technology and an approach that can bring value to this application,” says Verre.

“We are enabling fully fledged edge computing by the fact that our sensors produce very low data volumes. So, you can afford to have a cost-reasonable, low-power system on a chip at the edge, because you can simply generate a few event data that this processor can easily interface with and process locally.


“Instead of feeding this processor with tons of frames that overload them and hinder their capability to process data in real-time, our event camera can enable them to do real-time across a scene. We believe that event cameras are finally unlocking this edge processing.”

Making sensors smaller and cheaper is also a key innovation because it opens up a range of potential applications, such as in IoT sensing or smartphones. For this, Prophesee partnered with Sony, mixing its expertise in event cameras with Sony’s infrastructure and experience in vision sensors to develop a smaller, more efficient, and cheaper event camera evaluation kit. Verre thinks the pricing of event cameras is at a point where they can be realistically introduced into smartphones.

Another area companies are eyeing is fusion kits – the basic idea is to mix the capability of a neuromorphic camera with another vision sensor, such as lidar or a conventional camera, into a single system.

“From both the spatial information of a frame-based camera and from the information of an event-based camera, you can actually open the door to many other applications,” says Verre. “Definitely, there is potential in sensor fusion… by combining event-based sensors with some lidar technologies, for instance, in navigation, localisation, and mapping.”

Neuromorphic computing progress
However, while neuromorphic cameras mimic the human eye, the processing chips they work with are far from mimicking the human brain. Most neuromorphic computing, including work on event camera computing, is carried out using deep learning algorithms that perform processing on CPUs of GPUs, which are not optimised for neuromorphic processing. This is where new chips such as Intel’s Loihi 2 (a neuromorphic research chip) and Lava (an open-source software framework) come in.

“Our second-generation chip greatly improves the speed, programmability, and capacity of neuromorphic processing, broadening its usages in power and latency-constrained intelligent computing applications,” says Mike Davies, Director of Intel’s Neuromorphic Computing Lab.
BrainChip, a neuromorphic computing IP vendor, also partnered with Prophesee to deliver event-based vision systems with integrated low-power technology coupled with high AI performance.

It is not only industry accelerating the field of neuromorphic chips for vision – there is also an emerging but already active academic field. Neuromorphic systems have enormous potential, yet they are rarely used in a non-academic context. Particularly, there are no industrial employments of these bio-inspired technologies. Nevertheless, event-based solutions are already far superior to conventional algorithms in terms of latency and energy efficiency.

Working with the first iteration of the Loihi chip in 2019, Alpha Renner et al (‘Event-based attention and tracking on neuromorphic hardware’) developed the first set-up that interfaces an event-based camera with the spiking neuromorphic system Loihi, creating a purely event-driven sensing and processing system. The system selects a single object out of a number of moving objects and tracks it in the visual field, even in cases when movement stops, and the event stream is interrupted.

In 2021, Viale et al demonstrated the first spiking neuronal network (SNN) on a chip used for a neuromorphic vision-based controller solving a high-speed UAV control task. Ongoing research is looking at ways to use neuromorphic neural networks to integrate chips and event cameras for autonomous cars. Since many of these applications use the Loihi chip, newer generations, such as Loihi 2, should speed development. Other neuromorphic chips are also emerging, allowing quick learning and training of the algorithm even with a small dataset. Specialised SNN algorithms operating on neuromorphic chips can further help edge processing and general computing in event vision.

The development of event-based cameras, inspired by the retina, enables the exploitation of an additional physical constraint – time“The development of event-based cameras, inspired by the retina, enables the exploitation of an additional physical constraint – time. Due to their asynchronous course of operation, considering the precise occurrence of spikes, spiking neural networks take advantage of this constraint,” writes Lea Steffen and colleagues (‘Neuromorphic Stereo Vision: A Survey of Bio-Inspired Sensors and Algorithms’).

Lighting is another aspect the field of neuromorphic vision is increasingly looking at. An advantage of event cameras compared with frame-based cameras is their ability to deal with a range of extreme light conditions – whether high or low. But event cameras can now use light itself in a different way.

Prophesee and CIS have started work on the industry’s first evaluation kit for implementing 3D sensing based on structured light. This uses event-based vision and point cloud generation to produce an accurate 3D Point Cloud.

“You can then use this principle to project the light pattern in the scene and, because you know the geometry of the setting, you can compute the disparity map and then estimate the 3D and depth information,” says Verre. “We can reach this 3D Point Cloud at a refresh rate of 1kHz or above. So, any application of 3D tourism, such as 3D measurements or 3D navigation that requires high speed and time precision, really benefits from this technology. There are no comparable 3D approaches available today that can reach this time resolution.”

Industrial applications of event vision

Due to its inherent advantages, as well as progress in the field of peripherals (such as neuromorphic chips and lighting systems) and algorithms, we can expect the deployment of neuromorphic vision systems to continue – especially as systems become increasingly cost-effective.



Event vision can trace particles or monitor vibrations with low latency, low energy consumption, and relatively low amounts of data

We have mentioned some of the applications of event cameras here at IMVE before, from helping restore people’s vision to tracking and managing space debris. But in the near future perhaps the biggest impact will be at an industrial level.

From tracing particles or quality control to monitoring vibrations, all with low latency, low energy consumption, and relatively low amounts of data that favour edge computing, event vision is promising to become a mainstay in many industrial processes. Lowering costs through scaling production and better sensor design is opening even more doors.

Smartphones are one field where event cameras may make an unexpected entrance, but Verre says this is just the tip of the iceberg. He is looking forward to a paradigm shift and is most excited about all the applications that will soon pop up for event cameras – some of which we probably cannot yet envision.

“I see these technologies and new tech sensing modalities as a new paradigm that will create a new standard in the market. And in serving many, many applications, so we will see more event-based cameras all around us. This is so exciting.”

Event-based visionneuromorphic vision
PropheseeIntel
RELATED ARTICLES


Turbocharging cell imaging
FEATURE
LIFE SCIENCES, EVENT-BASED VISION, PATHOLOGY, MICROSCOPY
21 December 2021

Digital twin firm, Visometry, wins Vision start-up award
NEWS STORY
VISION STUTTGART, START-UPS, AR, ARTIFICIAL INTELLIGENCE, EVENT-BASED VISION
17 October 2022

Prophesee raises €50m for neuromorphic imaging
NEWS STORY
EVENT-BASED VISION, INVESTMENT
26 September 2022

Live webinar - Event-based vision
EVENT
EVENT-BASED VISION
24 January 2022

Turbocharging cell imaging
FEATURE
LIFE SCIENCES, EVENT-BASED VISION, PATHOLOGY, MICROSCOPY
21 December 2021
MEDIA PARTNERS
Company
Contact
About
Advertise
Terms & conditions
Privacy policy
Magazines
Electro Optics
Fibre Systems
Imaging and Machine Vision Europe
Laser Systems Europe
Research Information
Scientific Computing World
Registration
Login
Logout
Subscribe
Edit subscription
© 2023 Europa Science Ltd.
 
  • Like
  • Fire
  • Love
Reactions: 36 users

Sirod69

bavarian girl ;-)
You can't be relaxed. You have seen a 13% decline in the value of your holding today alone. It simply doesn't make sense to state that you are relaxed.

It's time to face reality in as much that they are poor communicators and that it needs to change.

The market is telling us that it is uncertain about the future of the company.

The company can and should react. We all know it won't though.

The arrogance exhibited by them is stupefying.
Phew, amazing, how I saw the share price this morning... I didn't expect it...... in bed I thought 0.4 cents would be ok for me.... and now... we're at 0.341 cents.. .. somehow I think to myself, now I don't care either, I've been in the red for a long time anyway... I'm able to wait a long time and that's what I was planning to do anyway... well, it's the way it is now and I'm really relaxed... it no use getting upset... I'm looking forward to the podcast

Relaxed Natalie Dormer GIF
 
  • Like
  • Love
  • Fire
Reactions: 35 users

robsmark

Regular
  • Haha
  • Like
Reactions: 4 users

cassip

Regular
Buffet sold some TSMC shares. But Qualcomm, Nvidia, AMD, Intel & semiconductors ETF all green in US overnight.
Seems as if Apple is more interesting to him.

A new strategy there?



 
Last edited:
  • Like
Reactions: 5 users

TECH

Regular
Maybe it was LDA Capital dumping shares to raise funds for BRN.

10 January 2023
Notice pursuant to Section 708A(5)(e) of the Corporations Act This notice is provided by Brainchip Holdings Ltd (BRN) for the purposes of Section 708A(5)(e) of the Corporations Act 2001 (Corporations Act). BRN today issued 30,000,000 fully paid ordinary shares (Shares) following the issue of a Capital Call Notice.


10 January 2023 – BrainChip Holdings Ltd (ASX: BRN), a leading provider of ultralow power, high-performance AI processor technology today announced that the company has submitted a capital call notice to LDA Capital Limited and LDA Capital LLC (LDA) to subscribe for up to 30,000,000 shares with an option for LDA to subscribe up to an additional 10,000,000 shares subject to company approval.


And from Top 20 holders list 27/1/2023
LDA CAPITAL LIMITED 26,045,582 shares

13 trading days since 27/1/2023 = 2M daily & should have been finished.

Not from what I have been told...I personally think that the whole situation with LDA Capital is extremely messy, and at some stage I do
hope that the CEO will explain why the Board called it so early into 2023.

I have seen a (sort of) explanation as to how this LDA put option is supposed to play out, but the floor price set by Brainchip has me thinking,

Today is the first time I have noticed the ASX send a company a please explain notice, because the share price has dropped. I personally find their behavior totally absurd.

May I also say that I think that having Peter put himself out there on Linkedin was great! asking followers to comment on what they would like him to comment on, such a brave move, and going by the two or so comments suggests to me at least, many are too afraid to ask him anything because of fear of possibly being laughed at or realizing they are exposing themselves to someone who has an IQ way above the average shareholder, me included, and fear being embarrassed.

I can assure you, 100%, our founder will never judge you.

The upcoming podcast is yet another positive move by the company, we all want to hear from Peter and Anil (in my opinion) and I'd be very happy to have both our "top guns" make themselves available for half-yearly "fire side chats" throughout the year/s.

All staff are important, they all contribute and continue to contribute daily, but having our two key guys talking about our company in an honest relaxed manner is extremely uplifting for all shareholders, so every 6 months lets make it happen Brainchip, you're all appreciated.

Love Brainchip x

Tech ;)
 
Last edited:
  • Like
  • Love
  • Fire
Reactions: 43 users

Tothemoon24

Top 20
Wow 🤩

This is uplifting


FEATURES ISSUE: FEB/MAR 23WHY YOU WILL BE S...
Why you will be seeing much more from event cameras
14 February 2023

February/March 2023
Advances in sensors that capture images like real eyes, plus in the software and hardware to process them, are bringing a paradigm shift in imaging, finds Andrei Mihai



The field of neuromorphic vision, where electronic cameras mimic the biological eye, has been around for some 30 years. Neuromorphic cameras (also called event cameras) mimic the function of the retina, the part of the eye that contains light-sensitive cells. This is a fundamental change from conventional cameras – and why applications for event cameras for industry and research are also different.


Conventional cameras are built for capturing images and visually reproducing them. They take a picture at certain amounts of time, capturing the field of vision and snapping frames at predefined intervals, regardless of how the image is changing. These frame-based cameras work excellently for their purpose, but they are not optimised for sensing or machine vision. They capture a great deal of information but, from a sensing perspective, much of that information is useless, because it is not changing.

Event cameras suppress this redundancy and have fundamental benefits in terms of efficiency, speed, and dynamic range. Event-based vision sensors can achieve better speed versus power consumption trade-off by up to three orders of magnitude. By relying on a different way of acquiring information compared with a conventional camera, they also address applications in the field of machine vision and AI.



Event camera systems can quickly and efficiently monitor particle size and movement


“Essentially, what we’re bringing to the table is a new approach to sensing information, very different to conventional cameras that have been around for many years,” says Luca Verre, CEO of Prophesee, the market leader in the field.

Whereas most commercial cameras are essentially optimised to produce attractive images, the needs of the automotive, industrial, Internet of Things (IoT) industries, and even some consumer products, often demand different performances. If you are monitoring change, for instance, as much as 90% of the scene is useless information because it does not change. Event cameras bypass that as they only monitor when light goes up or down in certain relative amounts, which produces a so-called “change event”.

In modern neuromorphic cameras, each pixel of the sensor works independently (asynchronously) and records continuously, so there is no downtime, even when you go down to microseconds. Also, since they only monitor changing data, they do not monitor redundant data. This is one of the key aspects driving the field forward.

Innovation in neuromorphic vision
Vision sensors typically gather a lot of data, but increasingly there is a drive to use edge processing for these sensors. For many machine vision applications, edge computation has become a bottleneck. But for event cameras, it is the opposite.

More and more, sensor cameras are used for some local processing, some edge processing, and this is where we believe we have a technology and an approach that can bring value to this application“More and more, sensor cameras are used for some local processing, some edge processing, and this is where we believe we have a technology and an approach that can bring value to this application,” says Verre.

“We are enabling fully fledged edge computing by the fact that our sensors produce very low data volumes. So, you can afford to have a cost-reasonable, low-power system on a chip at the edge, because you can simply generate a few event data that this processor can easily interface with and process locally.


“Instead of feeding this processor with tons of frames that overload them and hinder their capability to process data in real-time, our event camera can enable them to do real-time across a scene. We believe that event cameras are finally unlocking this edge processing.”

Making sensors smaller and cheaper is also a key innovation because it opens up a range of potential applications, such as in IoT sensing or smartphones. For this, Prophesee partnered with Sony, mixing its expertise in event cameras with Sony’s infrastructure and experience in vision sensors to develop a smaller, more efficient, and cheaper event camera evaluation kit. Verre thinks the pricing of event cameras is at a point where they can be realistically introduced into smartphones.

Another area companies are eyeing is fusion kits – the basic idea is to mix the capability of a neuromorphic camera with another vision sensor, such as lidar or a conventional camera, into a single system.

“From both the spatial information of a frame-based camera and from the information of an event-based camera, you can actually open the door to many other applications,” says Verre. “Definitely, there is potential in sensor fusion… by combining event-based sensors with some lidar technologies, for instance, in navigation, localisation, and mapping.”

Neuromorphic computing progress
However, while neuromorphic cameras mimic the human eye, the processing chips they work with are far from mimicking the human brain. Most neuromorphic computing, including work on event camera computing, is carried out using deep learning algorithms that perform processing on CPUs of GPUs, which are not optimised for neuromorphic processing. This is where new chips such as Intel’s Loihi 2 (a neuromorphic research chip) and Lava (an open-source software framework) come in.

“Our second-generation chip greatly improves the speed, programmability, and capacity of neuromorphic processing, broadening its usages in power and latency-constrained intelligent computing applications,” says Mike Davies, Director of Intel’s Neuromorphic Computing Lab.
BrainChip, a neuromorphic computing IP vendor, also partnered with Prophesee to deliver event-based vision systems with integrated low-power technology coupled with high AI performance.

It is not only industry accelerating the field of neuromorphic chips for vision – there is also an emerging but already active academic field. Neuromorphic systems have enormous potential, yet they are rarely used in a non-academic context. Particularly, there are no industrial employments of these bio-inspired technologies. Nevertheless, event-based solutions are already far superior to conventional algorithms in terms of latency and energy efficiency.

Working with the first iteration of the Loihi chip in 2019, Alpha Renner et al (‘Event-based attention and tracking on neuromorphic hardware’) developed the first set-up that interfaces an event-based camera with the spiking neuromorphic system Loihi, creating a purely event-driven sensing and processing system. The system selects a single object out of a number of moving objects and tracks it in the visual field, even in cases when movement stops, and the event stream is interrupted.

In 2021, Viale et al demonstrated the first spiking neuronal network (SNN) on a chip used for a neuromorphic vision-based controller solving a high-speed UAV control task. Ongoing research is looking at ways to use neuromorphic neural networks to integrate chips and event cameras for autonomous cars. Since many of these applications use the Loihi chip, newer generations, such as Loihi 2, should speed development. Other neuromorphic chips are also emerging, allowing quick learning and training of the algorithm even with a small dataset. Specialised SNN algorithms operating on neuromorphic chips can further help edge processing and general computing in event vision.

The development of event-based cameras, inspired by the retina, enables the exploitation of an additional physical constraint – time“The development of event-based cameras, inspired by the retina, enables the exploitation of an additional physical constraint – time. Due to their asynchronous course of operation, considering the precise occurrence of spikes, spiking neural networks take advantage of this constraint,” writes Lea Steffen and colleagues (‘Neuromorphic Stereo Vision: A Survey of Bio-Inspired Sensors and Algorithms’).

Lighting is another aspect the field of neuromorphic vision is increasingly looking at. An advantage of event cameras compared with frame-based cameras is their ability to deal with a range of extreme light conditions – whether high or low. But event cameras can now use light itself in a different way.

Prophesee and CIS have started work on the industry’s first evaluation kit for implementing 3D sensing based on structured light. This uses event-based vision and point cloud generation to produce an accurate 3D Point Cloud.

“You can then use this principle to project the light pattern in the scene and, because you know the geometry of the setting, you can compute the disparity map and then estimate the 3D and depth information,” says Verre. “We can reach this 3D Point Cloud at a refresh rate of 1kHz or above. So, any application of 3D tourism, such as 3D measurements or 3D navigation that requires high speed and time precision, really benefits from this technology. There are no comparable 3D approaches available today that can reach this time resolution.”

Industrial applications of event vision

Due to its inherent advantages, as well as progress in the field of peripherals (such as neuromorphic chips and lighting systems) and algorithms, we can expect the deployment of neuromorphic vision systems to continue – especially as systems become increasingly cost-effective.



Event vision can trace particles or monitor vibrations with low latency, low energy consumption, and relatively low amounts of data

We have mentioned some of the applications of event cameras here at IMVE before, from helping restore people’s vision to tracking and managing space debris. But in the near future perhaps the biggest impact will be at an industrial level.

From tracing particles or quality control to monitoring vibrations, all with low latency, low energy consumption, and relatively low amounts of data that favour edge computing, event vision is promising to become a mainstay in many industrial processes. Lowering costs through scaling production and better sensor design is opening even more doors.

Smartphones are one field where event cameras may make an unexpected entrance, but Verre says this is just the tip of the iceberg. He is looking forward to a paradigm shift and is most excited about all the applications that will soon pop up for event cameras – some of which we probably cannot yet envision.

“I see these technologies and new tech sensing modalities as a new paradigm that will create a new standard in the market. And in serving many, many applications, so we will see more event-based cameras all around us. This is so exciting.”

Event-based visionneuromorphic vision
PropheseeIntel
RELATED ARTICLES


Turbocharging cell imaging
FEATURE
LIFE SCIENCES, EVENT-BASED VISION, PATHOLOGY, MICROSCOPY
21 December 2021

Digital twin firm, Visometry, wins Vision start-up award
NEWS STORY
VISION STUTTGART, START-UPS, AR, ARTIFICIAL INTELLIGENCE, EVENT-BASED VISION
17 October 2022

Prophesee raises €50m for neuromorphic imaging
NEWS STORY
EVENT-BASED VISION, INVESTMENT
26 September 2022

Live webinar - Event-based vision
EVENT
EVENT-BASED VISION
24 January 2022

Turbocharging cell imaging
FEATURE
LIFE SCIENCES, EVENT-BASED VISION, PATHOLOGY, MICROSCOPY
21 December 2021
MEDIA PARTNERS
Company
Contact
About
Advertise
Terms & conditions
Privacy policy
Magazines
Electro Optics
Fibre Systems
Imaging and Machine Vision Europe
Laser Systems Europe
Research Information
Scientific Computing World
Registration
Login
Logout
Subscribe
Edit subscription
© 2023 Europa Science Ltd.

I’m inspired to load more tomorrow, the above ⬆️ just nailed further conviction
 
  • Like
  • Fire
Reactions: 6 users

Galaxycar

Regular
D
Wow, relaxed about diminishing wealth. Amazing.

Giggle away then
Did,not Sean say judge me by my next AGM, well the judge has spoken,time to go your only good enough to be a number two,overlooked for a reason,when the going get tough he hides, come out of your luxury Sydney expensive office and tell shareholders where we are really at with nda,s versus real potential paying customers,sick of listening to Tony Dawes bullshit. The minute you question all the free shares his demeanour changes. Won’t answer what they have achieved to earn them. FOS this lot.
 

JoMo68

Regular
Letter to Tony Dawe:

Dear Tony,
Todays bloodbath has been the culmination of months of senior management either:
failing with their the promises of market advancement, penetration and innovation, or
being too scared in their (mis)understanding of what they can reveal in ASX announcements, or a combination of both.

Today has slaughtered many faithful shareholders including myself. I have marketed your company to many friends under the impression that Brainchip was genuinely a disruptive force within edge computing. I am embarrassed and ashamed of opening my mouth, let alone believing management would keep shareholders abreast of the company’s successful market penetration.

I believe management need to re examine the way they support shareholders. I and many I communicate with are still ‘of the faith’ but it is considerably damaged.

Please take what I and I’m sure many others say, and rebuild shareholder confidence.

I haven’t sent this, but I needed to get it off my chest. Perhaps TD may read this here, but I doubt he will have the time, placating many other shareholders.
star trek eye roll GIF
 
  • Like
  • Haha
  • Love
Reactions: 8 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
We‘re gonna get there alive!

 
  • Like
  • Love
Reactions: 9 users

Violin1

Regular
Wow, relaxed about diminishing wealth. Amazing.

Giggle away then
She didn't sell. Only a loss when you do. Never fun when SP drops but not chrystallised until you capitulate and sell out. We all got excited on the rise to 2.34 and I never expected we'd go below $1 again but hold the faith - we invested for a reason.
 
  • Like
  • Love
  • Fire
Reactions: 20 users
D

Deleted member 1270

Guest
Yeah, we’re a pretty chill community here. Sounds like it’s not your vibe, particularly looking back on all of your negative contributions previously. So you may as well just go away unless you can dig this.


Off you go and join the backslapping crew. All is well and theres nothing to see here.

Unbelievable. Pass the cool aid.

I'll continue to comment for as long as I please. Put me on ignore. if you can't handle reality.
 
  • Like
Reactions: 4 users
D

Deleted member 1270

Guest
Lol. Lou is a long term investor. You on the other hand need to read Buffett 101. @belkin please stop before you embarrass yourself any further.
Good luck if this is how you view your investments.

I have been invested in the company since 2015 so I suppose I'm not long term enough for you?

Are you saying I can't respond how I like to a long term holder like Lou?

Buffett 101, Please stop. You're the one embarrassing yourself with the self help investment books. Amateur hour here.
 
  • Like
  • Thinking
Reactions: 8 users

Sirod69

bavarian girl ;-)
VVDN Technologies
VVDN Technologies
23 Min.


Join us to witness the newest tech trends in the networking & wireless world at #MWCBarcelona2023, VVDN will be showcasing the product engineering, manufacturing & cloud capabilities in the networking domain at Booth No. 2C40 & Hall No. 2, and our team of experts will be on-hand from 27 Feb to 2 March 2023.

Click below & book your meeting now.

Looking forward to meeting you in Barcelona!

#vvdntech #wemakeithappen #VVDNatMWC #innovation #technology #MWCB #MWC23 #MWC2023 #MWCBarcelona #MobileWorldCongress
1676470029161.png
 
  • Like
  • Fire
  • Love
Reactions: 14 users

Rach2512

Regular
Hi Everyone

Just in regards to shorters, I'm trying to understand the way they work.

My understanding, which I could be completely wrong so please correct me.

They borrow a heap of shares, do they sit on them and then decide when is a good time for them to sell them, they then sell a massive amount so as to spook the market, triggering stop losses, thus this reduces the SP all the more and then when they think they've hit rock bottom they quickly buy back in so they can then return the borrowed shares? If this is the case then can we expect some frantic buying from them over the next couple of days/weeks, which should recover the SP?

Also thinking about the SP, I've been in since my baby girl was born, she's now 10, I see there are a lot of people here who are not happy about the SP and it seems that this is their only focus. I'm curious say if the SP had shot up to $1, would you sell or hold knowing what the potential is for BRN, I know I wouldn't sell, if you would still hold how is that any different to holding now. I appreciate some may need the money now which if that is the case I feel for you. I think people were expecting to see revenue towards the end of last year but the plan changed from a Chip supplier to selling IP which meant changes had to be made which took time to implement. I think that we could all agree that this will be far more beneficial for BRN in the long run.


Happy to be corrected, I don't often contribute here so please don't shoot me!

Thank you to all the great posts and finds, they are very much appreciated.


Also sorry if a similar question has already been asked about shorters.
 
  • Like
  • Love
  • Fire
Reactions: 30 users
From the Feb/Mar 2023 issue of Imaging and Machine Vision Europe.

Get a mention on the Prophesee partnership in the article which is a general discussion with Verre as well.

Yes, saw as a demonstrator but with desire for commercial relationship possibly too. Makes sense when read first paragraph.




Neuromorphic computing progress

However, while neuromorphic cameras
mimic the human eye, the processing chipsthey work with are far from mimicking the human brain. Most neuromorphic computing, including work on event camera computing, is carried out using deep learning algorithms that
perform processing on CPUs of GPUs,
which are not optimised for
neuromorphic processing.


This is where new chips such as Intel’s Loihi 2 (a neuromorphic research chip) and Lava (an open-source software framework) come in.“Our second-generation chip greatly improves the speed, programmability, and capacity of neuromorphic processing, broadening its usages in power and latency-constrained intelligent computinga pplications,” says Mike Davies, Director ofIntel’s Neuromorphic Computing Lab.

BrainChip, a neuromorphic computing
IP vendor, also partnered with Prophesee to deliver event-based vision systems with integrated low-power technology coupled with high AI performance.

............

Smartphones are one field where
event cameras may make an unexpected
entrance, but Verre says this is just the tip
of the iceberg. He is looking forward to a
paradigm shift and is most excited about all the applications that will soon pop up forevent cameras – some of which we probablycannot yet envision.

“I see these technologies and new tech
sensing modalities as a new paradigm that will create a new standard in the market. And in serving many, many applications, so we will see more event-based cameras all around us. This is so exciting.”
 
  • Like
  • Fire
  • Love
Reactions: 34 users

Diogenese

Top 20
Hi Everyone

Just in regards to shorters, I'm trying to understand the way they work.

My understanding, which I could be completely wrong so please correct me.

They borrow a heap of shares, do they sit on them and then decide when is a good time for them to sell them, they then sell a massive amount so as to spook the market, triggering stop losses, thus this reduces the SP all the more and then when they think they've hit rock bottom they quickly buy back in so they can then return the borrowed shares? If this is the case then can we expect some frantic buying from them over the next couple of days/weeks, which should recover the SP?

Also thinking about the SP, I've been in since my baby girl was born, she's now 10, I see there are a lot of people here who are not happy about the SP and it seems that this is their only focus. I'm curious say if the SP had shot up to $1, would you sell or hold knowing what the potential is for BRN, I know I wouldn't sell, if you would still hold how is that any different to holding now. I appreciate some may need the money now which if that is the case I feel for you. I think people were expecting to see revenue towards the end of last year but the plan changed from a Chip supplier to selling IP which meant changes had to be made which took time to implement. I think that we could all agree that this will be far more beneficial for BRN in the long run.


Happy to be corrected, I don't often contribute here so please don't shoot me!

Thank you to all the great posts and finds, they are very much appreciated.


Also sorry if a similar question has already been asked about shorters.
Legal shorting involves the shorter borrowing th shares at $x per share in the hope that the share price will fall. If the share price falls to, say, $y, where y<x, they can buy shares at $y to return them to the lender, making $(x-y) per share profit.

However, if the share price rises to $z per share, where z>x, the shorter can face a loss of $(z-x) per share.

The practice you mention of manipulating the share price down by selling shares below what would be the market price is illegal and widespread on the ASX.

One indicator is if the SP often falls on close or in the after-market auction.

The practice is facilitated by bot trading, where computers sell small numbers of shares to continuously edge the SP down. Bot trading can also be tantamount to insider trading in that the large brokers can monitor incoming bids and make a purchase or sale before the bidders transaction is fulfilled.

Both short selling and bot trading should be banned as they are tools for picking the pockets of unsophisticated share holders.

It would be interesting to see how much the brokers donate to political parties.
 
  • Like
  • Fire
  • Thinking
Reactions: 54 users
Top Bottom