BRN Discussion Ongoing

hotty4040

Regular
Evening Chippers,

Breaking news...

World first, Pioneer DJ mixing table utilising Brainchips Akida neuromorphic chip on the International Space Station.
Personally can't imagine having to wash the external windows , whilst attached via umbilical, without some groovy tunes.

😄 .

* With any luck, may pull Fact Finder back, to give me a dressing down.
Seemed to work last time.

All in good humour.

ARi - Matasin, Live Series, Ep.003 ( Melodic Techno Progressive House Mix) 7th Jan 2023.

If a savvy individual could post link, Thankyou in advance.
This may be our only hope of retrieving Fact Finder.

Cheers for all the great finds and posts today.

Regards,
Esq.
Time to bury the "hatchet" FF. I've ( we've all ) missed you immensely ( immensurably even, I think that's a word ), so time to turn the other cheek, and get back into what you were born to do, i.e. ( Inspire with courage ) and project your viewpoints again and again, and with Gusto-oooo. Stuff is happening, ( right now IMO ) and we need your tick of approval and sharp appraisal qualities, and pronto, kind Sir.

The couch, can wait, and there's just toooo much crap on the goggle box IMHO, NB... I'm an expert in gogglebox wasted time, I can assure you, it is mind numbing, to say the least. So please "get back on board"...You just might miss all the fun.

And don't forget >>>>>>>



Akida Ballista >>>>>> Just a whole lotta good FACTS emerging - waiting for your perusal and comment <<<<<


hotty...
 
  • Like
  • Love
  • Fire
Reactions: 35 users

chapman89

Founding Member
This is an interesting paper setting out the issues around adverse weather as the title implies and you might like to read the whole paper.

The Lidar section does not deal with VALEO unfortunately but does point out the issues in using other Lidar. I have extracted a few parts that I found of particular interest. The link is at the end of these extracts:




Perception and sensing for autonomous vehicles under adverse weather conditions: A survey

Author links open overlay panelYuxiaoZhangaAlexanderCarballobcdHantingYangaKazuyaTakedaacd

a

Graduate School of Informatics, Nagoya University, Furo-cho, Chikusa-ward, Nagoya, 464-8601, Japan

b

Faculty of Engineering and Graduate School of Engineering, Gifu University, 1-1 Yanagido, Gifu City, 501-1193, Japan

c

Institute of Innovation for Future Society, Nagoya University, Furo-cho, Chikusa-ku, Nagoya, 464-8601, Japan

d

TierIV Inc., Nagoya University Open Innovation Center, 1-3, Mei-eki 1-chome, Nakamura-Ward, Nagoya, 450-6610, Japan

Received 29 April 2022, Revised 8 December 2022, Accepted 22 December 2022, Available online 9 January 2023, Version of Record 9 January 2023.

Abstract

Automated Driving Systems (ADS) open up a new domain for the automotive industry and offer new possibilities for future transportation with higher efficiency and comfortable experiences. However, perception and sensing for autonomous driving under adverse weather conditions have been the problem that keeps autonomous vehicles (AVs) from going to higher autonomy for a long time. This paper assesses the influences and challenges that weather brings to ADS sensors in a systematic way, and surveys the solutions against inclement weather conditions. State-of-the-art algorithms and deep learning methods on perception enhancement with regard to each kind of weather, weather status classification, and remote sensing are thoroughly reported. Sensor fusion solutions, weather conditions coverage in currently available datasets, simulators, and experimental facilities are categorized. Additionally, potential ADS sensor candidates and developing research directions such as V2X (Vehicle to Everything) technologies are discussed. By looking into all kinds of major weather problems, and reviewing both sensor and computer science solutions in recent years, this survey points out the main moving trends of adverse weather problems in perception and sensing, i.e., advanced sensor fusion and more sophisticated machine learning techniques; and also the limitations brought by emerging 1550 nm LiDARs. In general, this work contributes a holistic overview of the obstacles and directions of perception and sensing research development in terms of adverse weather conditions.





This first extract clearly presents the problem Mercedes Benz found while drying to develop ADAS and why it went looking for a different way of processing the amount of data being produced which led them to trial with Intel and Loihi before moving up to Brainchip the Artificial Intelligence experts and AKIDA technology for sensor fusion in real time at ultra low power.



Bijelic et al. (2020) from Mercedes-Benz AG present a large deep multimodal sensor fusion in unseen adverse weather. Their test vehicle is equipped with the following: a pair of stereo RGB cameras facing front; a near-infrared (NIR) gated camera whose adjustable delay capture of the flash laser pulse reduces the backscatter from particles in adverse weather (Bijelic et al., 2018b); a 77 GHz radar with 1&#x2218;" role="presentation" id="MathJax-Element-68-Frame">1∘ resolution; two Velodyne LiDARs namely HDL64 S3D and VLP32C; a far-infrared (FIR) thermal camera; a weather station with the ability to sense temperature, wind speed and direction, humidity, barometric pressure, and dew point; and a proprietary road-friction sensor. All the above are time-synchronized and ego-motion corrected with the help of the inertial measurement unit (IMU). Their fusion is entropy-steered, which means regions in the captures with low entropy can be attenuated, while entropy-rich regions can be amplified in the feature extraction. All the data collected by the exteroceptive sensors are concatenated for the entropy estimation process and the training was done by using clear weather only which demonstrated a strong adaptation. The fused detection performance was proven to be evidently improved than LiDAR or image-only under fog conditions. The blemish in this modality is that the amount of sensors exceeds the normal expectation of an ADS system. More sensors require more power supply and connection channels which is a burden to the vehicle itself and proprietary weather sensors are not exactly cost-friendly. Even though such an algorithm is still real-time processed, given the bulk amount of data from multiple sensors, the response and reaction time becomes something that should be worried about.



This next extract highlights the problem that AKIDA real time processing of Prophesee’s event based sensor overcomes and makes both essential in the automotive and robotic industries.

4.4.2. Reflections and shadows

Glare and strong light might not be removed easily, but reflections in similar conditions are relatively removable with the help of the absorption effect (Zheng et al., 2021b), reflection-free flash-only cues (Lei and Chen, 2021), and photo exposure correction (Afifi et al., 2021) techniques in the computer vision area. The principle follows reflection alignment and transmission recovery and it could relieve the ambiguity of the images well, especially in panoramic images which are commonly used in ADS (Hong et al., 2021). It is limited to recognizable reflections and fails in extremely strong lights where image content knowledge is not available. A special reflection is the mirage effect on hot roads. It has a weakness: the high-temperature area on the road is fixed and that fits the feature of a horizon which could be confusing (Young, 2015). Kumar et al. (2019)implemented horizon detection and depth estimation methods and managed to mark out a mirage in a video. The lack of mirage effects in datasets makes it hard to validate the real accuracy.

The same principle applies to shadow conditions as well, where the original image element is intact with a little low brightness in certain regions (Fu et al., 2021). Such image processing uses similar computer vision techniques as in previous paragraphs and can also take the route of first generating shadows and then removing them (Liu et al., 2021b). The Retinex algorithm can also be used for image enhancement in low-light conditions (Pham et al., 2020b).



This extract makes clear why it is absolutely critical that real time information is gathered by autonomous vehicles as to road surface conditions.

5.1.3. Road surface condition classification

Instant road surface condition changes are direct results of weather conditions, especially wet weather. The information on road conditions can sometimes be an alternative to weather classification. According to the research of Kordani et al. (2018) that at the speed of 80 km/h, the road friction coefficient of rainy, snowy, and icy road surface conditions are 0.4, 0.28 and 0.18 respectively, while average dry road friction coefficient is about 0.7. The dry or wet conditions can be determined in various ways besides road friction or environmental sensors (Shibata et al., 2020). Šabanovič et al. (2020) build a vision-based DNN to estimate the road friction coefficient because dry, slippery, slurry, and icy surfaces with decreasing friction can basically be identified as clear, rain, snow, and freezing weather correspondingly. Their algorithm detects not only the wet conditions but is able to classify the combination of wet conditions and pavement types as well. Panhuber et al. (2016) mounted a mono camera behind the windshield and observed the spray of water or dust caused by the leading car and the bird-view of the road features in the surroundings. They determine the road surface’s wet or dry condition by analyzing multiple regions of interest with different classifiers in order to merge into a robust result of 86% accuracy.

Road surface detection can also be performed in an uncommon way: audio. The sounds of vehicle speed, tire-surface interactions, and noise under different road conditions or different levels of wetness could be unique, so it is reasonable for Abdić et al. (2016) to train a deep learning network with over 780,000 bins of audio, including low speed when sounds are weak, even at 0 speed because it can detect the sound made by other driving-by vehicles. There are concerns about the vehicle type or tire type’s effects on the universality of such a method and the uncertain difficulty degree of the installation of sound collecting devices on vehicles.



This extract points to the convenient truths that:

  • AKIDA technology boasts the capacity to process ultrasonic sensors in real time allowing sensor fusion,
  • VALEO produces ultrasonic sensors and has a purpose built factory for their production along with the next gen Scala 3 Lidar, and
  • Brainchip and VALEO have an ASX announced EAP relationship for ADAS and AV development and Brainchip is trusted by VALEO.


2.4. Ultrasonic sensors

Ultrasonic sensors are commonly installed on the bumpers and all over the car body serving as parking assisting sensors and blindspot monitors (Carullo and Parvis, 2001). The principle of ultrasonic sensors is pretty similar to radar, both measuring the distance by calculating the travel time of the emitted electromagnetic wave, only ultrasonic operates at ultrasound band, around 40 to 70 kHz. In consequence, the detecting range of ultrasonic sensors normally does not exceed 11 m (Frenzel, 2021), and that restricts the application of ultrasonic sensors to close-range purposes such as backup parking. Efforts have been done to extend the effective range of ultrasonic and make it fit for long-range detecting (Kamemura et al., 2008). For example, Tesla’s “summon” feature uses ultrasonic to navigate through park space and garage doors (Tesla, 2021a).

Ultrasonic is among the sensors that are hardly considered in the evaluation of weather influences, but it does show some special features. The speed of sound traveling in air is affected by air pressure, humidity, and temperature (Varghese et al., 2015). The fluctuation of accuracy caused by this is a concern to autonomous driving unless enlisting the help of algorithms that can adjust the readings according to the ambient environment which generates extra costs. Nonetheless, ultrasonic does have its strengths, given the fact that its basic function is less affected by harsh weather compared to LiDAR and camera. The return signal of an ultrasonic wave does not get decreased due to the target’s dark color or low reflectivity, so it is more reliable in low visibility environments than cameras, such as high-glare or shaded areas beneath an overpass.

Additionally, the close proximity specialty of ultrasonic can be used to classify the condition of the road surface. Asphalt, grass, gravel, or dirt road can be distinguished from their back-scattered ultrasonic signals (Bystrov et al., 2016), so it is not hard to imagine that the snow, ice, or slurry on the road can be identified and help AV weather classification as well.



The following extract makes clear that the solutions known to these researchers are still to be found if ADAS or AV is to manage predictable

extreme light and weather conditions are to be managed within an acceptable power envelope in real time.


8. Conclusion

In this work, we surveyed the influence of adverse weather conditions on 5 major ADS sensors. Sensor fusion solutions were listed. The core solution to adverse weather problems is perception enhancement and various machine learning and image processing methods such as de-noising were thoroughly analyzed. Additional sensing enhancement methods including classification and localization were also among the discussions. A research tendency towards robust sensor fusions, sophisticated networks and computer vision models is concluded. Candidates for future ADS sensors such as FMCW LiDAR, HDR camera and hyperspectral camera were introduced. The limitations brought by the lack of relevant datasets and the difficulty of 1550 nm LiDAR were thoroughly explained. Finally, we believe that V2X and IoT have a brighter prospect in future weather research. This survey covered almost all types of common weather that pose negative effects on sensors’ perception and sensing abilities including rain, snow, fog, haze, strong light, and contamination, and listed out datasets, simulators, and experimental facilities that have weather support.

With the development of advanced test instruments and new technologies in LiDAR architectures, signs of progress have been largely made in the performance of perception and sensing in common wet weather. Rain and fog conditions seem to be getting better with the advanced development in computer vision in recent years, but still have some space for improvement on LiDAR. Snow, on the other hand, is still at the stage of dataset expansion and perception enhancement against snow has some more to dig in. Hence, point cloud processing under extreme snowy conditions, preferably with interaction scenarios either under controlled environments or on open roads is part of our future work. Two major sources of influence, strong light and contamination are still not rich in research and solutions. Hopefully, efforts made towards the robustness and reliability of sensors can carry adverse weather conditions research to the next level.

Declaration of Competing Interest

The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.

Acknowledgments

Funding

The author (Y.Z) would like to take this opportunity to thank the “Nagoya University Interdisciplinary Frontier Fellowship” supported by Nagoya University and JST, Japan, the establishment of university fellowships towards the creation of science technology innovation, Grant Number JPMJFS2120, and JSPS KAKENHI, Japan Grant Number JP21H04892 and JP21K12073.

The authors thank Prof. Ming Ding from Nagoya University for his help. We would also like to extend our gratitude to Sensible4, the University of Michigan, Tier IV Inc., Ouster Inc., Perception Engine Inc., and Mr. Kang Yang for their support. In addition, our deepest thanks to VTT Technical Research Center of Finland, the University of Waterloo, Pan Asia Technical Automotive Center Co., Ltd, and the Civil Engineering Research Institute for Cold Region of Japan.
 
  • Like
  • Fire
  • Love
Reactions: 35 users

Getupthere

Regular


CES "highlights": Lidar provider lets Tesla and other brands run over children's dummies​

10. January 2023 33 comments
mercedes crash dummy luminar demo ces

Image: Luminar
Underground, the Las Vegas Loop boring system was able to be tried out underground at the CES electronics fair last week, which transports visitors in Tesla electric cars through narrow tunnels to various stations on the site and in the city. They are controlled by human drivers - and above earth, the lidar provider Luminar demonstrated what can happen if you rely on today's computer systems: He let cars of different brands drive towards children's dummies, which ended with collisions except for his own technology.

CES visitor shows crash with Tesla​

Luminar did not use the name Tesla in press releases or in his video with "highlights" from this year's CES published on Monday. Instead, there is only talk of vehicles with similar technology, but other sensors. That this also means Tesla's can be seen in a section, filmed from the cockpit of a Lexus RX450H with lidar, as an overlay shows: The Japanese hybrid stops before rams a crossing children's doll on a Bobby car; a Model Y continues two lanes to the right of it.

Whether there was a collision is unclear, because the scene ends right after that. However, a similar one showed a CES visitor on Twitter, filmed as a passenger in the Tesla: According to the guest in FSD mode, the electric car closes to a children's character at some distance in the middle of his lane after the start. A few meters in front of her, an autopilot warning signal can be heard and immediately afterwards the squeaking braking tires - and then a dull noise and a loud "Damn!", because the Tesla slowed down significantly, but no longer stopped in front of the dummy.



Luminar himself shows such dummy accidents in his highlights video only with brands other than Tesla. You can see an SUV from Subaru or Toyota, a large pickup and a Mercedes electric car of the type EQS or EQE (see Photo above), which seem to brake automatically before the impact, but still wegram small pedestrian figures on their trail at a fairly high speed. After CES 2022, there was a similar demo video of Luminar, in which several such tests with a Tesla could be seen filmed from inside and outside.



A Lexus with built-in lidar system from Luminar is a counterexample this year. It brakes not only for the artificial child in the middle of the track in time, but also for a tire lying flat in the way. These tests were carried out during CES 2023 in bright sunshine and pouring rain, summarizes a Luminar manager, and regularly showed that their own technology prevents accidents that could not be prevented with others.

Volvo electric car with lidar in series​

According to a report by Marketwatch, a car from SAIC Motors is already being sold in China that has integrated the lidar sensor into a roof module. Luminar also showed a Volvo EX90 with its technology - the electric car of the Swedish-Chinese brand is to go into production this year with standard lidar. For autonomous or safer driving in real traffic, Luminar also announced that it would use its sensors to create an exact 3D map of roads worldwide. In the middle of this decade, one's own technology should be installed in more than one million vehicles.

Matching this​

Tags: autopilot, sensors, Tesla, Volvo
ADVERTISEMENT


https://go.securitysaversonline.com...px_piLriXTILSR_clGAR0CC-yD2xFoomvjdxLyOufj0AQ
Most iPhone Users Didn't Know This Simple Trick To Block Ads (Do It Now)Safe mobile tips|
ADVERTISEMENT
Learn more

https://auditorey.com/australias-20...x0px_piLriXTILSR_clGAR0CC-yD1gVMo7pyXwuWQte9h
Wanted: Australians who want to try Latest High-Tech Hearing Aids - Free!We're looking for people in Australia who want to try latest release high-tech hearing devicesAuditorey.com | Try Latest Hearing Devices|
ADVERTISEMENT
try now



Enter search term...

LATEST POSTINGS​

  1. STARTPAGE
  2. Autopilot
  3. Luminar lets Tesla and other dummies ramm > teslamag.de
Copyright Š 2023 ¡ Teslamag.de







https://go.securitysaversonline.com...px_piLriXTILSR_clGAR0CC-yD2xFoozLjr2cD81s6nAQ
Most iPhone Users Didn't Know This Simple Trick To Block Ads (Do It Now)Safe mobile tips|
Sponsored
https://auditorey.com/australias-20...x0px_piLriXTILSR_clGAR0CC-yD1gVMo9uyKq5vG5vJV
Wanted: Australians who want to try Latest High-Tech Hearing Aids - Free!Auditorey.com | Try Latest Hearing Devices|
Sponsored
https://wwiqtest.com/?utm_source=ta...x0px_piLriXTILSR_clGAR0CC-yDNiEgo56aP7o7vqukY
Check how smart are you. Answer 22 questions and find out what is your IQ.WW IQ test|
Sponsored
 
  • Like
  • Fire
Reactions: 13 users

Sirod69

bavarian girl ;-)
Tim Llewellynn
Status: online
Tim Llewellynn• 1.CEO/Co-Founder of NVISO Human Behaviour AI | President Bonseyes Community Association | Coordinator Bonseyes AI Marketplace | IBM Beacon Award Winner #metaverse #edgeai #decentralizedai
13 Min. • vor 13 Minuten


#chatgpt is back with a Jan 9th update!

Where #chatgpt has “learnt” technical #documentation on say AI development it provides amazing explanationary analysis if you know what you want to do and ask for.

If it 100x faster that searching Google.

If I were #google I’d be very very worried at the moment.

This will be game changer for small businesses #productivity and #startups startups - productivity gains will go through the roof.
 
  • Like
  • Fire
  • Love
Reactions: 24 users

buena suerte :-)

BOB Bank of Brainchip
When is the next quarterly being released? Anyone got the date?
Probably 30th/31st Jan (y)
 
  • Like
  • Fire
  • Love
Reactions: 13 users

AusEire

Founding Member. It's ok to say No to Dot Joining
  • Like
  • Fire
  • Love
Reactions: 23 users

IloveLamp

Top 20
  • Like
  • Thinking
Reactions: 26 users
A bit of an insight into what Mercedes Benz are planning:


"The EQ name – which stands for 'electric intelligence' – was first adopted by Mercedes-Benz in 2016 with the unveiling of the Generation EQ concept car, which went into production as the EQC SUV in 2018.

Now, according to Handelsblatt, Mercedes-Benz has a new generation of electric models coming from 2024 which could trigger the decision to retire the EQ name."
 
  • Like
  • Fire
Reactions: 11 users

buena suerte :-)

BOB Bank of Brainchip
  • Like
  • Haha
  • Love
Reactions: 31 users

misslou

Founding Member
Anyone that is pushing for FF to return might want to have a go at trying to do what he was doing for a time and see how long they last.

He is exceptional at using facts to express his conviction gained from his research while always urging readers to fact check for themselves and draw their own conclusions. Never misleading but always encouraging, explaining and providing valuable leads.

When we get overrun with users with manipulative, deceitful intentions, he is the one who takes large amounts of time out of his own day at his own expense, to use his skill set to expose the flaws in their statements and we lazily benefit by simply reading his response to the original post we ignored.

I don’t know how bad it still is because I now have so many users on ignore and I’m not sure if they work on the weekends.

But he had enough and quite rightly so, we would have to if we’d been putting in half as much effort, and should be respected to take as much time as he wants before he returns, should he ever decide to do so.
 
  • Like
  • Love
  • Fire
Reactions: 71 users

Onboard21

Member
  • Like
Reactions: 2 users

Sam

Nothing changes if nothing changes
Well said,

I was wondering why you haven’t been liking my comments 😂
 
  • Haha
  • Like
Reactions: 4 users

LM77

Member
Hi Dio, are you saying, opposed to your initial thought, it is possible that Cerence is a friend instead of a foe? Because I was wondering @Bravo if the large consulting firm that Cerence used to vett the OEMs might have been our recent podcast mate Accenture 🫢
I was thinking along the same line as you re Cerence using Accenture, and whilst most definitely not definitive proof, the chairman of Cerence, Arun Sarin, happens to also be the chairman of Accenture.

 
  • Like
  • Fire
  • Love
Reactions: 29 users

Mea culpa

prəmɪskjuəs
Anyone that is pushing for FF to return might want to have a go at trying to do what he was doing for a time and see how long they last.

He is exceptional at using facts to express his conviction gained from his research while always urging readers to fact check for themselves and draw their own conclusions. Never misleading but always encouraging, explaining and providing valuable leads.

When we get overrun with users with manipulative, deceitful intentions, he is the one who takes large amounts of time out of his own day at his own expense, to use his skill set to expose the flaws in their statements and we lazily benefit by simply reading his response to the original post we ignored.

I don’t know how bad it still is because I now have so many users on ignore and I’m not sure if they work on the weekends.

But he had enough and quite rightly so, we would have to if we’d been putting in half as much effort, and should be respected to take as much time as he wants before he returns, should he ever decide to do so.
Thank you so much for posting your thoughts on Fact Finder @misslou You have stated precisely what I was struggling to put together myself. I have had reservations about 2 or 3 posters who seem to have been lurking and/or asking inappropriate personal questions of others regarding their shareholdings. They seem to have become more active since FF took leave. I'm sure he had identified them for what they are. I don't have them on Ignore however, their narrative and pretence is remarkably similar to known creeps at the other place.

Cheers.
Mc.
 
  • Like
  • Love
  • Fire
Reactions: 25 users

BaconLover

Founding Member
I cannot see any manipulative, deceitful posts.

But please, if someone sees it, can we report it?
Thanks.
 
  • Like
  • Fire
  • Love
Reactions: 26 users

Build-it

Regular
I wonder if we will sign up to the rebranded IBM partner program or be requested to.

IBM has invested $1 billion into our partner ecosystem. We want to ensure that partners like you have the resources to build your business and develop software for your customers using IBM’s industry-defining hybrid cloud and AI platform. Together, we build and sell powerful solutions that elevate our clients’ businesses through digital transformation.

Screenshot_20230115-095058_Samsung Internet.jpg


Interesting no mention of Apple as their biggest partners.
Over five years ago, Apple and IBM joined forces to redefine how work gets done at enterprises across the world. Power met simplicity. Security met flexibility.

No doubt IBM want/need to be a force within the Ecosystem.

And we are aware of the PVDM connection and the Fact that Tim Llewellyn Is a IBM Beacon Award Winner.

Enjoying all the research that continues to be posted, it is exciting as RT would say.

Edge Compute.
 

Attachments

  • Screenshot_20230115-090019_Samsung Internet.jpg
    Screenshot_20230115-090019_Samsung Internet.jpg
    740.8 KB · Views: 56
  • Screenshot_20230115-093006_Samsung Internet.jpg
    Screenshot_20230115-093006_Samsung Internet.jpg
    570.3 KB · Views: 60
  • Like
  • Fire
  • Love
Reactions: 31 users

BaconLover

Founding Member
Ok, here's something I have wanted to talk about for a while.

Many of us love swimming in the ocean.
We know there are sharks in there.
We still do it. Because it is fun.

We love investing in disruptive companies like Brainchip.
We know there are sharks out there with all sorts.
We still do it. Because it can be life changing.

I think it is about time we accept this fact, and stop blaming others for what's happening on the market.

Yes, there could be manipulation, in every share on ASX which includes Brainchip.
But what can we do about this?

ASX is their ocean, and we are just swimming in it.
We need to be careful, look for warning signs, keep finger on the market pulse and act accordingly.

Once I realised this, I stopped the blame game on manipulation, because quite frankly, they helped me accumulate shares in around 6 accounts over the last few years. Thanks to the SP appreciation, around 90% of my main account is BRN shares.
Make use of the market conditions, accumulate if you can, sit out or stay put... but whatever you do, we need to realise manipulators alone do not control the market.

We all enjoyed the share price going to $2.34, if not for the shorts and manipulators we would not have reached that levels.
Not many people think about this.

We are adults and we need to take responsibility for our actions, and stop blaming others. If you make decisions to buy or sell, just because an up ramper or down ramper made comments on a public forum, you need to stay off the market for a bit, read up, prepare yourself psychologically and come back again later. We need to be able to hear both sides of arguments when investing, that is how we learn and grow. We cannot control how people invest, trade, or make money on the markets. We can only control our own actions and our behaviour towards the information we receive - how we receive this information and what we do with it.

There are more than 100 million shorts on BRN now, and all we need is one good Announcement to put a smile on our face.

TSE is still a great platform, I went to HC two days ago after a long time to compare, and it is still a cess pool. If you still visit it, you know what I am talking about. We have community moderation here, if you feel someone is making disparaging comments or down ramping, report it. Zeebot has created the platform but it is up to us to keep it sane. And from what I see, we do a great job.

I know many won't like this view, but it is another view and is mine.

Akida Ballista.
 
  • Like
  • Love
  • Fire
Reactions: 109 users

Dhm

Regular
I've mentioned Chris Ciovacco a few times before because he pulls together multiple stock market issues, together with a comprehensive array of charts. This weeks video is an excellent summary of where we have come from - macro wise - and paints a 'balance of probabilities' picture for a new bull market globally.

If you choose to watch, and I believe it is worth doing especially this week, put the Playback Speed at 1.5. If global markets are entering into a new bull phase, it will certainly help our prospects within the tech market.

 
  • Like
  • Love
  • Fire
Reactions: 27 users

skutza

Regular
I cannot see any manipulative, deceitful posts.

But please, if someone sees it, can we report it?
Thanks.
Nor can I. But again anyone thinking someone on this forum can manipulate the SP need to maybe rethink investing. If anyone was to sell their shares because of a post here then it would need to be hard evidence that the IP was fraudulent. That means they would be closer to the tech than Arm, Sifive, Edge Impluse, Nviso, Mercedes, Megachips, Socionext, Prophesee, VVDN etc etc etc.....

Their knowledge would be greater than the minds of the best software developers at the above companies plus so many more. So again the thought that someone comes here without any real evidence (facts) and people believe that it will destroy or kill the forum or the SP, well to me it would only show the weakness in the ability of the investor.

1673741852172.png
 
  • Like
  • Love
  • Fire
Reactions: 27 users
Top Bottom