BRN Discussion Ongoing

Boab

I wish I could paint like Vincent
Slim pickings I think, but the bit about the lessening of any blurring caught my eye.

 
  • Like
Reactions: 12 users

TECH

Regular
Brainchip Inc.....A global company

Brainchip Inc.....An International IP Supplier

Brainchip Inc.....World-Leader of Edge AI

Expect our company to receive some acknowledgements at the fast-approaching CES, not only from Mercedes Benz.

It's also important to acknowledge that we, Brainchip, have had a very busy year, many positive connections have been sealed
with a handshake, the coming year I personally expect to see much progress, with more companies joining us, product/s starting
to appear in the marketplace, over the next 24 months I also expect growth to really ramp up.

My opinion only.

Love Brainchip :love::geek:
 
  • Like
  • Love
  • Fire
Reactions: 75 users
A Christmas thought bubble.

Assume you are head of the foundry arm of a major technology company.

Assume you are considering the inclusion of a revolutionary one of a kind neuromorphic technology IP from a small Australian based technology company into your portfolio.

Would you simply accept the small Aussie company’s assurances as to what it does or would you require it to be internally tested and validated?

If as seems logical you would require it to be tested and validated internally would you:

a) Ask your Von Neumann compute experts to undertake the test and validation; or

b) Ask your Neuromorphic research arm under the control of a world renowned neuromorphic technology thought leader to undertake the test and validation.

Having most likely chosen option b). would you not also ask the Australian company to provide it’s claimed performance figures so that the testing and validation can be undertaken in some sort of context.

Assuming the end result is that the decision is favourable to the Australian company and its IP is to be included in your foundry IP portfolio would you not also require as part of the technical details you will make available to customers interested in the IP the results of benchmarking so you can allow customers to weigh up the relative advantage and disadvantage of adopting any particular IP.

Once again would you simply accept the benchmarking results from the Australian company or would you ask your internal experts to verify that these performance comparisons are accurate.

Again I think it likely you would be reluctant to simply accept the word of the Australian company and would ask for confirmation of the benchmarks by your own internal experts.

So taking the above train of thought and applying it to a known fact, being that, the Australian company Brainchip has told shareholders it is working on benchmarking AKIDA. Would it not seem likely that the results have been provided to Intel Foundry and that Intel have also verified the results prior to announcing the inclusion of the AKIDA IP.

This being so then could it not be that one part of the Rob Telson ‘more to come’ at CES 2023 might be the public release of the AKIDA technology benchmarking as verified by Intel???

My speculation only so DYOR
FF

AKIDA BALLISTA
 
  • Like
  • Love
  • Fire
Reactions: 98 users

SERA2g

Founding Member
Slightly off topic, Merry Christmas all. Have kept the phone use to a minimum these past few days so have missed a bit on the forum but did check in once daily to read the top comment.

Appreciate the effort everyone puts in here, especially while we’re all on holidays!

Hoping everyone stays safe over the break and has a happy and healthy 2023.

May it be the year of akida.

Cheers!
 
  • Like
  • Love
  • Fire
Reactions: 45 users

SERA2g

Founding Member
Speck

“Speck™ is a fully event-driven neuromorphic vision SoC. Speck™ is able to support large-scale spiking convolutional neural network (sCNN) with a fully asynchronous chip architecture”

Edit to add: im not suggesting speck contains akida. It and it’s sister SoC in Xylo have already been discussed here and are not akida. Just found the paper interesting so shared the context with it.


4D1C8151-65E5-40B5-964B-BAFE882D6E91.png


Here’s a link to the paper supported by synsense and IBM - https://www.frontiersin.org/articles/10.3389/fnins.2022.1068193/full

I find it interesting that a paper can be written on spiking neural networks but not reference Peter van der Made’s many peer reviewed works.
 
Last edited:
  • Like
  • Fire
  • Thinking
Reactions: 16 users
Speck

“Speck™ is a fully event-driven neuromorphic vision SoC. Speck™ is able to support large-scale spiking convolutional neural network (sCNN) with a fully asynchronous chip architecture”

Edit to add: im not suggesting speck contains akida. It and it’s sister SoC in Xylo have already been discussed here and are not akida. Just found the paper interesting so shared the context with it.


View attachment 25360

Here’s a link to the paper supported by synsense and IBM - https://www.frontiersin.org/articles/10.3389/fnins.2022.1068193/full

I find it interesting that a paper can be written on spiking neural networks but not reference Peter van der Made’s many peer reviewed works.
It might only be a small thing but in the court rooms of common law countries such as Australia it is considered very bad form to selectively quote from source documents and always considered an attempt to mislead the court.

Brainchip bless their little cotton socks unlike a certain company who shall remain named above have on their website the link to an Article from the EETimes titled 'Cars that think like you' however they do not edit the article to remove those parts which cover a competitor nor do they take the title of the article and rewrite it to be 'SynSense. Cars that think like you' so that it appears to be saying that their product creates a car that 'thinks like you' and in so doing mislead the reader and plagiarise the catch phrase of Mercedes Benz.

To address this shortfall in ethics I have extracted the balance of the article which the above named company saw fit to edit out and rename:

"BRAINCHIP AKIDA

The Mercedes EQXX concept car, debuted at CES 2022, features BrainChip’s Akida neuromorphic processor performing in–cabin keyword spotting. Promoted as “the most efficient Mercedes–Benz ever built,” the car takes advantage of neuromorphic technology to use less power than deep learning powered keyword spotting systems. This is crucial for a car that is supposed to deliver a 620–mile range (about 1,000 km) on a single battery charge, 167 miles further than Mercedes’ flagship electric vehicle, the EQS

Mercedes said at the time that BrainChip’s solution was 5 to 10× more efficient than conventional voice control when spotting the wake word “Hey Mercedes”.

Neuromorphic Car Mercedes EQXX Mercedes’ EQXX concept EV has a power efficiency of more than 6.2 miles per kWh, almost double that of the EQS. (Source: Mercedes)


“Although neuromorphic computing is still in its infancy, systems like these will be available on the market in just a few years,” according to Mercedes. “When applied at scale throughout a vehicle, they have the potential to radically reduce the energy needed to run the latest AI technologies.”

“[Mercedes is] looking at big issues like battery management and transmission, but every milliwatt counts, and the context of [BrainChip’s] inclusion was that even the most basic inference, like spotting a keyword, is important when you consider the power envelope,” Jerome Nadel, chief marketing officer at BrainChip, told EE Times.

Nadel said that a typical car in 2022 may have as many as 70 different sensors. For in–cabin applications, these sensors may be enabling facial detection, gaze estimation, emotion classification, and more.

“From a systems architecture point of view, we can do it in a 1:1 way, there’s a sensor that will do a level of pre–processing, and then the data will be forwarded,” he said. “There would be AI inference close to the sensor and… it would pass the inference meta data forward and not the full array of data from the sensor.”

The idea is to minimize the size and complexity of data packets sent to AI accelerators in automotive head units, while lowering latency and minimizing energy requirements. With a potential for 70 Akida chips or Akida–enabled sensors in each vehicle, Nadel said each one will be a “low–cost part that will play a humble role,” noting that the company needs to be mindful of the bill of materials for all these sensors.

BrainChip Akida neuromorphic processor in car system
BrainChip sees its neuromorphic processor next to every sensor in a car. (Source: BrainChip)

Looking further into the future, Nadel said neuromorphic processing will find its way into ADAS and autonomous vehicle systems, too. There is potential to reduce the need for other types of power–hungry AI accelerators.

“If every sensor had a limited, say, one or two node implementation of Akida, it would do the sufficient inference and the data that would be passed around would be cut by an order of magnitude, because it would be the inference metadata… that would have an impact on the horsepower that you need in the server in the trunk,” he said.

BrainChip’s Akida chip accelerates spiking neural networks (SNNs) and convolutional neural networks (via conversion to SNNs). It is not tailored for any particular use case or sensor, so it can work with vision sensing for face recognition or person detection, or other audio applications such as speaker ID. BrainChip has also demonstrated Akida with smell and taste sensors, though it’s more difficult to imagine how these sensors might be used in automotive (smelling and tasting for air pollution or fuel quality, perhaps).

Akida is set up to process SNNs or deep learning CNNs that have been converted to the spiking domain. Unlike native spiking networks, converted CNNs retain some information in spike magnitude, so 2– or 4–bit computation may be required. This approach, hwoever, allows exploitation of CNNs’ properties, including their ability to extract features from large datasets. Both types of networks can be updated at the edge using STDP — in the Mercedes example, that might mean retraining the network to spot more or different keywords after deployment.

Neuromorphic Car Mercedes EQXX interior

Mercedes used BrainChip’s Akida processor to listen for the keyword “Hey Mercedes” in the cabin of its EQXX concept EV. (Source: Mercedes)
Mercedes has confirmed that “many innovations”, including “specific components and technologies” from the EQXX concept car, will make it into production vehicles, reports Autocar. There is no word yet on whether new models of Mercedes will feature artificial brains."


You only get one chance in this world to be considered as ethical to the core of your existence and unfortunately this chance has been thrown away.

My opinion only DYOR
FF

AKIDA BALLISTA
 
  • Like
  • Fire
  • Love
Reactions: 50 users
It might only be a small thing but in the court rooms of common law countries such as Australia it is considered very bad form to selectively quote from source documents and always considered an attempt to mislead the court.

Brainchip bless their little cotton socks unlike a certain company who shall remain named above have on their website the link to an Article from the EETimes titled 'Cars that think like you' however they do not edit the article to remove those parts which cover a competitor nor do they take the title of the article and rewrite it to be 'SynSense. Cars that think like you' so that it appears to be saying that their product creates a car that 'thinks like you' and in so doing mislead the reader and plagiarise the catch phrase of Mercedes Benz.

To address this shortfall in ethics I have extracted the balance of the article which the above named company saw fit to edit out and rename:

"BRAINCHIP AKIDA

The Mercedes EQXX concept car, debuted at CES 2022, features BrainChip’s Akida neuromorphic processor performing in–cabin keyword spotting. Promoted as “the most efficient Mercedes–Benz ever built,” the car takes advantage of neuromorphic technology to use less power than deep learning powered keyword spotting systems. This is crucial for a car that is supposed to deliver a 620–mile range (about 1,000 km) on a single battery charge, 167 miles further than Mercedes’ flagship electric vehicle, the EQS

Mercedes said at the time that BrainChip’s solution was 5 to 10× more efficient than conventional voice control when spotting the wake word “Hey Mercedes”.

Neuromorphic Car Mercedes EQXX Mercedes’ EQXX concept EV has a power efficiency of more than 6.2 miles per kWh, almost double that of the EQS. (Source: Mercedes)


“Although neuromorphic computing is still in its infancy, systems like these will be available on the market in just a few years,” according to Mercedes. “When applied at scale throughout a vehicle, they have the potential to radically reduce the energy needed to run the latest AI technologies.”

“[Mercedes is] looking at big issues like battery management and transmission, but every milliwatt counts, and the context of [BrainChip’s] inclusion was that even the most basic inference, like spotting a keyword, is important when you consider the power envelope,” Jerome Nadel, chief marketing officer at BrainChip, told EE Times.

Nadel said that a typical car in 2022 may have as many as 70 different sensors. For in–cabin applications, these sensors may be enabling facial detection, gaze estimation, emotion classification, and more.

“From a systems architecture point of view, we can do it in a 1:1 way, there’s a sensor that will do a level of pre–processing, and then the data will be forwarded,” he said. “There would be AI inference close to the sensor and… it would pass the inference meta data forward and not the full array of data from the sensor.”

The idea is to minimize the size and complexity of data packets sent to AI accelerators in automotive head units, while lowering latency and minimizing energy requirements. With a potential for 70 Akida chips or Akida–enabled sensors in each vehicle, Nadel said each one will be a “low–cost part that will play a humble role,” noting that the company needs to be mindful of the bill of materials for all these sensors.

BrainChip Akida neuromorphic processor in car system
BrainChip sees its neuromorphic processor next to every sensor in a car. (Source: BrainChip)

Looking further into the future, Nadel said neuromorphic processing will find its way into ADAS and autonomous vehicle systems, too. There is potential to reduce the need for other types of power–hungry AI accelerators.

“If every sensor had a limited, say, one or two node implementation of Akida, it would do the sufficient inference and the data that would be passed around would be cut by an order of magnitude, because it would be the inference metadata… that would have an impact on the horsepower that you need in the server in the trunk,” he said.

BrainChip’s Akida chip accelerates spiking neural networks (SNNs) and convolutional neural networks (via conversion to SNNs). It is not tailored for any particular use case or sensor, so it can work with vision sensing for face recognition or person detection, or other audio applications such as speaker ID. BrainChip has also demonstrated Akida with smell and taste sensors, though it’s more difficult to imagine how these sensors might be used in automotive (smelling and tasting for air pollution or fuel quality, perhaps).

Akida is set up to process SNNs or deep learning CNNs that have been converted to the spiking domain. Unlike native spiking networks, converted CNNs retain some information in spike magnitude, so 2– or 4–bit computation may be required. This approach, hwoever, allows exploitation of CNNs’ properties, including their ability to extract features from large datasets. Both types of networks can be updated at the edge using STDP — in the Mercedes example, that might mean retraining the network to spot more or different keywords after deployment.

Neuromorphic Car Mercedes EQXX interior

Mercedes used BrainChip’s Akida processor to listen for the keyword “Hey Mercedes” in the cabin of its EQXX concept EV. (Source: Mercedes)
Mercedes has confirmed that “many innovations”, including “specific components and technologies” from the EQXX concept car, will make it into production vehicles, reports Autocar. There is no word yet on whether new models of Mercedes will feature artificial brains."


You only get one chance in this world to be considered as ethical to the core of your existence and unfortunately this chance has been thrown away.

My opinion only DYOR
FF

AKIDA BALLISTA
Leaving ethics to one side you might also ask who is more fearful of their competition SynSense or Brainchip?

I think the evidence supports the conclusion that Brainchip stands in the marketplace ready to take on all competitors face to face, toe to toe.

Others skulk around in the shadows frightened of being compared to AKIDA technology OR of even admitting of its very existence.

My opinion only DYOR
FF

AKIDA BALLISTA
 
  • Like
  • Love
  • Haha
Reactions: 44 users

equanimous

Norse clairvoyant shapeshifter goddess
Slim pickings I think, but the bit about the lessening of any blurring caught my eye.

I was thinking of LG earlier and to keep en eye out for CES 2023.

 
  • Like
  • Fire
  • Love
Reactions: 15 users

wilzy123

Founding Member
  • Haha
  • Like
  • Fire
Reactions: 20 users

Boab

I wish I could paint like Vincent
  • Like
  • Fire
  • Haha
Reactions: 16 users
And why would you not be.

The following is more than enough to get Big Kev going but as we know Socionext is not the only one:

"Advanced AI Solutions for Automotive

Partnering with BrainChip, Socionext utilizes Brainchip's Akida processor IP for real-time smart sensor data platforms leveraging customizable AI. The Akida provides neuromorphic, result-based computation ending with performance, a minimal silicon footprint, and a lower consumption of energy.

At CES, Socionext is displaying its 7nm and 5nm process technologies delivering automotive compliant SoCs enabling efficient safety as software development and system verification are ensured.

Have a gander at Socionext’s products and services at the CES Vehicle Tech and Advanced Mobility Zone, located in the Las Vegas Convention Center, North Hall, Booth 10654. CES runs from January 5-8, 2023."


I find it worthwhile to reflect on how excited we would have all been 2 years ago if the above announcement had been made as a one off achievement by Brainchip.

Today though it is but one part of a continuing narrative of engagement after engagement with major tech industry players including RENESAS, MEGAGHIPS, ARM and INTEL not to mention the major player in automotive industry for over 100 years MERCEDES BENZ. 😎😍😎😍😎😍😎

My opinion only DYOR
FF

AKIDA BALLISTA
 
  • Like
  • Love
  • Fire
Reactions: 50 users
I am not sure if this article has been posted but with so many of Brainchip's recently announced partners pushing AKIDA technology in the automotive space such as RENESAS, ARM, SOCIONEXT, MEGACHIPS, PROPHESEE, MERCEDES BENZ AND NVISO for example in my opinion it has ramifications worth repeating:

"Auto chips seen as biggest revenue producer in ‘23: KPMG survey​

By Matt HamblenDec 15, 2022 12:41pm
kpmgGSAchip shortageAutomotive
illustraion of a circuit board with a chip and a car logo atop
A survey of 151 global chip executives put chips for autos at the top of the list of revenue producers in 2023. (Getty Images)
With a shortage of chips during the pandemic hurting the automotive sector especially hard alongside an explosion of investment in electric vehicles and self-driving cars, it might come as no surprise that chip executives are laser focused on chip revenues from the auto sector.


In fact, a new survey of 151 chip executives conducted by KPMG and the Global Semiconductor Alliance found the automotive sector will be the most important revenue driver in 2023, pushing wireless communications into second place from its previous top spot. Wireless chips have long been seen as a top revenue driver in many of the 18 years of the survey.


Other chip categories were further down the list with metaverse ranked last out of 10 in importance for driving chip revenue in 2023. Internet of things, cloud computing and AI ranked third, fourth and firth in terms of importance. More than half of GSA’s survey respondents work at companies with more than $1 billion in annual revenue and about a third were US based.
KPMG predicts auto chip revenues will reach $200 billion by the mid-2030s and surpass $250 billion by 2040. Revenues are now about $50 billion in the auto category, but will see 8% annual growth, according to KPMG.
“The automotive industry has been at the short end of the stick from a supply chain perspective,” said Lincoln Clark, a partner and global semiconductor practice leader at KPMG US in an interview with Fierce Electronics. “How auto learns from that experience going forward [will be interesting] to watch as the need for chips is only going to expand with EV and cars, hybrid and electric.”

The auto industry needs a variety of chips and plenty of them. Most modern cars need more than 1,000 chips, a number expected to double as more self-driving features are introduced. Modern chips serve needs for electrical connections and power management, connectivity to wireless (including logic and memory chips) and ADAS (sensors, MCUs and GPUs).
“Auto OEMs and tier one suppliers have changed the way they think about semiconductors,” said Irene Signorino, KPMG managing director. “Before, chips were not a priority, but with the shortage they are thinking strategically.”
Clark also noted an emerging trend in the industry with carmakers bringing chip design and development in-house.
Other findings

Labor:
Finding engineers and other talent is the number one issue of chip executives, according to the survey.
“Demand for talent is very difficult and [chip companies} are competing with big tech,” Clark noted.
Despite the ongoing spate of high tech layoffs in late December, the survey—conducted in the fourth quarter—found 71% of respondents anticipate increasing their global workforce in 2023.
Chip shortage: 65% said the chip supply shortage will ease in 2023, while 20% think it will last into 2024 or later. 15% see supply and demand in balance for most products.
Russia-Ukraine war: 29% see the war materially impacting the supply chain in 2023, down from 39% in a shorter survey done in May 2022."

Can this constant positioning of Brainchip and AKIDA technology in the right place at the right time simply be occurring as a result of dumb luck or do those in charge at a Board and Key Management level actually deserve credit.

My opinion only DYOR
FF

AKIDA BALLISTA
 
  • Like
  • Love
  • Fire
Reactions: 44 users
And why would you not be.

The following is more than enough to get Big Kev going but as we know Socionext is not the only one:

"Advanced AI Solutions for Automotive

Partnering with BrainChip, Socionext utilizes Brainchip's Akida processor IP for real-time smart sensor data platforms leveraging customizable AI. The Akida provides neuromorphic, result-based computation ending with performance, a minimal silicon footprint, and a lower consumption of energy.

At CES, Socionext is displaying its 7nm and 5nm process technologies delivering automotive compliant SoCs enabling efficient safety as software development and system verification are ensured.

Have a gander at Socionext’s products and services at the CES Vehicle Tech and Advanced Mobility Zone, located in the Las Vegas Convention Center, North Hall, Booth 10654. CES runs from January 5-8, 2023."


I find it worthwhile to reflect on how excited we would have all been 2 years ago if the above announcement had been made as a one off achievement by Brainchip.

Today though it is but one part of a continuing narrative of engagement after engagement with major tech industry players including RENESAS, MEGAGHIPS, ARM and INTEL not to mention the major player in automotive industry for over 100 years MERCEDES BENZ. 😎😍😎😍😎😍😎

My opinion only DYOR
FF

AKIDA BALLISTA
Yet when I show my family links regarding the above information all I get is the sound of cricket's . .. I then proceed to take their pulse just to check if they are still alive.😐
 
  • Haha
  • Like
  • Love
Reactions: 32 users

scanspeak

Regular
Apologies if already posted.
 
  • Like
  • Fire
  • Love
Reactions: 5 users
"hope of a decent turn over the coming months.."

Keen to know the signs you're seeing because this can be said by anyone without looking a chart too.

And there are no guarantees on the market. Market could side trend for years or go down before turning so everything is a possibility. The second paragraph sounds a bit confident, but market has a funny way of humbling the best of traders and investors.
The Macro I have cited the resources to follow up on if interested..

The charts are done from a trading point of view.. And I know how much people on public forums love traders 😏.

However, if interested these were done from the point of view of long entries and some supply and demand technical analysis points of view..

Some people may benefit from seeing where there are good buying inflection points and when moves are likely to work better..

As is the case, earnings increasing qtr on qtr and year on year from a companies early stages of revenue growth is where the large gains can be made..

In any event, I hope some may find some value in these below charts tracing back from listing as BRN after the Aziana Takeover..

View attachment 25373

View attachment 25374

View attachment 25375
⬆️⬆️ No text here. However highlighting that moves of over 50% retracements from a peak often mean lower prices are coming.
Three big waves down over 2 years and notice how it really flattens into Q1 2020

1672119019624.png


Correction**silicon on chip ⬆️⬆️

1672119099028.png

**Fractal meaning to me is that short term time frame under 4-5 weeks where the price forms a descending wedge type of shape
1672119132222.png

**RHS “Right hand side” basically meaning consolidation.

1672119248046.png

**V moves or spike up moves which often fail unless they come after a constructive base.. ⬆️⬆️

1672119479087.png


Hope you can read the print here. There’s a lot of valuable information in this chart. Namely trendline with the use of the 20EMA moving average as a reliable inflection point.
Then the chart breaks out of its 1yr base, or two if you count from the Sep 2020 move.. I kind of feel that this eventual move to $2.30+ would have held if BRN was at royalty revenue stage which is what makes the next big move all the more important ⬆️⬆️

1672119905584.png


⬆️⬆️ This takes us to the present. We have signs of price flattening out and volume retraction again. The move from late November to present is the first sign for a turn back up, which is a low base fractal. Now we want to see it push above its 20EMA.. We know the buying and selling is reacting to quarterly and half year reports.. The chart is telling me we may get some sort of a move shortly in anticipation of more licensing agreements or news relating to tape outs and or progress toward revenue increases.. I would expect a price rise before it is in the news and reports.. As per those trend lines if the price is under those 20EMAs (blue line), then I draw another trend-line and start again..
**************

We have nearly 12 months of consolidation here.. It took 14-15 months to get some decent moves previously pre earnings, so any moves like we had into Sep 2020 and Jan 2022 are a chance to push a lot higher with what we hope is increasing earnings assisting…

Something to absorb for those who are interested in buying inflections and or timing factors..
 
  • Like
  • Fire
  • Love
Reactions: 46 users
The Macro I have cited the resources to follow up on if interested..

The charts are done from a trading point of view.. And I know how much people on public forums love traders 😏.

However, if interested these were done from the point of view of long entries and some supply and demand technical analysis points of view..

Some people may benefit from seeing where there are good buying inflection points and when moves are likely to work better..

As is the case, earnings increasing qtr on qtr and year on year from a companies early stages of revenue growth is where the large gains can be made..

In any event, I hope some may find some value in these below charts tracing back from listing as BRN after the Aziana Takeover..

View attachment 25373

View attachment 25374

View attachment 25375
⬆️⬆️ No text here. However highlighting that moves of over 50% retracements from a peak often mean lower prices are coming.
Three big waves down over 2 years and notice how it really flattens into Q1 2020

View attachment 25376

Correction**silicon on chip ⬆️⬆️

View attachment 25377
**Fractal meaning to me is that short term time frame under 4-5 weeks where the price forms a descending wedge type of shape
View attachment 25379
**RHS “Right hand side” basically meaning consolidation.

View attachment 25380
**V moves or spike up moves which often fail unless they come after a constructive base.. ⬆️⬆️

View attachment 25381

Hope you can read the print here. There’s a lot of valuable information in this chart. Namely trendline with the use of the 20EMA moving average as a reliable inflection point.
Then the chart breaks out of its 1yr base, or two if you count from the Sep 2020 move.. I kind of feel that this eventual move to $2.30+ would have held if BRN was at royalty revenue stage which is what makes the next big move all the more important ⬆️⬆️

View attachment 25382

⬆️⬆️ This takes us to the present. We have signs of price flattening out and volume retraction again. The move from late November to present is the first sign for a turn back up, which is a low base fractal. Now we want to see it push above its 20EMA.. We know the buying and selling is reacting to quarterly and half year reports.. The chart is telling me we may get some sort of a move shortly in anticipation of more licensing agreements or news relating to tape outs and or progress toward revenue increases.. I would expect a price rise before it is in the news and reports.. As per those trend lines if the price is under those 20EMAs (blue line), then I draw another trend-line and start again..
**************

We have nearly 12 months of consolidation here.. It took 14-15 months to get some decent moves previously pre earnings, so any moves like we had into Sep 2020 and Jan 2022 are a chance to push a lot higher with what we hope is increasing earnings assisting…

Something to absorb for those who are interested in buying inflections and or timing factors..
Start with these.. I think the 2015 to 2020 charts were cut.. So below three are the intended first three charts. They are important for context.

1672120881694.png


1672120899701.png


1672120914940.png
 
  • Like
  • Fire
  • Love
Reactions: 21 users

scanspeak

Regular
Learn how to get some hands on real-world experience with A.I.
 
  • Like
  • Love
Reactions: 7 users
  • Like
Reactions: 5 users

chapman89

Founding Member
Long term mindset always wins!
DDA9BFE4-63FD-47C6-950E-EB03B52B4239.jpeg
 
  • Like
  • Love
Reactions: 34 users

Deadpool

hyper-efficient Ai
Yet when I show my family links regarding the above information all I get is the sound of cricket's . .. I then proceed to take their pulse just to check if they are still alive.😐
Don't worry mate, they will probably resemble this in the not too distant future, and be calling you the messiah.😇


.
200.gif
 
  • Haha
  • Like
  • Wow
Reactions: 11 users
Top Bottom