BRN Discussion Ongoing

The pessimist in me worries that Ford has decided not to use Akida for self driving.

The optimist in me thinks that Argo is likely utilising AKIDA (they have stated in the past that their systems are "continually learning". Which would have us in bed with not just ford, but also VW Group.

Fingers crossed it's the latter!
Here is something to drive a stake through the heart of your pessimist so that it never again raises its very, very ugly head.

In the Rob Telson and Jerome Nadel podcast it was stated there are 70 sensors in an automobile.

If you process these 70 sensors with non AKIDA edge Ai best view outcome is it will cost you 1 to 1.5 watts so you will have a total draw of 70 to 105 watts.

Mercedes’ Benz stated that AKIDA technology was 5 to 10 times more efficient than comparable products so the 70 to 105 watts becomes 14 to 21 watts or 7 to 10.5 watts or some number in between.

You then achieve the additional saving in the GPU running the whole system because it is not receiving all data but the meta data which does not need preprocessing as this has been done on sensor already by AKIDA.

It is incontestable that driving range in an EV can only be extended by ensuring the maximum percentage of the battery power is reserved for the driving wheels.

Blind Freddie knows that the sort of energy savings that AKIDA brings to the EV platform cannot be ignored.

I personally along with Blind Freddie have complete confidence that AKIDA technology will dominate in the EV space on power savings alone and this is before you add in all of its other unique patent protected features.

The word ‘UBIQUITOUS’ is not being thrown around for nothing.

My opinion only DYOR
FF

AKIDA BALLISTA
 
Last edited:
  • Like
  • Love
  • Fire
Reactions: 66 users

Gemmax

Regular
When Sean our CEO highlighted the importance of honoring NDAs and said watch the financials instead, I don’t believe he meant start watching them in a years time.
Also Sean stated at the AGM. Words to the effect. Judge me on the financials next AGM. IMO
 
  • Like
  • Love
  • Fire
Reactions: 26 users

alwaysgreen

Top 20
Here is something to drive a stake through the heart of your pessimist so that it never again raises its very, very ugly head.

In the Rob Telson and Jerome Nadel podcast it was stated there are 70 sensors in an automobile.

If you process these 70 sensors with non AKIDA edge Ai best view outcome is it will cost you 1 to 1.5 watts so you will have a total draw of 70 to 105 watts.

Mercedes’ Benz stated that AKIDA technology was 5 to 10 times more efficient than comparable products so the 70 to 105 watts becomes 14 to 21 watts or 7 to 10.5 watts or some number in between.

You then achieve the addition saving in the GPU running the whole system because it is not receiving all data but the meta data which does not need preprocessing as this has been done on sensor already by AKIDA.

It is incontestable that driving range in an EV can only be extended by ensuring the maximum percentage of the battery power is reserved for the driving wheels.

Blind Freddie knows that the sort of energy savings that AKIDA brings to the EV platform cannot be ignored.

I personally along with Blind Freddie have complete confidence that AKIDA technology will dominate in the EV space on power savings alone and this is before you add in all of its other unique patent protected features.

The word ‘UBIQUITOUS’ is not being thrown around for nothing.

My opinion only DYOR
FF

AKIDA BALLISTA
Thank you for absolutely murdering and dismembering the pessimist in me :ROFLMAO:
 
  • Haha
  • Like
  • Love
Reactions: 28 users
Shorts sitting on 90 mil + will hit 100 mil next week if the average keeps continuing. Might be there already as the 90 mil is 4 days old. I can’t remember BRN ever being this high in shorts and should make One hell of a short rally if news drops. Are they starting to get desperate as there is not a lot of volume and we know how quick it can rise on volume. BRN just need to light the fire.
It's pretty obvious it's being shorted as we speak (large red volume bars).
As there is absolutely no reason, to push down like that, at the present time.

_20220708_134612.JPG


It will indeed be a big bonfire, if significant news drops.

I doubt very much, that they are getting the "selling volume" that they desperately need..

🔥🔥🔥
 
  • Like
  • Fire
  • Sad
Reactions: 24 users
Just catching up on the goss as been busy.

See the convo round Merc and was surfing the new models.

Whilst not sure Akida will be in any (like most comments) I did find a couple of things interesting as below fwiw.



E Class

1657253134919.png



Me Connect EQ

1657253246138.png



New A Class MBUX 2


1657253495190.png




GLE AMG


1657253684757.png
 
  • Like
  • Fire
  • Love
Reactions: 41 users

Proga

Regular
From my experience, dealers generally have no idea. Certainly, unless they are on this forum, would have no idea about the chips/processors used in the car.

Also, Mercedes has been pretty clear that AKIDA is only being implemented in the EQS. Any model prior will not have Akida. After though.......
AG,

Are you implying Akida will be used in the MBUX for all EQS released recently like the AMG EQS53? Try to find out. (don't worry just read the other 10 posts about it) I'm pretty sure it will be integrated in all models from 2024 just not the MBUX which just happens to matchup with Nadels timelines from FF's post

“If we look at a five-year strategic plan, our outer three years probably look different than our inner two,” Nadel says. “In the inner two we we’re still going to focus on chip vendors and designers and tier-one OEMs. But the outer three, if you look at categories, it’s really going to come from basic devices, be they in-car or in-cabin. be they in consumer electronics that are looking for this AI enablement. We need to be in the ecosystem. Our IP is de facto and the business model wraps around that.”
 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 22 users

jk6199

Regular
Are shorters called that because if we get one good announcement, they’ll be hanging by their short and curlies?😳
 
  • Haha
  • Like
  • Love
Reactions: 11 users

Diogenese

Top 20
"Mercedes engineers worked with California-based artificial-intelligence developer BrainChip to create systems based on the company’s Akida hardware and software. Among other things, the technology makes the “Hey, Mercedes” voice control system in the EQXX five to ten times more efficient than conventional voice control"
Hi DB,

You've sent me on a trip down memory lane:

BrainChip Announces: Unsupervised Visual Learning Achieved (2016)​


https://brainchip.com/brainchip-announces-unsupervised-visual-learning-achieved/

ALISO VIEJO, CA — (Marketwired) — 02/23/16 —
BrainChip Holdings Limited (ASX: BRN), developer of a revolutionary new Spiking Neuron Adaptive Processor (SNAP) technology that has the ability to learn autonomously, evolve and associate information just like the human brain, is pleased to report that it has achieved a further significant advancement of its artificial intelligence technology.

The R&D team in Southern California has completed the development of an Autonomous Visual Feature Extraction system (AVFE), an advancement of the recently achieved and announced Autonomous Feature Extraction (AFE) system. The AVFE system was developed and interfaced with the DAVIS artificial retina purchased from its developer, Inilabs of Switzerland. DAVIS has been developed to represent data streams in the same way as BrainChip’s neural processor, SNAP.

Highlights


  • Capable of processing 100 million visual events per second
  • Learns and identifies patterns in the image stream within seconds — (Unsupervised Feature Learning)
  • Potential applications include security cameras, collision avoidance systems in road vehicles and Unmanned Aerial Vehicle’s (UAV’S), anomaly detection, and medical imaging
  • AVFE is now commercially available
  • Discussions with potential licensees for AVFE are progressing
AVFE is the process of extracting informative characteristics from an image. The system initially has no knowledge of the contents of an input stream. The system learns autonomously by repetition and intensity, and starts to find patterns in the image stream. BrainChip’s SNAP learns to recognize features within a few seconds, just like a human would when looking at a scene. This image stream can originate from any source, such as an image sensor like the DAVIS artificial retina, but also from other sources that are outside of human perception such as radar or ultrasound images.

In traditional systems, a computer program loads a single frame from a video camera and searches that frame for identifying features, predefined by a programmer. Each section of the image is compared to a template until a match is found and a percentage of the match is returned, along with its location. This is a cumbersome operation.

An AVFE test sequence was conducted on a highway in Pasadena, California for 78.5 seconds. An average event rate of 66,100 events per second was recorded. The SNAP spiking neural network learned the features of vehicles passing by the sensor within seconds (see Figure 1). It detected and started counting cars in real time. The results of this hardware demonstration shows that SNAP can process events emitted by the DAVIS camera in real time and perform unsupervised learning of temporally correlated features.

AVFE can be configured for a large number of uses including surveillance and security cameras, collision avoidance systems in road vehicles and Unmanned Aerial Vehicle’s (UAV’S), anomaly detection, medical imaging, audio processing and many other applications.

Peter van der Made, CEO and Inventor of the SNAP neural processor said, “We are very excited about this significant advancement. It shows that BrainChips neural processor SNAP acquires information and learns without human supervision from visual input. The AVFE is remarkable and capable of high speed visual perception and learning, that has wide spread commercial applicability
.”

It hasn't been all plain sailing. Just a couple of years ago, we were flying by the seat of our pants:

https://s3.amazonaws.com/content.st...f/Product+Development+and+Business+Update.pdf

Sydney, Australia – 28 February, 2019: BrainChip Holdings Ltd (ASX: BRN), the leading neuromorphic computing company, today provided a product development and business update. Akida product development Since inception BrainChip has been committed to providing an artificial intelligence solution as an integrated circuit. The Company’s acquisition of Spikenet Technologies in September of 2016 has provided software validation of a spiking neural network (SNN) specific to image processing. The Spikenet research and engineering team have proved invaluable in the area of image processing and have provided significant insight for the development of Akida.

The Company announced the Akida Development Environment (ADE) and Architecture in the fourth quarter of 2018. The ADE allows users to fully simulate the implementation of the Akida IC and determine benchmark performance in terms of accuracy and power consumption in AI Edge applications. Importantly, this allows OEM equipment design prior to the IC introduction.

The development of Akida is proceeding well, with refinements from inputs of early access potential customers. The Company has implemented the Akida NPC in a Field Programmable Gate Array (FPGA) for internal use in evaluation. This is an important step in the process of developing a complex IC as it provides verification of the logic design, thereby improving prospects for a successful implementation prior to incurring manufacturing expenses.

The Company has determined that an Application Specific Integrated Circuit (ASIC) vendor that provides full services, from the layout of the design through all subsequent manufacturing processes, will be most cost effective, [### Socionext ###] reduce risk and accelerate time-tomarket for Akida in AI Edge applications. The Company expects to select an ASIC vendor in the first quarter of 2019 and commence logic circuit design in the first quarter of 2019.

BrainChip has made great strides in the Akida design, creating a device that is very compact, flexible, provides low-latency and is low-power for AI Edge applications. The device can be user-configured for both convolutional and fully-connected networks applicable to a broad range of visual, data, and sensor applications. Akida will deliver up to 1.2 million neurons and over 10 billion synapses in a low-power chip, expandable to a far greater capacity by utilizing off-chip memory.

The details of the design are proprietary and are described in the Company’s currently pending provisional patent. The Company is working on a series of patents covering all aspects of the unique Akida design in detail.

Restructuring and Expense Control
The Company is implementing a restructuring and series of expense controls to focus resources primarily on the Akida product development.

With regard to BrainChip Studio, end-user engagement has provided the Company deep knowledge of customer expectations and insight regarding the human capital and sales process required to be successful. However, the Company underestimated the time and effort to support end-users and the time to achieve revenue from the ongoing trials. The insights gained from end-user engagement has reinforced the Company’s view that focusing on OEM customers provides many benefits including a significant reduction in direct cost and opportunity cost savings.

Taking these valuable learnings into account, the Company has shifted its focus from end-user sales, to OEM relationships. This allows the primary focus of the the organisation to be on the completion of the Akida IC. Because the Company’s success with OEM partners is highly dependent on their own success in marketing their platform, the Company intends to partner with those OEMs best able to bring the innovation of a low-power, high-accuracy spiking neural network to its customer base. The business model for BrainChip Studio includes license and revenue sharing while Akida includes product, license and royalty revenue.

With regard to restructuring and reduction in expenses, BrainChip Studio end-user sales and marketing roles will be eliminated, and discretionary spending will be reduced. In total, the changes implemented in this restructuring are expected to result in a decrease of 10% to 15% of overall planned spending. The restructuring will not affect research or engineering development resources. In addition, certain key management personnel have agreed to accept a temporary reduction in their salaries until such time as the board considers the Company to be in a position to revert to their current market based remuneration
.


The ending of Studio sales and marketing would not have eliminated their contractual obligations to support existing customers.

"Certain key management personnel have agreed to accept a temporary reduction in their salaries", and to burn the candle at both ends for over a year, but I remember the furore over at the other place about the allocation of bonus shares - not to mention the sale of shares by PvdM and Anil.

When @TECH talks about the quality of the Brainchip team, this shows their mettle.
 
Last edited:
  • Like
  • Love
  • Fire
Reactions: 35 users
When Sean our CEO highlighted the importance of honoring NDAs and said watch the financials instead, I don’t believe he meant start watching them in a years time.
When do you think then
 
D

Deleted member 118

Guest
Shorts sitting on 90 mil + will hit 100 mil next week if the average keeps continuing. Might be there already as the 90 mil is 4 days old. I can’t remember BRN ever being this high in shorts and should make One hell of a short rally if news drops. Are they starting to get desperate as there is not a lot of volume and we know how quick it can rise on volume. BRN just need to light the fire.

 
  • Haha
Reactions: 9 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
Intriguing...Elbit's X-Sight helmet uses AI and Augmented Reality.

"It also continuously tracks the landscape to warn pilots of potential obstacles such as power lines, antennas and other interferences, without prior mapping". It also includes an "integrated array of sensor systems" and includes the "AI-powered mission computer for real-time data fusion and machine learning of obstacles and threats, and for running operational applications".

It also says here that it "enables air forces from around the world to integrate sensors made by different companies, as well as to use varying configurations and develop and install additional applications on the mission computer".

This is one for Tony Dawe, if he isn't onto it already! 🥳


Published: JULY 8, 2022 06:00
 
  • Like
  • Love
  • Fire
Reactions: 19 users

alwaysgreen

Top 20
AG,

Are you implying Akida will be used in the MBUX for all EQS released recently like the AMG EQS53? Try to find out. I'm pretty sure it will be integrated in all models from 2024 just not the MBUX which just happens to matchup with Nadels timelines from FF's post

“If we look at a five-year strategic plan, our outer three years probably look different than our inner two,” Nadel says. “In the inner two we we’re still going to focus on chip vendors and designers and tier-one OEMs. But the outer three, if you look at categories, it’s really going to come from basic devices, be they in-car or in-cabin. be they in consumer electronics that are looking for this AI enablement. We need to be in the ecosystem. Our IP is de facto and the business model wraps around that.”

I'm suggesting we won't see Akida in any Merc's this year and probably not next year. 2024 will see it rolled out. All speculation on my behalf.

The car industry is notoriously slow at rolling out tech. Thankfully Mr Musk has forced them to up their game a little.
 
  • Like
  • Love
Reactions: 6 users
Hi DB,

You've sent me on a trip down memory lane:

BrainChip Announces: Unsupervised Visual Learning Achieved (2016)​


https://brainchip.com/brainchip-announces-unsupervised-visual-learning-achieved/

ALISO VIEJO, CA — (Marketwired) — 02/23/16 —
BrainChip Holdings Limited (ASX: BRN), developer of a revolutionary new Spiking Neuron Adaptive Processor (SNAP) technology that has the ability to learn autonomously, evolve and associate information just like the human brain, is pleased to report that it has achieved a further significant advancement of its artificial intelligence technology.

The R&D team in Southern California has completed the development of an Autonomous Visual Feature Extraction system (AVFE), an advancement of the recently achieved and announced Autonomous Feature Extraction (AFE) system. The AVFE system was developed and interfaced with the DAVIS artificial retina purchased from its developer, Inilabs of Switzerland. DAVIS has been developed to represent data streams in the same way as BrainChip’s neural processor, SNAP.

Highlights


  • Capable of processing 100 million visual events per second
  • Learns and identifies patterns in the image stream within seconds — (Unsupervised Feature Learning)
  • Potential applications include security cameras, collision avoidance systems in road vehicles and Unmanned Aerial Vehicle’s (UAV’S), anomaly detection, and medical imaging
  • AVFE is now commercially available
  • Discussions with potential licensees for AVFE are progressing
AVFE is the process of extracting informative characteristics from an image. The system initially has no knowledge of the contents of an input stream. The system learns autonomously by repetition and intensity, and starts to find patterns in the image stream. BrainChip’s SNAP learns to recognize features within a few seconds, just like a human would when looking at a scene. This image stream can originate from any source, such as an image sensor like the DAVIS artificial retina, but also from other sources that are outside of human perception such as radar or ultrasound images.

In traditional systems, a computer program loads a single frame from a video camera and searches that frame for identifying features, predefined by a programmer. Each section of the image is compared to a template until a match is found and a percentage of the match is returned, along with its location. This is a cumbersome operation.

An AVFE test sequence was conducted on a highway in Pasadena, California for 78.5 seconds. An average event rate of 66,100 events per second was recorded. The SNAP spiking neural network learned the features of vehicles passing by the sensor within seconds (see Figure 1). It detected and started counting cars in real time. The results of this hardware demonstration shows that SNAP can process events emitted by the DAVIS camera in real time and perform unsupervised learning of temporally correlated features.

AVFE can be configured for a large number of uses including surveillance and security cameras, collision avoidance systems in road vehicles and Unmanned Aerial Vehicle’s (UAV’S), anomaly detection, medical imaging, audio processing and many other applications.

Peter van der Made, CEO and Inventor of the SNAP neural processor said, “We are very excited about this significant advancement. It shows that BrainChips neural processor SNAP acquires information and learns without human supervision from visual input. The AVFE is remarkable and capable of high speed visual perception and learning, that has wide spread commercial applicability
.”

It hasn't been all plain sailing. Just a couple of years ago, we were flying by the seat of our pants:

https://s3.amazonaws.com/content.st...f/Product+Development+and+Business+Update.pdf

Sydney, Australia – 28 February, 2019: BrainChip Holdings Ltd (ASX: BRN), the leading neuromorphic computing company, today provided a product development and business update. Akida product development Since inception BrainChip has been committed to providing an artificial intelligence solution as an integrated circuit. The Company’s acquisition of Spikenet Technologies in September of 2016 has provided software validation of a spiking neural network (SNN) specific to image processing. The Spikenet research and engineering team have proved invaluable in the area of image processing and have provided significant insight for the development of Akida.

The Company announced the Akida Development Environment (ADE) and Architecture in the fourth quarter of 2018. The ADE allows users to fully simulate the implementation of the Akida IC and determine benchmark performance in terms of accuracy and power consumption in AI Edge applications. Importantly, this allows OEM equipment design prior to the IC introduction.

The development of Akida is proceeding well, with refinements from inputs of early access potential customers. The Company has implemented the Akida NPC in a Field Programmable Gate Array (FPGA) for internal use in evaluation. This is an important step in the process of developing a complex IC as it provides verification of the logic design, thereby improving prospects for a successful implementation prior to incurring manufacturing expenses.

The Company has determined that an Application Specific Integrated Circuit (ASIC) vendor that provides full services, from the layout of the design through all subsequent manufacturing processes, will be most cost effective, [### Socionext ###] reduce risk and accelerate time-tomarket for Akida in AI Edge applications. The Company expects to select an ASIC vendor in the first quarter of 2019 and commence logic circuit design in the first quarter of 2019.

BrainChip has made great strides in the Akida design, creating a device that is very compact, flexible, provides low-latency and is low-power for AI Edge applications. The device can be user-configured for both convolutional and fully-connected networks applicable to a broad range of visual, data, and sensor applications. Akida will deliver up to 1.2 million neurons and over 10 billion synapses in a low-power chip, expandable to a far greater capacity by utilizing off-chip memory.

The details of the design are proprietary and are described in the Company’s currently pending provisional patent. The Company is working on a series of patents covering all aspects of the unique Akida design in detail.

Restructuring and Expense Control
The Company is implementing a restructuring and series of expense controls to focus resources primarily on the Akida product development.

With regard to BrainChip Studio, end-user engagement has provided the Company deep knowledge of customer expectations and insight regarding the human capital and sales process required to be successful. However, the Company underestimated the time and effort to support end-users and the time to achieve revenue from the ongoing trials. The insights gained from end-user engagement has reinforced the Company’s view that focusing on OEM customers provides many benefits including a significant reduction in direct cost and opportunity cost savings.

Taking these valuable learnings into account, the Company has shifted its focus from end-user sales, to OEM relationships. This allows the primary focus of the the organisation to be on the completion of the Akida IC. Because the Company’s success with OEM partners is highly dependent on their own success in marketing their platform, the Company intends to partner with those OEMs best able to bring the innovation of a low-power, high-accuracy spiking neural network to its customer base. The business model for BrainChip Studio includes license and revenue sharing while Akida includes product, license and royalty revenue.

With regard to restructuring and reduction in expenses, BrainChip Studio end-user sales and marketing roles will be eliminated, and discretionary spending will be reduced. In total, the changes implemented in this restructuring are expected to result in a decrease of 10% to 15% of overall planned spending. The restructuring will not affect research or engineering development resources. In addition, certain key management personnel have agreed to accept a temporary reduction in their salaries until such time as the board considers the Company to be in a position to revert to their current market based remuneration
.


The ending of Studio sales and marketing would not have eliminated their contractual obligations to support existing customers.

"Certain key management personnel have agreed to accept a temporary reduction in their salaries", and to burn the candle at both ends for over a year, but I remember the furore over at the other place about the allocation of bonus shares.

When @TECH talks about the quality of the Brainchip team, this shows their mettle.
That's one of the things I love about this Company, they aren't afraid, or too big, to change tack if needed and are more than willing, to put it the hard work to achieve success.

_20220708_141129.JPG


What happens though, if the "genius" part of the equation, is a fair whack more, than 1%??
 
  • Fire
  • Like
  • Love
Reactions: 15 users
D

Deleted member 118

Guest
And I thought our 5% total shorted was a lot

 
  • Like
  • Love
  • Fire
Reactions: 5 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
This article is stacked full of interesting info.

It says"Sony, which has dominated the image sensor market for many years, has by its own calculation lost 10 percentage points of market share, dropping from 53% to 43% over the past two years.To reverse this decline, it has launched an aggressive campaign to expand capacity, upgrade its technology and diversify from smartphones into automotive, virtual reality and other applications"...HELLO AKIDA!

It also, talks about Japanese camera maker Canon who "announced that it had developed a new single-photon avalanche diode-type image sensor that can “see in the dark” – can take high-resolution color photos in the darkest night or in other low-light environments in which CMOS image sensors do not work as well". So we best hop into bedsies with Canon as well, if we aren't already!

Scott Foster July 7, 2022
 
  • Like
  • Fire
  • Love
Reactions: 24 users

I have a portfolio of once highly speculative stocks and I was discussing with another poster recently off line how each of these stocks have all derisked to the point where no longer is it likely they could fail and like Brainchip are when’s not ifs and the only question is how large will the returns be at the point of when.

They have all validated their platforms, proven their technology or found what they were looking for in barrels but all remain pre- income.

The other similarity they all have is the persistent manipulation accompanied by relentless mindless trolling on HC.

As a straight forward retail investor there is absolutely no point once you have reached the derisked stage of even looking at the share price unless you decide you want to buy more or there has been a price sensitive announcement because a short bot manipulated price is not a true market indicator of the value of your investment.

The true value remains north of $1.50 plus probably around $1.70 at this point. The next IP licence will determine how far north of $1.70 we will then head as we gap up with north of $2.34 seeming reasonable.

My opinion only so DYOR
and ignore the shorts & bots
FF

AKIDA BALLISTA
 
  • Like
  • Fire
  • Love
Reactions: 60 users

robsmark

Regular
I have a portfolio of once highly speculative stocks and I was discussing with another poster recently off line how each of these stocks have all derisked to the point where no longer is it likely they could fail and like Brainchip are when’s not ifs and the only question is how large will the returns be at the point of when.

They have all validated their platforms, proven their technology or found what they were looking for in barrels but all remain pre- income.

The other similarity they all have is the persistent manipulation accompanied by relentless mindless trolling on HC.

As a straight forward retail investor there is absolutely no point once you have reached the derisked stage of even looking at the share price unless you decide you want to buy more or there has been a price sensitive announcement because a short bot manipulated price is not a true market indicator of the value of your investment.

The true value remains north of $1.50 plus probably around $1.70 at this point. The next IP licence will determine how far north of $1.70 we will then head as we gap up with north of $2.34 seeming reasonable.

My opinion only so DYOR
and ignore the shorts & bots
FF

AKIDA BALLISTA
I don’t think anyone here will disagree with this post FF, the question is though, where is the next commercial agreement? We are more than 18 months into our commercialisation phase now and have had two signed contracts, the most recent was almost eight months ago.
We can all see the potential here, that why we’re invested, but the SP does remain important. The reason it’s getting manipulated by bots is because of this lack of formal progression. If new customers were jumping on the bus every month, would the bots be selling or buying? My guess is the latter.
You can argue that Renesas and MegaChips are selling on our behalf, and that is possible, but without revenue how are we as shareholders or potential investors make informed investment decisions?
I think this company will do great things, and will one day see my family in a financial position the previous generation would have thought to be impossible, but nobody can argue that the current financial status from a SH perspective is grim. Shorts higher then they’ve ever been, SP a third of what it was six months ago, with new new customers announced or revenue in sight… Can you blame the market opportunists for playing the game?
 
Last edited:
  • Like
  • Love
  • Fire
Reactions: 46 users
“WHEN DEPLOYED AT SCALE”

The most undervalued and least referred to words used by one of the great Automotive makers of the last 100 years when speaking about AKIDA technology solutions.

Argo the creator of a yet to be fully approved autonomous vehicle platform is valued at about $US12.4 billion and Ford and VW have invested $US3.6 billion in Argo.

Does Argo have a place in medicine, white goods, security, space, military, IOT, 5G, 6G, toys, companion robots, retail, remote monitoring, heavy rail maintenance etc; etc;

My opinion only DYOR
FF

AKIDA BALLISTA
 
  • Like
  • Fire
  • Love
Reactions: 32 users
Just catching up on the goss as been busy.

See the convo round Merc and was surfing the new models.

Whilst not sure Akida will be in any (like most comments) I did find a couple of things interesting as below fwiw.



E Class

View attachment 10905


Me Connect EQ

View attachment 10906


New A Class MBUX 2


View attachment 10907



GLE AMG


View attachment 10909

Recent podcast by CEO of Merc.

Not had chance to listen but skimmed the transcript.

That and the podcast below.

Snipped a couple of bits and whilst mentions MBUX & Hey Mercedes he does also speak about some tech from the EQXX not being integrated for a couple of years....apparently.






1657258640491.png


1657258613321.png


1657258728894.png
 
  • Like
  • Fire
  • Love
Reactions: 24 users

Xray1

Regular
Also Sean stated at the AGM. Words to the effect. Judge me on the financials next AGM. IMO
Absolutely correct ........... let's hope it all starts of with this upcoming 4C
 
  • Like
  • Love
Reactions: 12 users
Top Bottom