BRN Discussion Ongoing

BrainChip founder and CTO Peter van der Made presented at Stocks Down Under’s Semiconductor Conference on 30 November 2021.

So if you’re looking at our products, the Akida 1000 is the major product, which is the chip. The neural multiprocessor mimics…and I’ll look it up in the dictionary, it mimics neurobiological architecture present in the neural nervous system. So it doesn’t work like a processor that is applied in your laptop, it works like a mini-brain. It is very good at recognizing things, recognizing smells, recognizing images, recognizing tastes. We have done all these things as examples and made them available to the market. We also have an artificial retina image to spike converter. What the eye is doing, it takes images, converts those images into spikes that then the brain is processing. We do something very similar with our artificial retina image to spike converter.

The chip is up to 10 times more energy-efficient than any other edge AI processor. Except if you’re looking at very tiny things that can’t be really used for anything. But anything that isn’t the same class as the BrainChip Akida processor is using at least 10 times more energy here. And it’s up to a thousand times more energy-efficient than GPUs.
Thank you @ BienSuerte

This mini panic by some is but one more example of why the CEO Sean Hehir went to lengths at the AGM to educate shareholders to the sad fact that out there many, many companies are making false claims about their efficiency which are in no way shape or form close to the power and performance advantages offered by AKIDA.

Given the statement today by Luca Verre CEO of Prophesee TWO more pretenders have been exposed.

My opinion only DYOR
FF

AKIDA BALLISTA
 
  • Like
  • Love
  • Fire
Reactions: 42 users

MDhere

Regular
But i dont think we are limited to the Hey Mercedes. I read that they have sensors that only act on a spikes which saves car energy, so basically I believe that the smart sensors in the vehicles will be equipped with Akida, Thats my thoughts and im sticking to it.
 
  • Like
  • Love
  • Fire
Reactions: 23 users

TechGirl

Founding Member
We have all seen the Prophesee partners page ....
We are integral to their solution moving forward ... hence, we will be integrated with their partners in delivering commercial solutions.
These guys & MegaChips ... why is the SP still under $1.00 .... won't be for long !

View attachment 18054

Plus lets not forget Prophesee's new partners


Prophesee Showcases the Solutions and Partners Driving Event Based Vision at VISION 2022​


The 2021 VISION Award winner is back with a powerful suite of hardware and software sensing solutions and new announcements with its fast-growing partner’s ecosystem including Century Arks, CIS, Datalogic, Framos, Lucid and MVTec.

iCatch-Prophesee

PARIS – October 3, 2022 – The full extent of the power and accessibility of Prophesee’s advanced neuromorphic vision systems will be on display at the 2022 VISION show, with an array of technology showcases, partner demonstrations and live interactions that showcase the company’s breakthrough event-based vision approach to machine vision.

Prophesee will have in-depth product demonstrations at its Booth 8B29 at the premier industry event happening October 4-6 in Stuttgart, Germany.

Prophesee will also deliver an overview of its event-based platform on Wednesday, 6 October at 12:20 (Hall 8 Booth C70), entitled “The World Between Frames; Event Camera Come of Age.”

Secure your private meeting today
for in-depth insights from our experts: http://prophesee.ai/meet

Prophesee’s Metavision® platform has gained traction with leading developers of machine vision systems in industrial automation, robotics, automotive and IoT. Its event-based vision approach significantly reduces the amount of data needed to capture information. Among the benefits of the sensor and AI technology are ultra-low latency, robustness to challenging lighting conditions, energy efficiency, and low data rate.

The company’s breakthrough sensors, accompanying development tools, open-source algorithms and models are at the foundation of several announcements and demonstrations at the show, including:

  • CENTURY ARKS: Century Arks announces it will release mid-October the SilkyEvCam HD, first commercial Event-based Vision camera featuring the IMX636MP sensor realized in collaboration between Sony and Prophesee.
  • CIS: Prophesee announces collaboration with CIS, a leading machine vision provider, to build the first Event-Based 3D sensing evaluation platform, leveraging advanced VCSEL technology: https://www.prophesee.ai/2022/10/03/cis-prophesee-structured-light-evaluation-kit/
  • Datalogic: A global technology leader in the automatic data capture and factory automation markets begins landmark partnership with Prophesee to bring the performance and efficiency of neuromorphic vision to its industrial products: https://www.prophesee.ai/2022/10/03/datalogic-prophesee-event-camera-partnership/
  • Framos: A leading global supplier of imaging products, custom vision solutions and OEM services is releasing their brand new Event-Based Vision development kit based on NVIDIA Jetson Xavier, featuring IMX636 sensor realized in collaboration between Sony and Prophesee.
  • Lucid Vision Labs: Following early prototype announcement at VISION 2021, Machine Vision leader Lucid announces commercial availability of its new Triton EVS, featuring the Prophesee Metavision sensor inside.
  • MVTec: Prophesee and MVTec Partner to Support Integration of Prophesee Event-Based Metavision® Cameras with MVTec HALCON Machine Vision Software. https://www.prophesee.ai/2022/10/03/mvtec-prophesee-halcon-integration-partnership/
“We are very pleased to see the fast-growing ecosystem around event-based vision and the variety of applications this powerful approach to machine vision can be used for. We have made great strides in the Machine Vision world even from last year when we were named best in show by VISION, and this year we shine the spotlight on not just our innovations but the broad range of use cases being enabled by our partners around the world,” said Luca Verre, CEO and co-founder of Prophesee.

ABOUT PROPHESEE


Prophesee is the inventor of the world’s most advanced neuromorphic vision systems.
The company developed a breakthrough Event-Based Vision approach to computer vision. This new vision category allows for significant reductions of power, latency and data processing requirements to reveal what was invisible to traditional frame-based sensors until now. Prophesee’s patented Metavision® sensors and algorithms mimic how the human eye and brain work to dramatically improve efficiency in areas such as autonomous vehicles, industrial automation, IoT, security and surveillance, and AR/VR.
Prophesee is based in Paris, with local offices in Grenoble, Shanghai, Tokyo and Silicon Valley. The company is driven by a team of more than 100 visionary engineers, holds more than 50 international patents and is backed by leading international equity and corporate investors including 360 Capital Partners, European Investment Bank, iBionext, Intel Capital, Prosperity7 Ventures, Robert Bosch Venture Capital, Sinovation, Supernova Invest, Will Semiconductor, Xiaomi.
Learn more: www.prophesee.ai
 
  • Like
  • Fire
  • Love
Reactions: 38 users

buena suerte :-)

BOB Bank of Brainchip
Thank you @ BienSuerte

This mini panic by some is but one more example of why the CEO Sean Hehir went to lengths at the AGM to educate shareholders to the sad fact that out there many, many companies are making false claims about their efficiency which are in no way shape or form close to the power and performance advantages offered by AKIDA.

Given the statement today by Luca Verre CEO of Prophesee TWO more pretenders have been exposed.

My opinion only DYOR
FF

AKIDA BALLISTA
Thought I would post the complete transcript as it makes for a great read .... cheers FF

December 1, 2021
Brainchip, BRN

Summary​

BrainChip founder and CTO Peter van der Made presented at Stocks Down Under’s Semiconductor Conference on 30 November 2021.
See a full transcript of the presentation and of the Q&A sessionbelow.


Transcription​


Marc: All right. So the last presentation is from Peter van der Made. If I could invite Peter up to the stage, please. Peter is a fellow Dutchman I’m glad to say. So there’s a lot of semiconductor expertise actually in the Netherlands. Historically, he’s come from Philips Electronics and from their Philips Semiconductors and ASML so there’s a lot of expertise there. And Peter, good morning to you.
Peter: Good morning, Marc. How are you doing?
Marc: Pretty good. Pretty good. Thank you for joining us today. I was just talking about our background in the Netherlands in the semiconductor space. I suspect your background is potentially from Philips Semiconductors as well. But like I said, there’s a very big ecosystem and infrastructure in the Netherlands around this. A lot has happened for BrainChip recently including a very nice announcement last week. So without further ado, I’ll hand the floor over to you.
Peter: Yes. Thank you, Marc. It’s really good to have the opportunity to tell you about all the exciting things that are going on at BrainChip. I’m not sure if you can see our slides at the moment. Is that on, Marc? There they are. Yes. So let’s go to the next slide. Where are we? There was a button somewhere to flip. There they are.
So it is our statement, our disclaimer that says that we did our best to bring the best possible statements about our company, but that some of the information is available on the external sources that we cannot verify. You can read it at your leisure when you get our slides. So if we could start with the BrainChip mission statement. BrainChip was started back in 2004 with the objective to build safer, more sophisticated artificial intelligence processes. That’s exactly what we have done with the introduction of our Akida 1000. So with the Akida 1000, we have accomplished our first step in this way, a commercial neural multiprocessor that is unique in the markets. So this component has been recently introduced. We are now building boards with this component and shipping those boards to our early access customers.
The component is very unique because it’s very low power consumption, extremely low power consumption, which is good for the planet. Because if you look at current artificial intelligence in the computing way, where you’re completing the outcome, it generates enormous amounts of greenhouse gases. So if you look at the collected data centers of the world, it’s something around 600 megatonnes of greenhouse gases. And Akida 1000, if you look at a single instance, Akida 1000 is estimated at about 97% to 99% more energy-efficient in cloud processing. Or even if you look up GPUs, they are very power-hungry, up to a thousand times more power than Akida actually is. So we have also looked at healthcare. We believe that we can make a big difference in the healthcare field where our sensors pick up for instance [inaudible 00:05:04] blood, VOCs in breath and other medical samples. [inaudible 00:05:10] classified as faster and more accurately than any other method that is currently around. And even learn on the chip.
We also looked at safer workplaces where Akida can eliminate the risk in the workplace by monitoring, classifying air quality, automated quality control, visual classification, and tactile sensing. MetaTF is our development platform. MetaTF plugs into TensorFlow. It’s easy to use. Anybody who knows how to use TensorFlow will be able to use MetaTF. It’s self-contained, it’s fast, [inaudible 00:05:51] effort response and API that’s easy to use. So it provides for the onset learning and training, and a seamless environment for any developer that understands TensorFlow.
So a little bit about the background of BrainChip and what we are doing at the moment. At the moment we have development centers in two places in the world. We are in Perth in Australia with the Research Institute. We have a software development and testing center in Toulouse, in France. We have chip design and sales and engineering in Southern California. And our corporate headquarters in Sydney. We do the assembly and layout of the chip in Japan. So the current situation of the BrainChip share price on the 25th of November was 63 cents with a market cap of just over a billion dollars. We have 1.8 billion shares outstanding, with 1.3 billion in float. And as management hold just under 20% of the company. We are growing rapidly. We are employing people at the moment, especially in sales and marketing, and looking at about 70 people with our, as I mentioned, headquarters in Sydney.
This is our management and board. Between the five people in the management team, we have something like 180 years of experience. Rob, he’s our sales and marketing guy. He is ex ARM. Ken, CFO. And he is ex Virgin Galactic. So really going reaching out into space. Anil is our chief design and chief development officer. Anil must have made at least a hundred chips in his life. You know a little bit about my background. This is my third company I’ve started. The first company, PolyGraphics, high resolutions graphics back in the 1980s, early ’80s. I then went on to teaching at university for a while. And then after that, I created the new company called vCIS, which was later purchased by IISS and later by IBM. All the patents of the company are now in the hands of IBM. And Sean, who just joined us, many years with HP, and other large organizations. And so the seasoned executive in sales and marketing and building large companies out of early beginnings, especially in sales and marketing. And that’s why we focus on Sean and look at candidates for CEO.
Our board of directors, I’m also part of the board of directors. Many, many years in the semiconductor industry, in the finance area. Geoff, ex-Commonwealth Bank. And Antonio with many years in the industry as well. So we have a very seasoned team. Also, the people who are working for us are extremely well experienced. Oops [inaudible 00:09:32]. If you’re looking at our financial position at present for a burn rate, BrainChip has sufficient to actually run for two years. Of course, our burn rate is going to change over these years. We’re going to employ more people. But at the same time, we are counting on an explosion in sales. We’re looking forward to an explosion in sales on that Akida 1000 chip and its modules, and also the IP. We just signed up, as you may have seen, MegaChips, which is a large organization that makes chips for many different applications.
We also finalized the American depository receipts that makes it available for U.S. institutional investors. And we have upgraded our listing in the United States to the 2X level, which means that institutional investors can now invest in BrainChip in the United States. The company ended the previous quarter, as we have published in our listing, almost 24 million was in cash or 32 million Australian dollars. That’s U.S. dollars.
The patent’s protection is very important. As I said, our technology is very unique. It’s been a 16-year development path. So we want to make sure that this technology is protected. And our first patent stems from 2008. We filed that same patent in 2007 in Australia. And the patent has been followed up by new patents. We actually have a number of patent engineers within the company now who are pursuing these patents because in the last couple of months a lot of the patents that had been sitting around for a while had been granted. We have two patents that are allowed, patent [inaudible 00:11:47] seven and nine, which will soon be granted. We expect that it’ll be granted within the next month.
We are building out our patent portfolio to go worldwide as we are marketing the chip and the technology IP worldwide. So therefore we have filed in all these different countries. And in the next year or so, we will see at least hundreds of these patents being filed in different countries, and subsequently sort of granted.
So if you’re looking at our products, the Akida 1000 is the major product, which is the chip. The neural multiprocessor mimics…and I’ll look it up in the dictionary, it mimics neurobiological architecture present in the neural nervous system. So it doesn’t work like a processor that is applied in your laptop, it works like a mini-brain. It is very good at recognizing things, recognizing smells, recognizing images, recognizing tastes. We have done all these things as examples and made them available to the market. We also have an artificial retina image to spike converter. What the eye is doing, it takes images, converts those images into spikes that then the brain is processing. We do something very similar with our artificial retina image to spike converter.
The chip is up to 10 times more energy-efficient than any other edge AI processor. Except if you’re looking at very tiny things that can’t be really used for anything. But anything that isn’t the same class as the BrainChip Akida processor is using at least 10 times more energy here. And it’s up to a thousand times more energy-efficient than GPUs.
We also priced the chip for use in hand-held solar-powered equipment. So that at the [inaudible 00:14:01] at the dollar store for a chip, so we priced this chip at a very competitive level, somewhere between $15 and $25 in quantities. Other point is real-time learning. We do real-time learning, which is unique in the market. We have a video for instance where we show little elephants to the camera, and Akida classifies them, you can tell if it’s an elephant. We can then show it a picture of an elephant in the wild and it will recognize the elephant from any angle.
This is really unique in the market. Nobody else has been able to do this in a single shot. It learns very much like a child learns. Take a child and tell them it’s an elephant, and for the rest of its life, the child will know what an elephant looks like. You don’t have to show it a thousand examples like you do with [inaudible 00:14:54] deep learning.
The product we’re building around Akida, of course, we have MetaTF as a development environment. We also have USB stick. We have the M.2 Class insert board and the PCIe board for development as a development kit. This includes free sample codes with a development suite for visual, sounds, all those vibrations at that top classification, although very high accuracy.
So in comparison to the rest of the industry, if we’re looking at traditional AI versus neuromorphic AI, if our consumption is very high of traditional AI, neuromorphic AI is very, very low. Learning in traditional AI is expensive and time-consuming. It takes weeks to train a network. We can do once real-time learning, which takes milliseconds. Traditional AI needs cloud connectivity, Siri, for instance, and Alexa. They send all the information that they receive up to the cloud. It’s particularly a problem when you’re sending images, they can be hacked. In Akida’s case, neuromorphic AI is completely independent of the cloud. All the information is processed on the chip itself, and the classification is produced by the chip itself. So it’s very secure. The data that stays within the chip cannot be hacked.
Also, if you have Siri or Alexa or any other cloud-based AI nest, for instance, it requires an internet connection. Once it hasn’t got an internet connection, it doesn’t work anymore. And because of the need to facilitate information up to the cloud and then get it back as an answer, the latency can be quite high.
So here we’re looking at the 10 problems that Akida 1000 is solving in the industry. The IoT bandwidth is a problem that it’s building. If you have 40 billion as a forecast, 40 billion new units that connect to the internet, IoT devices. IoT bandwidth is becoming a serious internet [inaudible 00:17:25] becomes a serious problem with increased latency.
Power consumption, if you’re looking at CO2 emissions, Akida 1000 is extremely energy-efficient. Availability. If you’re in the middle of Nullarbor, you won’t have any internet connection, for instance, with Akida, you don’t need an internet connection to keep the unit working. So you don’t get the message that it needs an internet connection.
Training. Akida can be trained very rapidly with one-shot on-chip learning. Personalization, which is a very important thing, for instance, in the car industry, somebody gets into the camera and Akida can classify who that person is. Then learn instantly who that person is if it’s a new owner of the car for instance.
Portability. It’s small, it’s very light. Excuse me. It’s very small, it’s very light. And it allows AI in portable equipment because it doesn’t get hot. It uses no power almost. Very little power. So it doesn’t get hot, it doesn’t need any cooling. It can be used as a standalone module. It has an own chip processor. The on-chip processor, its only function is to preprocess data and receive data. The neuron fabric [inaudible 00:18:46] all the neural network work. So it’s very low cost. They are engineered for low-cost applications. It’s modular. You can modify the IP very easily to fit the manufacturer’s requirements. And IP licenses are available. And it’s an easy development environment that is familiar to many different data engineers.
So here we’re talking about the bandwidth problem [inaudible 00:19:24] devices forecast to be completed for bandwidth. Of course, Akida doesn’t need any bandwidth. We solve that problem. Data centers are growing with 20% per annum. And by 2030, if we continue going the way we’re going today, data centers will consume 30% of global electricity by 2030. So Akida can solve that problem because we got not processing [inaudible 00:19:53]. We get processing distributed over many different locations. And those locations can be powered by battery or solar power.
So talking about here is the runaway power of AI processing. At the moment, we only have the figure of 2016, which was 416 terawatts. That is about 30% more than the whole United Kingdom. Since 2016, it has been increasing at about 15% to 20%. So today’s estimation of CO2 emissions is something like 600 MegaTons. With the distributed processing where you have Akida in every device, you would be far more energy-efficient and avoid this problem.
So the Akida processor it has a large number of interfaces on it. It can talk to normal computer equipments. It has these M-Plus CPU on boards for data processing. It has the video frame interface. The artificial retina. I talked about information from the video camera can process into events or spikes, which are then processed on the neuron fabric and you get an output that classifies what is in-frame. We also have an external memory interface. We have multi-chip extensions, so you can put 64 Akidas on a single board. And I already mentioned the on-chip one-shot learning.
So it’s a very sophisticated device. It’s very exciting that we have this device ready at the right time for the right market. This is the forecast by Tractica. The Tractica forecast between 2018 and 2025. We’re looking at the 2022 bar here that 450 million forecast, 450 million A SIC that all could use Akida IP. With 700 million accelerated chips that could also be the Akida chip, the Akida 1000. And you see that they’re increasing very rapidly in the years to come. And this is a very exciting slide for us because this bottom part of the graph is where we operate.
The comparison of Akida through all the other components that are out there. So we have a very well documented competitive analysis, and this graph comes out of that. In place of Akida 1000, it ticks all the boxes. It’s micro to min power use. It’s real-time on-chip learning. It’s TensorFlow compatible. You can use it in standalone mode. The on-chip convolution, which means that you can look at images [inaudible 00:23:11] networks. It’s available as IP and it’s a very green technology.
If you’re looking at IBM TrueNorth, which is an older chip, it’s also a test chip. It’s not a commercial device like Akida 1000. It ticks the first box but none of the other ones. It’s a green technology because it has also low power, not as low as Akida.
Intel Loihi. Intel Loihi as well, it ticks the first box, it’s minimal power use. If you want to do learning, you have to program. It’s got its own environment called LAVA for learning. They have on-chip, now 686 processors, which cannot be very power efficient. They do say that they do convolution but I think it’s being done by the x86 hardware. And it’s not available to IP.
Then there is what I call deep learning accelerators like the Coral TPU and the NVIDIA Jetson. Those things are not very low power that through the [inaudible 00:24:25] Watts for the TPU, [inaudible 00:24:27] the CPU next to that. VLAs are in the range of 5 to 10 Watts. They are Math chips, they’re not doing any neural networks. They just perform multiplication. They are TensorFlow compatible but they do not check any of the other boxes. They do not do on-chip convolution. They have to do convolution on your CPU. They are not available as IP and they’re green because they use about the same amount of power for instance as a data center.
So these are our websites, and basically, you can find BrainChip on LinkedIn. Our videos [inaudible 00:25:14] at our videos. And I think that we are, potentially, Marc…
Marc: Excellent. Thank you, Peter. So we’ve got a number of questions coming in, some of them center around your current partnerships that you’ve announced previously, as well as the early access program. So let’s break that up in two. Can you talk a little bit about the feedback you’re getting from your early access program collaborators? And I know the question that we got out of this might be hard to answer, but do you have an idea of the percentage of partners or people that are trying this out that could convert to commercial deals?
Peter: The customers to convert to commercial deals? Of course, to take a little while for them to get their projects work out to the point where they are ready for a commercial deal. We’re particurlaly excited about the NASA people who have given us feedback. Very positive feedback about what they’re doing. Of course, they’re getting the latest generation of the chip. At the moment the production chip, which has been delivered to a number of people in the EIP Group. You cannot mention names, so EIP customers. Unfortunately, because they’re under a non-disclosure agreement, people do not want to splash what they’re working on in their labs out there in the market [inaudible 00:26:47] you know, what’s going to come next, not until they’re ready to do so.
Marc: Okay. Yeah. I understand that, especially in the semiconductor industry, it’s highly competitive and it’s always very sort of hush-hush. And it’s something that, unfortunately, the ASX doesn’t always understand. But is there anything in general that you can talk about with these partnerships that you have with NASA, Valeo, Magic Eye, Ford? Any sort of a common denominator that you see in these conversations with them?
Peter: Yeah. The common denominator would be that all of these companies are very excited about technology.
Marc: All right. That’s clear. A question from Dan about when we can start to see physical devices contain Akida, and I’m assuming this would be commercially. If you can comment a little bit on that.
Peter: Yes. As people are developing with Akida, they need to develop their boards, they need to develop their software, they need to prepare the whole market for these products. I’d say that in the next year we will see… I expect to see commercial products that are out there with Akida in it.
Marc: Okay. And in terms of revenue, some of the units via the recent batch of chips that have been completed, been distributed to partners. Is there anything you can say about what sort of revenue? Maybe not the level of revenue, but if there is something that, in the current quarter, that you will be reporting on soon. Because I seem to remember there was definitely a commercial or at least, you know, revenues coming from these chips, these initial chips.
Peter: Yes. And the revenue will be showing up in our 4C. I can’t really comment on that at this stage until we’re ready for these at our 4C at the end of this quarter.
Marc: Okay. Regarding the Akida 2000 and the timeline specifically for that one, can you talk a little bit about the development timeline?
Marc: Yes. Akida 2000 is very exciting too. We are developing that here in Perth. We have an excellent team here in Perth of highly qualified people who are working on Akida 2000 and Akida 3000. So we’re really thinking ahead here. Akida 2000 will be optimized for more complex network architectures, such as [inaudible 00:29:37] transformers. And for a very large part, those simulations of those chips are… At least Akida 2000 is already…the simulation is working already. And we are getting ready to hand it over to engineering. So Akida 3000 is going to a cortical network. Cortical networks are based on the way the human cortex works. They’re still a lot of open questions about, especially in neuroscience, how that exactly works. We’ll be building models of the cortex and see how we can apply that to a real commercial environment. Of course, everything we do, of course, even though the science is fascinating, we have to always keep in mind that we are building commercial products.
Marc: All right. One interesting question I saw earlier coming from Nathan about the advantage you have in terms of time versus competitors. Previously you stated two to three years advantage versus competitors. With the Akida 1000 now almost sort of commercially available, do you still think your timeline or the time advantage is still two to three years or has that changed?
Peter: Yes, we are working very hard to make sure that we maintain that advantage. So Akida 1000, we estimate to be two or three years ahead of the market. For instance, we have convolution and hardware not by a processor. We have real-time learning that nobody else has been able to manage. So we definitely have a two or three advantage in Akida 1000, plus we have commercializing that product. But to maintain that advantage, we are working on Akida 2000 and 3000. So we stay ahead of the markets. With Akida 3000, we’re probably about five or six years ahead at the moment.
Marc: All right. Well, last question, Peter, because the audio is still, apparently, not great. What we might do actually is just send you these questions so you can come back to us by email and then we can put them up on the website. But just the last question. With the new CEO, Sean, it’s very early days, but some questions come in for him as in how has he gone in the first sort of week, basically?
Peter: In the first week, yes, he’s extremely eager. He worked over at the… I got the questions about the business plan and technology and what we’re doing over the Thanksgiving weekend. So he’s extremely eager to get started. And yeah, we have a great working relationship. I really enjoy working with Sean.
Marc: Excellent. All right. Well, Peter, thank you very much for your time. We’ll send through those questions to you by email so we can get answers to those. And again, for everyone that’s here today, we’ll put up on our website.

 
  • Like
  • Love
  • Fire
Reactions: 24 users

buena suerte :-)

BOB Bank of Brainchip

Q&A session transcript​



Q: Can you give us some idea of the % of EAPs that the company thinks will sign commercial deals with BrainChip?

A: Our expectation is that most of our EAP customers will continue to have an expanding commercial relationship with BrainChip. The responses from all our EAP customers have been very good, and we are continuing our collaboration with them. We anticipate that we see results in the coming year.



Q: When will we start to see physical devices that contain Akida?

A: We could potentially see Akida chips and IP in commercial devices within the next 12-18 months, but that will depend on how quickly our customers can design, test and develop new products for sale. One thing we can say is that over the coming years, more and more devices will be carrying an Akida chip or its IP inside, and that’s very exciting.



Q: How is the new CEO going with the company? Can you share a bit more insights?

A: Sean Hehir is a great acquisition for BrainChip. His skill set, values and experience are perfectly suited to where BrainChip is right now and where we aspire to be. He has the drive, the energy, and the expertise to get us there. Rather than provide you with my thoughts on where Sean is going to drive the Company’s future, I think it’s better that Sean articulates that himself. I’m sure Sean will do that soon and we can share his vision for BrainChip.



Q: The BrainChip shuttle PC and Raspberry Pi have been on sale now for a few months. How many of these units have been sold and will we see revenue in the next set of reports?

A: We will report our sales revenues in the Appendix 4C cashflow report with the December Quarter report at the end of January, and in future 4C statements. We do not intend to report the number of product units sold but will aggregate combined sales revenues for the Appendix 4C report, as required by the ASX.



Q: Can you give an update on progress made with partners Renesas, Valeo and NASA?

A: We will provide updates on our partners and partnerships when we have something material to report. We will do this via the ASX in our Quarterly reports. We continue to work with these partners to support their development activities.



Q: Have any of the EAPs decided not to continue with Akida?

A: All our EAP customers continue to work with us to integrate either the Akida IP or the AKD1000 chip.



Q: What is the timeline for development of Akida 2000.

A: The BrainChip Research Institute are in the process of handing over the prototype of the AKD2000 design to our engineering team, who will then design the silicon. Our engineering resources are currently occupied supporting clients to integrate AKD1000 or the Akida IP. The commercial success of AKD1000 is our focus at present.



Q: Do you have any feedback from customers who received the first batch of volume produced chips?

A: EAP customers are very positive and excited to have the Akida1000 production chips in hand and the feedback has been entirely positive.



Q: Is there any schedule for the new CEO to address the market and shareholders?

A: Our new CEO, Sean Hehir, will engage with our shareholders and the broader equity capital markets soon. He’s only been in the job a week, so let’s allow him some time. I know he’s very keen to come to Australia and engage with our large and passionate retail shareholder base.



Q: What can you tell us about the significance of the MegaChips deal and can you give a bit more info about MegaChips plans to use the Akida IP?

A: MegaChips is a global Tier-1 semiconductor company with a well-establish reputation for quality, innovation, and technical expertise, so we are very excited to be in partnership with them. In addition to buying an IP license to use the Akida IP in a range of commercial products, MegaChips lends enormous industry credibility and validation to BrainChip and to our neuromorphic AI technology. They will also become a distribution partner, helping to spread the word about neuromorphic AI to their huge customer base and ensure they remain at the cutting edge of the next wave of innovation in the AI sector. Longer term, MegaChips is a transformational deal for BrainChip, and I’m not sure if the market fully appreciates that yet, but they will in time.



Q: Can you talk a little bit about the product roadmap and addressing different markets, e.g. with high-margin use cases?

A: In terms of the Akida family of products, we have already released our first generation Akida1000 product commercially, and we are well advanced in the development of the second generation, Akida2000.

Beyond that, we have our third generation Akida3000 currently in development at the BrainChip Research Institute in Perth, and we are looking at several options, including an Akida500, to address the needs of a range of different customer requirements and commercial applications. Some of these products are intended to be commercialised for specific markets (such as radiation hardened chips for use by NASA in space exploration), while others have a multitude of commercial applications across a global market.

The upshot is we believe that this pipeline of new products will keep us at the forefront of the neuromorphic AI sector for years to come, and that’s an exciting opportunity for investors to consider. Given the huge growth already happening in the Edge-AI sector, and the pressure that’s building on all fronts for industry, government, and households to find low-power solutions to help reduce carbon emissions, we believe we are the right company with the right product in the market at the right time.
 
  • Like
  • Love
  • Fire
Reactions: 39 users
Hi @DingoBorat

There is a statement by Peter van der Made that AKIDA is 1,000 times more efficient than GPUs. Not sure where it is now but its out there somewhere to be found.

Prophesee has been looking for the event based processor to match the performance of their event based sensor and have now stated via this podcast and their CEO that AKIDA has provided the missing piece of their puzzle. Clearly not SynSense who runs around claiming all sorts of things about itself uninhibited by the ASX or SEC rules.

This fellow who not one of us ever mentioned before yesterday posts one random claim about an unidentified IP as being 100 more time efficient than a GPU and when asked if it is AKIDA says not that one.

Well the first thing to say is according to Peter van der Made AKIDA is 1,000 times more efficient so 10 times better than what is being described.
Second thing is that someone else I think @Cardpro posted other posts of this chap where he is saying Moores Law is not dead and conventional technology has a long way to go before it is dead.

Then @Bravo found and posted about a new clock technology that improves the performance of a GPU 100 times and references RISC-V technology as being made more efficient.

Having reflected on @Bravo 's posted information I am of the mind that this is what this poster was referencing however as SiFive adds AKIDA to its RISC-V all this new clock technology does is make more efficient the whole system which includes AKIDA it does not replace AKIDA.

So whether it is @Bravo 's discovery or whether it is AKIDA2.0 it in no way changes the importance of the Prophesee statements concerning why they are going forward with AKIDA in Mobile Phone vision systems, Industrial vision systems and Automotive vision systems.

Indeed all of the concerns that some have raised about AKIDA being dropped by Mercedes Benz for better technology are real except they are real for SynSense over at BMW where Prophesee has its foot in the door and on todays presentation the only solution they would be moving forward with is one where they are utilising AKIDA in vision systems for automotive.

If anything the Prophesee podcast should have everyone asking will this increase the use of AKIDA in Mercedes Benz as they will no doubt be introducing Mercedes Benz to what the combined AKIDA Prophesee vision sensor is capable of achieving for them.

Perhaps Brainchip is also over at Valeo introducing the Prophesee AKIDA Vision System for Valeo to consider as yet another automotive sensor technology to combine with its present offerings.

I have an enormous sense of frustration with the continuing discussion of one random engineer who makes claims about testing he has not even completed. For all we know this other technology if it is not AKIDA could be like Elon Musks Chip and needs 500 watts of power to achieve this 100 times improved performance.

My opinion only DYOR
FF

AKIDA BALLISTA
Hey FactFinder, it says power efficiency not performance, so it's not anything like Musk's Dodo chip..

The guy seems genuine, when he says "no not that one" and I don't think it's in reference to "another" BrainChip product.
He has no reason to be cryptic and no one has a gun against his head, to comment further at all.

I don't think it would be very hard for "any" tech to be more power efficient, than GPU type technology..
As the Ford scientist stated a year or two ago, GPUs were "Zombie technology" dead, they just don't know it yet..

I also think @Bravo is most likely on the money, with the Movellus/Esperanto combination.

This doesn't mean it can do, what BrainChip technology can do.
It likely has no learning capabilities, is probably not available as IP and is not necessarily as suited to far edge devices, like AKIDA etc..
So it doesn't mean it will encroach on our particular markets.

I don't think anyone, myself included, is trying to take the shine off the Prophesee Podcast (which I'm admittedly yet to listen to and am sure is exciting) but it's definitely a topic worthy of discussion and the implications of what Esperanto has developed (if that's what it is), shouldn't be swept under the carpet, in my opinion..

You have no idea, how much I actually need, more than want BrainChip to do well, so I'm not trying to rain on the parade here..
 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 31 users

TechGirl

Founding Member
Hmmm I have made an educated assumption from what Luca was just referring to in our Podcast that this below announcement is the barcode example that of course we will be in because we complete Prophesee's Products.



Datalogic begins landmark partnership with Prophesee​


Datalogic, on the leading-edge of the most innovative technologies, partners with Prophesee to bring the performance and efficiency of neuromorphic vision to its industrial products

PARIS, October 3, 2022 – Datalogic, a global technology leader in the automatic data capture and factory automation markets, and Prophesee SA, inventor of the world’s most advanced neuromorphic vision systems, announce their recent collaboration regarding the next generation of industrial products.
Datalogic is working with Prophesee to evaluate how this highly innovative platform can bring greater value to its upcoming breakthrough products.
“We are conducting a very fruitful partnership with Prophesee. Neuromorphic vision is a fascinating technology inspired by the behaviour of the human biological system, exactly like Neural Networks. We believe that the combination of these technologies will provide innovative solutions to our customers’ needs”, said Michele Benedetti, Chief Technology Officer at Datalogic.
“Our partnership with Datalogic is a testament to the commercial readiness of our Metavision platform and its ability to meet a growing range of vision challenges. We look forward to continuing to work together, to evolve Datalogic’s advanced vision systems”, said Luca Verre, co-founder and CEO of Prophesee.

ABOUT PROPHESEE


Prophesee is the inventor of the world’s most advanced neuromorphic vision systems.
The company developed a breakthrough Event-Based Vision approach to computer vision. This new vision category allows for significant reductions of power, latency and data processing requirements to reveal what was invisible to traditional frame-based sensors until now. Prophesee’s patented Metavision® sensors and algorithms mimic how the human eye and brain work to dramatically improve efficiency in areas such as autonomous vehicles, industrial automation, IoT, security and surveillance, and AR/VR.
Prophesee is based in Paris, with local offices in Grenoble, Shanghai, Tokyo and Silicon Valley. The company is driven by a team of more than 100 visionary engineers, holds more than 50 international patents and is backed by leading international equity and corporate investors including 360 Capital Partners, European Investment Bank, iBionext, Intel Capital, Prosperity7 Ventures, Robert Bosch Venture Capital, Sinovation, Supernova Invest, Will Semiconductor, Xiaomi.
Learn more: www.prophesee.ai

ABOUT DATALOGIC GROUP


Global technology leader in the automatic data capture and factory automation markets since 1972, specialized in the designing and production of barcode readers, mobile computers, sensors for detection, measurement and safety, machine vision and laser marking systems.
Datalogic S.p.A. is listed in the Euronext STAR Milan segment of the Italian Stock Exchange since 2001 as DAL.MI.
Visit www.datalogic.com
 
  • Like
  • Fire
  • Love
Reactions: 42 users

Diogenese

Top 20
Hi @DingoBorat

There is a statement by Peter van der Made that AKIDA is 1,000 times more efficient than GPUs. Not sure where it is now but its out there somewhere to be found.

Prophesee has been looking for the event based processor to match the performance of their event based sensor and have now stated via this podcast and their CEO that AKIDA has provided the missing piece of their puzzle. Clearly not SynSense who runs around claiming all sorts of things about itself uninhibited by the ASX or SEC rules.

This fellow who not one of us ever mentioned before yesterday posts one random claim about an unidentified IP as being 100 more time efficient than a GPU and when asked if it is AKIDA says not that one.

Well the first thing to say is according to Peter van der Made AKIDA is 1,000 times more efficient so 10 times better than what is being described.
Second thing is that someone else I think @Cardpro posted other posts of this chap where he is saying Moores Law is not dead and conventional technology has a long way to go before it is dead.

Then @Bravo found and posted about a new clock technology that improves the performance of a GPU 100 times and references RISC-V technology as being made more efficient.

Having reflected on @Bravo 's posted information I am of the mind that this is what this poster was referencing however as SiFive adds AKIDA to its RISC-V all this new clock technology does is make more efficient the whole system which includes AKIDA it does not replace AKIDA.

So whether it is @Bravo 's discovery or whether it is AKIDA2.0 it in no way changes the importance of the Prophesee statements concerning why they are going forward with AKIDA in Mobile Phone vision systems, Industrial vision systems and Automotive vision systems.

Indeed all of the concerns that some have raised about AKIDA being dropped by Mercedes Benz for better technology are real except they are real for SynSense over at BMW where Prophesee has its foot in the door and on todays presentation the only solution they would be moving forward with is one where they are utilising AKIDA in vision systems for automotive.

If anything the Prophesee podcast should have everyone asking will this increase the use of AKIDA in Mercedes Benz as they will no doubt be introducing Mercedes Benz to what the combined AKIDA Prophesee vision sensor is capable of achieving for them.

Perhaps Brainchip is also over at Valeo introducing the Prophesee AKIDA Vision System for Valeo to consider as yet another automotive sensor technology to combine with its present offerings.

I have an enormous sense of frustration with the continuing discussion of one random engineer who makes claims about testing he has not even completed. For all we know this other technology if it is not AKIDA could be like Elon Musks Chip and needs 500 watts of power to achieve this 100 times improved performance.

My opinion only DYOR
FF

AKIDA BALLISTA
According to the originators of DVS, we are a perfect match for their technology. Particularly liked the bit about mobile phones.

We also have a sweet spot for LiDaR (LdN a couple of years ago). I can't wait for Valeo to come out of stealth mode.
Let's say for arguments sake that Francoise was talking about the new Esperanto chip. Do you think they could be a competitor or is there a possibility that we could be complementary to each other? I say this because of Esperanto's links with SiFive.

In another article by Sally Ward-Foxton it stated:

“BrainChip can run [AI] algorithms on their own, but when they move into a larger system, they will need a host processor,” Chris Jones, vice president, product at SiFive, told EE Times. “You could pick a host processor that does nothing but scheduling, or you could pick a host processor that actually contributes to the AI processing, and that’s where the SiFive Intelligence product comes in.”

In an SoC design for edge AI, the AI workload would typically be split between host processor, vector processor, and AI accelerator — some parts of edge workloads are better suited to general purpose compute rather than a dedicated AI accelerator, Jones said.



This 1,000-core RISC-V processor is generating buzz in the AI space​


By Anthony Spadafora
published April 26, 2022
New AI chip hold promise for machine learning recommendation


Esperanto RISC-V Chip

(Image credit: Esperanto Technologies)


A new 1,000-core RISC-V processor from Esperanto Technologies (opens in new tab) is currently being evaluated by Samsung SDS and other ‘lead customers”.

According to a press release (opens in new tab) from the computer software company, its new ET-SoC-1 AI Inference Accelerator is undergoing initial evaluations by a number of firms ahead of its release.

Esperanto Technologies itself was founded back in 2014 by semiconductor industry veteran Dave Ditzel who has previously worked at both Sun Microsystems and Intel. The company is now led by President and CEO Art Swift who in addition to being the former CEO of Wave Computing also spent some time working at the RISC-V Foundation.

As part of its evaluation program, Samsung SDS (opens in new tab) and other potential customers will get a chance to obtain performance data after running a variety of off-the-shelf AI (opens in new tab) models using the ET-SoC-1 AI Inference Accelerator and so far the results have been quite impressive.

ET-SoC-1 AI Inference Accelerator​

Esperanto’s ET-SoC-1 features 1,088 energy-efficient, 64-bit bit processor cores that utilize the RISC-V (opens in new tab) instruction set architecture which is quickly becoming a viable alternative to those of both x86 and ARM. The company’s new chip also includes four high-performance RISC-V cores along with 160m bytes of on-chip SRAM as well as interfaces for flash memory and external DRAM.

What sets the ET-SoC-1 apart from similar chips is its speed together with its low-power requirements. While the chip can run any type of machine learning (opens in new tab) workload, Esperanto says that it excels at machine learning recommendation which is used by Meta, Amazon and other hyperscalers.

VP of AI at Samsung SDS, Dr. Patrick Bangert provided further insight on the experience the company’s data science team had when evaluating the ET-SoC-1, saying:

“Our data science team was very impressed with the initial evaluation of Esperanto’s AI acceleration solution. It was fast, performant and overall easy to use. In addition, the SoC demonstrated near-linear performance scaling across different configurations of AI compute clusters. This is a capability that is quite unique, and one we have yet to see consistently delivered by established companies offering alternative solutions to Esperanto.”

While Esperanto has given Samsung SDS and other potential customers a chance to test out its new AI chip (opens in new tab), we’ll have to wait and see if the evaluation was impressive enough for orders to start coming in.

Hi Bravo,

Just trying to fend off the piratical boarders, Francois said he had looked at the IP, whereas Esparanto has the SoC available for customer evaluation.

https://www.esperanto.ai/News/esper...erencing-solution-now-in-initial-evaluations/
"Esperanto has made very impressive progress and is now providing customers evaluation access to their RISC-V hardware and software running off-the-shelf AI models with strong performance and efficiency. This really shows the company’s confidence in their first multi-core solution,” said Karl Freund, founder and principal analyst at Cambrian-AI Research.

While this does not entirely disprove the Esperanto theory in that they could also be sending the IP out as well, it still leaves Akida1.5/2 in the race.

However, IP is often a closely guarded secret (now I'm putting my thumb on Akida's side of the scales).
 
  • Like
  • Love
  • Fire
Reactions: 26 users

HopalongPetrovski

I'm Spartacus!

Yes, so much to become enthused about from this latest podcast. 🤣

Just any one of those three application areas mentioned, Industrial, Auto, Mobile phones could/would be Company making, from a revenue and reputation building standpoint.
The opinion expressed that Prophesee's biologically inspired offering is "completed" by our neuromorphic tech, in that their sensor's data is efficiently processed by our IP in a way that existing, outdated methodology's simply can't, is fantastic validation for us.

We have the "brain like" component to handle and make useful sense of the output from their "eye" in a way that is a more true mimicry of the actual biological systems that have inspired their creators.

Existing computational methodologies and the still-image cinematography based systems, invented 150 years ago, are hampered too much by latency, overwhelmed by unnecessary and repetitive processing and restricted by dynamic range constraints which in many circumstances render the data presented by their "eye" degraded beyond usability.

Their event based offering is much better suited to artificial intelligence systems under development and our evolving IP seems to be heading in the same direction. Future iterations as expressed by Peter VDM are extending our neuromorphic attributes and the blending of perceived data streams or sensor fusion, will only lead towards intelligences beyond human perception and capability.

Meanwhile, I was very heartened by the expressed desire to achieve human vision restoration which is at the heart and perhaps indeed the raison d'être for Prophesee. Another fine ethical company in alignment with our founders vision.
 
  • Like
  • Love
  • Fire
Reactions: 34 users

cosors

👀
Hi @cosors: that seems strange. Did you check other data (other online brokers, for example)?
Tradegate itself shows about 130k as volume, SP up 5,05%.
Regards
cassip
Yes I see it. I have no idea why. So far this has always been a reliable page with a nice overview of all the exchanges here including OTC. But Onvista has just been redesigned for more information and Tradegate says unknown not 0. I overlooked that. Maybe wrong because of updates and adjustments of the page, under construction?
Thanks for bringing it to my attention!
___
I deleted my misleading post.
 
  • Like
Reactions: 4 users
Works for Mercedes. I wonder who he is referring to?! If it was Brainchip you would imagine that they’ve “reviewed” the IP a while ago, unless they have the new and improved akida containing LSTM?? View attachment 17989
Hi @DingoBorat

Here I go sweeping this under the mat and hopefully into a crack and then into the cavity under the floor boards where it can remain for 100 years to be discovered by someone interested in technology that just was too little too late when discovered.

Clearly cannot be AKIDA because it uses ten times more power - Peter van der Made said of AKIDA1.0

“And it’s up to a thousand times more energy-efficient than GPUs”

So now I have swept it under the mat we can all get stuck into understanding the full exciting significance of todays podcast.

My opinion only DYOR
FF

AKIDA BALLISTA
 
  • Like
  • Haha
  • Fire
Reactions: 21 users

Gazzafish

Regular
So glad I’m in QLD at the
Moment so I get too see BRN’s open and close price an hour earlier than everyone else in other states of Australia.. 🤪🎉
 
  • Haha
  • Like
Reactions: 18 users

HopalongPetrovski

I'm Spartacus!
So glad I’m in QLD at the
Moment so I get too see BRN’s open and close price an hour earlier than everyone else in other states of Australia.. 🤪🎉
Yes.......but what about the curtains? 🤣🤣🤣
And those poor cows having to get up a whole hour earlier every day!
 
  • Haha
  • Like
  • Sad
Reactions: 19 users

Diogenese

Top 20
Hmmm I have made an educated assumption from what Luca was just referring to in our Podcast that this below announcement is the barcode example that of course we will be in because we complete Prophesee's Products.


Datalogic begins landmark partnership with Prophesee​


Datalogic, on the leading-edge of the most innovative technologies, partners with Prophesee to bring the performance and efficiency of neuromorphic vision to its industrial products

PARIS, October 3, 2022 – Datalogic, a global technology leader in the automatic data capture and factory automation markets, and Prophesee SA, inventor of the world’s most advanced neuromorphic vision systems, announce their recent collaboration regarding the next generation of industrial products.
Datalogic is working with Prophesee to evaluate how this highly innovative platform can bring greater value to its upcoming breakthrough products.


ABOUT PROPHESEE


Prophesee is the inventor of the world’s most advanced neuromorphic vision systems.
The company developed a breakthrough Event-Based Vision approach to computer vision. This new vision category allows for significant reductions of power, latency and data processing requirements to reveal what was invisible to traditional frame-based sensors until now. Prophesee’s patented Metavision® sensors and algorithms mimic how the human eye and brain work to dramatically improve efficiency in areas such as autonomous vehicles, industrial automation, IoT, security and surveillance, and AR/VR.
Prophesee is based in Paris, with local offices in Grenoble, Shanghai, Tokyo and Silicon Valley. The company is driven by a team of more than 100 visionary engineers, holds more than 50 international patents and is backed by leading international equity and corporate investors including 360 Capital Partners, European Investment Bank, iBionext, Intel Capital, Prosperity7 Ventures, Robert Bosch Venture Capital, Sinovation, Supernova Invest, Will Semiconductor, Xiaomi.
Learn more: www.prophesee.ai

ABOUT DATALOGIC GROUP


Global technology leader in the automatic data capture and factory automation markets since 1972, specialized in the designing and production of barcode readers, mobile computers, sensors for detection, measurement and safety, machine vision and laser marking systems.
Datalogic S.p.A. is listed in the Euronext STAR Milan segment of the Italian Stock Exchange since 2001 as DAL.MI.
Visit www.datalogic.com
Sorry,

The company I was thinking about was Digimarc who pops up in BrainChip searches, not Datalogic.
 
  • Like
  • Love
Reactions: 6 users

Diogenese

Top 20
So glad I’m in QLD at the
Moment so I get too see BRN’s open and close price an hour earlier than everyone else in other states of Australia.. 🤪🎉
But at what cost to the curtains?
 
  • Haha
  • Like
Reactions: 5 users

Diogenese

Top 20
  • Haha
  • Like
Reactions: 5 users

Cardpro

Regular
Wow market wants BrainChip!
Bye bye shorters, do you guys really want to gamble? One news will shoot this company to the moon (literally - come on NASA!!)
 
  • Like
  • Fire
Reactions: 11 users

buena suerte :-)

BOB Bank of Brainchip
So glad I’m in QLD at the
Moment so I get too see BRN’s open and close price an hour earlier than everyone else in other states of Australia.. 🤪🎉
Well at the moment in Perth time! markets open @ 7am and close @ 1pm so great to watch the open with brekky and the close with lunch :)

GO BRN .... another Green day ;)
 
  • Like
  • Haha
  • Love
Reactions: 18 users
According to the originators of DVS, we are a perfect match for their technology. Particularly liked the bit about mobile phones.

We also have a sweet spot for LiDaR (LdN a couple of years ago). I can't wait for Valeo to come out of stealth mode.

Hi Bravo,

Just trying to fend off the piratical boarders, Francois said he had looked at the IP, whereas Esparanto has the SoC available for customer evaluation.

https://www.esperanto.ai/News/esper...erencing-solution-now-in-initial-evaluations/
"Esperanto has made very impressive progress and is now providing customers evaluation access to their RISC-V hardware and software running off-the-shelf AI models with strong performance and efficiency. This really shows the company’s confidence in their first multi-core solution,” said Karl Freund, founder and principal analyst at Cambrian-AI Research.

While this does not entirely disprove the Esperanto theory in that they could also be sending the IP out as well, it still leaves Akida1.5/2 in the race.

However, IP is often a closely guarded secret (now I'm putting my thumb on Akida's side of the scales).
Have we looked at Innatera before?

Backed by German Tech investors last year, IP possibly available and whilst an analog-mixed signal does use SNN?

The conversation re MB is a valid conversation imo and is not to say we aren't still in the frame and should not take away from the podcast as they are two diff discussions.

Given the positivity of the podcast, and it is, really like to see something more formal in due course between us & Prophesee.

I also thought being a public forum and not say, a fiefdom, every SH or non-SH for that matter, should be afforded their thoughts, questions and views so long as respectfully presented.

Just as positivity posts are welcomed, other questions we may not sometimes want to consider should also be.

Personally, I prefer to not shy away from "what ifs" or "could be".

As a new positive BRN connection, partnership, possibility etc is discovered helps keep me informed, the opposite of those is also as true.


Blue is available.

1664940045741.png




1664940535586.png
 
  • Like
  • Fire
  • Thinking
Reactions: 12 users

alwaysgreen

Top 20
Almost getting upto where it was before 10M shorts taken out - that have not as yet been bought back
We need our share price to be up at least 10% higher if we are going to remain in the 200 after the next re-balance.

They gave us a quarter out of the 200 because they don't like constantly re-shuffling but I don't think we will be so lucky at the next re-balance.
 
  • Like
  • Thinking
Reactions: 6 users
Top Bottom