Exactly @cosors, with cloud based systems, that simply upload the raw data, privacy is a real concern. Once the transmitted images are in “The Cloud” who the hell knows what can be done with them.That is exactly my thought and approach. Specifically I'm interested in facial recognition or driver recognition as it will probably become mandatory. The laws are not yet in place as well as the laws regarding AI-based data processing, probably next year.
I'm quite sure that Nviso+Akida has or will get the attention of some lawyers in this particular area. Through MB we have at least more attention in Germany than in the US. There Tesla attracts all the attention. Be it about lawsuits over facial recognition in the vehicle or recognition outside. Apart from lawsuits against Autopilot (now also one in Germany). As far as I know none of the lawsuits have been settled yet.
Nviso should do some/more lobbying in Brussels so that the key politicians in the law-making process realise that there is also a non-cloud based solution to the problem.
StableI agree. It’s not up to Brainchip to announce it. And maybe they can’t due to NDA as Renesas will want to push their own AI where they can even if it’s inferior, so they get bigger profit margins.
Brainchip may have their hands tied. Once Renesas bought the license for the 2 nodes to put into an Arm M33 product they have paid the $$$ to do what they like with it.
So maybe there won’t be an announcement. It would be nice to have some acknowledgement of a product available now after such a long wait for all involved!
Maybe it will just be in the financials. Either way I am just happy for the companies success and for the shareholders who have held firm!
Or… I could be completely wrong!
Fingers crossed everyone!
With all due respect to ARM I think that Akida technology is a little more specialised and sophisticated than ARMs low power reduced instruction chip technology. IMO this should enable BRN to command a much higher price for their IP in each chip sold. Time will tell I guess.If ARM's total revenue amounts to 9.7c per chip shipped, why would we be any different? I struggle to understand why it is so low.
If ARM's total revenue amounts to 9.7c per chip shipped, why would we be any different? I struggle to understand why it is so low.
ARM Example Royalties | |
IP | Royalty (% of chip cost) |
ARM7/9/11 | 1.0% - 1.5% |
ARM Cortex A-series | 1.5% - 2.0% |
ARMv8 Based Cortex A-series | 2.0% and above |
Mali GPU | 0.75% - 1.25% adder |
Physical IP Package (POP) | 0.5% adder |
I recall 50 cents per chip being thrown around a while back. Pretty sure it came from management but take this with a grain of salt as I can't be certain. I think it was at the same time they mentioned that royalties will also be dependant on the final cost or quantity of the product it is being utilised in. I'll try and dig up the comments.The ARM revenue averages out at 9.7c per chip as there would be many older IP low cost chips as compared to their newer IP high cost chips.
ARM Example RoyaltiesIP Royalty (% of chip cost)ARM7/9/11 1.0% - 1.5%ARM Cortex A-series 1.5% - 2.0%ARMv8 Based Cortex A-series 2.0% and aboveMali GPU 0.75% - 1.25% adderPhysical IP Package (POP) 0.5% adder
You can see above an old ARM 7/9/11 IP chip has 1-1.5% royalty whereas a newer ARMv8 Cortex IP chip has 2% and above. Newer IP ARM chips will also have adders such as 0.75-1.25% for the Mali GPU + 0.5% adder for POP. The 0.5% adder for POP is usually paid by the foundry such as TSMC.
If you look at ARM's customer list TSMC is one of them due to the POP royalties.
Brainchip being revolutionary new AI IP should have royalties similar to new IP ARM chips at 2% and above. If chip cost is $15 x 2% = 30c per chip.
Well Well, what do we have here? Nail on the head I would say.Any idea if Innoviz are using AKIDA for their Lidar? Just signed a 4B deal with VW and have more sales through BMW. They say sales will be around 8 million units
Well Well, what do we have here? Nail on the head I would say.
Okay, so I did some digging and came up short so ignore my 50 cent royalty comment unless someone else recalls this. Some things I found though are interesting (old news though).I recall 50 cents per chip being thrown around a while back. Pretty sure it came from management but take this with a grain of salt as I can't be certain. I think it was at the same time they mentioned that royalties will also be dependant on the final cost or quantity of the product it is being utilised in. I'll try and dig up the comments.
That's awesome ... great work @thelittleshortRecent email correspondence between myself and motoring writer Tony Davis from the Australian Financial Review:
View attachment 13662
On Sat, 6 Aug 2022 at 12:23, thelittleshort wrote:
Hi Tony
I just read your great article on Baraja (link)
Have you considered writing a similar piece on BrainChip? Another Australian company with an extremely bright future
BrainChip Akida neuromorphic IP is included in the EQXX concept EV from Mercedes Benz and is also being adopted by some amazing partners including Ford and NASA
Not sure if you are aware of BrainChip already? If not they are definitely worth a bit of research and will be a classic Australian success story within a couple of years
Appreciate your time
Cheers
thelittleshort
On 6 Aug 2022, at 7:51 pm, Tony Davis <tony.crossmedia@gmail.com> wrote:
Hey thelittleshort, thanks for the note. send me some more info and i will read with interest, cheers, tony
On Mon, 8 Aug 2022 at 13:34, thelittleshort wrote:
Hi Tony
Thanks for your interest. Specific BrainChip automotive info and links below. Hopefully they pique your interest.
BrainChip have many irons in the fire, with automotive being only one market that their technology is going to disrupt
Here is an AFR article by Tony Boyd from January 2022 about BrainChip generally as a tech stock to watch
Description of BrainChip's tech itself from their website. This gives an overview of the technology itself
BrainChip is a global technology company that is producing a ground breaking neuromorphic processor that brings artificial intelligence to the edge in a way that is beyond the capabilities of other products. The chip is high performance, small, ultra-low power and enables a wide array of edge capabilities that include on-chip training, learning and inference. The event-based neural network processor is inspired by the spiking nature of the human brain and is implemented in an industry standard digital process. By mimicking brain processing BrainChip has pioneered a processing architecture, called Akida™, which is both scalable and flexible to address the requirements in edge devices. At the edge, sensor inputs are analyzed at the point of acquisition rather than through transmission via the cloud to a data centre. Akida is designed to provide a complete ultra-low power and fast AI Edge Network for vision, audio, olfactory and smart transducer applications. The reduction in system latency provides faster response and a more power efficient system that can reduce the large carbon footprint of data centres.
BrainChip White Paper: Designing Smarter and Safer Cars With Essential AI https://brainchip.com/wp-content/uploads/2022/07/BrainChip_Designing-Smarter-Safer-Cars.pdf
Conventional AI silicon and cloud-centric inference models do not perform efficiently at the automotive edge. This makes it challenging to manufacture vehicles with highly personalized in-cabin systems and advanced assisted driving capabilities. To speed up the development of smarter and safer vehicles, innovative automotive companies are untethering edge AI functions from the cloud - and performing distributed inference computation on local neuromorphic AKIDA silicon. With AKIDA-powered smart sensors and AI accelerators, automotive companies are designing lighter, faster, and more energy efficient in-cabin systems that enable advanced driver verification and customization, sophisticated voice control technology, and next-level gaze estimation and emotion classification capabilities. In addition to redefining the in-cabin experience, AKIDA supports new computer vision and LiDAR systems that detect vehicles, pedestrians, bicyclists, street signs, and objects with incredibly high levels of precision. These fast and energy efficient ADAS systems are already helping automotive companies accelerate the rollout of increasingly advanced assisted driving capabilities.
From Mercedes Benz regarding the inclusion of BrainChip's Akida in their concept EV VISION EQXX https://group-media.mercedes-benz.com/marsMediaSite/instance/ko.xhtml?oid=52282663&filename=VISION-EQXX--taking-electric-range-and-efficiency-to-an-entirely-new-level
Neuromorphic computing – a car that thinks like you: Another key efficiency feature of the VISION EQXX that takes its cue from nature is the way it thinks. It uses an innovative form of information processing called neuromorphic computing. The hardware runs so-called spiking neural networks. Information is coded in discrete spikes and energy is only consumed when a spike occurs, which reduces energy consumption by orders of magnitude. Working with California-based artificial intelligence experts BrainChip, Mercedes-Benz engineers developed systems based on BrainChip’s Akida hardware and software. The example in the VISION EQXX is the “Hey Mercedes” hot-word detection. Structured along neuromorphic principles, it is five to ten times more efficient than conventional voice control. Although neuromorphic computing is still in its infancy, systems like these will be available on the market in just a few years. When applied on scale throughout a vehicle, they have the potential to radically reduce the energy needed to run the latest AI technologies.
From EE Times - Mercedes Applies Neuromorphic Computing in EV Concept Car by Sally Ward-Foxton https://www.eetimes.com/mercedes-applies-neuromorphic-computing-in-ev-concept-car/#
The Mercedes Vision EQXX concept car, promoted as “the most efficient Mercedes-Benz ever built,” incorporates neuromorphic computing to help reduce power consumption and extend vehicle range. To that end, BrainChip’s Akida neuromorphic chip enables in-cabin keyword spotting as a more power-efficient way than existing AI-based keyword detection systems. As automakers shift their focus to electric vehicles, many are struggling to squeeze every last volt from a single battery charge. The need to reduce power consumption in vehicle electronic systems has therefore become critical to extending EV range. Touting Vision EQXX as “a car that thinks like you,” Mercedes promises range of more than 1,000 km (about 620 miles) on a single charge. “Working with California-based artificial intelligence experts BrainChip, Mercedes-Benz engineers developed systems based on BrainChip’s Akida hardware and software,” Mercedes noted in a statement describing the Vision EQXX. “The example in the Vision EQXX is the “Hey Mercedes” hot-word detection. Structured along neuromorphic principles, it is five to ten times more efficient than conventional voice control,” the carmaker claimed.
Valeo
BrainChip signs joint development agreement for Akida neuromorphic chip with Valeo https://smallcaps.com.au/brainchip-joint-development-agreement-akida-neuromorphic-chip-valeo/
NVISO
https://www.nviso.ai/en/news/nviso-...-computing-using-the-brainchip-akida-platform
In this video BrainChip’s eco-system partner, Nviso, demonstrates Emotion Detection running on the BrainChip AKD1000 reference board BrainChip + Nviso Emotion Detection Demo
https://brainchip.com/brainchip-and-nviso-partner-automotive-edge-devices/
Cheers
thelittleshort
From: Tony Davis <tony.crossmedia@gmail.com>
Subject: Re: BrainChip
Date: 9 Aug 2022 at 10:08 am
To: thelittleshort
thanks thelittleshort, leave it with me ... might work for a supplement i am doing a little later in the year, cheers, tony
If you get to see this person again (I assume it is Nirav Patel you are talking about), ask him if he knows Andrew Morton, lead developer and maintainer of the Linux kernel. Andrew is also employed by Google. I used to work with Andrew back in the 1990s, we were jogging buddies too. I expect the CTO of the Linux Foundation probably has a working relationship with him. But I digress.Hope you guys might be able to help me here. I am a retired banker so the BrainChip thing is amazing for me to understand (and most of I still don’t get) but being a shareholder is a privilege.
I am on holiday presently in Seville, and I met a bloke who is the CTO of The Linux Foundation in the US.
https://www.linuxfoundation.org/.
He is telling me of Open Source computing and how it is vital in the current evolution of computing. I suggested Brainchip was too, but was out of my depth in keeping up with him. Can anyone guide me to anything that might help my understanding of where Open Source fits in, and is it relevant to us?
Oh, I just heard Olivia Newton John just passed away. I am crying, especially with Judith Durham a few days ago
If you get to see this person again (I assume it is Nirav Patel you are talking about), ask him if he knows Andrew Morton, lead developer and maintainer of the Linux kernel. Andrew is also employed by Google. I used to work with Andrew back in the 1990s, we were jogging buddies too. I expect the CTO of the Linux Foundation probably has a working relationship with him. But I digress.
Open source means anyone can write software and others can use it freely. Even better, the actual source code is published in such a way that others can see the actual source code and can even modify it and re-publish it. That happens quite often and is how the community assists itself. This could help Akida uptake as people write software that can utilise Akida’s IP, and then users need to buy a licence, or buy hardware, or pay for IP on which to run their solution.
You probably have seen that software @uiux has been sharing. That’s on a platform that is used for software developers to share their code openly. And that is what @uiux is doing. And uiux’s software can help others work with Akida! At least show them some things that are possible and how simple it is.
For the coding I used to do, we were never allowed to use open source or publish our code as open source. Sometimes we had to work on air-gaped computers in secure internal rooms. It was secret squirrels stuff. I only add that as I expect a lot of companies developing systems utilising Akida may be in the same boat. Their code is the opposite of open-source as their livelihood depends on it. But any open source stuff out there will certainly help with prototyping and proof-of-concept work to show to investors and the like.
My view is that open source is for playing with and testing the waters only. If that is what was meant by “vital to the current evolution of computing” then I agree.
Any real-world, life-critical programs most likely will not contain open source code. The operating system may very well be open source itself, but the applications won’t.
We going to see some action with the sp today