BRN Discussion Ongoing


8.6 Sensor Module Algorithm and Firmware
To operate the ZMOD4410, the software and libraries provided by Renesas should be used. The algorithm for the user’s microprocessor always
uses the raw output signals from the ZMOD4410, which are resistances, to determine the level of gases present. Depending on the
microprocessor and compiler used, the firmware and its algorithms require 10 to 30 kB flash size. All algorithms feature an automated baseline
correction function, ensuring that the module can learn from its environment and distinguish elevated levels of gases under all conditions
. The
proprietary metal oxide (MOx) used in the sensor ensures that the sensor module can respond effectively to changing TVOC during long-term
operation. Therefore, the user focus should always be on the relevant output parameters: The IAQ level according to UBA, the TVOC
concentration (available in mg/m3 and as Ethanol equivalent in ppm), and/or the estimation of carbon dioxide (eCO2).
For implementing the sensor module in a customer-specific application, detailed information on the programming is available.

More information
and guidance on the firmware integration, architecture, and supported platforms are available in the ZMOD4410 Programming Manual – Read
Me. Code Examples in C and additional firmware descriptions for API, HAL, libraries, etc., are included at no cost in the downloadable firmware
package from the ZMOD4410 product page


@butcherano @Diogenese @Stable Genius

Good morning @uiux

Is that a technical way of saying no, it’s not Akida?

If so, appreciate you keeping it real. It can be quite draining having to temper others expectations. So sorry about that!

If so forgive my exuberance. I was really just very happy for everyone involved: the founders; staff and us shareholders!

I’ll put my excitement back in it’s box, close the lid on it and wait patiently.

If this product doesn’t include Akida it’s not the end of the world.

Renesas have still licensed 2 nodes to release on an Arm M33 product with us at some stage as their CTO told us so!

Every days a fresh start, enjoy!

1659995953502.jpeg
 
  • Like
  • Love
  • Fire
Reactions: 27 users

uiux

Regular
Good morning @uiux

Is that a technical way of saying no, it’s not Akida?

If so, appreciate you keeping it real. It can be quite draining having to temper others expectations. So sorry about that!

If so forgive my exuberance. I was really just very happy for everyone involved: the founders; staff and us shareholders!

I’ll put my excitement back in it’s box, close the lid on it and wait patiently.

If this product doesn’t include Akida it’s not the end of the world.

Renesas have still licensed 2 nodes to release on an Arm M33 product with us at some stage as their CTO told us so!

Every days a fresh start, enjoy!

View attachment 13652


Was just highlighting this feature:


All algorithms feature an automated baseline
correction function, ensuring that the module can learn from its environment and distinguish elevated levels of gases under all conditions
.

--

Which is interesting if the system relies on a neural network for it's classifications, learning on the edge perhaps?
 
  • Like
  • Fire
  • Love
Reactions: 24 users

Dozzaman1977

Regular
1659996384475.png
 
  • Like
  • Fire
Reactions: 10 users

Slymeat

Move on, nothing to see.
That is exactly my thought and approach. Specifically I'm interested in facial recognition or driver recognition as it will probably become mandatory. The laws are not yet in place as well as the laws regarding AI-based data processing, probably next year.
I'm quite sure that Nviso+Akida has or will get the attention of some lawyers in this particular area. Through MB we have at least more attention in Germany than in the US. There Tesla attracts all the attention. Be it about lawsuits over facial recognition in the vehicle or recognition outside. Apart from lawsuits against Autopilot (now also one in Germany). As far as I know none of the lawsuits have been settled yet.

Nviso should do some/more lobbying in Brussels so that the key politicians in the law-making process realise that there is also a non-cloud based solution to the problem.
Exactly @cosors, with cloud based systems, that simply upload the raw data, privacy is a real concern. Once the transmitted images are in “The Cloud” who the hell knows what can be done with them.

Akida-based sensors, or just sensors that interact with Akida, only deal with meta data. Internal one-shot training is used to locally store the metadata required to do internal facial recognition. But even then, only meta data is stored and the individuals are anonymously known as person-1, person-2 etc. It is only for demonstrable effect that these are labelled as “Rob”, “Anil” etc.

For obstacles outside the vehicle, no facial recognition is required. Once again Akida-based sensors need only transmit metadata describing the type of obstacle, location, and motion (helped by LTSM). And once again, none of this info needs to be sent external to the vehicle and only needs to be stored locally, and then only for a few seconds.

Even when considering the possibility of “black boxes” for EVs, they need only store data pertinent to vehicle logistics and response based on type of obstacles in the immediate area. And maybe an anonymised LiDAR map when an accident happens. But then we have dash cam for that also—these are deemed less invasive for their lack of connectedness also. And a person is liable if they chose to manually upload images.

An EV that uses only metadata and transmits NOTHING externally, should not be subject to this legislation. Akida should rule this space. Akida is THE ANSWER to this space!

Bring on the legislation I say. I’d love to see it enforced everywhere. One could even say, I want this privacy legislation being ubiquitous.🤣
 
  • Like
  • Fire
  • Love
Reactions: 20 users

Earlyrelease

Regular
I agree. It’s not up to Brainchip to announce it. And maybe they can’t due to NDA as Renesas will want to push their own AI where they can even if it’s inferior, so they get bigger profit margins.

Brainchip may have their hands tied. Once Renesas bought the license for the 2 nodes to put into an Arm M33 product they have paid the $$$ to do what they like with it.

So maybe there won’t be an announcement. It would be nice to have some acknowledgement of a product available now after such a long wait for all involved!

Maybe it will just be in the financials. Either way I am just happy for the companies success and for the shareholders who have held firm!

Or… I could be completely wrong!

Fingers crossed everyone!
Stable
While I agree that it would be very nice for our soul to see the SP gain a bit or lot. What I take out of this if it is us is the fact the competitors for these products will know they are using our products. So apart from the trickle of revenue slowing growing into a river, the snowball effect of new potential customers swamping our company is what gives me a little grin on my dial as I sit on the train heading to my 9-5 for one of hopefully a limited few more times.

Stay strong and stay long chippers
 
  • Like
  • Love
  • Fire
Reactions: 22 users

Dozzaman1977

Regular
All the applications this Renesas ZMOD4410 sensor can be used for is endless.
So Excited Reaction GIF by Travis
Party Dancing GIF by Florida Georgia Line


Air Conditioner (High-End)
Air Duct System
Air Purifier Sensor Module
Air Quality Control for IoT Building Automation
Arduino Shield Sensor Board
Bathroom Odor Detector
Bathroom Odor Detector with Bluetooth Low Energy
Biosensing with Wireless Charging and Bluetooth
Bluetooth Low Energy (LE) Sensor Network Solution
Building Automation Lighting with Air Quality Sensors Solution
Cellular Cloud Connected System
Cloud and Sensor Solution for IoT Endpoints
Cloud Connected Wi-Fi & Bluetooth® Low Energy Sensor Hub
Connected Oxygen Concentrator Controller
Diaper Odor Detector
Flammable Gas Leakage Detector
Furnace Control
Gas Sensor with Cloud Connection for Industrial Applications
HVAC Air Quality Sensor
IEEE 802.15.4g-Based Battery-Powered Sub-GHz Wireless Communication
In-Home Air Quality Monitor System
Indoor Air Quality Sensor (IAQ)
Industrial Automation Solution with Industrial Ethernet Module
Industrial CAN Sensor Network
Industrial Ether Connectable IoT Sensor Hub
Infusion Level Monitor Using Capacitive Touch Sensing
Instrument Panel for Light Electric Vehicles
IoT Cold Chain Monitoring
IoT Sensor Board with Machine Learning & Bluetooth® Low Energy
Large Power BLDC Ceiling Fan with PFC
Mbed™ Based Image Processing Solution
Modbus ASCII/RTU Slave Board
Multi-Purpose Air Quality Sensor Solution
Multi-Sensor Module for Industrial Ethernet
Multi-Sensor Platform for ASi-5
Personal Safety Tracker
Precision TIG Welding Controller
Secure Cloud & Sensor Solution
Smart BLDC Air Cooler
Smart BLDC Fan with Humidity and Gas Sensors
Smart Industrial Gas Alarm
Smart IoT Air Purifier
Smart Range Hood
Smart Room Controller with DAB Audio System
Thermopile CO₂ Detector
Waterproof Shower Controller
Wearable Activity Tracker
Wireless Sensor Hub
Wireless Sensor Network Solutions

 
Last edited:
  • Like
  • Fire
  • Wow
Reactions: 88 users

equanimous

Norse clairvoyant shapeshifter goddess
 
  • Like
  • Fire
  • Love
Reactions: 42 users

jtardif999

Regular
If ARM's total revenue amounts to 9.7c per chip shipped, why would we be any different? :unsure: I struggle to understand why it is so low.
With all due respect to ARM I think that Akida technology is a little more specialised and sophisticated than ARMs low power reduced instruction chip technology. IMO this should enable BRN to command a much higher price for their IP in each chip sold. Time will tell I guess.
 
  • Like
  • Fire
  • Love
Reactions: 46 users

Steve10

Regular
If ARM's total revenue amounts to 9.7c per chip shipped, why would we be any different? :unsure: I struggle to understand why it is so low.

The ARM revenue averages out at 9.7c per chip as there would be many older IP low cost chips as compared to their newer IP high cost chips.

ARM Example Royalties​
IP
Royalty (% of chip cost)​
ARM7/9/11
1.0% - 1.5%​
ARM Cortex A-series
1.5% - 2.0%​
ARMv8 Based Cortex A-series
2.0% and above​
Mali GPU
0.75% - 1.25% adder​
Physical IP Package (POP)
0.5% adder​

You can see above an old ARM 7/9/11 IP chip has 1-1.5% royalty whereas a newer ARMv8 Cortex IP chip has 2% and above. Newer IP ARM chips will also have adders such as 0.75-1.25% for the Mali GPU + 0.5% adder for POP. The 0.5% adder for POP is usually paid by the foundry such as TSMC.

If you look at ARM's customer list TSMC is one of them due to the POP royalties.

Brainchip being revolutionary new AI IP should have royalties similar to new IP ARM chips at 2% and above. If chip cost is $15 x 2% = 30c per chip.
 
  • Like
  • Fire
Reactions: 32 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
There is a suspicion that Brembo may be using Akida in their Sensify braking system. They've been testing and demonstrating the system on Teslas in the US.





Screen Shot 2022-08-09 at 9.44.03 am.png



 
  • Like
  • Thinking
  • Love
Reactions: 41 users

alwaysgreen

Top 20
The ARM revenue averages out at 9.7c per chip as there would be many older IP low cost chips as compared to their newer IP high cost chips.

ARM Example Royalties​
IP
Royalty (% of chip cost)​
ARM7/9/11
1.0% - 1.5%​
ARM Cortex A-series
1.5% - 2.0%​
ARMv8 Based Cortex A-series
2.0% and above​
Mali GPU
0.75% - 1.25% adder​
Physical IP Package (POP)
0.5% adder​


You can see above an old ARM 7/9/11 IP chip has 1-1.5% royalty whereas a newer ARMv8 Cortex IP chip has 2% and above. Newer IP ARM chips will also have adders such as 0.75-1.25% for the Mali GPU + 0.5% adder for POP. The 0.5% adder for POP is usually paid by the foundry such as TSMC.

If you look at ARM's customer list TSMC is one of them due to the POP royalties.

Brainchip being revolutionary new AI IP should have royalties similar to new IP ARM chips at 2% and above. If chip cost is $15 x 2% = 30c per chip.
I recall 50 cents per chip being thrown around a while back. Pretty sure it came from management but take this with a grain of salt as I can't be certain. I think it was at the same time they mentioned that royalties will also be dependant on the final cost or quantity of the product it is being utilised in. I'll try and dig up the comments.
 
  • Like
Reactions: 6 users
Any idea if Innoviz are using AKIDA for their Lidar? Just signed a 4B deal with VW and have more sales through BMW. They say sales will be around 8 million units
Well Well, what do we have here? Nail on the head I would say.
 

Attachments

  • Screenshot_20220809-092110_LinkedIn.jpg
    Screenshot_20220809-092110_LinkedIn.jpg
    1.1 MB · Views: 257
  • Like
  • Fire
  • Love
Reactions: 33 users

TechGirl

Founding Member
  • Like
  • Haha
  • Love
Reactions: 25 users

alwaysgreen

Top 20
I recall 50 cents per chip being thrown around a while back. Pretty sure it came from management but take this with a grain of salt as I can't be certain. I think it was at the same time they mentioned that royalties will also be dependant on the final cost or quantity of the product it is being utilised in. I'll try and dig up the comments.
Okay, so I did some digging and came up short so ignore my 50 cent royalty comment unless someone else recalls this. Some things I found though are interesting (old news though).

- We are all expecting revenue growth to outstrip cost growth later this year (so revenue will be coming in) but in the Pitt St research paper , it states "First license agreement with a major ASIC manufacturer announced in December 2020. This license was paid at the beginning of 2021 and royalties are expected in 2023".
-
The paper also states the following on Royalties (and remember, the paper was written with Brainchip's involvement: We believe the most lucrative future revenue stream for the company will be royalties paid by customers for each product they sell that
includes Akida IP. These royalties are usually a percentage of the customer’s revenue from sales and typically range from 2% to 15%, again depending on the intended application areas, the amount of IP used and expected production volumes. Notably, royalty percentages also depend on the uniqueness of the IP that is being licensed. As the specifications and features of Akida are quite unique vs. other technologies, including Intel’s Loihi and IBM’s TrueNorth, this may help the company charge higher-than-average royalty percentages for Akida.
Other royalty revenue models simply use a fixed dollar amount per chip sold. This is a preferred model for many high-volume production companies,
including cell phone manufacturers.
This should put to bed any guessing of royalty percentages.
 
  • Like
  • Fire
  • Love
Reactions: 28 users

equanimous

Norse clairvoyant shapeshifter goddess
Pitt Street Research Re rate from $1.50 to $2.30

 
  • Like
  • Love
  • Fire
Reactions: 52 users
I like how this is worded. Link to full doc below.

Towards dedicated, specialized AI-workload processors, BrainChips’s Akida neuromorphic processor is a revolutionary advanced neural networking processor that brings artificial intelligence to the edge [133]. The Akida NSoC is designed for use as a stand-alone embedded accelerator or as a co-processor, while also including interfaces for ADAS sensors, audio sensors, and other IoT sensors.

https://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=9785622

SC
 
  • Like
  • Love
  • Fire
Reactions: 22 users
Recent email correspondence between myself and motoring writer Tony Davis from the Australian Financial Review:

1660003325445.png



On Sat, 6 Aug 2022 at 12:23, thelittleshort wrote:
Hi Tony

I just read your great article on Baraja (link)

Have you considered writing a similar piece on BrainChip? Another Australian company with an extremely bright future

BrainChip Akida neuromorphic IP is included in the EQXX concept EV from Mercedes Benz and is also being adopted by some amazing partners including Ford and NASA

Not sure if you are aware of BrainChip already? If not they are definitely worth a bit of research and will be a classic Australian success story within a couple of years

Appreciate your time
Cheers
thelittleshort




On 6 Aug 2022, at 7:51 pm, Tony Davis <tony.crossmedia@gmail.com> wrote:
Hey thelittleshort, thanks for the note. send me some more info and i will read with interest, cheers, tony




On Mon, 8 Aug 2022 at 13:34, thelittleshort wrote:
Hi Tony

Thanks for your interest. Specific BrainChip automotive info and links below. Hopefully they pique your interest.
BrainChip have many irons in the fire, with automotive being only one market that their technology is going to disrupt

Here is an AFR article by Tony Boyd from January 2022 about BrainChip generally as a tech stock to watch

Description of BrainChip's tech itself from their website. This gives an overview of the technology itself
BrainChip is a global technology company that is producing a ground breaking neuromorphic processor that brings artificial intelligence to the edge in a way that is beyond the capabilities of other products. The chip is high performance, small, ultra-low power and enables a wide array of edge capabilities that include on-chip training, learning and inference. The event-based neural network processor is inspired by the spiking nature of the human brain and is implemented in an industry standard digital process. By mimicking brain processing BrainChip has pioneered a processing architecture, called Akida™, which is both scalable and flexible to address the requirements in edge devices. At the edge, sensor inputs are analyzed at the point of acquisition rather than through transmission via the cloud to a data centre. Akida is designed to provide a complete ultra-low power and fast AI Edge Network for vision, audio, olfactory and smart transducer applications. The reduction in system latency provides faster response and a more power efficient system that can reduce the large carbon footprint of data centres.

BrainChip White Paper: Designing Smarter and Safer Cars With Essential AI https://brainchip.com/wp-content/uploads/2022/07/BrainChip_Designing-Smarter-Safer-Cars.pdf

Conventional AI silicon and cloud-centric inference models do not perform efficiently at the automotive edge. This makes it challenging to manufacture vehicles with highly personalized in-cabin systems and advanced assisted driving capabilities. To speed up the development of smarter and safer vehicles, innovative automotive companies are untethering edge AI functions from the cloud - and performing distributed inference computation on local neuromorphic AKIDA silicon. With AKIDA-powered smart sensors and AI accelerators, automotive companies are designing lighter, faster, and more energy efficient in-cabin systems that enable advanced driver verification and customization, sophisticated voice control technology, and next-level gaze estimation and emotion classification capabilities. In addition to redefining the in-cabin experience, AKIDA supports new computer vision and LiDAR systems that detect vehicles, pedestrians, bicyclists, street signs, and objects with incredibly high levels of precision. These fast and energy efficient ADAS systems are already helping automotive companies accelerate the rollout of increasingly advanced assisted driving capabilities.

From Mercedes Benz regarding the inclusion of BrainChip's Akida in their concept EV VISION EQXX https://group-media.mercedes-benz.com/marsMediaSite/instance/ko.xhtml?oid=52282663&filename=VISION-EQXX--taking-electric-range-and-efficiency-to-an-entirely-new-level

Neuromorphic computing – a car that thinks like you: Another key efficiency feature of the VISION EQXX that takes its cue from nature is the way it thinks. It uses an innovative form of information processing called neuromorphic computing. The hardware runs so-called spiking neural networks. Information is coded in discrete spikes and energy is only consumed when a spike occurs, which reduces energy consumption by orders of magnitude. Working with California-based artificial intelligence experts BrainChip, Mercedes-Benz engineers developed systems based on BrainChip’s Akida hardware and software. The example in the VISION EQXX is the “Hey Mercedes” hot-word detection. Structured along neuromorphic principles, it is five to ten times more efficient than conventional voice control. Although neuromorphic computing is still in its infancy, systems like these will be available on the market in just a few years. When applied on scale throughout a vehicle, they have the potential to radically reduce the energy needed to run the latest AI technologies.

From EE Times - Mercedes Applies Neuromorphic Computing in EV Concept Car by Sally Ward-Foxton https://www.eetimes.com/mercedes-applies-neuromorphic-computing-in-ev-concept-car/

The Mercedes Vision EQXX concept car, promoted as “the most efficient Mercedes-Benz ever built,” incorporates neuromorphic computing to help reduce power consumption and extend vehicle range. To that end, BrainChip’s Akida neuromorphic chip enables in-cabin keyword spotting as a more power-efficient way than existing AI-based keyword detection systems. As automakers shift their focus to electric vehicles, many are struggling to squeeze every last volt from a single battery charge. The need to reduce power consumption in vehicle electronic systems has therefore become critical to extending EV range. Touting Vision EQXX as “a car that thinks like you,” Mercedes promises range of more than 1,000 km (about 620 miles) on a single charge. “Working with California-based artificial intelligence experts BrainChip, Mercedes-Benz engineers developed systems based on BrainChip’s Akida hardware and software,” Mercedes noted in a statement describing the Vision EQXX. “The example in the Vision EQXX is the “Hey Mercedes” hot-word detection. Structured along neuromorphic principles, it is five to ten times more efficient than conventional voice control,” the carmaker claimed.

Valeo
BrainChip signs joint development agreement for Akida neuromorphic chip with Valeo https://smallcaps.com.au/brainchip-joint-development-agreement-akida-neuromorphic-chip-valeo/

NVISO
https://www.nviso.ai/en/news/nviso-...-computing-using-the-brainchip-akida-platform
In this video BrainChip’s eco-system partner, Nviso, demonstrates Emotion Detection running on the BrainChip AKD1000 reference board BrainChip + Nviso Emotion Detection Demo
https://brainchip.com/brainchip-and-nviso-partner-automotive-edge-devices/

Cheers
thelittleshort




From: Tony Davis <tony.crossmedia@gmail.com>
Subject: Re: BrainChip
Date:
9 Aug 2022 at 10:08 am
To: thelittleshort

thanks thelittleshort, leave it with me ... might work for a supplement i am doing a little later in the year, cheers, tony
 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 104 users

buena suerte :-)

BOB Bank of Brainchip
Recent email correspondence between myself and motoring writer Tony Davis from the Australian Financial Review:

View attachment 13662


On Sat, 6 Aug 2022 at 12:23, thelittleshort wrote:
Hi Tony

I just read your great article on Baraja (link)

Have you considered writing a similar piece on BrainChip? Another Australian company with an extremely bright future

BrainChip Akida neuromorphic IP is included in the EQXX concept EV from Mercedes Benz and is also being adopted by some amazing partners including Ford and NASA

Not sure if you are aware of BrainChip already? If not they are definitely worth a bit of research and will be a classic Australian success story within a couple of years

Appreciate your time
Cheers
thelittleshort




On 6 Aug 2022, at 7:51 pm, Tony Davis <tony.crossmedia@gmail.com> wrote:
Hey thelittleshort, thanks for the note. send me some more info and i will read with interest, cheers, tony




On Mon, 8 Aug 2022 at 13:34, thelittleshort wrote:
Hi Tony

Thanks for your interest. Specific BrainChip automotive info and links below. Hopefully they pique your interest.
BrainChip have many irons in the fire, with automotive being only one market that their technology is going to disrupt
Here is an AFR article by Tony Boyd from January 2022 about BrainChip generally as a tech stock to watch

Description of BrainChip's tech itself from their website. This gives an overview of the technology itself
BrainChip is a global technology company that is producing a ground breaking neuromorphic processor that brings artificial intelligence to the edge in a way that is beyond the capabilities of other products. The chip is high performance, small, ultra-low power and enables a wide array of edge capabilities that include on-chip training, learning and inference. The event-based neural network processor is inspired by the spiking nature of the human brain and is implemented in an industry standard digital process. By mimicking brain processing BrainChip has pioneered a processing architecture, called Akida™, which is both scalable and flexible to address the requirements in edge devices. At the edge, sensor inputs are analyzed at the point of acquisition rather than through transmission via the cloud to a data centre. Akida is designed to provide a complete ultra-low power and fast AI Edge Network for vision, audio, olfactory and smart transducer applications. The reduction in system latency provides faster response and a more power efficient system that can reduce the large carbon footprint of data centres.

BrainChip White Paper: Designing Smarter and Safer Cars With Essential AI https://brainchip.com/wp-content/uploads/2022/07/BrainChip_Designing-Smarter-Safer-Cars.pdf

Conventional AI silicon and cloud-centric inference models do not perform efficiently at the automotive edge. This makes it challenging to manufacture vehicles with highly personalized in-cabin systems and advanced assisted driving capabilities. To speed up the development of smarter and safer vehicles, innovative automotive companies are untethering edge AI functions from the cloud - and performing distributed inference computation on local neuromorphic AKIDA silicon. With AKIDA-powered smart sensors and AI accelerators, automotive companies are designing lighter, faster, and more energy efficient in-cabin systems that enable advanced driver verification and customization, sophisticated voice control technology, and next-level gaze estimation and emotion classification capabilities. In addition to redefining the in-cabin experience, AKIDA supports new computer vision and LiDAR systems that detect vehicles, pedestrians, bicyclists, street signs, and objects with incredibly high levels of precision. These fast and energy efficient ADAS systems are already helping automotive companies accelerate the rollout of increasingly advanced assisted driving capabilities.

From Mercedes Benz regarding the inclusion of BrainChip's Akida in their concept EV VISION EQXX https://group-media.mercedes-benz.com/marsMediaSite/instance/ko.xhtml?oid=52282663&filename=VISION-EQXX--taking-electric-range-and-efficiency-to-an-entirely-new-level

Neuromorphic computing – a car that thinks like you: Another key efficiency feature of the VISION EQXX that takes its cue from nature is the way it thinks. It uses an innovative form of information processing called neuromorphic computing. The hardware runs so-called spiking neural networks. Information is coded in discrete spikes and energy is only consumed when a spike occurs, which reduces energy consumption by orders of magnitude. Working with California-based artificial intelligence experts BrainChip, Mercedes-Benz engineers developed systems based on BrainChip’s Akida hardware and software. The example in the VISION EQXX is the “Hey Mercedes” hot-word detection. Structured along neuromorphic principles, it is five to ten times more efficient than conventional voice control. Although neuromorphic computing is still in its infancy, systems like these will be available on the market in just a few years. When applied on scale throughout a vehicle, they have the potential to radically reduce the energy needed to run the latest AI technologies.

From EE Times - Mercedes Applies Neuromorphic Computing in EV Concept Car by Sally Ward-Foxton https://www.eetimes.com/mercedes-applies-neuromorphic-computing-in-ev-concept-car/#

The Mercedes Vision EQXX concept car, promoted as “the most efficient Mercedes-Benz ever built,” incorporates neuromorphic computing to help reduce power consumption and extend vehicle range. To that end, BrainChip’s Akida neuromorphic chip enables in-cabin keyword spotting as a more power-efficient way than existing AI-based keyword detection systems. As automakers shift their focus to electric vehicles, many are struggling to squeeze every last volt from a single battery charge. The need to reduce power consumption in vehicle electronic systems has therefore become critical to extending EV range. Touting Vision EQXX as “a car that thinks like you,” Mercedes promises range of more than 1,000 km (about 620 miles) on a single charge. “Working with California-based artificial intelligence experts BrainChip, Mercedes-Benz engineers developed systems based on BrainChip’s Akida hardware and software,” Mercedes noted in a statement describing the Vision EQXX. “The example in the Vision EQXX is the “Hey Mercedes” hot-word detection. Structured along neuromorphic principles, it is five to ten times more efficient than conventional voice control,” the carmaker claimed.

Valeo
BrainChip signs joint development agreement for Akida neuromorphic chip with Valeo https://smallcaps.com.au/brainchip-joint-development-agreement-akida-neuromorphic-chip-valeo/

NVISO
https://www.nviso.ai/en/news/nviso-...-computing-using-the-brainchip-akida-platform
In this video BrainChip’s eco-system partner, Nviso, demonstrates Emotion Detection running on the BrainChip AKD1000 reference board BrainChip + Nviso Emotion Detection Demo
https://brainchip.com/brainchip-and-nviso-partner-automotive-edge-devices/

Cheers
thelittleshort




From: Tony Davis <tony.crossmedia@gmail.com>
Subject: Re: BrainChip
Date:
9 Aug 2022 at 10:08 am
To: thelittleshort

thanks thelittleshort, leave it with me ... might work for a supplement i am doing a little later in the year, cheers, tony
That's awesome ... great work @thelittleshort :)
 
  • Like
  • Love
  • Fire
Reactions: 35 users

Slymeat

Move on, nothing to see.
Hope you guys might be able to help me here. I am a retired banker so the BrainChip thing is amazing for me to understand (and most of I still don’t get) but being a shareholder is a privilege.

I am on holiday presently in Seville, and I met a bloke who is the CTO of The Linux Foundation in the US.

https://www.linuxfoundation.org/.

He is telling me of Open Source computing and how it is vital in the current evolution of computing. I suggested Brainchip was too, but was out of my depth in keeping up with him. Can anyone guide me to anything that might help my understanding of where Open Source fits in, and is it relevant to us?

Oh, I just heard Olivia Newton John just passed away. I am crying, especially with Judith Durham a few days ago 😢
If you get to see this person again (I assume it is Nirav Patel you are talking about), ask him if he knows Andrew Morton, lead developer and maintainer of the Linux kernel. Andrew is also employed by Google. I used to work with Andrew back in the 1990s, we were jogging buddies too. I expect the CTO of the Linux Foundation probably has a working relationship with him. But I digress.

Open source means anyone can write software and others can use it freely. Even better, the actual source code is published in such a way that others can see the actual source code and can even modify it and re-publish it. That happens quite often and is how the community assists itself. This could help Akida uptake as people write software that can utilise Akida’s IP, and then users need to buy a licence, or buy hardware, or pay for IP on which to run their solution.

You probably have seen that software @uiux has been sharing. That’s on a platform that is used for software developers to share their code openly. And that is what @uiux is doing. And uiux’s software can help others work with Akida! At least show them some things that are possible and how simple it is.

For the coding I used to do, we were never allowed to use open source or publish our code as open source. Sometimes we had to work on air-gaped computers in secure internal rooms. It was secret squirrels stuff. I only add that as I expect a lot of companies developing systems utilising Akida may be in the same boat. Their code is the opposite of open-source as their livelihood depends on it. But any open source stuff out there will certainly help with prototyping and proof-of-concept work to show to investors and the like.

My view is that open source is for playing with and testing the waters only. If that is what was meant by “vital to the current evolution of computing” then I agree.

Any real-world, life-critical programs most likely will not contain open source code. The operating system may very well be open source itself, but the applications won’t.
 
  • Like
  • Fire
  • Love
Reactions: 26 users

uiux

Regular
If you get to see this person again (I assume it is Nirav Patel you are talking about), ask him if he knows Andrew Morton, lead developer and maintainer of the Linux kernel. Andrew is also employed by Google. I used to work with Andrew back in the 1990s, we were jogging buddies too. I expect the CTO of the Linux Foundation probably has a working relationship with him. But I digress.

Open source means anyone can write software and others can use it freely. Even better, the actual source code is published in such a way that others can see the actual source code and can even modify it and re-publish it. That happens quite often and is how the community assists itself. This could help Akida uptake as people write software that can utilise Akida’s IP, and then users need to buy a licence, or buy hardware, or pay for IP on which to run their solution.

You probably have seen that software @uiux has been sharing. That’s on a platform that is used for software developers to share their code openly. And that is what @uiux is doing. And uiux’s software can help others work with Akida! At least show them some things that are possible and how simple it is.

For the coding I used to do, we were never allowed to use open source or publish our code as open source. Sometimes we had to work on air-gaped computers in secure internal rooms. It was secret squirrels stuff. I only add that as I expect a lot of companies developing systems utilising Akida may be in the same boat. Their code is the opposite of open-source as their livelihood depends on it. But any open source stuff out there will certainly help with prototyping and proof-of-concept work to show to investors and the like.

My view is that open source is for playing with and testing the waters only. If that is what was meant by “vital to the current evolution of computing” then I agree.

Any real-world, life-critical programs most likely will not contain open source code. The operating system may very well be open source itself, but the applications won’t.



Open source indeed

 
  • Like
  • Love
  • Fire
Reactions: 12 users
Top Bottom