It might only be a small thing but in the court rooms of common law countries such as Australia it is considered very bad form to selectively quote from source documents and always considered an attempt to mislead the court.
Brainchip bless their little cotton socks unlike a certain company who shall remain named above have on their website the link to an Article from the EETimes titled 'Cars that think like you' however they do not edit the article to remove those parts which cover a competitor nor do they take the title of the article and rewrite it to be
'SynSense. Cars that think like you' so that it appears to be saying that their product creates a car that 'thinks like you' and in so doing mislead the reader and plagiarise the catch phrase of Mercedes Benz.
To address this shortfall in ethics I have extracted the balance of the article which the above named company saw fit to edit out and rename:
"BRAINCHIP AKIDA
The Mercedes
EQXX concept car, debuted at CES 2022, features BrainChip’s Akida neuromorphic processor performing in–cabin keyword spotting. Promoted as “the most efficient Mercedes–Benz ever built,” the car takes advantage of neuromorphic technology to use less power than deep learning powered keyword spotting systems. This is crucial for a car that is supposed to deliver a 620–mile range (about 1,000 km) on a single battery charge, 167 miles further than Mercedes’ flagship electric vehicle, the EQS
Mercedes
said at the time that BrainChip’s solution was 5 to 10× more efficient than conventional voice control when spotting the wake word “Hey Mercedes”.
Mercedes’ EQXX concept EV has a power efficiency of more than 6.2 miles per kWh, almost double that of the EQS. (Source: Mercedes)
“Although neuromorphic computing is still in its infancy, systems like these will be available on the market in just a few years,” according to Mercedes. “When applied at scale throughout a vehicle, they have the potential to radically reduce the energy needed to run the latest AI technologies.”
“[Mercedes is] looking at big issues like battery management and transmission, but every milliwatt counts, and the context of [BrainChip’s] inclusion was that even the most basic inference, like spotting a keyword, is important when you consider the power envelope,” Jerome Nadel, chief marketing officer at BrainChip, told EE Times.
Nadel said that a typical car in 2022 may have as many as 70 different sensors. For in–cabin applications, these sensors may be enabling facial detection, gaze estimation, emotion classification, and more.
“From a systems architecture point of view, we can do it in a 1:1 way, there’s a sensor that will do a level of pre–processing, and then the data will be forwarded,” he said. “There would be AI inference close to the sensor and… it would pass the inference meta data forward and not the full array of data from the sensor.”
The idea is to minimize the size and complexity of data packets sent to AI accelerators in automotive head units, while lowering latency and minimizing energy requirements. With a potential for 70 Akida chips or Akida–enabled sensors in each vehicle, Nadel said each one will be a “low–cost part that will play a humble role,” noting that the company needs to be mindful of the bill of materials for all these sensors.
BrainChip sees its neuromorphic processor next to every sensor in a car. (Source: BrainChip)
Looking further into the future, Nadel said neuromorphic processing will find its way into ADAS and autonomous vehicle systems, too. There is potential to reduce the need for other types of power–hungry AI accelerators.
“If every sensor had a limited, say, one or two node implementation of Akida, it would do the sufficient inference and the data that would be passed around would be cut by an order of magnitude, because it would be the inference metadata… that would have an impact on the horsepower that you need in the server in the trunk,” he said.
BrainChip’s Akida chip accelerates spiking neural networks (SNNs) and convolutional neural networks (via conversion to SNNs). It is not tailored for any particular use case or sensor, so it can work with vision sensing for face recognition or person detection, or other audio applications such as speaker ID. BrainChip has also demonstrated Akida with smell and taste sensors, though it’s more difficult to imagine how these sensors might be used in automotive (smelling and tasting for air pollution or fuel quality, perhaps).
Akida is set up to process SNNs or deep learning CNNs that have been converted to the spiking domain. Unlike native spiking networks, converted CNNs retain some information in spike magnitude, so 2– or 4–bit computation may be required. This approach, hwoever, allows exploitation of CNNs’ properties, including their ability to extract features from large datasets. Both types of networks can be updated at the edge using STDP — in the Mercedes example, that might mean retraining the network to spot more or different keywords after deployment.
Mercedes used BrainChip’s Akida processor to listen for the keyword “Hey Mercedes” in the cabin of its EQXX concept EV. (Source: Mercedes)
Mercedes has confirmed that “many innovations”, including “specific components and technologies” from the EQXX concept car, will make it into production vehicles, reports Autocar. There is no word yet on whether new models of Mercedes will feature artificial brains."
You only get one chance in this world to be considered as ethical to the core of your existence and unfortunately this chance has been thrown away.
My opinion only DYOR
FF
AKIDA BALLISTA