BRN Discussion Ongoing

Sirod69

bavarian girl ;-)
Didn't we have it before when someone at BRN uploaded something too early on the homepage when it shouldn't have been yet? There was something there as far as I can remember
 
  • Like
Reactions: 9 users

Pmel

Regular
Collaboration with AI labs inc.
Still not showing on BRN website?
 

Attachments

  • 31006F38-6BF4-4E64-9F8D-25D0C8838492.jpeg
    31006F38-6BF4-4E64-9F8D-25D0C8838492.jpeg
    734 KB · Views: 119
Last edited:
  • Like
  • Fire
Reactions: 4 users

HopalongPetrovski

I'm Spartacus!
Didn't we have it before when someone at BRN uploaded something too early on the homepage when it shouldn't have been yet? There was something there as far as I can remember
It was the whole new website. 🤣
Again, discovered on a weekend and probably slated for a Monday morning surprise.
Someone of the 1000 discovered and posted links to it here before it was officially released 🤣
You'd think the guy's in the States would be wise to this now, particularly as they are keeping everything HUSH, HUSH.
It's like catnip to our curious cats. 🤣
 
  • Like
  • Haha
  • Fire
Reactions: 23 users

Xhosa12345

Regular
Didn't we have it before when someone at BRN uploaded something too early on the homepage when it shouldn't have been yet? There was something there as far as I can remember
borat-cheat.gif
 
  • Haha
  • Like
Reactions: 12 users

TECH

Regular
As I mentioned recently, Intel has made a very smart play in offering/accepting Brainchip Inc into their ecosystem.

Loihi maybe fantastic research in the making, but nothing will ever beat owning revolutionary technology that is not only 100%
proven in silicon, but is also available immediately as a commercial offering, whether as in IP block/s or SoC.

I personally believe that they have conceded (finally) that our technology is superior, much more advanced, ready now, cheaper, and moving
further ahead, it will be very interesting over the next 5 years how Intel plays this, I'd watch this space rather closely, but that's just my own view.

Please see the link below..........Tech ;)

 
  • Like
  • Fire
  • Love
Reactions: 59 users

Sirod69

bavarian girl ;-)

BrainChip Latest News​

Well, we already know one more message

 
  • Like
  • Fire
Reactions: 22 users

Terroni2105

Founding Member


Some of these projects are interesting.



View attachment 28663
Bacon I don’t think the Microsoft one means AI Lab Inc, i think they just referring to their lab in general 🤔
 
  • Like
  • Thinking
Reactions: 7 users

Terroni2105

Founding Member
The 802.11ah Wi-Fi Halow technology is apparently better at passing through materials (walls?)

Ultra-long range, ultra-low power, and massive capacity Wi-Fi​

The industry’s smallest and most integrated single-chip solution. Operating in the sub-1 GHz license exempt RF band means superior penetration through materials and longer range than any other Wi-Fi.


CUTTING-EDGE Wi-Fi DEVICE PERFORMANCE​

Morse Micro’s 802.11ah Wi-Fi HaLow technology features one of the smallest and lowest-power single-chip solution that incorporates Radio, PHY, and MAC as well as an optional Host Applications Processor, designed in compliance with the sub-1 GHz IEEE 802.11ah standard to power the IoT. It solves the challenges for Wi-Fi in IoT devices, overcoming the fundamental weaknesses of existing wireless technologies offering ultra-low power, longer range, and secure connections at a higher capacity.

Brainchip already has an established connection with Carnegie Mellon and it has been said that it would be mutually beneficial and extremely profitable if our technology potentially was successfully partnered with Morse Micro.

Probably all just coincidence.

TT
Carnegie Mellon does have a relationship with Megachips

edit: my mistake, Megachips is with Santa Clara University, not sure about CMU
 
Last edited:
  • Like
  • Fire
Reactions: 11 users

BaconLover

Founding Member
Bacon I don’t think the Microsoft one means AI Lab Inc, i think they just referring to their lab in general 🤔
Yes Terroni I didn't think they were the same either, but found those on my search, lots of projects in the field for us to keep an eye on, thought I'd share.
There are a few AI Labs going around with same name just like we have a few "Brainchips" going around.
 
  • Like
Reactions: 11 users

wilzy123

Founding Member
I think this will give you an idea of the price... (see ANN 10 January 2023)

"The Capital Call Notice pricing period will begin upon exercise of the put option on or around 11 January, with an anticipated ending date in late March or early April, subject to adjustments based on the share price performance throughout the pricing period.

The issue price for the capital call shares will be 91.5% of the higher of the average daily VWAP of shares over the pricing period (subject to any applicable adjustments) and the minimum price notified to LDA Capital by the Company."

A call for an additional 10M is completely unrelated to this current call and is subject to company approval.

@FJ-215 Well?
 

Boab

I wish I could paint like Vincent

Attachments

  • Page 6.jpg
    Page 6.jpg
    418.7 KB · Views: 81
  • Page 7.jpg
    Page 7.jpg
    297.1 KB · Views: 79
  • Page 9.jpg
    Page 9.jpg
    240.6 KB · Views: 84
  • Page 8.jpg
    Page 8.jpg
    231 KB · Views: 77
  • Like
  • Fire
Reactions: 17 users

BaconLover

Founding Member
  • Like
  • Fire
Reactions: 23 users

Terroni2105

Founding Member
Last edited:
  • Like
  • Love
  • Haha
Reactions: 5 users

alwaysgreen

Top 20
  • Like
  • Haha
  • Love
Reactions: 23 users

JK200SX

Regular
1675560185218.png



Went live on 32nd Jan 2023.
 

Attachments

  • Screenshot 2023-02-05 at 12-21-22 BrainChip Partners with AI Labs on Next-Generation Applicati...png
    Screenshot 2023-02-05 at 12-21-22 BrainChip Partners with AI Labs on Next-Generation Applicati...png
    1,012.1 KB · Views: 103
  • Like
  • Love
  • Fire
Reactions: 25 users

JK200SX

Regular
  • Like
Reactions: 9 users

Tothemoon24

Top 20
iMedia

Home
Tech
Application of SNN in Vehicle Field

2023-02-03 22:41

iMedia

Home
Tech
Application of SNN in Vehicle Field

2023-02-03 22:41 HKT

Depots are using neuromorphic technologies to implement AI functions such as keyword recognition, driver attention, and passenger behavior monitoring.

Mimicking biological brain processes is tantalizing because it promises to enable advanced functionality without a significant increase in power consumption, which is EV-friendly. Neuromorphic computing and perception are also expected to bring these advantages, such as extremely low latency, enabling real-time decision-making in some cases. This combination of low latency and high energy efficiency is very attractive.

Spike Network


The truth is, there's still something we don't know about how the human brain works. However, cutting-edge research has shown that neurons communicate by sending each other electrical signals called spikes, and that the sequence and timing of the spikes (rather than their size) are key factors. Mathematical models of how neurons respond to these spikes are still being studied. But many scientists agree that if multiple spikes arrive at adjacent neurons at the same time (or in very rapid succession), it means that the information those spikes represent are correlated, thus causing the neuron to fire a spike.

This is in contrast to artificial neural networks based on deep learning (the mainstream AI today), where information travels through the network in a regular rhythm; that is, the information entering each neuron is represented as a numerical value, rather than based on time.

Making a spike-based artificial system is not easy. Besides we don't know how neurons work, there is no consensus on the best way to train spike neural networks. Backpropagation requires computing derivatives, which is not possible with spikes. Some companies approximate the derivative of the spike in order to use backpropagation (like SynSense), and some use another technique called STDP (spike timing dependent plasticity), which is closer to how biological brains function. However, STDP is not yet mature as a technique (BrainChip uses this method for one-shot learning at the edge). It is also possible to take a deep learning CNN, trained by backpropagation in the normal way, and convert it to run in the spike domain (another technique used by BrainChip).


SynSense Speck

SynSense is working with BMW to advance the integration of neuromorphic chips in smart cockpits and explore related areas together. BMW will evaluate SynSense's Speck SoC, which combines SynSense's neuromorphic vision processor and Inivation's 128x128-pixel event camera. Can be used to capture visual information in real-time, identify and detect objects, and perform other vision-based detection and interaction functions.

Dylan Muir, vice president of global research operations at SynSense, said: "When BMW replaces RGB cameras with Speck modules for visual perception, they can replace not only sensors, but also a lot of GPU or CPU computing required to process standard RGB visual streams."


Using event-based cameras provides a higher dynamic range than standard cameras, which is beneficial for use in extreme lighting conditions inside and outside the vehicle.

BMW will explore the use of neuromorphic technology in cars, including monitoring driver attention and passenger behavior through the Speck module.

"In the coming months, we will explore more applications inside and outside the vehicle," Muir said.


SynSense's neuromorphic vision processors have a fully asynchronous digital architecture. Each neuron uses integer logic with 8-bit synaptic weights, 16-bit neuron states, 16-bit thresholds, and unit input-output spikes. Neurons use a simple integrate-and-fire model, where when the neuron fires a simple 1-bit spike, the input spike is combined with the neuron's synaptic weights until a threshold is reached. Overall, the design is a balance between complexity and computational efficiency, Muir said.

Application of SNN in Vehicle Field

SynSense's electronic neurons are based on the integrate-and-fire model


SynSense's digital chips are designed to process event-based CNNs, with each layer processed by a different core. The kernel runs asynchronously and independently; the entire processing pipeline is event-driven.

"Our Speck modules run in real-time with low latency," Muir said. "We can manage effective inference rates above 20Hz at less than 5mW power consumption. This is much faster than using traditional low-power computing on standard RGB video streams. ."

While SynSense and BMW will initially explore neuromorphic use cases in smart cockpits, it has potential for other automotive applications as well.


"First, we'll explore non-safety-critical use cases, and we're planning future versions of Speck with higher resolution, as well as improvements to our DynapCNN vision processor, which will interface with high-resolution sensors," Muir said. We plan for these future technologies It will support advanced automotive applications such as autonomous driving, emergency braking, etc."

Application of SNN in Vehicle Field

SynSense and Inivation Speck module, an event camera-based module containing sensors and processors


BrainChip Akida

Mercedes-Benz's EQXX concept car, which debuted at CES earlier this year, uses BrainChip's Akida neuromorphic processor for in-vehicle keyword recognition. Billed as "the most efficient car Mercedes has ever made," the car utilizes neuromorphic technology that consumes less power than a deep learning-based keyword spotting system. That's crucial for a car with a range of 620 miles, or 167 miles more than Mercedes' flagship electric car, the EQS.

Mercedes said at the time that BrainChip's solution was five to 10 times more efficient than traditional voice controls at recognizing the wake word "Hey Mercedes."


Application of SNN in Vehicle Field

Mercedes said, “Although neuromorphic computing is still in its infancy, such systems will soon be on the market within a few years. When applied at scale throughout vehicles, they have the potential to radically reduce the amount of effort required to run the latest AI technologies. power consumption."

BrainChip's CMO Jerome Nadel said: "Mercedes is focused on big issues like battery management and transmission, but every milliwatt counts, and when you think about energy efficiency, even the most basic reasoning, like finding keywords, matters. important."


A typical car could have as many as 70 different sensors by 2022, Nadel said. For cockpit applications, these sensors can enable face detection, gaze assessment, emotion classification, and more.

He said: “From a system architecture perspective, we can do a 1:1 approach where there is a sensor that will do some preprocessing and then the data will be forwarded. The AI will do inference near the sensor...it will Instead of the full array of data from sensors, the inference metadata is passed forward.”

The idea is to minimize the size and complexity of packets sent to AI accelerators, while reducing latency and minimizing power consumption. Each vehicle will likely have 70 Akida chips or sensors with Akida technology, each of which will be "low-cost parts that won't notice them at all," Nadel said. He noted that attention needs to be paid to the BOM of all these sensors.


Application of SNN in Vehicle Field

BrainChip expects to have its neuromorphic processor next to every sensor on the vehicle

Going forward, Nadel said, neuromorphic processing will also be used in ADAS and autonomous driving systems. This has the potential to reduce the need for other types of power-hungry AI accelerators.


"If every sensor could have Akida configured on one or two nodes, it would do adequate inference, and the data passed would be an order of magnitude less, because that would be inference metadata...that would affect the servers you need," he said. power."

BrainChip's Akida chip accelerates SNNs (spike neural networks) and CNNs (by converting to SNNs). It's not tailored for any specific use case or sensor, so it can be paired with visual sensing for face recognition or people detection, or other audio applications like speaker ID. BrainChip also demonstrated Akida's smell and taste sensors, although it's hard to imagine how these could be used in cars (perhaps to detect air pollution or fuel quality through smell and taste).

Akida is set up to handle SNNs or deep learning CNNs that have been transformed into SNNs. Unlike the native spike network, the transformed CNN preserves some spike-level information, so it may require 2 or 4 bits of computation. However, this approach allows exploiting the properties of CNNs, including their ability to extract features from large datasets. Both types of networks can be updated at the edge using STDP. In the case of Mercedes-Benz, this might mean retraining the network after deployment to discover more or different keywords.


Application of SNN in Vehicle Field

According to Autocar, Mercedes-Benz confirmed that "many innovations" from the EQXX concept car, including "specific components and technologies," will be used in the production model. There's no word yet on whether new Mercedes-Benz models will feature artificial brains.






Depots are using neuromorphic technologies to implement AI functions such as keyword recognition, driver attention, and passenger behavior monitoring.

Mimicking biological brain processes is tantalizing because it promises to enable advanced functionality without a significant increase in power consumption, which is EV-friendly. Neuromorphic computing and perception are also expected to bring these advantages, such as extremely low latency, enabling real-time decision-making in some cases. This combination of low latency and high energy efficiency is very attractive.

Spike Network


The truth is, there's still something we don't know about how the human brain works. However, cutting-edge research has shown that neurons communicate by sending each other electrical signals called spikes, and that the sequence and timing of the spikes (rather than their size) are key factors. Mathematical models of how neurons respond to these spikes are still being studied. But many scientists agree that if multiple spikes arrive at adjacent neurons at the same time (or in very rapid succession), it means that the information those spikes represent are correlated, thus causing the neuron to fire a spike.

This is in contrast to artificial neural networks based on deep learning (the mainstream AI today), where information travels through the network in a regular rhythm; that is, the information entering each neuron is represented as a numerical value, rather than based on time.

Making a spike-based artificial system is not easy. Besides we don't know how neurons work, there is no consensus on the best way to train spike neural networks. Backpropagation requires computing derivatives, which is not possible with spikes. Some companies approximate the derivative of the spike in order to use backpropagation (like SynSense), and some use another technique called STDP (spike timing dependent plasticity), which is closer to how biological brains function. However, STDP is not yet mature as a technique (BrainChip uses this method for one-shot learning at the edge). It is also possible to take a deep learning CNN, trained by backpropagation in the normal way, and convert it to run in the spike domain (another technique used by BrainChip).


SynSense Speck

SynSense is working with BMW to advance the integration of neuromorphic chips in smart cockpits and explore related areas together. BMW will evaluate SynSense's Speck SoC, which combines SynSense's neuromorphic vision processor and Inivation's 128x128-pixel event camera. Can be used to capture visual information in real-time, identify and detect objects, and perform other vision-based detection and interaction functions.

Dylan Muir, vice president of global research operations at SynSense, said: "When BMW replaces RGB cameras with Speck modules for visual perception, they can replace not only sensors, but also a lot of GPU or CPU computing required to process standard RGB visual streams."


Using event-based cameras provides a higher dynamic range than standard cameras, which is beneficial for use in extreme lighting conditions inside and outside the vehicle.

BMW will explore the use of neuromorphic technology in cars, including monitoring driver attention and passenger behavior through the Speck module.

"In the coming months, we will explore more applications inside and outside the vehicle," Muir said.


SynSense's neuromorphic vision processors have a fully asynchronous digital architecture. Each neuron uses integer logic with 8-bit synaptic weights, 16-bit neuron states, 16-bit thresholds, and unit input-output spikes. Neurons use a simple integrate-and-fire model, where when the neuron fires a simple 1-bit spike, the input spike is combined with the neuron's synaptic weights until a threshold is reached. Overall, the design is a balance between complexity and computational efficiency, Muir said.

Application of SNN in Vehicle Field

SynSense's electronic neurons are based on the integrate-and-fire model


SynSense's digital chips are designed to process event-based CNNs, with each layer processed by a different core. The kernel runs asynchronously and independently; the entire processing pipeline is event-driven.

"Our Speck modules run in real-time with low latency," Muir said. "We can manage effective inference rates above 20Hz at less than 5mW power consumption. This is much faster than using traditional low-power computing on standard RGB video streams. ."

While SynSense and BMW will initially explore neuromorphic use cases in smart cockpits, it has potential for other automotive applications as well.


"First, we'll explore non-safety-critical use cases, and we're planning future versions of Speck with higher resolution, as well as improvements to our DynapCNN vision processor, which will interface with high-resolution sensors," Muir said. We plan for these future technologies It will support advanced automotive applications such as autonomous driving, emergency braking, etc."

Application of SNN in Vehicle Field

SynSense and Inivation Speck module, an event camera-based module containing sensors and processors


BrainChip Akida

Mercedes-Benz's EQXX concept car, which debuted at CES earlier this year, uses BrainChip's Akida neuromorphic processor for in-vehicle keyword recognition. Billed as "the most efficient car Mercedes has ever made," the car utilizes neuromorphic technology that consumes less power than a deep learning-based keyword spotting system. That's crucial for a car with a range of 620 miles, or 167 miles more than Mercedes' flagship electric car, the EQS.

Mercedes said at the time that BrainChip's solution was five to 10 times more efficient than traditional voice controls at recognizing the wake word "Hey Mercedes."


Application of SNN in Vehicle Field

Mercedes said, “Although neuromorphic computing is still in its infancy, such systems will soon be on the market within a few years. When applied at scale throughout vehicles, they have the potential to radically reduce the amount of effort required to run the latest AI technologies. power consumption."

BrainChip's CMO Jerome Nadel said: "Mercedes is focused on big issues like battery management and transmission, but every milliwatt counts, and when you think about energy efficiency, even the most basic reasoning, like finding keywords, matters. important."


A typical car could have as many as 70 different sensors by 2022, Nadel said. For cockpit applications, these sensors can enable face detection, gaze assessment, emotion classification, and more.

He said: “From a system architecture perspective, we can do a 1:1 approach where there is a sensor that will do some preprocessing and then the data will be forwarded. The AI will do inference near the sensor...it will Instead of the full array of data from sensors, the inference metadata is passed forward.”

The idea is to minimize the size and complexity of packets sent to AI accelerators, while reducing latency and minimizing power consumption. Each vehicle will likely have 70 Akida chips or sensors with Akida technology, each of which will be "low-cost parts that won't notice them at all," Nadel said. He noted that attention needs to be paid to the BOM of all these sensors.


Application of SNN in Vehicle Field

BrainChip expects to have its neuromorphic processor next to every sensor on the vehicle

Going forward, Nadel said, neuromorphic processing will also be used in ADAS and autonomous driving systems. This has the potential to reduce the need for other types of power-hungry AI accelerators.


"If every sensor could have Akida configured on one or two nodes, it would do adequate inference, and the data passed would be an order of magnitude less, because that would be inference metadata...that would affect the servers you need," he said. power."

BrainChip's Akida chip accelerates SNNs (spike neural networks) and CNNs (by converting to SNNs). It's not tailored for any specific use case or sensor, so it can be paired with visual sensing for face recognition or people detection, or other audio applications like speaker ID. BrainChip also demonstrated Akida's smell and taste sensors, although it's hard to imagine how these could be used in cars (perhaps to detect air pollution or fuel quality through smell and taste).

Akida is set up to handle SNNs or deep learning CNNs that have been transformed into SNNs. Unlike the native spike network, the transformed CNN preserves some spike-level information, so it may require 2 or 4 bits of computation. However, this approach allows exploiting the properties of CNNs, including their ability to extract features from large datasets. Both types of networks can be updated at the edge using STDP. In the case of Mercedes-Benz, this might mean retraining the network after deployment to discover more or different keywords.


Application of SNN in Vehicle Field

According to Autocar, Mercedes-Benz confirmed that "many innovations" from the EQXX concept car, including "specific components and technologies," will be used in the production model. There's no word yet on whether new Mercedes-Benz models will feature artificial brains.
 
  • Like
  • Fire
  • Love
Reactions: 47 users

Murphy

Life is not a dress rehearsal!
  • Haha
  • Like
Reactions: 8 users

Terroni2105

Founding Member
I can’t find anything about the Founder of AI Labs Inc Bhaskar Rao. There is nothing on the AI Labs Inc website about the management team either.

There are others with the same name and I have found one who is an entrepreneur who is possibly the same man as for AI Labs Inc.

He is also Founder and CEO of another company named Technosphere (in some areas referred to as Technosphere Labs so similiar to AI Labs) https://www.technosphere.io/about-us/

https://www.linkedin.com/in/bhaskar-rao-2490087/

1675561912577.jpeg



Anyway I find it really unusual that it is very difficult to find anything on Bhaskar Rao and AI Labs Inc (linked together) when I search.

Does anyone else have more luck?
 
Last edited:
  • Like
  • Thinking
Reactions: 15 users

VictorG

Member
I can’t find anything about the Founder of AI Labs Inc Bhaskar Rao. There is nothing on the AI Labs Inc website about the management team either.

There are others with the same name and I have found one who is an entrepreneur who is possibly the same man as for AI Labs Inc.

He is also Founder and CEO of another company named Technosphere (in some areas referred to as Technosphere Labs so similiar to AI Labs) https://www.technosphere.io/about-us/

https://www.linkedin.com/in/bhaskar-rao-2490087/

View attachment 28691


Anyway I find it really unusual that it is very difficult to find anything on Bhaskar Rao and AI Labs Inc when I search.

Does anyone else have more luck?
In their LinkedIn bio page it states that Ai Labs is public company.
I searched NASDAQ and New York exchanges, nothing.

Screenshot_20230205_120049_LinkedIn.jpg
 
  • Like
Reactions: 5 users
Top Bottom