BRN Discussion Ongoing

Esq.111

Fascinatingly Intuitive.
  • Like
Reactions: 1 users

Diogenese

Top 20
Yes. Valeo have been making a fair bit of noise about their intelligent lighting too.

https://www.valeo.com/en/lighting/
...
With its embedded intelligence, lighting can now provide new insights. It can guide, alert and assist. It can increase driver alertness, for example by over-illuminating cyclists on the roadside. It can draw the curves of the road on the pavement. It can warn the vehicle behind of potential danger using rear lights. Valeo already owns all these operational technologies. For example, our PictureBeam Monolithic technology is the first high-definition LED artificial intelligence (AI) lighting solution to significantly improve road safety at night.
Valeo use a NN to determine areas of interest:

EP4456014A1 METHOD TO OPERATE AN ADAPTIVE HEADLIGHT DEVICE OF A VEHICLE 20230428

[0057] The artificial neural network 24 is preferably a convolutional neural network (CNN), in particular, a multi-branch CNN. Further sub-steps of step S4 are possible which are here indicated as steps S5 to S9. The artificial neural network 24 may comprise three separate input branches 16, 17, 18 for feature extraction. The provide outside information 10 is provided to a first input branch 16 of the three input branches 16, 17, 18 in step S5. The provided weather information 14 is provided to a second input branch 17 in step S6, and the lighting information 15 is provided to a third input branch 18 of the three input branches 16, 17, 18 in step S7. The second input branch 17 and/or the third input branch 18 are preferably fed with data determined by the further artificial neural networks 23. To determine the weather information 14 and/or the lighting information 15, the respective further artificial neural network 23 may be applied on the provided outside information 10 and/or on the further sensor information. The further sensor information may be provided by the rain sensor 5, the temperature sensor 6 and/or by the external device 9. There may be only one single artificial neural network 24 that determines the weather information 14, the lighting information 15 and the control command 20. Otherwise, the artificial neural network 24 and the two further artificial neural networks 23 differ from each other.

[0060] Applying the artificial neural network 24 may comprise determining at least one area of interest in the environment 3 of the vehicle 1. The area of interest may comprise at least partially an opposing vehicle 1, a traffic sign, an obstacle and/or a road infrastructure element. It is hence possible to define which area in the environment 3 should be more or less illuminated than at a current point in time.

Given the filing date is 2023, it is possible that they use Akida/TENNs software to determine the areas of interest.
 
  • Like
  • Fire
  • Thinking
Reactions: 13 users

Diogenese

Top 20
Valeo use a NN to determine areas of interest:

EP4456014A1 METHOD TO OPERATE AN ADAPTIVE HEADLIGHT DEVICE OF A VEHICLE 20230428

[0057] The artificial neural network 24 is preferably a convolutional neural network (CNN), in particular, a multi-branch CNN. Further sub-steps of step S4 are possible which are here indicated as steps S5 to S9. The artificial neural network 24 may comprise three separate input branches 16, 17, 18 for feature extraction. The provide outside information 10 is provided to a first input branch 16 of the three input branches 16, 17, 18 in step S5. The provided weather information 14 is provided to a second input branch 17 in step S6, and the lighting information 15 is provided to a third input branch 18 of the three input branches 16, 17, 18 in step S7. The second input branch 17 and/or the third input branch 18 are preferably fed with data determined by the further artificial neural networks 23. To determine the weather information 14 and/or the lighting information 15, the respective further artificial neural network 23 may be applied on the provided outside information 10 and/or on the further sensor information. The further sensor information may be provided by the rain sensor 5, the temperature sensor 6 and/or by the external device 9. There may be only one single artificial neural network 24 that determines the weather information 14, the lighting information 15 and the control command 20. Otherwise, the artificial neural network 24 and the two further artificial neural networks 23 differ from each other.

[0060] Applying the artificial neural network 24 may comprise determining at least one area of interest in the environment 3 of the vehicle 1. The area of interest may comprise at least partially an opposing vehicle 1, a traffic sign, an obstacle and/or a road infrastructure element. It is hence possible to define which area in the environment 3 should be more or less illuminated than at a current point in time.

Given the filing date is 2023, it is possible that they use Akida/TENNs software to determine the areas of interest.
They even do gaze-tracking light control:


EP4456013A1 METHOD TO OPERATE AN ADAPTIVE HEADLIGHT DEVICE OF A VEHICLE 20230428

Gaze tracking

The invention relates to a method to operate a headlight device object (2) of a vehicle (1) comprising: providing an outside information (10) describing an environment (3) of the vehicle (1) and providing a gaze information (14) describing a gaze of a driver (9) of the vehicle (1). The method comprises determining (S3) a control command (19) for the headlight device (2) by applying a control command determining algorithm (22) on the provided outside information (10) and the provided gaze information (14), so that the determined control command (19) describes at least one setting (20, 21) for the headlight device (2) by which light emitted by the headlight device (2) is adjusted to the environment (3) and the gaze of the driver (9). It also comprises operating (S8) the headlight device (2) according to the determined control command (19).
 
  • Like
  • Love
  • Fire
Reactions: 12 users

Quiltman

Regular
1732179261356.png


1732179285899.png


1732179312579.png


1732179345852.png
 
  • Like
  • Fire
  • Love
Reactions: 34 users

Diogenese

Top 20
They even do gaze-tracking light control:


EP4456013A1 METHOD TO OPERATE AN ADAPTIVE HEADLIGHT DEVICE OF A VEHICLE 20230428

Gaze tracking

The invention relates to a method to operate a headlight device object (2) of a vehicle (1) comprising: providing an outside information (10) describing an environment (3) of the vehicle (1) and providing a gaze information (14) describing a gaze of a driver (9) of the vehicle (1). The method comprises determining (S3) a control command (19) for the headlight device (2) by applying a control command determining algorithm (22) on the provided outside information (10) and the provided gaze information (14), so that the determined control command (19) describes at least one setting (20, 21) for the headlight device (2) by which light emitted by the headlight device (2) is adjusted to the environment (3) and the gaze of the driver (9). It also comprises operating (S8) the headlight device (2) according to the determined control command (19).
This is an interesting, if diverting, rabbit hole:

Troxler fading:
https://en.wikipedia.org/wiki/Troxler's_fading

Troxler's fading, also called Troxler fading or the Troxler effect, is an optical illusion affecting visual perception. When one fixates on a particular point for even a short period of time, an unchanging stimulus away from the fixation point will fade away and disappear. Research suggests that at least some portion of the perceptual phenomena associated with Troxler's fading occurs in the brain.

It's a bit like DVS - only differentish?
 
  • Like
  • Wow
  • Fire
Reactions: 8 users

Tothemoon24

Top 20
  • Like
  • Fire
  • Love
Reactions: 26 users

Dallas

Regular
  • Like
  • Love
  • Fire
Reactions: 9 users

Quiltman

Regular
And the referred to blog

1732179397604.png

1732179432142.png

1732179472566.png

1732179504791.png

1732179539423.png

1732179574992.png

1732179638383.png

1732179675472.png

1732179758032.png

1732179810037.png
 
  • Like
  • Fire
  • Love
Reactions: 24 users

Quiltman

Regular
1732179889571.png

1732179910283.png
 
  • Like
  • Fire
  • Love
Reactions: 12 users

Diogenese

Top 20
TATA have developed what they call a reservoir-based spiking NN:

US2023334300A1 METHODS AND SYSTEMS FOR TIME-SERIES CLASSIFICATION USING RESERVOIR-BASED SPIKING NEURAL NETWORK 20220418

1732180683851.png




The present disclosure relates to methods and systems for time-series classification using a reservoir-based spiking neural network, that can be used at edge computing applications. Conventional reservoir based SNN techniques addressed either by using non-bio-plausible backpropagation-based mechanisms, or by optimizing the network weight parameters. The present disclosure solves the technical problems of TSC, using a reservoir-based spiking neural network. According to the present disclosure, the time-series data is encoded first using a spiking encoder. Then the spiking reservoir is used to extract the spatio-temporal features for the time-series data. Lastly, the extracted spatio-temporal features of the time-series data is used to train a classifier to obtain the time-series classification model that is used to classify the time-series data in real-time, received from edge devices present at the edge computing network.


To my untutored eye, it does look as if it works on a different principle from Akida, but it mat just be LSTM in drag.

However, they do use it in some applications in conjunction with a more conventional SNN:

US2024176987A1 METHOD AND SYSTEM OF SPIKING NEURAL NETWORK-BASED ECG 20221125

1732181086444.png


This disclosure relates generally to method and system for spiking neural network based ECG classifier for wearable edge devices. Employing deep neural networks to extract the features from ECG signal have high computational intensity and large power consumption. The spiking neural network of the present disclosure obtains a training dataset comprising a plurality of ECG time-series data. The spiking neural network comprise a reservoir-based spiking neural network and a feed forward based spiking neural network. Each of the spiking neural network having a logistic regression-based ECG classifier are trained to classify one or more class labels. The peak-based spike encoder of each spiking neural network obtains a plurality of encoded spike trains from the plurality of ECG time-series. The peak-based spike encoder provides high performance for classifying one or more labels. Efficacy of the peak-based spike encoder for classification is experimentally evaluated with different datasets.

[0037] The SNN layer 304 obtains neuronal trace values of a plurality of feed forward neurons from the plurality of encoded spike trains. Further, a second set of spatio-temporal features are extracted based on the neuronal trace values of the plurality of feed forward neurons for each ECG time-series data from each feed-forward neuron
.

The patents refer to ''spatio-temporal features" which is a TENNs speciality, but the TATA patents are a bit early for TENNs, but who knows what goes on behind closed doors?
 
Last edited:
  • Like
  • Thinking
  • Sad
Reactions: 22 users

Boab

I wish I could paint like Vincent
View attachment 73151



From the article. Apparently we may have to wait another couple of years??

Other processors that are likely to come out within the next couple of years, such as Zeroth, Akida etc., cater to edge applications.
 
  • Like
  • Wow
  • Thinking
Reactions: 7 users

7für7

Top 20
From the article. Apparently we may have to wait another couple of years??

Other processors that are likely to come out within the next couple of years, such as Zeroth, Akida etc., cater to edge applications.
You know son… We’ve been waiting for so many years already… a year or two more won’t exactly break the record! Right?? Greetings from Thailand

1732184354359.gif
 
  • Like
Reactions: 1 users

Frangipani

Top 20
View attachment 72108

Last week, a 6G-RIC (https://6g-ric.de) delegation exhibited - among other things - their proof-of-concept implementation of neuromorphic wireless cognition funded by Germany’s Federal Ministry of Education and Research (aka the Spot robot dog demo developed by Fraunhofer HHI researchers in Berlin) at the Brooklyn 6G Summit, hosted by Nokia and New York University.

Slawomir Stanczak, who is Professor for Network Information Theory at TU Berlin, Head of Fraunhofer HHI’s Wireless Communications and Networks Department as well as Coordinator of the 6G Research and Innovation Cluster (6G-RIC), also participated in a panel discussion with representatives from Nokia, NVIDIA, Rohde & Schwarz, MediaTek as well as InterDigital Communications on the topic of “Energy efficiency in AI/ML networks”:

View attachment 72109

View attachment 72110

The Fraunhofer HHI Wireless Communications & Networks Team will be presenting a remote live demo of their neuromorphic wireless cognition PoC (the one using a COTS Spot robot dog, developed as a 6G-Research and Innovation Cluster project and funded by the BMBF / German Federal Ministry of Education and Research) at the upcoming IEEE Global Communications Conference in Cape Town, South Africa (8 - 12 December 2024):

C4CC87E2-1B4C-43DC-90CD-A7AD8D3DB6E6.jpeg
 
  • Like
  • Fire
  • Love
Reactions: 12 users

7für7

Top 20
Good knight!!

FB830726-7974-49A1-B5B2-0E9862566681.jpeg
 
  • Like
  • Fire
Reactions: 9 users

IloveLamp

Top 20
🤔


1000019886.jpg

1000019889.jpg
 

Attachments

  • 1000019889.jpg
    1000019889.jpg
    781.4 KB · Views: 31
Last edited:
  • Like
  • Fire
  • Love
Reactions: 12 users

manny100

Regular
From the article. Apparently we may have to wait another couple of years??

Other processors that are likely to come out within the next couple of years, such as Zeroth, Akida etc., cater to edge applications.
The Author must not be aware that BRN is currently in talks on many AKIDA engagements and that there is plenty of current interest in TENN's and PICO.
 
  • Like
  • Thinking
  • Fire
Reactions: 17 users

Diogenese

Top 20
From the article. Apparently we may have to wait another couple of years??

Other processors that are likely to come out within the next couple of years, such as Zeroth, Akida etc., cater to edge applications.
Tomorrow is within the next couple of years, or, for the pessimists, 1 January 2025.
 
  • Like
  • Fire
  • Haha
Reactions: 14 users
  • Wow
  • Love
  • Like
Reactions: 5 users

Frangipani

Top 20
Any link to our Brainchip? Puzzling.....

View attachment 73173


083D8FFF-D034-435F-BFD8-40B5F6244083.jpeg
 
  • Haha
  • Like
Reactions: 3 users

Frangipani

Top 20
View attachment 73151



From the article. Apparently we may have to wait another couple of years??

Other processors that are likely to come out within the next couple of years, such as Zeroth, Akida etc., cater to edge applications.
The Author must not be aware that BRN is currently in talks on many AKIDA engagements and that there is plenty of current interest in TENN's and PICO.

The thing is - while the TCS LinkedIn post is from today, the white paper it links to titled “Advancing edge computing capabilities with neuromorphic platforms” has actually been online since at least 29 November 2022.
That puts the authors’ outlook somewhat into perspective…

I wish they’d always add the date of publication when posting such white papers.



CEEAA711-CBD2-4542-AEDB-3281ED07FD79.jpeg

3AF5958B-B0A5-4C00-8413-2F3D56C78320.jpeg
 
Last edited:
  • Like
  • Fire
Reactions: 16 users
Top Bottom