BRN Discussion Ongoing

Bravo

If ARM was an arm, BRN would be its biceps💪!
Following @Fullmoonfever’s discovery yesterday of Fernando Sevilla Martinez’s (Data Science Specialist at Volkswagen) recent GitHub project - demonstrating Akida-powered neuromorphic processing for V2X and federated learning prototypes - there may be broader implications worth noting.

This article below, published a month ago, discusses how Verizon is launching its first commercial V2X platform and it has signed on Volkswagen, the Arizona Commerce Authority, the Delaware Department of Transportation and Rutgers University as its first customers.

The article suggests that Volkswagen could possibly incorporate V2X in next year's models.

in August 2024, the Biden administration, announced a plan to deploy V2X nationwide by 2036 which "means that the highways of the near future could be dotted with 5.9 GHz radio transceivers that communicate with cellular radios inside your car and update vehicles as they're driving to prevent accidents."

Verizon has been developing V2X capabilities for some time. In 2022, Verizon and Cisco collaborated on a successful demonstration in Las Vegas where they showed that cellular and mobile edge compute (MEC) technology can enable autonomous driving solutions without the use of physical roadside units to extend radio signals to enable C-V2X communication.

It’s also worth noting that BrainChip and Cisco partnered in 2016, specifically around neuromorphic video-analytics demos via Cisco’s IoE Innovation Center. While not directly connected to current V2X initiatives, that collaboration could become relevant again if neuromorphic approaches gain traction.

Screenshot 2025-07-13 at 12.03.41 pm.png





 
  • Like
  • Fire
  • Love
Reactions: 32 users

TECH

Regular
Turning this ship around could be seen by some as a daunting task, but one must remain optimistic, having been in contact with our CTO a few days ago, I can reassure many here that Tony is working hard with our excellent scientists, engineers and leadership team in trying to secure us a win or two later in 2025, I'm guessing at the backend or early 2026, though nothing is guaranteed remember.

Tony's appointment back in 2023 was a win for all of us, I'm very happy with his approach and optimism towards delivering technical success to us all and I get the feeling he is a realist, he knows nothings guaranteed, no matter how good the technology is, it's the client who makes that ultimate decision to commit.

That has been the challenge ever since TSMC delivered our first batch of AKD 1000 NSOC's, the first run was an outstanding success, the technology has (finally) been accepted as brilliant, but the truth will be in the signing of IP deals over the next 1-3 years in my view.

Aren't we getting a 20 million AUD cash injection at the end of this month by way of LDA Capital, or have I got my dates mixed up?

Regards.........Tech. (y)
 
  • Like
  • Love
  • Fire
Reactions: 10 users

Xerof

Flushed the Toilet
Legitimate Crypto Recovery Companies Hire iFORCE HACKER RECOVERY
If you’ve lost Bitcoin, I highly recommend iForce Hacker Recovery. I invested 100,000 USDT in crypto, trusting someone who claimed to manage my account only to discover it was all a scam. When I couldn’t withdraw or reach him, panic set in. Thankfully, I found iForce Hacker Recovery through positive reviews and reached out. Their team was professional, relentless, and honest. To my amazement, they recovered my stolen funds. I never thought it was possible. If you're in a similar situation, don’t hesitate, contact iForce Hacker Recovery.
Website: https://iforcehackers.com
Email: iforcehk@consultant.com
WhatsApp: +1 240-803-3706 View attachment 88450
Gee, what a terrible thing to happen, you have my deepest sympathy.

But you’re advertising again Yaseen, sooo,

@zeeb0t, I know you hate free advertising, in your own time mate
 
  • Haha
  • Like
Reactions: 5 users

7für7

Top 20
Legitimate Crypto Recovery Companies Hire iFORCE HACKER RECOVERY
If you’ve lost Bitcoin, I highly recommend iForce Hacker Recovery. I invested 100,000 USDT in crypto, trusting someone who claimed to manage my account only to discover it was all a scam. When I couldn’t withdraw or reach him, panic set in. Thankfully, I found iForce Hacker Recovery through positive reviews and reached out. Their team was professional, relentless, and honest. To my amazement, they recovered my stolen funds. I never thought it was possible. If you're in a similar situation, don’t hesitate, contact iForce Hacker Recovery.
Website: https://iforcehackers.com
Email: iforcehk@consultant.com
WhatsApp: +1 240-803-3706 View attachment 88450


I thought it was too good to be true…
…but then I met Kevin Satoshi-Günther – the man who knew about Dogecoin before Elon and once touched the blockchain oracle with his bare hands.

I was at rock bottom. My portfolio was made up of Shiba Inu, LUNA, and broken dreams. But Kevin looked me in the eyes and said:

“Your mindset is the real bear market.”
-That hit hard.


So I signed up for his Starter Package – originally $5,500, now only €499 – limited-time exclusive offer for everyone, forever.

It includes:
  • Access to his secret Telegram channel “CryptoChakraWealth”
  • A personalized energy scan of your wallet address
  • And a hand-signed JPEG of Kevin’s spiritual hardware wallet (Special Edition: Limitless Abundance)

Since then, my life has changed: I’m still broke – but now I have vision.

Comment “🧿Moonify me, Kevin!” and he’ll send you his exclusive whitepaper titled:

“How I Became an Internally Fulfilled Human in Just 3 Days with $12.”

Send me today your bank account and pin.. so I can check, how to optimise my portfolio.

#Gratitude #PassiveKarma #CryptoHealing #FromNowOnOnlyGains
 
  • Haha
  • Like
Reactions: 4 users
Turning this ship around could be seen by some as a daunting task, but one must remain optimistic, having been in contact with our CTO a few days ago, I can reassure many here that Tony is working hard with our excellent scientists, engineers and leadership team in trying to secure us a win or two later in 2025, I'm guessing at the backend or early 2026, though nothing is guaranteed remember.

Tony's appointment back in 2023 was a win for all of us, I'm very happy with his approach and optimism towards delivering technical success to us all and I get the feeling he is a realist, he knows nothings guaranteed, no matter how good the technology is, it's the client who makes that ultimate decision to commit.

That has been the challenge ever since TSMC delivered our first batch of AKD 1000 NSOC's, the first run was an outstanding success, the technology has (finally) been accepted as brilliant, but the truth will be in the signing of IP deals over the next 1-3 years in my view.

Aren't we getting a 20 million AUD cash injection at the end of this month by way of LDA Capital, or have I got my dates mixed up?

Regards.........Tech. (y)
Well trying just isnt good enough as we all just want results.

1752385070846.gif
 
  • Haha
  • Love
  • Like
Reactions: 9 users

manny100

Top 20
See below Tony's linked in response to a query. Tony says more to come.
Tony said: " Well, I hope this is just step one in series of news from the CTO and engineering teams coming out over the summer. We have a lot of stuff that is in the pipeline and this is a small part of it."
His comment above was in repose to a query on his post:
"Modern RF systems—from drones to CubeSats—face tough constraints: size, weight, power, and cooling. Traditional AI models just don’t cut it at the edge.
See below and the post on linked in.
In our latest blog, we explore how BrainChip’s TENN (Temporal Event-Based Neural Network) offers a breakthrough in modulation classification, combining real-time performance with ultra-low power operation—without sacrificing accuracy."
Tony, was this what the poem was about? Is this the big news?


View M Anthony Lewis’  graphic link

M Anthony Lewis • Following​

CTO@BrainChip | AI, Robotics, Disruptive Computing

2w

Philip Dodge Well, I hope this is just step one in series of news from the CTO and engineering teams coming out over the summer. We have a lot of stuff that is in the pipeline and this is a small part of it.


Like
like
love
celebrate
9

Reply
 
  • Like
  • Fire
Reactions: 14 users

Frangipani

Top 20
(…) Anyway, as you had already noticed in your first post on this DeGirum enquiry, Raúl Parada Medina (assuming it is the same person, which I have no doubt about) and Fernando Sevilla Martínez are both co-authors of a paper on autonomous driving:

https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-450543

View attachment 88429

In fact, they have co-published two papers on autonomous driving, together with another researcher: Jordi Casas-Roma. He is director of the Master in Data Science at the Barcelona-based private online university Universitat Oberta de Catalunya, the same department where Fernando Sevilla Martínez got his Master’s degree in 2022 before moving to Wolfsburg the following year, where he now works as a data scientist at the headquarters of the Volkswagen Group.


View attachment 88430


View attachment 88426 View attachment 88427

By the way, Akida does get a mention in the above paper “Spiking neural networks for autonomous driving: A review”, which was first submitted to Elsevier in May 2024 (around the same time when I first noticed Raúl Parada Medina liking BrainChip posts) and then resubmitted in a revised version in August 2024. It was then published online on 21 October 2024.

Apparently the three co-authors of first author Fernando Sevilla Martínez didn’t contribute much themselves to the paper; instead their role was a supervisory one.

“CRediT authorship contribution statement
Fernando S. Martínez: Writing – review & editing, Writing – original draft, Investigation.
Jordi Casas-Roma: Supervision.
Laia Subirats: Supervision.
Raúl Parada: Supervision”





7CC9A995-66DB-42D5-9332-5F17A7A58276.jpeg

(…)

509F9FC2-D957-4099-A7B1-A23684C7014B.jpeg

(…)
1C8E5F9B-E7DD-478C-B6A5-098D2DFE5023.jpeg




Note that Fernando Sevilla Martínez not only works at the Volkswagen Group headquarters in Wolfsburg/Germany, but is also affiliated with the e-Health Center at his alma mater Universitat Oberta de Catalunya (UOC) in Barcelona.

So IMO there is a fair chance that he or his colleagues there will also experiment with Akida in the field of digital health, eg. in a hospital setting.

From the GitHub @Fullmoonfever had discovered:

“Acknowledgements​

This implementation is part of a broader effort to demonstrate low-cost, energy-efficient neuromorphic AI for distributed and networked edge environments, particularly leveraging the BrainChip Akida PCIe board and Raspberry Pi 5 hardware.”




EBBAF8E8-6A17-4694-AB80-0ECC6A94F260.jpeg





8622CC09-134F-4BE6-8814-7679F69F1320.jpeg
 

Attachments

  • 9278A873-F7E5-4CEB-90C9-28AA3E891FFF.jpeg
    9278A873-F7E5-4CEB-90C9-28AA3E891FFF.jpeg
    647.3 KB · Views: 72
Last edited:
  • Like
  • Love
  • Fire
Reactions: 23 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
Following @Fullmoonfever’s discovery yesterday of Fernando Sevilla Martinez’s (Data Science Specialist at Volkswagen) recent GitHub project - demonstrating Akida-powered neuromorphic processing for V2X and federated learning prototypes - there may be broader implications worth noting.

This article below, published a month ago, discusses how Verizon is launching its first commercial V2X platform and it has signed on Volkswagen, the Arizona Commerce Authority, the Delaware Department of Transportation and Rutgers University as its first customers.

The article suggests that Volkswagen could possibly incorporate V2X in next year's models.

in August 2024, the Biden administration, announced a plan to deploy V2X nationwide by 2036 which "means that the highways of the near future could be dotted with 5.9 GHz radio transceivers that communicate with cellular radios inside your car and update vehicles as they're driving to prevent accidents."

Verizon has been developing V2X capabilities for some time. In 2022, Verizon and Cisco collaborated on a successful demonstration in Las Vegas where they showed that cellular and mobile edge compute (MEC) technology can enable autonomous driving solutions without the use of physical roadside units to extend radio signals to enable C-V2X communication.

It’s also worth noting that BrainChip and Cisco partnered in 2016, specifically around neuromorphic video-analytics demos via Cisco’s IoE Innovation Center. While not directly connected to current V2X initiatives, that collaboration could become relevant again if neuromorphic approaches gain traction.

View attachment 88451






In addition to the above, we also have the following snippet...


Screenshot 2025-07-13 at 8.59.12 pm.png





Screenshot 2025-07-13 at 8.51.05 pm.png


 
  • Like
  • Love
  • Fire
Reactions: 27 users

Frangipani

Top 20
Keep in mind that Raúl Parada Medina has a telecommunications background and works as a Senior Researcher for CTTC in Castelldefels near Barcelona, the Centre Tecnològic de Telecomunicacions de Catalunya.

So when he describes himself as an “IoT research specialist within the connected car project at a Spanish automobile manufacturer”, the emphasis is on “connected” rather than on “automobile”. Therefore, any upcoming research projects may not involve cars at all.

Just to give you an idea what diverse topics Raúl Parada Medina has been working on over the past two years:



297C8D0E-8F65-4939-9C4A-C80322C7511C.jpeg
DF53A3E8-95BC-44BA-AE6A-7EF3179F566C.jpeg




05288B26-E4D4-4906-A054-40637138DAAC.jpeg




1D508558-9DAF-4ADA-B5E7-C8723AB4B0FA.jpeg




665C56BE-BA6D-470E-A329-737D88399E2D.jpeg






B952E996-2E03-4DE2-8B85-4A7234BFA543.jpeg






B770DC6E-0909-4905-AC95-0D5ACC1A95E3.jpeg




F9800809-2CAB-4420-8E27-E8D3B724129E.jpeg

BB22D2C0-49CE-4DA1-9E68-FA03CD53C819.jpeg




743B736D-14C3-448C-912B-856D825A01E5.jpeg
 
  • Like
  • Love
  • Fire
Reactions: 31 users

Frangipani

Top 20
I believe this confirmation of an “active alliance” with BrainChip is new on the Tata Elxsi website?

Why Tata Elxsi?

(…)
  • Active alliances with Brainchip for neuromorphic computing and IISc Bangalore for edge cybersecurity.”




View attachment 88392

View attachment 88393 View attachment 88394 View attachment 88395
View attachment 88398 View attachment 88399

Despite our years of collaboration with TCS researchers and the above encouraging affirmation of an “active alliance” with Tata Elxsi, we should not ignore that TCS are also exploring other neuromorphic options for what will ultimately be their own clients.

And while a number of TCS patents do refer to Akida as an example of a neuromorphic processor that could be utilised, they always also refer to Loihi, as far as I’m aware of.

A recent case in point for Tata Consultancy Research’s polyamory is the June 2025 paper “The Promise of Spiking Neural Networks for Ubiquitous Computing: A Survey and New Perspectives”, co-authored by five Singapore Management University (SMU) researchers as well as Sounak Dey and Arpan Pal from TCS, both very familiar names to regular readers of this forum.

Although we know those two TCS researchers to be fans of Akida, they sadly did not express a preference for BrainChip’s neuromorphic processor over those from our competitors in below paper published less than six weeks ago.
On the contrary, in their concluding “key takeaway” recommendations of neuromorphic hardware (“We make the following recommendations for readers with different needs considering neuromorphic hardware chipsets”), the seven co-authors do not even mention Akida at all.

Even more surprisingly, the section on Akida is factually incorrect:
- AKD1500 is a first generation reference chip and is not based on Akida 2.0, BrainChip’s second generation platform that supports TENNs and vision transformers.
- An AKD2000 reference chip does not (yet) exist - it may or may not materialise. At present, only Akida 2.0 IP is commercially available - not an actual silicon chip, as claimed by the paper’s authors.
- The paper is in total ignorance of ultra-low-power Akida Pico, operating on less than 1mW of power, which was revealed by our company back in October 2024 and is based on Akida 2.0.
It is highly unlikely this (possibly revised) version of the paper published on 1 June 2025 would have been submitted to arXiv prior to BrainChip’s announcement of Akida Pico, and we can safely assume Sounak Dey and Arpan Pal would have been aware of that October 2024 BrainChip announcement (unlike maybe their SMU co-authors).

One could argue the reason Akida Pico is not mentioned could possibly be that an actual Akida Pico chip is not commercially available, yet, given the authors state

“5.2 Neuromorphic Hardware
In this subsection, we summarize the latest commercially available neuromorphic hardware chipsets, highlighting their capabilities and development support for building and deploying spiking neural networks.”,

which, however, in turn begs the question, why Loihi 2 is listed, then, as it was always conceptualised as a research chip and is not commercially available. In the paragraph on Loihi 2, the authors correctly state that “this neuromorphic research chipset is available only through the Intel Neuromorphic Research Community (INRC).”

Given the fact that Sounak Dey and Arpan Pal co-authored this paper, the above inaccuracies are bewildering, to say the least. Did the two TCS researchers who both have firsthand experience with Akida contribute to only part of this paper and not proofread the final version before it was submitted?

Either way not a good look…




65AF922B-BF11-4B8B-88CB-B56AAB53AE02.jpeg


(…)

18F94192-34EE-433B-A749-C1780323F6D3.jpeg
5CAB37BF-8FC9-4CE9-B46B-74D720DE5373.jpeg
9DCDFA61-5266-4C51-A826-AAE07C6E6073.jpeg

(…)

6779A981-B39B-4282-B09E-018FCB2097BA.jpeg
 
Last edited:
  • Like
  • Fire
  • Sad
Reactions: 25 users

MegaportX

Regular
  • Like
  • Fire
  • Love
Reactions: 11 users

7für7

Top 20
It’s been so quiet from the company side lately…

Almost eerie.

I’m starting to think they’re mad at us and just don’t want to talk anymore…

Fine. I’ll go first:


Dear BrainChip, we’re sorry.
We’re sorry for asking so much of you.
Sorry for being stressed out that the share price is still stuck in traffic.

But try to understand us, too:

We paid out of our own pockets. No insider bonuses. No free millions of shares. Just hope, patience… and a sprinkle of desperation.

So let’s just move on, shall we?

After all – philosophically speaking – we’re just artificial intelligence too.
(Emotional beta version, though.)

Sorry Cry Me A River GIF by Offline Granny!
 
  • Like
Reactions: 4 users

7für7

Top 20
I keep having doubts…

“What if you picked the wrong horse? What if it botched the start and collapses right before the finish line?”

But then I remind myself…
This isn’t a horse race…. It’s a rollercoaster experience…. A pricy one….
 
  • Thinking
Reactions: 1 users

Tezza

Regular
I keep having doubts…

“What if you picked the wrong horse? What if it botched the start and collapses right before the finish line?”

But then I remind myself…
This isn’t a horse race…. It’s a rollercoaster experience…. A pricy one….
I keep thinking of beta or blackberry. I just hope we are vhs or iphone.
 
Last edited:
  • Like
  • Sad
  • Haha
Reactions: 4 users

Gazzafish

Regular
Probably silence because they are busy negotiating exclusive rights of Akida 2.0 with Apple for a fixed 3 year period.. 🤪 disclaimer: this is me dreaming and completely made up…..but I wonder
 
  • Like
  • Haha
  • Love
Reactions: 6 users

Rach2512

Regular


Sorry of already posted.

Screenshot_20250714_135922_Samsung Internet.jpg
Screenshot_20250714_135943_Samsung Internet.jpg
Screenshot_20250714_135953_Samsung Internet.jpg
Screenshot_20250714_140001_Samsung Internet.jpg
Screenshot_20250714_140015_Samsung Internet.jpg
Screenshot_20250714_140023_Samsung Internet.jpg
 
  • Like
  • Fire
  • Love
Reactions: 24 users
I keep having doubts…

“What if you picked the wrong horse? What if it botched the start and collapses right before the finish line?”

But then I remind myself…
This isn’t a horse race…. It’s a rollercoaster experience…. A pricy one….
Most expensive one in town
Let me on
 

Rach2512

Regular

Could we be on it?

Screenshot_20250714_155546_Samsung Internet.jpg

Screenshot_20250714_155558_Samsung Internet.jpg
 
  • Like
  • Thinking
  • Love
Reactions: 6 users
Interesting channel for BRN , hopefully they are speaking with them soon 🤔


 
  • Like
  • Love
  • Thinking
Reactions: 6 users

Frangipani

Top 20

4.2 Use Case: If the Akida accelerator is deployed in an autonomous driving system, V2X communication allows other vehicles or infrastructure to receive AI alerts based on neuromorphic-based vision​

This Use Cases simulates a lightweight V2X (Vehicle-to-Everything) communication system using Python. It demonstrates how neuromorphic AI event results, such as pedestrian detection, can be broadcast over a network and received by nearby infrastructure or vehicles.

(…) Given that Raúl Parada Medina describes himself as an “IoT research specialist within the connected car project at a Spanish automobile manufacturer”, I had already suggested a connection to the Volkswagen Group via SEAT or CUPRA at the time.

https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-424590

View attachment 88422
View attachment 88424





View attachment 88425

Extremely likely the same Raúl Parada Medina whom you recently spotted asking for help with Akida in the DeGirum Community - very disappointingly, no one from our company appears to have been willing to help solve this problem for more than 3 months!

Why promote DeGirum for developers wanting to work with Akida and then not give assistance when needed? Not a good look, if we are to believe shashi from the DeGirum team, who wrote on February 12 he would forward Parada’s request to the BrainChip team, but apparently never got a reply.

https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-461608

View attachment 88428

The issue continued, until it was eventually solved on 27 May by another DeGirum team member, stephan-degirum (presumably Stephan Sokolov, who recently demonstrated running the DeGirum PySDK directly on BrainChip hardware at the 2025 Embedded Vision Summit - see the video here: https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-469037)





raul.parada.medina
May 27

Hi @alex and @shashi for your reply, it looks there is no update from Brianchip in this sense. Please, could you tell me how to upload this model in the platform? Age estimation (regression) example — Akida Examples documentation. Thanks!

1 Reply



shashiDeGirum Team
May 27

@stephan-degirum
Can you please help @raul.parada.medina ?




stephan-degirum
143_2.png
raul.parada.medina
May 27

Hello @raul.parada.medina , conversion of a model from BrainChip’s model zoo into our format is straightforward:
Once you have an Akida model object, like Step 4 in the example:
model_akida = convert(model_quantized_keras)

You’ll need to map the model to your device and then convert it to a compatible binary:


from akida import devices

# Map model onto your Akida device
dev = devices()[0]
try:
model_akida.map(dev, hw_only=True)
except RuntimeError:
model_akida.map(dev, hw_only=False)

# Extract the C++-compatible program blob
blob = model_akida.sequences[0].program
with open("model_cxx.fbz", "wb") as f:
f.write(blob)

print("C++-compatible model written to model_cxx.fbz")

Note: You want to be sure that the model is supported on your Akida device. There are many models on the BrainChip model zoo that are not compatible with their “version 1 IP” devices.
If your device is a v1 device, you’ll need to add a set_akida_version guard:

from cnn2snn import convert, set_akida_version, AkidaVersion
# Convert the model
with set_akida_version(AkidaVersion.v1):
model_akida = convert(model_quantized_keras)
model_akida.summary()

from akida import devices
# Map model onto your Akida device
# ... (see above)

for more information on v1/v2 model compatibility please see their docs: Akida models zoo — Akida Examples documentation

Once you have a model binary blob created:

Create a model JSON file adjacent to the blob by following Model JSON Structure | DeGirum Docs or by looking at existing BrainChip models on our AI Hub for reference: https://hub.degirum.com/degirum/brainchip
ModelPath is your binary model file
RuntimeAgent is AKIDA
DeviceType is the middle output from akida devices in all caps.
For example for if akida devices shows: PCIe/NSoC_v2/0 you put: NSOC_V2
Your JSON + binary model blob are now compatible with PySDK. Try running the inference on your device locally by specifying the full path to the JSON as a zoo_url, see: PySDK Package | DeGirum Docs
“For local AI hardware inferences you specify zoo_urlparameter as either a path to a local model zoo directory, or a path to model’s .json configuration file.”
You can then zip them up and upload them to your model zoo in our AI Hub.
Let me know if this helped.
P.S. we currently have v1 hardware in our cloud farm, and this model is the face estimation model for NSoC_v2:
https://hub.degirum.com/degirum/brainchip/vgg_regress_age_utkface--32x32_quant_akida_NSoC_1


Anyway, as you had already noticed in your first post on this DeGirum enquiry, Raúl Parada Medina (assuming it is the same person, which I have no doubt about) and Fernando Sevilla Martínez are both co-authors of a paper on autonomous driving:

https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-450543

View attachment 88429

In fact, they have co-published two papers on autonomous driving, together with another researcher: Jordi Casas-Roma. He is director of the Master in Data Science at the Barcelona-based private online university Universitat Oberta de Catalunya, the same department where Fernando Sevilla Martínez got his Master’s degree in 2022 before moving to Wolfsburg the following year, where he now works as a data scientist at the headquarters of the Volkswagen Group.


View attachment 88430


View attachment 88426 View attachment 88427

After digging a little further, it looks more and more likely to me that CUPRA is the Spanish automobile manufacturer with the connected car project Raúl Parada Medina is currently involved in (cf. his LinkedIn profile).

Which in turn greatly heightens the probability that he and Fernando Sevilla Martínez (who works for the Volkswagen Group as a data scientist in the Volkswagen logistics data lake) have been collaborating once again, this time jointly experimenting on “networked neuromorphic AI for distributed intelligence” with the help of an Akida PCIe board paired with a Raspberry Pi 5. (https://github.com/SevillaFe/SNN_Akida_RPI5)

While the GitHub repository SNN_Akida_RPI5 is described very generally as “Eco-Efficient Deployment of Spiking Neural Networks on Low-Cost Edge Hardware”, and one of the use cases (involving MQTT - Message Queuing Telemetry Transport) “supports event-based edge AI and real-time feedback in smart environments, such as surveillance, mobility, and robotics” - and hence a very broad range of applications - one focus is evidently on V2X (= Vehicle-to-Everything) communication systems: Use case 4.2 in the GitHub repository demonstrates how “neuromorphic AI event results, such as pedestrian detection, can be broadcast over a network and received by nearby infrastructure or vehicles.”

F7B8CD08-3233-4F94-B3D2-E228670574D6.jpeg



CUPRA is nowadays a standalone car brand owned by SEAT, a Spanish (Catalonian to be precise) automobile manufacturer headquartered in Martorell near Barcelona. In fact, CUPRA originated from SEAT’s motorsport division Cupra Racing. Both car brands are part of the Volkswagen Group.

CUPRA’s first EV, the CUPRA Born, introduced in 2021 and named after a Barcelona neighbourhood, is already equipped with Car2X technology as standard (see video below). Two more CUPRA models with Car2X functions have since been released: CUPRA Tavascan and CUPRA Terramar.

Broadly speaking, Car2X/V2X (often, but not always used interchangeably, see the Telekom article below) stands for technologies that enable vehicles to communicate with one another and their environment in real time. They help to prevent accidents by warning other nearby vehicles of hazards ahead without visibility, as V2X can “see” around corners and through obstacles in a radius of several hundred meters, connecting all users who have activated V2X (obviously provided their vehicles support it) to a safety network.

“B. V2X technology

I. Principles

Your vehicle is equipped with V2X technology. If you activate the V2X technology, your vehicle can exchange important road traffic information, for example about accidents or traffic jams, with other road users or traffic infrastructure if they also support V2X technology. This makes your participation in road traffic even safer. When you log into the vehicle for the first time, you must check whether the V2X setting is right for you and you can deactivate V2X manually as needed.

Communication takes place directly between your vehicle and other road users or the traffic infrastructure within a close range of approximately 200 m to 800 m. This range can vary depending on the environment, such as in tunnels or in the city.

(…)


III. V2X functionalities

V2X can assist you in the following situations:

1. Warning of local hazards

The V2X function scans the range described above around your vehicle in order to inform you of relevant local hazards. To do this, driving information from other V2X users is received and analysed. For example, if a vehicle travelling in front initiates emergency braking and sends this information via V2X, your vehicle can display a warning message. Please note that your vehicle does not perform automatic driving interventions due to such warnings. In other words, it does not automatically initiate emergency braking, for example.

2. Supplement to adaptive cruise control

The V2X technology can supplement your vehicle's predictive sensor system (e.g. radar and camera systems) and detect traffic situations even more quickly to give you more time to react to them. With more precise information about a traffic situation, adaptive cruise control, for example, can respond to a tail end of a traffic jam in conjunction with the cruise control system and automatically adjust the speed. Other functions, such as manual lane change assistance, are also improved.

3. Other functionalities

Further V2X functions may be developed in future. We will inform you separately about data processing in connection with new V2X functions.

IV. Data exchange

If you activate the V2X technology, it continuously sends general traffic information to other V2X users (e.g. other vehicles, infrastructure) and allows them to evaluate the current traffic situation. The following data is transmitted for this: information about the V2X transmitter (temporary ID, type), vehicle information (vehicle dimensions), driving information (acceleration, geographical position, direction of movement, speed), information from vehicle sensors (yaw rate, cornering, light status, pedal status and steering angle) and route (waypoints, i.e. positioning data, of the last 200 m to 500 m driven).

The activated V2X technology also transmits additional data to other V2X users when certain events occur. In particular, these events include a vehicle stopping, breakdowns, accidents, interventions by an active safety system and the tail end of traffic jams. The data is only transmitted when these events occur. The following data is additionally transmitted: event information (type of event, time of event and time of message, geographical position, event area, direction of movement) and route (waypoints, i.e. positioning data, of the last 600 m to 1,000 m driven).

The data sent to other V2X users is pseudonymised. This means that you are not displayed as the sender of the information to other V2X users.

Volkswagen AG does not have access to this data and does not store it.”




Here is a February 2024 overview of Car2X hazard warning dashboard symbols in Volkswagen Group automobiles, which also shows 8 models that were already equipped with this innovative technology early last year: 7 VW models as well as the CUPRA Born. Meanwhile the number has increased to at least 13 - see the screenshot of a Motoreport video uploaded earlier this month.


2D9278F8-0660-499F-9306-FDDCA70707C1.jpeg



And here is an informative article on Car2X/V2X by a well-know German telecommunications provider that - just as numerous competitors in the field - has a vested interest in the expansion of this technology, especially relating to the “development towards nationwide 5G Car2X communications”.


255ED097-709F-4C17-B90B-2D4EE50BBBA2.jpeg
2D633CA0-F4FD-455A-B7D9-878110B11994.jpeg
E03843A1-AF78-413B-BBFD-D618DBE46694.jpeg



A5D2E96C-C609-480C-A56D-4A2850A13D21.jpeg

AA6C28C7-BE7B-43BB-998F-1D5172ABD30F.jpeg




76798919-57CB-40B2-A8F3-1FB4685312C6.jpeg




B84FD131-41AB-4E2F-8663-CF28FAA9E522.jpeg
 

Attachments

  • 1BF7BB9F-41DF-4A65-8746-FB48998E9FA5.jpeg
    1BF7BB9F-41DF-4A65-8746-FB48998E9FA5.jpeg
    524.9 KB · Views: 76
  • Like
  • Fire
  • Love
Reactions: 31 users
Top Bottom