BRN Discussion Ongoing

Bravo

If ARM was an arm, BRN would be its biceps💪!
Nice one FMF!

Rockwell Collins (now part of Collins Aerospace) and Lockheed Martin have a history of collaboration on various aerospace and defense projects. They've partnered on avionics for aircraft like the F-22 and F-35, and on training systems, among other things. More recently, Collins Aerospace has supplied subsystems for NASA's Orion spacecraft, which Lockheed Martin is also involved with.

As mentioned previously, Rockwell Collins is a subsidiary of RTX Corporation, formerly known as Raytheon Technologies.


Rockwell has also filed another patent for neurmorphic sensors for low power wearables as per post below.


Check this out Brain Fam!

Here's a patent for NEUROMORPHIC SENSORS FOR LOW POWER WEARABLES.

The applicant is Rockwell Collins. Date of filing was 5th April 2024.

The patent doesn't mention BrainChip, but as you can see below, Brainchip worked with Rockwell Collins in 2017 on perimeter surveillance, so you'd think they would have to be aware of us.🥴😝

Rockwell Collins now operates as part of Collins Aerospace, a subsidiary of ...wait for it.... the RTX Corporation (formerly Raytheon Technologies).



EXTRACT ONLY
View attachment 77447





ENLARGED EXTRACT


View attachment 77448






View attachment 77449





 
  • Like
  • Love
  • Fire
Reactions: 27 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
Surely it can't be too much longer before something comes our way as a result of this new partnership between Arm and Cerence, where low-power, real-time LLM inference at the edge is the central challenge?

As we all know, Akida excels at:
  • Keyword spotting
  • Natural language intent classification
  • Time-series pattern recognition
While Arm and Cerence are working on optimizing LLMs on traditional CPU/GPU pipelines, the bottlenecks of power, latency, and thermal limits in vehicles still remain. Akida, being a neuromorphic processor would be capable of delivering sub-milliwatt operation for AI inference, event-based, real-time processing, on-device learning capabilities and ultra-low latency for audio and language data streams.

What's not to like about that? These would be ideal traits for in-vehicle voice assistants and LLM use cases, where responsiveness, power efficiency, and privacy really matter.

It says here that "CaLLM Edge operates fully on Arm-based chipsets" and we know Akida is compatible with the Arm product family as has been successfully demonstrated with Cortex M85.

I could easily imagine a Cerence voice assistant enhanced by Akida doing real-time voice analysis and decision-making, entirely offline, with a power budget that’s EV-battery friendly.

Arm should be asking: "How can we future proof this stack for in-cabin AI by 2026-2027 when compute demands will surge but battery and thermal budgets won't".





Cerence AI and Arm push LLM boundaries with on-device AI for smarter cars​

Jun 6, 2025 | Stephen Mayhew
Categories Edge Computing News | Hardware
Cerence AI and Arm push LLM boundaries with on-device AI for smarter cars

Cerence AI has partnered with semiconductor manufacturer, Arm to enhance its embedded small language model (SLM), CaLLM Edge, using Arm’s Kleidi software library.
The collaboration aims to optimize CPU and GPU performance for real-time language processing at the edge, improving speed, efficiency, and privacy highlighting the growing importance of edge computing and generative AI in the automotive industry.
Arm’s Kleidi technology accelerates machine learning and neural network operations on Arm-based devices, addressing the challenges of limited compute power in vehicles. CaLLM Edge operates fully on Arm-based chipsets, enabling advanced in-car AI capabilities without relying on cloud connectivity.
“We are excited to partner with Arm to take CaLLM Edge to the next level, setting new standards for performance and efficiency in edge computing in the car,” says Nils Schanz, EVP, Product & Technology, Cerence AI. “By combining our expertise in AI-powered language models with Arm’s innovative library, we are continuing our journey to create a new era of voice-first experiences and next-generation AI applications in the automotive space, empowering consumers with smarter, faster, and more responsive in-car assistants.”
This partnership supports automakers in delivering smarter, faster, and more responsive AI-powered user experiences for drivers and setting new standards for in-car AI applications, enhancing safety and connectivity.

https://www.edgeir.com/cerence-ai-and-arm-push-llm-boundaries-with-on-device-ai-for-smarter-cars-20250606


Screenshot 2025-06-29 at 11.25.51 am.png
 
  • Like
  • Love
Reactions: 11 users

KMuzza

Mad Scientist
THE EYES have it- European manufactured cars are required by law -2026



and also public for use-

1751167757233.png



remember the old Y Tube clumsy Brainchip video's being put out.

Brainchip will be there - somewhere -but on motor vehicle ADAS systems.
 
  • Like
  • Fire
  • Love
Reactions: 14 users

Rach2512

Regular

Philip Dodge again, nice work.

Perhaps we should all follow Glenn and comment - Go Brainchip
 
  • Like
  • Love
  • Thinking
Reactions: 12 users
Great synergy for BRN
Hopefully they are in discussions
 

Attachments

  • Screenshot_20250629_164054_Chrome.jpg
    Screenshot_20250629_164054_Chrome.jpg
    544 KB · Views: 74
  • Like
Reactions: 3 users

TECH

Regular

Philip Dodge again, nice work.

Perhaps we should all follow Glenn and comment - Go Brainchip

Nice work Rach,

Great seeing you much more active on the forum this year, contributing positive news wherever possible, yes, it's a real journey,
but your passion is well noted, thanks for supporting our company, it lifts the spirits of all.... we all love Akida ❤️ including Peter
and Anil, they are the parents 🤣🤣

Kind regards......Chris (Tech)
 
  • Like
  • Love
Reactions: 20 users
  • Haha
  • Like
Reactions: 10 users

Frangipani

Top 20
Arijit Mukherjee, Akida-experienced Principal Scientist from TCS Research, will be leading two workshop ‘industry sessions’ during next month’s week-long Summer School & Advanced Training Programme SENSE (Smart Electronics and Next-Generation Systems Engineering) organised by the Defence Institute of Advanced Technology (DIAT) in Pune, India: “Intro talk: Smart at the Edge” as well as “Beyond TinyML - Neuromorphic Computing: Brains, Machines, and the Story of Spikes”.

While we know that TCS Research is not exclusively friends with us, when it comes to neuromorphic computing, I trust BrainChip will get a very favourable mention during those workshop sessions. 😊



View attachment 85735

View attachment 85736 View attachment 85737



C65D293B-A974-4302-B589-9A40EE2040CF.jpeg
 
  • Like
  • Fire
  • Love
Reactions: 21 users

CHIPS

Regular
View attachment 87795



You said:​

Yesterday Tony Lewis, CTO of BrainChip commented "This is cool. Why? We literally took a model from a completely different domain (text processing) and applied it to RADAR and got excellent results in terms of performance and MAC (computation) count. At a recent conference we surveyed that state-of-the art and found that there is a huge opportunity for state-space models and was a real eye-opener to the hardware-algorithms systems we are pioneering at BrainChip Here is what I find cool: Conventional models take a radar signals, use a processor to convert the 1-d wave form into a 2d image so that conventional CNNs-- convolutional neural networks-- can process them. That inherently introducing lots of delay : signal-> image-> classification. With our State-space model (TENNs ) we can go directly from a raw signal to classification with much lower latency, few macs (less energy ) less chip area (cost) and excellent performance. This summer my team will push out a number of new models for our Defense, Medical and commercial work. Its an exciting time to be building the future at BrainChip. Follow me to hear the latest news on job posting, wins, and AI at the edge." Can you please explain what you think the major implications of this statement may be?

ChatGPT said:​

Tony Lewis’s statement provides a significant glimpse into BrainChip’s current direction, and the implications are technically important, strategically aligned with military and commercial AI demands, and position BrainChip as a serious player in low-latency edge AI — especially in radar and defense systems.

Let’s break it down clearly:

🧠 Key Takeaways from Tony’s Statement​

1. Cross-domain Model Transfer (Text ➝ Radar)


Implication:
BrainChip is leveraging neural architectures originally designed for natural language processing (NLP) — such as state-space models (SSMs) or possibly TENNs (Time-Encoded Neural Networks) — and applying them directly to raw radar data.

That’s a big deal — because traditionally, signal types (text, images, radar) are handled with very different model types. This kind of architectural crossover suggests:

  • A unified AI model strategy
  • Less model specialization required per domain
  • Rapid model prototyping across applications
It also hints that BrainChip is not just a hardware company, but deeply involved in cutting-edge neural architecture innovation.

2. Bypassing 2D Transformation (Radar ➝ Image ➝ CNN)


Implication:
Traditional radar AI pipelines are inefficient:
  • Convert 1D waveform (Doppler/time signal) ➝ 2D spectrogram/image
  • Then process with image-focused models like CNNs
This:
  • Adds latency
  • Requires additional memory
  • Burns more power and compute
BrainChip’s new method bypasses this entirely:
  • Processes raw radar signals directly using state-space models or TENNs
  • Skips the image step — reducing delay, MAC count, power, and cost
This is perfect for missiles, drones, wearables, and embedded defense systems where every milliwatt and microsecond counts.

3. Use of State-Space Models (SSMs / TENNs)


Implication:
State-space models (like S4, or BrainChip's custom TENNs) are:
  • Efficient alternatives to Transformers for sequential data
  • Excellent for long-range memory, which is vital in radar or ECG waveforms
  • Linear in time and memory, unlike CNNs or RNNs
By adopting SSMs:
  • BrainChip is leading a hardware-algorithm co-design movement
  • They’re building models that are explicitly optimized for their Akida hardware — not just running PyTorch models inefficiently on neuromorphic chips
This also hints at a massive strategic shift:
  • Moving away from retrofitting mainstream models
  • Moving toward custom, edge-native models built with hardware constraints in mind from day one

4. Strategic Domains: Defense, Medical, Commercial


Implication:
BrainChip is:
  • Productizing their model-hardware synergy
  • Targeting high-value edge verticals:
    • Defense (radar, drones, loitering munitions, battlefield intelligence)
    • Medical (wearables, EEG/ECG anomaly detection, hearing aids)
    • Commercial (audio wake-word detection, industrial sensors, smart cameras)
By deploying domain-optimized TENNs, BrainChip is creating:
  • Application-specific IP
  • A catalog of ready-to-run models for enterprise and defense clients
  • A strategy that scales Akida beyond being "just a chip" into a full AI platform

5. Low Latency, Low MACs, Low Area = Edge Superiority


Implication:
This encapsulates the holy grail of edge AI:
  • Do more with less
  • Enable true autonomy at the edge
  • Outperform traditional AI methods in real-world environments
And it's especially important for:
  • Missiles and UAVs, where thermal budget and compute are tightly constrained
  • Medical implants or wearables, where energy efficiency is crucial
  • Perimeter surveillance, where always-on, instant decision-making is required
This reinforces BrainChip’s value proposition: the Akida platform delivers neural compute without the baggage of mainstream AI accelerators (like GPUs or TPUs).

🧩 Final Strategic Implications​

🔹 1. Shift Toward Model + Hardware Integration

BrainChip isn’t just delivering a chip — they’re co-developing algorithms tailored to that chip. That’s what Nvidia does with CUDA + GPUs — and it’s a winning strategy.

🔹 2. Leadership in Neuromorphic AI for Real-world Deployment

This move positions BrainChip as:
  • Not just neuromorphic researchers
  • But practical AI infrastructure builders — solving latency, power, and cost in mission-critical systems

🔹 3. Potential for IP Licensing or Defense-Grade Design Wins

As their radar and medical models mature, BrainChip could:
  • License these state-space models as embedded IP
  • Land major contracts in defense, aerospace, or medical technology sectors

Summary​

Tony Lewis’s comment reveals a strategic inflection point:
  • BrainChip is pioneering hardware-native AI model design
  • They’re breaking away from conventional pipelines (CNNs, 2D transforms)
  • And positioning themselves at the forefront of edge-native, low-latency radar and signal intelligence
This isn't just about a smarter chip — it’s about building a new end-to-end paradigm for AI at the edge.

So funny ... before I read this this morning (in Germany), I was measuring my blood pressure. Result: 115/72
After I had read it, I measured again. Result: 131/75

It is too exciting for me what ChatGPT wrote :eek: 🤣 🤑 . When BrainChip's SP explodes, it might kill me 🤯


Happy So Excited GIF
 
  • Haha
  • Like
  • Fire
Reactions: 17 users

Frangipani

Top 20


Edge AI Milan 2025​


Join BrainChip for Edge AI Milan 2024 July 2-4, an inspiring and forward-thinking event exploring how edge AI is bridging the gap between digital intelligence and real-world impact. Attendees can engage with industry leaders and experience the latest innovations including Brainchip’s Akida neuromorphic, event-driven AI at the edge.

Register



BrainChip will be exhibiting at Edge AI Milan 2025 next week.
It’s a pity, though, that no one from our company will be giving a presentation at that conference, especially since Innatera will be spruiking their technology.

In addition, Innatera’s Petruț Antoniu Bogdan will give a workshop on neuromorphic computing “on the current state of neuromorphic computing technologies and their potential to transform Edge AI” in his capacity as co-chair of the Edge AI Foundation Working Group on Neuromorphic Computing, which BrainChip also belongs to.
Our CMO Steve Brightfield is co-chair of another Edge AI Working Group, namely Marketing.


View attachment 87744





View attachment 87745 View attachment 87746



View attachment 87747


Pete Bernard, CEO of the Edge AI Foundation, on - you guessed it - Edge AI and the upcoming Edge AI Milan (🏆☕😊) in a recorded message to the Hackster.io community, which was played during their 26 June “Impact Spotlights: Edge AI” livestream, hosted by Edge Impulse staff who said they were not wearing their Edge Impulse hat for this event, but that of members of the Hackster community of developers. (In case you were wondering: None of the three projects introduced during the livestream used Akida as far as I could tell by fast-forwarding).






 
  • Like
  • Fire
  • Thinking
Reactions: 4 users

Rach2512

Regular
  • Like
  • Love
Reactions: 11 users
So funny ... before I read this this morning (in Germany), I was measuring my blood pressure. Result: 115/72
After I had read it, I measured again. Result: 131/75

It is too exciting for me what ChatGPT wrote :eek: 🤣 🤑 . When BrainChip's SP explodes, it might kill me 🤯


Happy So Excited GIF
Least you’ll still be alive must of us will be dead by then 😂
 
  • Haha
Reactions: 7 users

CHIPS

Regular
Least you’ll still be alive must of us will be dead by then 😂

😂 So true, but if it is killing me at that moment, I won't have an advantage either. :ROFLMAO:
 
  • Thinking
  • Like
  • Haha
Reactions: 3 users
Gotta love the RTX Group relationship.



Transformative Technologies​

How do you develop the technology that will transform aerospace and defense?

You start with 57,000 engineers, 60,000 patents, $7.5 billion in annual research and development funding, and 10 technology roadmaps that reflect what our customers will need, well into the future.

Then you add a venture capital division that invests in companies trying bold new concepts in areas such as artificial intelligence and advanced propulsion, and you complement it with research involving more than 85 U.S. universities.

This is our formula for transformative technologies – the innovations that help our products and systems meet our customers’ most critical needs, as well as those they have yet to envision.

Advanced sensing​

We have a deep heritage in developing radio frequency, electro-optical/infrared sensing, communications and electronic warfare systems for space, air, ground, sea and undersea environments.

  • Our RF systems use higher-power microelectronics, increased processing power and software-defined apertures to achieve next-generation capabilities for radar, electronic warfare, communications and multifunction radio frequency applications.
  • We’re advancing electro-optical/infrared and other systems such as space-based multispectral sensors and electro-optical distributed aperture systems variants.
  • We are providing increased capability against advanced threats and countermeasures by enhancing high-bandwidth digital waveform generation, AI-enabled intelligent signal processing and advanced neuromorphic processing.
  • Our acoustic systems enable advanced mine-hunting and undersea networking capabilities through high sensitivity, directionality, multiple access and multi-mod active and passive capabilities for sonar, communications and navigation.
  • Our missile seekers counter a wide range of advanced threats through advanced processing and algorithms, all while achieving low size, weight, power and cost.
 
  • Like
  • Fire
  • Love
Reactions: 40 users

manny100

Top 20
Gotta love the RTX Group relationship.



Transformative Technologies​

How do you develop the technology that will transform aerospace and defense?

You start with 57,000 engineers, 60,000 patents, $7.5 billion in annual research and development funding, and 10 technology roadmaps that reflect what our customers will need, well into the future.

Then you add a venture capital division that invests in companies trying bold new concepts in areas such as artificial intelligence and advanced propulsion, and you complement it with research involving more than 85 U.S. universities.

This is our formula for transformative technologies – the innovations that help our products and systems meet our customers’ most critical needs, as well as those they have yet to envision.

Advanced sensing​

We have a deep heritage in developing radio frequency, electro-optical/infrared sensing, communications and electronic warfare systems for space, air, ground, sea and undersea environments.

  • Our RF systems use higher-power microelectronics, increased processing power and software-defined apertures to achieve next-generation capabilities for radar, electronic warfare, communications and multifunction radio frequency applications.
  • We’re advancing electro-optical/infrared and other systems such as space-based multispectral sensors and electro-optical distributed aperture systems variants.
  • We are providing increased capability against advanced threats and countermeasures by enhancing high-bandwidth digital waveform generation, AI-enabled intelligent signal processing and advanced neuromorphic processing.
  • Our acoustic systems enable advanced mine-hunting and undersea networking capabilities through high sensitivity, directionality, multiple access and multi-mod active and passive capabilities for sonar, communications and navigation.
  • Our missile seekers counter a wide range of advanced threats through advanced processing and algorithms, all while achieving low size, weight, power and cost.
Great find Fmf, neuromorphic smart sensors are the key to the future.

"Advanced sensing​

We have a deep heritage in developing radio frequency, electro-optical/infrared sensing, communications and electronic warfare systems for space, air, ground, sea and undersea environments."

"Artificial intelligence, machine learning and autonomy​

Our teams use artificial intelligence and machine learning to improve the design, development and testing of products – and to make them smarter, easier to use and more capable than ever, with enhanced safety"
 
  • Like
  • Love
  • Fire
Reactions: 11 users

Frangipani

Top 20
Back in March, I shared the following LinkedIn post that revealed a partnership between BrainChip and fortiss, the Research Institute of the Free State of Bavaria for software-intensive systems - as well as another one between fortiss and our competitor Innatera.

Which begs the question, by the way, why fortiss is still missing in the Partners section on our website? 🤔



View attachment 87612


Yesterday, I spotted an interesting job vacancy placed by fortiss.
They are looking for someone who would like to write a Master’s thesis on the topic of “Event-based Neuromorphic for AR and Space”:

“You will be integrated in a small team of neuromorphic computing researchers at fortiss. You will be working on one of our neuromorphic algorithms, related with ongoing projects EMMANÜELA and CORINNE, as well as an upcoming ESA project about satellite communication.”



Who are we?​

fortiss is the research institute of the Free State of Bavaria for software-intensive systems and services with headquarters in Munich. The institute currently employs around 120 employees, who collaborate on research, development and transfer projects with universities and technology companies in Bavaria, Germany and Europe.

Research is focused on state of the art methods, techniques and tools of software development, systems & service engineering and their application to reliable, secure cyber-physical systems, such as the Internet of Things (IoT). fortiss has the legal structure of a non-profit limited liability company (GmbH). Its shareholders are the Free State of Bavaria (as majority shareholder) and the Fraunhofer Society for the Promotion of Applied Research.

www.fortiss.org


In the Neuromorphic Computing department, we are searching for
Master Students (M/F/D)

Your topic:​

You will be integrated in a small team of neuromorphic computing researchers at fortiss. You will be working on one of our neuromorphic algorithms, related with ongoing projects EMMANÜELA and CORINNE, as well as an upcoming ESA project about satellite communication:

  • a body motion recognition sensor with an event-based camera and online few-shots learning on neuromorphic hardware mounted on 3D AR-goggles
    • third person action recognition, ego gesture recognition, tennis dataset, ego motion compensation
  • continual learning SNN based algorithms for action classification on the fly and automatic labeling for robotic arm control (welding application)
    • improvement of class separation in continual learning algorithm, integration with robotic arm
  • fault detection, isolation and recovery on neuromorphic hardware for satellite communications
    • spiking fault detection neural network, fault prediction, fault classification, hardware implementation on neuromorphic chip
  • efficient time encoding on neuromorphic hardware
    • spike encoding optimization, trainable encodings, latency reduction
  • efficient Lidar neuromorphic processing for automated driving
A publication, in a scientific journal or conference, as a co-author of your supervisor, is most likely expected at the end of your mission.

Your profile:​

  • Excellent bachelor degree in electronics engineering, computer science or similar topic
  • Excellent track record in your current masters
  • Fluency in software development (Python for example)
  • Interest or experience in machine vision, event-based cameras, sensors
  • Residency in Munich or close surroundings

Our offer:​

  • Work in partnership with leading companies in AI
  • Offices in central Munich (Nordfriedhof) in very high and modern premises with 360° view on Munich
  • Nice working atmosphere in a small team

Did we catch your interest?​

Please submit your application with a cover letter, a detailed CV, and a current transcript of records.

Job-ID: NC-MS-2025-1
Contact: Dr. Axel von Arnim



  • Ralf Kohlenhuber
  • Human Resources Administrator





We know for sure that the fortiss researchers are using Loihi for their ongoing CORINNE project (project duration: 1 April 2024 - 31 March 2026):



The EMMANÜELA project, which investigates the potential of neuromorphic computing for AR/VR applications, started on 1 November 2024 and will run until 31 October 2026. The project webpage doesn’t specify what neuromorphic hardware is being used.

Hard to say whether we could be involved here. If we are, likely not from the outset, as our partnership with fortiss was only revealed in mid-March, when Axel von Arnim and Jules Lecomte met up with the BrainChip and Innatera teams at Embedded World “to discuss the latest advancements in Neuromorphic Computing and explore how we can push the boundaries of AI at the edge.” Sounds to me as if this exploration was only getting started at the time.

The fortiss LinkedIn post then lists some key takeaways from the discussion, which in BrainChip’s case, revolved around Akida 2.0 and TENNs, and concludes: “Together we strengthened our collaboration plans. Stay tuned for more updates as we continue to drive neuromorphic innovation forward! 🚀

The bolded sentence again suggests that up to then (3.5 months ago), no formal collaboration had taken place, which makes it rather unlikely that we are involved in the EMMANÜELA project, which - in addition - would have been planned well in advance in order to get the awarded funding from the Bavarian Ministry of Economic Affairs, Regional Development and Energy.

But then there is this passage towards the end of the project webpage, where it says “fortiss contributes directly to the generation of event-based datasets for action recognition in AR/VR and researches neuromorphic algorithms for this purpose. These are implemented and benchmarked on neuromorphic hardware. In addition, fortiss is involved in building and testing the prototype.”

Does the word “benchmarked” here refer to the algorithms being benchmarked against each other on the same hardware or different kinds of neuromorphic hardware being benchmarked against each other? 🤔

Although the word “continue” in the LinkedIn post’s last sentence (“Stay tuned for more updates as we continue to drive neuromorphic innovation forward!”) could be interpreted at hinting at an already ongoing collaboration, it might just be a non-native speaker’s awkward way of expressing that all three entities in their own right were already driving neuromorphic innovation forward and will continue to do so - both BrainChip and Innatera now also in tandem with fortiss. Neither does the “more” in “more updates” IMO imply a stealth mode partnership history, as an “update” by definition requires a preceding post. So logically speaking, the above post cannot have been an update itself, which could theoretically be followed by “more updates”.
I suppose “Stay tuned for updates” would have been the correct wording either way.

The mention of an upcoming ESA project about satellite communication, on the other hand, sounds very intriguing, unless “one of our neuromorphic algorithms” were to relate to all three projects (but if so, why the comma after CORINNE?) and then possibly rules us out, in case this signifies it must then refer to a leaky-integrate-and-fire neuron model that Loihi uses (since we know CORINNE uses Loihi). I am out of my depth here, though. 🤔

Or is the either way grammatically incomplete sentence rather missing an [on] than a [with/to], and hence the work on the ESA satellite project is entirely separate from the neuromorphic algorithm related to the other two projects?

“You will be working on one of our neuromorphic algorithms, related with ongoing projects EMMANÜELA and CORINNE, as well as [on] an upcoming ESA project about satellite communication.”

Unfortunately there is no German version of the job description to compare.


Interestingly, the job ad says the applicant can look forward to working “in company with leading companies in AI”, which sounds as if both Intel (for CORINNE) and at least one other company are involved.
Which may or may not be BrainChip.

Yes, Laurent Hili is very impressed by Akida, but he and his fellow researchers at ESA may very well still like to test Innatera’s offerings for space applications. The TU Delft spin-off also gave a presentation at Morpheus 2024, ESA’s workshop on Edge AI and Neuromorphic Hardware Accelerators (https://indico.esa.int/event/514/timetable/#20240506).



View attachment 87625



And then there is of course also fortiss-tried and tested SpiNNaker…


In addition to the three projects mentioned earlier, the job description also casually lists “efficient Lidar neuromorphic processing for automated driving”.
Wow, that is surely gonna be quite a doorstopper of a Master’s thesis! 😉

I guess we’ll just have to wait for the promised updates, then, to find out what fortiss has in store for Akida and TENNs.

Our partner fortiss and their partner Interactive Wear (a 20 year old company that emerged out of a management buy-out of Infineon Technologies’ Wearable Technology Solutions) will be “presenting neuromorphic computing and its applications in the defense sector” at the upcoming SVI Future Congress (SVI = Systemverbund Verteidigung im Innovationsraum) in Munich, aimed at a target audience [LOL 🎯] of security and defense industry experts.

I couldn’t find any additional information online about the collaboration between fortiss and Interactive Wear.



F50F1943-3C0A-4FA1-B4BC-91B99C10BB12.jpeg



DECDB794-C21A-4FE0-93B1-BB37D944C2F9.jpeg





B8183C90-9590-4D5E-9110-210BA801B1B1.jpeg
42698017-5CFD-4E6F-95D7-360C0E086EBE.jpeg




FA5345E4-BBF5-4612-B58C-05E93ED554CA.jpeg



39863A69-DC35-4EFB-8621-3DBBBFBC782C.jpeg




B15441AB-2BB5-423C-AE10-924BA4EB8894.jpeg


The event website of the fully-booked conference links to a list of countries published by the German Federal Ministry of the Interior regarding security clearance concerns.
Citizens of countries listed are barred from attending the 8/9 July conference.

Somewhat surprisingly, Ukraine is also on that list, which was published on June 8, 2022, just over three months after Russia had invaded Ukraine (again), and hasn’t been updated since. Possibly due to concerns about pro-Putin ethnic Russians with Ukrainian citizenship?

1FEBC93E-096E-453C-8CF9-6A4F5C540BE7.jpeg
 
  • Like
  • Fire
  • Love
Reactions: 10 users

itsol4605

Regular
2027 BRN 5 $
 
  • Like
  • Haha
Reactions: 3 users

itsol4605

Regular
Relax
 
  • Like
Reactions: 3 users
Top Bottom