A couple of years before I became a BRN shareholder, I came across a company from Austria called g.tec medical engineering GmbH (
https://www.gtec.at), founded in 1999 by Christoph Guger and Günter Edlinger as a spin-out of TU Graz, and one of their products called mindBEAGLE, which uses Brain-Computer-Interface (BCI) technology for assessing patients suffering from disorders of consciousness or locked-in syndrome, can help with outcome prediction, and even provides very basic communication with some of them.
Over the past 25+ years, g.tec medical engineering have specialised “in developing high performance brain-computer interfaces and neurotechnologies for both invasive and non-invasive recordings in research and clinical settings” (
https://www.gtec.at/about/) and are one of the leading companies, if not THE leading company, in this field world-wide.
In a November 2024 interview (
https://www.gtec.at/2024/11/04/leading-the-bci-field/), Co-Founder and Co-CEO Christoph Guger, who has degrees from both Johns Hopkins University and TU Graz, shared the following about g.tec medical engineering’s impressive journey:
“We sell our platform to many Universities like Harvard, Stanford, Yale, and of course, Johns Hopkins and have expanded to 100 countries around the world.
Besides that, the systems are used for technology developments by major industrial players like BMW, Airbus, Meta, Apple, Amazon, and many more. About 10 years ago we started with the development of medical products that we sell to hospitals and rehabilitation clinics.
We established a franchise system that allows businessmen and therapists to use neurotechnology in their centers to treat patients. With our recoveriX system for the neurorehabilitation of stroke patients and patients with Multiple Sclerosis, we are already in many countries, and up to about 50,000 treatments were done.”
They have since also teamed up with Tobii to offer integrated EEG and eye-tracking technology.
Most of you will be familiar with the term Brain-Computer-Interface (BCI) - sometimes also called Brain-Machine-Interface (BMI) or Mind-Machine-Interface (MMI) - but may not be fully aware what it actually means.
In a December 2015 publication, Christoph Guger (that’s, by the way, where the G in g.tec comes from - it stands for Guger Technologies) and two of his co-authors described a BCI as follows:
“A BCI is a device that reads voluntary changes in brain activity, then translates these signals into a message or command in real-time (…) Most BCIs rely on the electroencephalogram (EEG). These signals (also called “brainwaves”) can be detected with electrodes on the surface of the head. Thus, these “noninvasive” sensors can detect brain activity with very little preparation. Some BCIs are “invasive”, meaning that they require neurosurgery to implant sensors. These BCIs can provide a much more detailed picture of brain activity, which can facilitate prosthetic applications or surgery for epilepsy and tumor removal.”
The implants used in clinical trials by Neuralink (founded in 2016 by Elon Musk and a team of eight scientists and engineers) are the most well-known examples of invasive BCIs. And while we BRN shareholders tend to roll our eyes when our company’s silicon gets confused with Musk’s “brain chips”, there is no doubt that BrainChip’s technology is also being evaluated in this field of BCIs.
In 2020, g.tec medical engineering introduced the BCI & Neurotechnology Spring School, a free ten-day virtual event - now held annually - which has become the world’s largest neurotech event, orchestrated from a small town in Austria called Schiedlberg. Participants can access 140 hours of cutting-edge education and even earn 14 ECTS* credits and an official exam certificate at no cost.
*ECTS = European Credit Transfer and Accumulation System
I noticed that one of last year’s 82,000 (!) participants was Temi Mohandespour, who used to work as a research scientist at BrainChip’s now closed Perth office from March 2021 until January 2025. She has since moved to Berlin and now works for Data4life, a non-profit organisation, whose mission is to digitalise health data for research (
www.data4life.care/en/).
https://www.linkedin.com/posts/temi-mohandespour_here-is-a-big-thank-you-to-gtec-medical-activity-7193097495894208513-9euk?
View attachment 92732
Several of her colleagues at BrainChip gave her above “thank you” post a

, including our CTO.
While I wasn’t able to find out anything concrete about what Temi Mohandespour may have been working on relating to BCIs during her last nine months at BrainChip post-Spring School, I happened to discover the LinkedIn profile of someone else who worked not only on one, but on two BCI projects utilising Akida -
although not as an employee of BrainChip:
https://www.linkedin.com/in/hammouamri-ilyass/
View attachment 92729
Ilyass Hammouamri, who recently defended his PhD thesis at the Université de Toulouse (
https://doctorat.univ-toulouse.fr/as/ed/cv.pl?mat=140961&site=EDT)
and whose PhD supervisor was Timothée Masquelier (one of the four co-inventors of the JAST patent that BrainChip first licensed and later acquired),
was a part-time research engineer at Neurobus between September 2024 and April 2025.
It was during that time - still under Gregor Lenz as CTO - that he “
developed a Proof of Concept solution for motor imagery classification from a Dry EEG Headset using a BrainChip Akida neuromorphic chip for robotic arm control”.
“Motor imagery (MI) is a mental process in which a subject vividly imagines performing a movement without any actual physical execution. MI is widely used in BCI systems to enable control of external devices, such as a cursor on a screen or a robotic arm, through brain activity.”
https://docs.medusabci.com/kernel/1.4/tutorials.php (by the Biomedical Engineering Group at the University of Valladolid, Spain)
I wonder whether this project may have been the continuation of the BMI* project that Neurobus’s first employee, Ljubica Cimeša, had developed in collaboration with Airbus, which also used EEG signals for robotic control:
*The terms Brain-Computer-Interface (BCI) and Brain-Machine-Interface (BMI) are often used interchangeably.
https://www.linkedin.com/in/cimesa-ljubica/
View attachment 92736
View attachment 92737
But his part-time contract job with Neurobus was not the first time Ilyass Hammouamri had been involved in BCI research using Akida:
During his time at CNRS (Centre national de la recherche scientifique) CerCo (Centre de Recherche Cerveau et Cognition) in Toulouse, where he was as a PhD candidate in Timothée Masquelier‘s NeuroAI lab from to September 2021 to February 2025, he “worked on a joint project between different labs and BrainChip: Decoding speech from ECoG brain signals”.
Which means there must have been at least one more lab involved in that project, possibly more.
ECoG stands for electrocorticography. In contrast to EEG, it involves recording electrical activity directly from the surface of the brain und thus requires a craniotomy.
en.wikipedia.org
View attachment 92731
Here is a good illustration I found online, which happens to be from a video by g.tec medical engineering:
View attachment 92730
I have no idea whether or not any of g.tec medical engineering’s products (such as wearable EEG headsets, biosignal amplifiers) were actually used for either of the two BCI projects that Ilyass Hammouamri was involved in.
What I can tell you, though, is that they list Airbus under “Happy Customers” alongside quite a few other interesting names (
https://www.gtec.at/).
Stumbled across more info today about the “joint project between different labs and BrainChip” (“Decoding speech from ECoG brain signals”) that Ilyass Hammouamri

was involved in during his time at CNRS (Centre national de la recherche scientifique) CerCo (Centre de Recherche Cerveau et Cognition) in Toulouse, where he was as a PhD candidate in Timothée Masquelier’s NeuroAI lab from to September 2021 to February 2025:
The ANR (Agence Nationale de la Recherche / French National al Research Agency) BRAIN-Net project started in December 2020 and ran over a duration of four years, which means it ended about a year ago.
It was coordinated by Blaise Yvert from the Grenoble Institute of Neuroscience, whose goal it is to “restore speech to people who are paralyzed and who have lost their vocal abilities. Along with his team at the Grenoble Institute of Neuroscience, he is developing a system capable of decoding the brain signals associated with speech, so that it can be produced by an external device. This is referred to as a brain-computer interface.” (quoted from the article on Blaise Yvert below)
The Bioelectronics research group of the IMS Laboratory recently published two articles in the prestigious science journals Nature Electronics and Nature Communications: 🔹 « A ferroelectric–memristor memory for both training and inference », in Nature Electronics. This paper reports a unified...
www.linkedin.com
FYI: The linked article co-authored by researchers from France and Japan and published in Nature Communications -
https://www.nature.com/articles/s41467-025-64231-2 - does not mention BrainChip or Akida.
Large-scale neural recordings using high-density electrode arrays are key to understanding brain dynamics and designing brain-computer interfaces for rehabilitation. These devices produce large data flows that raise new challenges to extract relevant information in real time with limited power...
anr.fr
Here is an interesting six-month old article about the research conducted by BRAIN-Net project coordinator Blaise Yvert, Inserm* Research Director and head of the
Neurotechnologies and network dynamics team at Grenoble Institute of Neuroscience:
*
INSERM (= Institut National de la Santé et de la Recherche Médicale) is the French National Institute of Health and Medical Research.
À l’Institut des neurosciences de Grenoble, Blaise Yvert développe un dispositif capable de décoder les signaux cérébraux associés à la parole, pour que celle-ci puisse être produite par un appareil externe. On parle d’interface cerveau-machine.
www.inserm.fr
- Blaise Yvert: Getting the Brain to Talk
- PUBLISHED ON: 10/06/2025
- READING TIME: 5 MIN
- NEWS
Blaise Yvert has one goal – restore speech to people who are paralyzed and who have lost their vocal abilities. Along with his team at the Grenoble Institute of Neuroscience, he is developing a system capable of decoding the brain signals associated with speech, so that it can be produced by an external device. This is referred to as a brain-computer interface.
Blaise Yvert is an Inserm Research Director and head of the Neurotechnologies and network dynamics team at Grenoble Institute of Neuroscience (unit 1216 Inserm/Grenoble-Alpes University) in Grenoble.
Will Yvert’s research restore speech to those who have lost it? This is what the Inserm Research Director leading the Neurotechnologies and Network Dynamics team at the Grenoble Institute of Neuroscience is hoping. For the past ten years, he has been working on the development of a brain-computer interface to decode the brain signals of speech and reproduce the words of people who are unable to utter them. His project was recently selected as part of the Impact Santé program, funded by the France 2030 investment plan and coordinated by Inserm. Brain Implant, the scientific consortium he has formed for this project, has received three million euros to develop a new brain implant that will improve the accuracy of speech reconstruction based on brain activity.
From engineer to researcher
This desire dates back to his engineering studies at École centrale de Lyon and Cornell University in the US. « I was drawn to research and wanted to develop health technologies, especially for people with disabilities. I knew several people with disabilities when I was younger and it’s a cause I hold dear », explains Yvert.
Once he graduated in 1993, the young engineer was hired by an Inserm human electrophysiology research unit in Lyon. « The team sought to mathematically locate the brain regions responsible for the signals recorded on the surface of the head. This was something I was particularly interested in », he recalls. During two postdocs, one in Finland and the other in Germany, the researcher used this approach to identify the auditory areas. But he realized that, even for very simple sounds, the cortex activation pattern is too complex to be finely understood with non-invasive recordings. « So I thought: let’s develop more sophisticated systems, for a more precise look at what happens in the neural networks. »
Towards a new technology
With this goal in mind, and after obtaining a research fellowship at Inserm, Yvert joined in 2003 a research unit in Bordeaux that focuses on neural networks in the developing spinal cord. There, he initiated a partnership with the French Alternative Energies and Atomic Energy Commission (CEA) in Grenoble and the ESIEE engineering school in Paris, which has academic laboratories, to develop microelectrode networks to enable detailed exploration of neural tissue activity in vitro. An initial prototype was finalized three years later. Through multiple collaborations, he continued to improve this technology, particularly with new materials to increase the performance of electrodes (platinum, diamond and, more recently, graphene).
Then, Yvert wanted to put his research to work for patients. With this project in mind, he spent a year at Brown University in the US, in a research unit that led the way in implantable brain-computer interfaces in humans. Back in France, he joined the Grenoble Institute of Neuroscience and began his project on decoding brain signals of speech. In particular, he collaborated with the Clinatec institute created by the CEA, « a unique environment for creating new rehabilitation strategies for people with paralysis », he believes.
The interface to which Yvert devoted his work is aimed particularly at people with “locked-in syndrome” (LIS). Although they cannot move or speak due to complete paralysis, their cognitive faculties are intact. “The cortical activities produced when they want to say something are always present, so if we can decode them with our implants, we can reproduce what they want to say”, hopes the researcher. An initial clinical trial is expected to start in 2025, « if the regulatory procedures go well », he warns. This trial will include people with LIS who will be equipped with an implant developed by Clinatec, positioned on the surface of the brain. “This device provides signals that are highly stable over the long term, with wireless transmission through the skin », he explains.
Pursue and accelerate development
At the same time, the scientist does not forget the fundamental aspect, which has always been a source of motivation in his work. « For example, we’re exploring the brain activity of a new animal model that is very vocal – the pig. This model allows us to test new, more efficient types of implants for potential future use in humans. It will also be possible to see whether there are similarities between the data collected in animals and humans ».
In order to finely decode brain activity, he believes that the devices will still need to be improved, by increasing the number of electrodes, and by innovating in materials and integrated electronics. This is the goal of the Brain Implant project. « We want to create a technological building block that would serve both basic research and to develop brain-computer interfaces for clinical use in different indications: to restore speech or other motor functions », he explains.
These developments and their challenges for people and society are inevitably accompanied by ethical questions around which Yvert has set up processes of reflection, conducted in collaboration with philosophers and patient organizations.
And as if all of this were not enough,
the researcher has also led, since early 2025, the Grenoble Initiative in Medical Devices (LabEx GIMeD), a research partnership on medical devices. « The aim is to bring together multidisciplinary units that develop health technologies, including teams specialized in the humanities and social sciences, to reflect on the implications of these technologies. New projects are expected to emerge from this ecosystem », he outlines for the future.
Looking back, Yvert notes that risk-taking during his career has been successful. “Going from non-invasive brain recording in humans to the technological development of in vitro systems took me out of my comfort zone. But in the end, this leap was essential in preparing for the development of an interface that, I hope, will one day be able to provide real services to patients », he concluded.
Blaise Yvert is an Inserm Research Director and head of the
Neurotechnologies and network dynamics team at Grenoble Institute of Neuroscience (unit 1216 Inserm/Grenoble-Alpes University) in Grenoble.