I'm reading rumours about the Switch 2 having a chat function. If it also has some sort of voice recognition function that could be where they utilise Akida?
Love your informative research. Thanks Bravo.Hi JB49,
Are you talking about the tech details that have been revealed recently in the findings at the FCC? The filings show there is Near Field Communication (NFC) support through a Radio Frequency Identification (RFID) feature that's in the right Joy-Con of the console.
I asked ChatGPT for it's thoughts and here's the response I received, which should all be taken with a grain of salt until we find out more about it when its officially revealed on April 2.
View attachment 79049
View attachment 79050
Hi JB49,
Are you talking about the tech details that have been revealed recently in the findings at the FCC? The filings show there is Near Field Communication (NFC) support through a Radio Frequency Identification (RFID) feature that's in the right Joy-Con of the console.
I asked ChatGPT for it's thoughts and here's the response I received, which should all be taken with a grain of salt until we find out more about it when its officially revealed on April 2.
View attachment 79049
View attachment 79050
This link leads me to nowhere…Could be a good result here, newcomer Anusha Madan, Machine Learning Engineer has liked the Edge Impulse post on the Qualcomm acquisition.
![]()
LinkedIn Login, Sign in | LinkedIn
Login to LinkedIn to keep in touch with people you know, share ideas, and build your career.www.linkedin.com
Apologies if posted already.
Hope everyone is having a great day!![]()
Ah ok , it was a the Linkedin page for Edge Impulse announcing the Qualcomm acquisition. Sorry the link didn't work. If you have Linkedin just locate and an you will See Anusha Madan from Brainchip liked it.This link leads me to nowhere…it says this site is not existing
Ok… so I have to do it by myself? That’s kind of… anyway… thanks for NOTHING!!!!!Ah ok , it was a the Linkedin page for Edge Impulse announcing the Qualcomm acquisition. Sorry the link didn't work. If you have Linkedin just locate and a you will See Anusha Madan from Brainchip liked it.
Interesting like from Mr Brightfield imo
![]()
In January 2019, I was given the honor and challenge to lead the newly… | Rafa Camargo | 25 comments
In January 2019, I was given the honor and challenge to lead the newly formed Hardware team at what would later become Meta Reality Labs. From inception, the… | 25 comments on LinkedInwww.linkedin.com
How Meta Made Silicon Carbide Waveguides & Unlocked Orion’s Large Field of View
As clear as the choice of silicon carbide as a substrate seems today, when we first started down the road to AR glasses a decade ago, it was anything but.www.meta.com
Specialized Silicon: How Meta’s Custom Chips Are Revolutionizing Augmented Reality
In 2017, Reality Labs Chief Scientist Michael Abrash, backed by Meta Founder & CEO Mark Zuckerberg, established a new, secret team within what was then Oculus Research to build the foundation of the next computing platform. Their herculean task: Create a custom silicon solution that could...www.meta.com
View attachment 79054 View attachment 79055Orion’s Compute Puck: The Story Behind the Device that Helped Make Meta’s AR Glasses Possible
Last year at Connect, we unveiled Orion—our first true pair of AR glasses. Today, we’re turning our attention to an unsung hero of Orion: the compute puck.www.meta.com
YEAHHHH YOUR STATEMENT IS GENIUS LEVEL THINKING … GOOO BRAINCHIIIIP1:45 PM - 2:15 PM
Information
Dr. Anthony Lewis
Brainchip
Fast Online Recognition of Gestures using Hardware Efficient Spatiotemporal Convolutional Networks via Codesign
The Temporal Neural Networks (TENNs) developed by Brainchip can be used to tackle a wide range of vision tasks, including object detection, eye tracking and gesture recognition. Here, we will show how the codesign of model architecture, training pipeline and hardware implementation can combine to achieve SOTA performance, using a gesture recognition task example.
The TENNs architecture leverages multiple techniques to improve its efficiency on compatible hardware (such as the Akida chip). First, although effectively offering a 3D convolution, it uses spatially and temporally separable convolutions to make the model lighter in parameter count with equivalent computational power. Second, when deployed on dedicated hardware, temporal inputs are buffered efficiently to minimize memory usage and data movement. Finally, it is possible to reduce model computation even further by adding regularization to boost sparsity of information transiting in the already slim network (achieving more than 90% average activation sparsity in some layers) and thus further improve the efficiency on compatible hardware.
We apply a lightweight TENN model to a gesture recognition task, showing that it can accurately classify the movements performed by a range of actors with SOTA accuracy. The efficiency of the model is then pushed further with virtually no cost to accuracy by applying regularization of activations.
Looks like Tony will be taking the stage and delivering some magic on behalf of the Brainchip Family !!
In just a little over 7.5 hours from now, so work out your own time zones if you're interested, I am happy he has attended this event,
as our CTO you must respect the fact, yes it's a team thing, BUT Tony is now our leader on the technical side, as Peter has retired, as such.
Comon Brainchip, we are worth way more than our current ASX share price indicates, it's an absolute joke.......Tech x
Great find.... If nothing else, you would hate to be developing the latest Meta glasses and watching your competition advertising that they are implementing the latest cutting edge neuromorophic tech, regardless of who's tech it is, and you are not... It's a win.Speaking of smart glasses, what have we here?
Fortiss is actively developing smart glasses prototypes incorporating neuromorphic technology through their EMMANÜELA project, which focuses on energy-efficient human-machine interaction in augmented and virtual reality (AR/VR) settings. This initiative leverages neuromorphic sensor technology and computing to enhance immersive digital experiences.
Can't find anything to suggest a direct collaboration between Fortiss and BrainChip at this time, so will keep it on the watch list.
View attachment 79058
View attachment 79059
![]()
Neuromorphic computing creates an immersive digital experience
In the EMMANÜELA project, fortiss is researching the use of neuromorphic sensor technology and neuromorphic computing in AR/VR. The aim is to investigate the potential of these technology-inspired approaches for augmented digital worlds and to open up new application possibilities.www.fortiss.org
Same company using Loihi for camera and many other things.Speaking of smart glasses, what have we here?
Fortiss is actively developing smart glasses prototypes incorporating neuromorphic technology through their EMMANÜELA project, which focuses on energy-efficient human-machine interaction in augmented and virtual reality (AR/VR) settings. This initiative leverages neuromorphic sensor technology and computing to enhance immersive digital experiences.
Can't find anything to suggest a direct collaboration between Fortiss and BrainChip at this time, so will keep it on the watch list.
View attachment 79058
View attachment 79059
![]()
Neuromorphic computing creates an immersive digital experience
In the EMMANÜELA project, fortiss is researching the use of neuromorphic sensor technology and neuromorphic computing in AR/VR. The aim is to investigate the potential of these technology-inspired approaches for augmented digital worlds and to open up new application possibilities.www.fortiss.org
BrainChip is as we know involved at the intel accelerator program… so… who knows… everything is possible. But we will as always just speculate…Same company using Loihi for camera and many other things.
ELEANOR
Following on from the INRC3 project, where a robotic arm is taught to insert an object using only force feedback, the ELEANOR project (Energy and Latency Efficient Object Insertion Using a Robotic Arm Equipped with Event-Based KAmera and NeuromOrpher Hardware) uses an event-based camera to make the arm approach the slot. Optical flow and 3D reconstruction via Intel's Loihi research chip is used for this purpose.
IBM and Loihi mentioned everywhere on thier website but no Brainchip or Akida.
Lets hope they are trying something new.