BRN Discussion Ongoing

Slade

Top 20
So… basically no denoising on this clip .. can someone confirm that?
Jeez u are annoying. Watch the clip it provides a great live demo of Akida performing denoising.
 
  • Like
  • Haha
  • Love
Reactions: 25 users

Townyj

Ermahgerd
  • Like
  • Fire
  • Love
Reactions: 16 users

Tothemoon24

Top 20
Nice to see the top brass at Tata posting about us

IMG_0777.jpeg
 
  • Like
  • Fire
  • Love
Reactions: 53 users

7für7

Top 20
Nice to see the top brass at Tata posting about us

View attachment 79183

Hm? What? Wow… another praising post about Akida. Nice. Thank you Mr. TATA!! Aaaaaaaand—good night.



1741866291779.gif





Ps..
If anyone is wondering what’s going on… don’t worry, I’m just trying some reverse psychological tactics… I call them the Dolcise Principle.





So… GO BRAINCHIP!

Sorry.. I mean..No don’t go…
 
  • Like
Reactions: 1 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
So weird.

I was just checking out the latest news about Trump trying to support Elon Musk's Tesla brand as the company has become a target in the last few days for American's who are incensed about Elon's handling/mishandling of DOGE.

So, Trump has tried "bigging" up Tesla, by "purchasing" a Tesla of his very own. LOL.

Anyway I just wanted to mention this because I was trying to find out more about measuring the max distance electric vehicles can potentially achieve per charge. I believe Mercedes holds the current record of max distance with the EQXX, in which Brainchip featured.

To cut a long story short, whilst attempting this research, I stumbled upon a previous post from July 2022 #91,953 and I found it very interesting to say the least. What's interesting is that Trump actually says at the time that Elon's "electric cars don't drive long enough".

Suffice to say, I am merely trying to point out that Mercedes selected BrainChip's help with their EQXX saying the following from the EETimes article linked here.
The article states “Although neuromorphic computing is still in its infancy, systems like these will be available on the market in just a few years,” according to Mercedes. “When applied at scale throughout a vehicle, they have the potential to radically reduce the energy needed to run the latest AI technologies.”

If Elon doesn't have us on his radar, perhaps he should, since AI will be a feature heavily in every electric vehicle in the future and therefore has the potential to effect the max distance per charge if not managed correctly.





Screenshot 2025-03-13 at 10.48.15 pm.png




Trump hits back at Elon Musk, says he could have made him ‘drop to his knees and beg’​

By
Lee Brown
Published July 13, 2022
Updated July 13, 2022, 9:43 a.m. ET



Former President Donald Trump has hit back at Elon Musk for calling him too old to run again — claiming he could have made the world’s richest man “drop to [his] knees and beg” when he was in the White House.
Trump used his Twitter rival Truth Social to attack Musk, 51, in an ongoing war of words that on Monday saw the Tesla mogul saying it was “time for Trump to hang up his hat & sail into the sunset.”
“When Elon Musk came to the White House asking me for help on all of his many subsidized projects, whether it’s electric cars that don’t drive long enough, driverless cars that crash, or rocketships to nowhere, without which subsidies he’d be worthless, and telling me how he was a big Trump fan and Republican, I could have said, ‘drop to your knees and beg,’ and he would have done it,” the 45th commander-in-chief claimed.
“Now Elon should focus on getting himself out of the Twitter mess because he could owe $44 billion for something that’s perhaps worthless,” Trump wrote of Musk, who faces legal action after pulling out of his much-hyped offer to buy the social media giant.

Former President Donald Trump on Tuesday escalated his feud with billionaire Elon Musk in a series of Truth Social posts.

“Also, lots of competition for electric cars!” Trump insisted of Tesla.

“P.S. Why was Elon allowed to break the $15 million stock purchase barrier on Twitter without any reporting? That is a very serious breach!” the former president added.

“Have fun Elon and [Jack Dorsey] go to it!” he wrote, referring to the Twitter founder who had supported Musk’s plans to take over.



 
Last edited:
  • Like
  • Fire
  • Haha
Reactions: 9 users

Frangipani

Top 20

View attachment 79180

Some still images from the video clip, showing the three demos:

1. Dynamic Gesture Recognition Demo on the new Akida 2.0 FPGA:


616FE52E-1350-40B7-9ACA-AB62C137B517.jpeg



2. “Wake Word, ASR & LLM - a TENNs Story
Performance on the Edge using TENNs”
(LLM FPGA Tech Demonstrator, showing a 1B parameter LLM trained from scratch at BrainChip and running not in software simulation, but on an FPGA, unconnected to the internet and requiring such a small amount of power it could run on a watch battery; the FPGA is running at about 1/10th the speed that the ASIC would run at)

012F37E9-81F4-4154-8237-54F2B6BA73FD.jpeg
26CA1A32-F294-4CF7-BA6A-A76836384A48.jpeg



3. Audio Denoising Demo (TENNs):

FF2DCBEE-2F4E-47E7-BE46-B8DAB4EA0538.jpeg


33F70D49-179B-4AA0-87B6-02604187CEA3.jpeg

A212ED08-0363-41F7-AAC4-9D2BCEACC24B.jpeg
 
  • Like
  • Fire
  • Love
Reactions: 51 users

Tezza

Regular
Some still images from the video clip, showing the three demos:

1. Dynamic Gesture Recognition Demo on the new Akida 2.0 FPGA:


View attachment 79186


2. “Wake Word, ASR & LLM - a TENNs Story
Performance on the Edge using TENNs”
(LLM FPGA Tech Demonstrator, showing a 1B parameter LLM trained from scratch at BrainChip and running not in software simulation, but on an FPGA, unconnected to the internet and requiring such a small amount of power it could run on a watch battery; the FPGA is running at about 1/10th the speed that the ASIC would run at)

View attachment 79189 View attachment 79190


3. Audio Denoising Demo (TENNs):

View attachment 79191

View attachment 79200
View attachment 79199
So you signed some deals whilst all these customers were swamping you?
 
  • Haha
  • Like
Reactions: 3 users

Frangipani

Top 20
Hi Pom and all.
I don't quite get the relevance of gesture recognition.
What are the practical applications that it addresses?
I can see that it may be a handy aid for deaf people that sign and perhaps in the vacuum of space where sound hasn't a medium to propagate within.
Is it something to do with the proposed Nintendo gaming system?
I am not familiar with it so may be missing something relevant there.
Appreciate it if anyone here can enlighten me.

Here’s what I found to be a good overview:


Some excerpts:

What is Gesture Recognition?​

Gesture recognition refers to the technology that interprets human gestures, such as hand movements, facial expressions, or body language, through mathematical algorithms. It enables humans to interact with machines and computers without using mechanical devices like keyboards, mice, or touchscreens. Gesture recognition works by using cameras and sensors to pick up movements from parts of the body like hands or the face. These movements are turned into digital data that computers can understand.

(…)

Gesture Recognition and Detection Technologies​

  • Sensor-Based Hand Gesture Recognition: A sensor-based gesture recognition program detects and analyses human gestures. This can be accomplished using a variety of sensors, including cameras, infrared sensors, and accelerometers. These sensors gather information about the movement and location of a person's body or limbs, which the algorithm subsequently utilizes to recognize specific motions.
  • Vision-Based Hand Gesture Recognition: A vision-based gesture recognition system detects and interprets motions using cameras or other visual sensors. The cameras collect photos or videos of the user's gestures, which are then analyzed and identified using computer vision and machine learning techniques.

Gesture Recognition Examples and Uses​

  • Smart TVs: Modern smart TVs use gesture recognition, allowing viewers to switch channels, adjust the volume, or browse through menus with simple hand movements. This means you don’t always need to use a remote control, making it more convenient and accessible.
  • Home Automation Systems: In smart homes, gesture recognition enhances user interaction by enabling control over the home environment. For instance, waving your hand can turn lights on or off, adjust the thermostat, or manage your home entertainment systems, integrating seamlessly with smart home technology for improved convenience and energy efficiency.
  • Gaming Consoles: Devices like the Microsoft Kinect have transformed gaming, providing a motion-controlled gaming experience where players use their body movements to interact with the game. This adds a level of physical activity and immersion to gaming, making it more engaging and interactive.
  • Automotive: Modern cars incorporate gesture recognition for safer and more convenient control of various features. Drivers can execute commands like adjusting the stereo volume, changing air conditioning settings, or answering phone calls with simple hand gestures, minimizing distractions and enhancing focus on driving.
  • Virtual Reality (VR) and Augmented Reality (AR): These technologies heavily rely on gesture recognition for user interaction. In VR and AR environments, users can manipulate objects, navigate menus, or control applications through gestures, creating a more immersive and interactive experience without needing physical controllers.
  • Kitchen Appliances: Advanced kitchen gadgets are adopting gesture recognition, allowing for hands-free operation. For example, with a wave of your hand, you can operate microwaves, ovens, or smart faucets, adding convenience and hygiene to cooking and kitchen management.
(…)

Conclusion​

Gesture recognition is a technology that allows devices to understand and respond to human movements. Using advanced machine learning algorithms like CNNs and SVMs, it transforms physical gestures into digital commands, making interaction with gadgets more intuitive and seamless. This technology enhances user experience in smart homes, gaming, automotive, and virtual reality, among other areas. As we move towards more interactive and user-friendly technologies, gesture recognition stands out as a key player in bridging the gap between humans and machines, making our interactions more natural and efficient.





Apart from the use cases listed above, human-robot interaction comes to mind - think of the proof-of-concept the researchers from Fraunhofer HHI’s Wireless Communications and Networks Department demonstrated with the help of Spot, the robot dog, as part of 6G-RIC (Research and Innovation Cluster),
funded by Germany’s Federal Ministry of Education and Research:



2082DA08-3D7D-4D50-96BE-B97CBB576419.jpeg



Human-robot interaction via gesture recognition is also of particular interest in the healthcare sector. Halfway through this August 2024 post
https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-433491, I summarised a fascinating podcast I had listened to.
Here are some excerpts:

“I chanced upon an intriguing German-language podcast (Feb 1, 2024) titled “6G und die Arbeit des 6G-RIC” (“6G and the work of the 6G-RIC”) with Slawomir Stanczak as guest, who is Professor for Network Information Theory at TU Berlin, Head of Fraunhofer HHI’s Wireless Communications and Networks Department as well as Coordinator of the 6G Research and Innovation Cluster (6G-RIC):

https://www.ip-insider.de/der-nutze...ellschaft-a-cf561755cde0be7b2496c94704668417/

(…)

From 17:12 min onwards, the podcast host picks up the topic of connected robotics and mentions a collaboration with Charité Universitätsmedizin Berlin, which is Germany’s biggest (and very renowned) university hospital, regarding the development of nursing robots and their control via 6G.

Stanczak confirms this and shares with his listeners they are in talks with Charité doctors in order to simplify certain in-hospital-processes and especially to reduce the workload on staff. Two new technological 6G features are currently being discussed: 1. collaborative robots and 2. integrated communication and sensing (ICAS).

Stanczak and his colleagues were told that apart from the global nursing shortage we are already facing, it is also predicted that we will suffer a shortage of medical doctors in the years to come, so the researchers were wondering whether robots could possibly compensate for this loss.

The idea is to connect numerous nursing robots in order to coordinate them and also for them to communicate with each other and cooperate efficiently on certain tasks - e.g., comparatively simple ones such as transporting patients to the operating theatre or serving them something to drink [of a non-alcoholic nature, I presume 😉]. But the researchers even envision complex tasks such as several robots collaborating on turning patients in bed.

Telemedicine will also become more important in the future, such as surgeons operating remotely with the help of an operating robot [you may have heard about the da Vinci Surgical System manufactured by Intuitive Surgical], while being in a totally different location.
[Something Stanczak didn’t specifically mention, but came to my mind when thinking of robot-control via gesture recognition in a hospital setting, is the fact that it would be contactless and thus perfect in an operating theatre, where sterile conditions must be maintained.] (…)”


Think of a surgeon using hand gestures during an operation to instruct a medical assistant robot to pass him/her the correct surgical instruments.



Then there is the whole field of industrial robots.
Fortiss, for example, has an ongoing project in collaboration with NEURA Robotics and TU Chemnitz called CORINNE (Cobots’ Relational Interface with Neuromorphic Networks and Events) that “aims to build robots that can recognise and respond to gestures (known or unknown), to interact with humans on welding tasks”. They are using Loihi for that project running from April 2024 to March 2026, in case you wondered.

https://www.fortiss.org/en/research/projects/detail/corinne


4E27663E-1AFC-4ECE-8662-06241CC6CBE5.jpeg


CE1C4E88-9AC9-463E-9CD2-63BB4A17746F.jpeg


So while gesture recognition and neuromorphic technology undoubtedly make a fruitful liaison, we as BRN shareholders won’t get to taste the sweetness of that ripe fruit until customers actually start signing on the dotted line.
 
  • Like
  • Love
Reactions: 18 users

Frangipani

Top 20
  • Like
  • Fire
  • Love
Reactions: 24 users

Frangipani

Top 20


22828B50-BED3-427B-8427-D4E5250CCDFA.jpeg


English translation:

BA044506-73CC-433D-B838-6EB867979866.jpeg




Very positive article in German about NEXA with some additional info we didn’t know yet:

The LinkedIn post’s English translation already mentioned the smart glasses’ liquid crystal technology that protects against light-induced triggers of epileptic seizures by darkening harmful light stimuli within milliseconds.

The below article also says that the current prototype apparently only functions at temperatures below 26 degrees Celsius (78.8 degrees Fahrenheit) and that the Onsor team are working on fixing this.

Yeah, not exactly ideal in Oman’s hot climate, unless you stick to buildings, cars or public transport with AC… 😊



A3833D9A-DBAC-4E28-B571-E0F04FF778E3.jpeg

1FB9E2DF-5E53-4F51-BA94-D5173F49C553.jpeg
 
  • Like
  • Love
  • Fire
Reactions: 22 users

Frangipani

Top 20

9831FA0A-B9F7-4760-986F-0AF623289C9A.jpeg


The two gentlemen standing next to Alf Kuchenbuch and Gilles Bézard in the top picture are Jules Lecomte and Axel von Arnim from fortiss, by the way!
Great to see them visiting the BrainChip booth!
Coincidentally, I had posted another picture with Jules Lecomte in it just two hours ago, regarding the CORINNE project… Or was it telepathy? 😉

Anyway, today was definitely not Jules Lecomte’s first encounter with Akida:

https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-447484

3688E726-51A2-4DDC-BA1F-5C5B19606CCE.jpeg


(…)

AACE265A-8AB2-4E5A-BC52-F33DF0057112.jpeg
 
  • Like
  • Fire
  • Love
Reactions: 32 users

Iseki

Regular

View attachment 79220

The two gentlemen standing next to Alf Kuchenbuch and Gilles Bézard in the top picture are Jules Lecomte and Axel von Arnim from fortiss, by the way!
Great to see them visiting the BrainChip booth!
Coincidentally, I had posted another picture with Jules Lecomte in it just two hours ago, regarding the CORINNE project… Or was it telepathy? 😉

Anyway, today was definitely not Jules Lecomte’s first encounter with Akida:

https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-447484

View attachment 79221

(…)

View attachment 79222
great telepathy! well done!
BTW do we know which FPGA chip we are using in our demo?
 
  • Like
Reactions: 2 users

Iseki

Regular
We have smart watches that show us how stressed we are already.... Moving into an unknown area like seizures is huge, people still take med's and have attacks. These glasses aren't some gimmick like the Rayban Meta ones, those do exactly what a smartphone or smartwatch does.

Proving a tech works from top to bottom is what Brainchip are doing.. not making some gimmick that will have a quick win. We want long term inventions with Akida that can be built upon and last for years/decades to come.
I think you've nailed it. Put akida into a smartwatch today, and cell phones will follow.
 
  • Like
Reactions: 4 users

jtardif999

Regular
Hi Smooth,

That just puts into prospective what T......Fact Finder allegedly said that, he wouldn't sell Brainchip shares for anything less than
$40.00 a share (I assume that if that is actually what he said, he was referring to AUD).

That would value our company as a stand-alone operator at around 84 Billion AUD.

As a great Australian quote, from the movie many would remember..........."Tell him he's dreaming" !!!!!!

I do know of someone who once upon a time mentioned 10.5 Billion AUD would be fair value based on todays shares listed on market.

We ALL KNOW WE HAVE SOMETHING SPECIAL, but sadly the market is still catching up, being allegedly 3 plus years ahead has presented
our company with some major headwinds, it's never ever an easy gig being so far ahead of the general masses, but once the adoption
commences, we are really positioned at the top of the grid......let the race commence.

Regards...........Tech.
I have always thought that ‘3 years ahead’ doesn’t really make sense in the context of things? We have a differentiated patented tech that the world is going to need at some stage - 3 years ahead doesn’t fit that narrative imo.
 
  • Like
Reactions: 3 users

Dijon101

Regular
Screenshot_20250314_094634_Chrome.jpg



Gesture recognition. Where science fiction becomes reality ..

Can't help but think of minority report and making traditional interfaces like the keyboard and mouse obsolete.
 
  • Like
  • Love
  • Fire
Reactions: 8 users

Frangipani

Top 20
I noticed Eric Gallo liking one of today’s BrainChip posts on LinkedIn.

Remember him?

6B46E531-98A5-462F-9942-10CDE44112C1.jpeg





Turns out he is no longer with Accenture - instead, he has been working for a company called SpikeSense Labs since December 2024.

(At first, I inferred he must have founded or co-founded SpikeSense Labs, since he describes himself as being “self-employed”; however, wouldn’t labelling himself “staff” in that case be rather puzzling? 🤔)


0F2116AD-915D-452E-940A-396B255DC2AD.jpeg


3514B662-45D8-409D-9680-3B0705B18C87.jpeg
9CF169F2-4551-4694-B8CD-BA25BC4D410C.jpeg




F528A8D2-8E31-412D-B08D-8A4ABD5D67FE.jpeg
 

Attachments

  • 29C9D057-20FE-44BA-8E2D-CCB24BB3490C.jpeg
    29C9D057-20FE-44BA-8E2D-CCB24BB3490C.jpeg
    140.9 KB · Views: 17
  • Like
  • Fire
  • Wow
Reactions: 18 users

HopalongPetrovski

I'm Spartacus!
Here’s what I found to be a good overview:


Some excerpts:

What is Gesture Recognition?​

Gesture recognition refers to the technology that interprets human gestures, such as hand movements, facial expressions, or body language, through mathematical algorithms. It enables humans to interact with machines and computers without using mechanical devices like keyboards, mice, or touchscreens. Gesture recognition works by using cameras and sensors to pick up movements from parts of the body like hands or the face. These movements are turned into digital data that computers can understand.

(…)

Gesture Recognition and Detection Technologies​

  • Sensor-Based Hand Gesture Recognition: A sensor-based gesture recognition program detects and analyses human gestures. This can be accomplished using a variety of sensors, including cameras, infrared sensors, and accelerometers. These sensors gather information about the movement and location of a person's body or limbs, which the algorithm subsequently utilizes to recognize specific motions.
  • Vision-Based Hand Gesture Recognition: A vision-based gesture recognition system detects and interprets motions using cameras or other visual sensors. The cameras collect photos or videos of the user's gestures, which are then analyzed and identified using computer vision and machine learning techniques.

Gesture Recognition Examples and Uses​

  • Smart TVs: Modern smart TVs use gesture recognition, allowing viewers to switch channels, adjust the volume, or browse through menus with simple hand movements. This means you don’t always need to use a remote control, making it more convenient and accessible.
  • Home Automation Systems: In smart homes, gesture recognition enhances user interaction by enabling control over the home environment. For instance, waving your hand can turn lights on or off, adjust the thermostat, or manage your home entertainment systems, integrating seamlessly with smart home technology for improved convenience and energy efficiency.
  • Gaming Consoles: Devices like the Microsoft Kinect have transformed gaming, providing a motion-controlled gaming experience where players use their body movements to interact with the game. This adds a level of physical activity and immersion to gaming, making it more engaging and interactive.
  • Automotive: Modern cars incorporate gesture recognition for safer and more convenient control of various features. Drivers can execute commands like adjusting the stereo volume, changing air conditioning settings, or answering phone calls with simple hand gestures, minimizing distractions and enhancing focus on driving.
  • Virtual Reality (VR) and Augmented Reality (AR): These technologies heavily rely on gesture recognition for user interaction. In VR and AR environments, users can manipulate objects, navigate menus, or control applications through gestures, creating a more immersive and interactive experience without needing physical controllers.
  • Kitchen Appliances: Advanced kitchen gadgets are adopting gesture recognition, allowing for hands-free operation. For example, with a wave of your hand, you can operate microwaves, ovens, or smart faucets, adding convenience and hygiene to cooking and kitchen management.
(…)

Conclusion​

Gesture recognition is a technology that allows devices to understand and respond to human movements. Using advanced machine learning algorithms like CNNs and SVMs, it transforms physical gestures into digital commands, making interaction with gadgets more intuitive and seamless. This technology enhances user experience in smart homes, gaming, automotive, and virtual reality, among other areas. As we move towards more interactive and user-friendly technologies, gesture recognition stands out as a key player in bridging the gap between humans and machines, making our interactions more natural and efficient.





Apart from the use cases listed above, human-robot interaction comes to mind - think of the proof-of-concept the researchers from Fraunhofer HHI’s Wireless Communications and Networks Department demonstrated with the help of Spot, the robot dog, as part of 6G-RIC (Research and Innovation Cluster),
funded by Germany’s Federal Ministry of Education and Research:



View attachment 79205


Human-robot interaction via gesture recognition is also of particular interest in the healthcare sector. Halfway through this August 2024 post
https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-433491, I summarised a fascinating podcast I had listened to.
Here are some excerpts:

“I chanced upon an intriguing German-language podcast (Feb 1, 2024) titled “6G und die Arbeit des 6G-RIC” (“6G and the work of the 6G-RIC”) with Slawomir Stanczak as guest, who is Professor for Network Information Theory at TU Berlin, Head of Fraunhofer HHI’s Wireless Communications and Networks Department as well as Coordinator of the 6G Research and Innovation Cluster (6G-RIC):

https://www.ip-insider.de/der-nutze...ellschaft-a-cf561755cde0be7b2496c94704668417/

(…)

From 17:12 min onwards, the podcast host picks up the topic of connected robotics and mentions a collaboration with Charité Universitätsmedizin Berlin, which is Germany’s biggest (and very renowned) university hospital, regarding the development of nursing robots and their control via 6G.

Stanczak confirms this and shares with his listeners they are in talks with Charité doctors in order to simplify certain in-hospital-processes and especially to reduce the workload on staff. Two new technological 6G features are currently being discussed: 1. collaborative robots and 2. integrated communication and sensing (ICAS).

Stanczak and his colleagues were told that apart from the global nursing shortage we are already facing, it is also predicted that we will suffer a shortage of medical doctors in the years to come, so the researchers were wondering whether robots could possibly compensate for this loss.

The idea is to connect numerous nursing robots in order to coordinate them and also for them to communicate with each other and cooperate efficiently on certain tasks - e.g., comparatively simple ones such as transporting patients to the operating theatre or serving them something to drink [of a non-alcoholic nature, I presume 😉]. But the researchers even envision complex tasks such as several robots collaborating on turning patients in bed.

Telemedicine will also become more important in the future, such as surgeons operating remotely with the help of an operating robot [you may have heard about the da Vinci Surgical System manufactured by Intuitive Surgical], while being in a totally different location.
[Something Stanczak didn’t specifically mention, but came to my mind when thinking of robot-control via gesture recognition in a hospital setting, is the fact that it would be contactless and thus perfect in an operating theatre, where sterile conditions must be maintained.] (…)”


Think of a surgeon using hand gestures during an operation to instruct a medical assistant robot to pass him/her the correct surgical instruments.



Then there is the whole field of industrial robots.
Fortiss, for example, has an ongoing project in collaboration with NEURA Robotics and TU Chemnitz called CORINNE (Cobots’ Relational Interface with Neuromorphic Networks and Events) that “aims to build robots that can recognise and respond to gestures (known or unknown), to interact with humans on welding tasks”. They are using Loihi for that project running from April 2024 to March 2026, in case you wondered.

https://www.fortiss.org/en/research/projects/detail/corinne


View attachment 79210

View attachment 79211

So while gesture recognition and neuromorphic technology undoubtedly make a fruitful liaison, we as BRN shareholders won’t get to taste the sweetness of that ripe fruit until customers actually start signing on the dotted line.
Thank you Frangipani for sharing this.
Gesture recognition as another, or complimentary mode of interface certainly has a place and in certain circumstances will be the best medium of interaction between our organic intelligence and our current silicon manifestations or simulacrums, as well as the advanced and superseding artificial intelligences we are striving to bring into existence.
Whilst I thought just "speaking" to instruct a device would be simpler and less energy intensive there will be noisy environments that will make this difficult. And here especially, where traditional keypads are not practical or an impediment, gesture recognition will have a place.
I guess the logical next step is some kind of neural interface as proposed by Musk with neuralink (and others) and particularly as robotics advances and becomes more mainstream the lines between us will tend to blur and merge as we become something beyond merely biological machines and embrace other mediums of being.
 
  • Like
Reactions: 5 users

manny100

Regular
We already knew that GEN 2/TENNs already was well advanced in some facets to the original Chatgpt.
Is the Fpga adoption a repeat of that for LLMs or a significant improvement?
Tony L did not specify.
 
  • Like
Reactions: 2 users

7für7

Top 20
I hope we will give Gesture recognition to the shorter soon if you know what I mean…

1741908997796.gif
 
  • Haha
  • Like
  • Fire
Reactions: 13 users
Top Bottom