BRN Discussion Ongoing

Gee, very low volume so far in the first hour of trading. Plus only averaging about 5,000 per trade, heaps with 3 digits or less.

There's also a big 3 million buy order at 18c at the moment, interesting.
 
  • Fire
  • Like
  • Thinking
Reactions: 4 users
Interesting
 

Attachments

  • 8E983C62-8A0F-4693-8A83-2F1E5EAD5D6A.jpeg
    8E983C62-8A0F-4693-8A83-2F1E5EAD5D6A.jpeg
    131.3 KB · Views: 212
  • B3551A3C-AA17-4560-B546-D4DB385B1DB1.jpeg
    B3551A3C-AA17-4560-B546-D4DB385B1DB1.jpeg
    117 KB · Views: 213
Last edited:
  • Like
  • Fire
Reactions: 4 users
Mercedes’ will be talking about their new software in 48 hours
 

Attachments

  • 102EAFEE-40AB-46FC-82C0-391B53D05263.jpeg
    102EAFEE-40AB-46FC-82C0-391B53D05263.jpeg
    232.3 KB · Views: 149
  • EA394523-FF98-400B-BAA6-DF6BF070072C.jpeg
    EA394523-FF98-400B-BAA6-DF6BF070072C.jpeg
    142.5 KB · Views: 147
Last edited:
  • Like
  • Fire
  • Haha
Reactions: 12 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
I'm reading rumours about the Switch 2 having a chat function. If it also has some sort of voice recognition function that could be where they utilise Akida?


Hi JB49,

Are you talking about the tech details that have been revealed recently in the findings at the FCC? The filings show there is Near Field Communication (NFC) support through a Radio Frequency Identification (RFID) feature that's in the right Joy-Con of the console.

I asked ChatGPT for it's thoughts and here's the response I received, which should all be taken with a grain of salt until we find out more about it when its officially revealed on April 2.



Screenshot 2025-03-12 at 11.21.54 am.png



Screenshot 2025-03-12 at 11.22.02 am.png
 
  • Like
  • Fire
  • Thinking
Reactions: 33 users

MegaportX

Regular
Hi JB49,

Are you talking about the tech details that have been revealed recently in the findings at the FCC? The filings show there is Near Field Communication (NFC) support through a Radio Frequency Identification (RFID) feature that's in the right Joy-Con of the console.

I asked ChatGPT for it's thoughts and here's the response I received, which should all be taken with a grain of salt until we find out more about it when its officially revealed on April 2.



View attachment 79049


View attachment 79050
Love your informative research. Thanks Bravo.
 
  • Like
  • Love
Reactions: 9 users

Rach2512

Regular
Hi JB49,

Are you talking about the tech details that have been revealed recently in the findings at the FCC? The filings show there is Near Field Communication (NFC) support through a Radio Frequency Identification (RFID) feature that's in the right Joy-Con of the console.

I asked ChatGPT for it's thoughts and here's the response I received, which should all be taken with a grain of salt until we find out more about it when its officially revealed on April 2.



View attachment 79049


View attachment 79050

Perhaps!
 

Attachments

  • Screenshot_20250312_103827_Samsung Internet.jpg
    Screenshot_20250312_103827_Samsung Internet.jpg
    506.7 KB · Views: 193
  • Like
Reactions: 8 users

MDhere

Top 20
Could be a good result here, newcomer Anusha Madan, Machine Learning Engineer has liked the Edge Impulse post on the Qualcomm acquisition.


Apologies if posted already.

Hope everyone is having a great day! :)
 
  • Like
Reactions: 6 users

IloveLamp

Top 20
Interesting like from Mr Brightfield imo



1000022457.jpg
1000022462.jpg
 
  • Like
Reactions: 9 users

7für7

Top 20
Could be a good result here, newcomer Anusha Madan, Machine Learning Engineer has liked the Edge Impulse post on the Qualcomm acquisition.


Apologies if posted already.

Hope everyone is having a great day! :)
This link leads me to nowhere… 😳 it says this site is not existing
 

MDhere

Top 20
This link leads me to nowhere… 😳 it says this site is not existing
Ah ok , it was a the Linkedin page for Edge Impulse announcing the Qualcomm acquisition. Sorry the link didn't work. If you have Linkedin just locate and an you will See Anusha Madan from Brainchip liked it.
 
  • Like
Reactions: 3 users

7für7

Top 20
Ah ok , it was a the Linkedin page for Edge Impulse announcing the Qualcomm acquisition. Sorry the link didn't work. If you have Linkedin just locate and a you will See Anusha Madan from Brainchip liked it.
Ok… so I have to do it by myself? That’s kind of… anyway… thanks for NOTHING!!!!!

1741750625508.gif



Just kidding i will check later 😂
 
  • Haha
Reactions: 3 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
Interesting like from Mr Brightfield imo



View attachment 79054 View attachment 79055


Remember the podcast in January when Steve mentioned he was in ongoing discussions with manufacturers of smart glasses? He said they are working with manufacturers to integrate more advanced algorithms into products. 👓
 
  • Like
  • Love
  • Wow
Reactions: 20 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
Speaking of smart glasses, what have we here?

Fortiss is actively developing smart glasses prototypes incorporating neuromorphic technology through their EMMANÜELA project, which focuses on energy-efficient human-machine interaction in augmented and virtual reality (AR/VR) settings. This initiative leverages neuromorphic sensor technology and computing to enhance immersive digital experiences.

Can't find anything to suggest a direct collaboration between Fortiss and BrainChip at this time, so will keep it on the watch list.


Screenshot 2025-03-12 at 2.55.07 pm.png




Screenshot 2025-03-12 at 2.55.16 pm.png


 
  • Like
  • Fire
  • Love
Reactions: 24 users

TECH

Regular
1:45 PM - 2:15 PM
Information
Dr. Anthony Lewis
Brainchip

Fast Online Recognition of Gestures using Hardware Efficient Spatiotemporal Convolutional Networks via Codesign​

The Temporal Neural Networks (TENNs) developed by Brainchip can be used to tackle a wide range of vision tasks, including object detection, eye tracking and gesture recognition. Here, we will show how the codesign of model architecture, training pipeline and hardware implementation can combine to achieve SOTA performance, using a gesture recognition task example.
The TENNs architecture leverages multiple techniques to improve its efficiency on compatible hardware (such as the Akida chip). First, although effectively offering a 3D convolution, it uses spatially and temporally separable convolutions to make the model lighter in parameter count with equivalent computational power. Second, when deployed on dedicated hardware, temporal inputs are buffered efficiently to minimize memory usage and data movement. Finally, it is possible to reduce model computation even further by adding regularization to boost sparsity of information transiting in the already slim network (achieving more than 90% average activation sparsity in some layers) and thus further improve the efficiency on compatible hardware.
We apply a lightweight TENN model to a gesture recognition task, showing that it can accurately classify the movements performed by a range of actors with SOTA accuracy. The efficiency of the model is then pushed further with virtually no cost to accuracy by applying regularization of activations.

Looks like Tony will be taking the stage and delivering some magic on behalf of the Brainchip Family !!

In just a little over 7.5 hours from now, so work out your own time zones if you're interested, I am happy he has attended this event,
as our CTO you must respect the fact, yes it's a team thing, BUT Tony is now our leader on the technical side, as Peter has retired, as such.

Comon Brainchip, we are worth way more than our current ASX share price indicates, it's an absolute joke.......Tech x
 
  • Like
  • Love
  • Fire
Reactions: 57 users

7für7

Top 20
1:45 PM - 2:15 PM
Information
Dr. Anthony Lewis
Brainchip

Fast Online Recognition of Gestures using Hardware Efficient Spatiotemporal Convolutional Networks via Codesign​

The Temporal Neural Networks (TENNs) developed by Brainchip can be used to tackle a wide range of vision tasks, including object detection, eye tracking and gesture recognition. Here, we will show how the codesign of model architecture, training pipeline and hardware implementation can combine to achieve SOTA performance, using a gesture recognition task example.
The TENNs architecture leverages multiple techniques to improve its efficiency on compatible hardware (such as the Akida chip). First, although effectively offering a 3D convolution, it uses spatially and temporally separable convolutions to make the model lighter in parameter count with equivalent computational power. Second, when deployed on dedicated hardware, temporal inputs are buffered efficiently to minimize memory usage and data movement. Finally, it is possible to reduce model computation even further by adding regularization to boost sparsity of information transiting in the already slim network (achieving more than 90% average activation sparsity in some layers) and thus further improve the efficiency on compatible hardware.
We apply a lightweight TENN model to a gesture recognition task, showing that it can accurately classify the movements performed by a range of actors with SOTA accuracy. The efficiency of the model is then pushed further with virtually no cost to accuracy by applying regularization of activations.

Looks like Tony will be taking the stage and delivering some magic on behalf of the Brainchip Family !!

In just a little over 7.5 hours from now, so work out your own time zones if you're interested, I am happy he has attended this event,
as our CTO you must respect the fact, yes it's a team thing, BUT Tony is now our leader on the technical side, as Peter has retired, as such.

Comon Brainchip, we are worth way more than our current ASX share price indicates, it's an absolute joke.......Tech x
YEAHHHH YOUR STATEMENT IS GENIUS LEVEL THINKING … GOOO BRAINCHIIIIP

1741756876388.gif
 
  • Like
  • Haha
Reactions: 5 users

toasty

Regular
Interesting we've closed up while the index is down quite a bit...........and the buy/sell ratio looks very positive..........
 
  • Like
Reactions: 22 users

Labsy

Regular
Speaking of smart glasses, what have we here?

Fortiss is actively developing smart glasses prototypes incorporating neuromorphic technology through their EMMANÜELA project, which focuses on energy-efficient human-machine interaction in augmented and virtual reality (AR/VR) settings. This initiative leverages neuromorphic sensor technology and computing to enhance immersive digital experiences.

Can't find anything to suggest a direct collaboration between Fortiss and BrainChip at this time, so will keep it on the watch list.


View attachment 79058



View attachment 79059

Great find.... If nothing else, you would hate to be developing the latest Meta glasses and watching your competition advertising that they are implementing the latest cutting edge neuromorophic tech, regardless of who's tech it is, and you are not... It's a win.
 
  • Like
  • Fire
Reactions: 15 users

Rach2512

Regular
A couple of late ones, trying to get in before tomorrow's news!
 

Attachments

  • Screenshot_20250312_155803_Samsung Internet.jpg
    Screenshot_20250312_155803_Samsung Internet.jpg
    327.8 KB · Views: 135
  • Like
Reactions: 3 users

Luppo71

Founding Member
Speaking of smart glasses, what have we here?

Fortiss is actively developing smart glasses prototypes incorporating neuromorphic technology through their EMMANÜELA project, which focuses on energy-efficient human-machine interaction in augmented and virtual reality (AR/VR) settings. This initiative leverages neuromorphic sensor technology and computing to enhance immersive digital experiences.

Can't find anything to suggest a direct collaboration between Fortiss and BrainChip at this time, so will keep it on the watch list.


View attachment 79058



View attachment 79059

Same company using Loihi for camera and many other things.

ELEANOR​

Following on from the INRC3 project, where a robotic arm is taught to insert an object using only force feedback, the ELEANOR project (Energy and Latency Efficient Object Insertion Using a Robotic Arm Equipped with Event-Based KAmera and NeuromOrpher Hardware) uses an event-based camera to make the arm approach the slot. Optical flow and 3D reconstruction via Intel's Loihi research chip is used for this purpose.

IBM and Loihi mentioned everywhere on thier website but no Brainchip or Akida.
Lets hope they are trying something new.
 
  • Like
Reactions: 4 users

7für7

Top 20
Same company using Loihi for camera and many other things.

ELEANOR​

Following on from the INRC3 project, where a robotic arm is taught to insert an object using only force feedback, the ELEANOR project (Energy and Latency Efficient Object Insertion Using a Robotic Arm Equipped with Event-Based KAmera and NeuromOrpher Hardware) uses an event-based camera to make the arm approach the slot. Optical flow and 3D reconstruction via Intel's Loihi research chip is used for this purpose.

IBM and Loihi mentioned everywhere on thier website but no Brainchip or Akida.
Lets hope they are trying something new.
BrainChip is as we know involved at the intel accelerator program… so… who knows… everything is possible. But we will as always just speculate… 🙆🏻‍♂️
 
  • Like
Reactions: 4 users
Top Bottom