BRN Discussion Ongoing



1X's 3rd iteration of NEO.
I think it's actually more uncanny, the closer it gets to moving like a real person, in a big sock..

These don't use the rigid mechanics of other humanoid robots.

Reminds me of an excellent tv show I watched a few years ago

 
  • Wow
  • Like
Reactions: 4 users

HopalongPetrovski

I'm Spartacus!


1X's 3rd iteration of NEO.
I think it's actually more uncanny, the closer it gets to moving like a real person, in a big sock..

These don't use the rigid mechanics of other humanoid robots.

Interesting stuff Dingo.
I particularly like the fact that it can be "driven" remotely.
This may be previous version but even so, they are getting there.

 
  • Wow
  • Like
Reactions: 4 users
Reminds me of an excellent tv show I watched a few years ago


Yeah watched that and enjoyed it.

It reminds me of a Red Dwarf episode, where there was an earlier "model" of Kryten and Rimmer questioned why the older model, was more "realistic" looking.

Kryten explained that they changed direction, because people were irked the more Life-like they became.

The only ones that will be made more Life-like, are "companion" robots..😆
 
  • Haha
  • Like
Reactions: 5 users

Esq.111

Fascinatingly Intuitive.
Morning Chippers ,

Buy / Sell spread​

Prices table
BuyersNumberBuyersVolumeBuyersPrice $SellersPrice $SellersVolumeSellersNumber
19339,0080.2000.405149,2324
495,6200.1950.410742,69814
682,1310.1900.415220,0005
1100,0000.1850.420219,08217
548,0010.1800.42510,0001
210,9000.1750.430204,9955
770,8690.1700.435357,8818
256,0000.1650.440498,25615
11105,1950.1600.445125,4284
225,3540.1550.450447,90329
17410,5490.1500.45520,0001
115,0000.1450.460250,0001
17,4140.1350.4706,5001
3305,0000.1000.4904,8501
16,6320.0980.5002321
1500,0000.0960.5106,6201
0.60081
30.500491

Early morning buy / sell .........Sell $30.50each for 49 units ...... 😁.

Hopefully an eventful week ahead.

Regards ,
Esq.
 
  • Like
  • Haha
  • Fire
Reactions: 14 users

CHIPS

Regular
Morning Chippers ,

Buy / Sell spread​

Prices table
BuyersNumberBuyersVolumeBuyersPrice $SellersPrice $SellersVolumeSellersNumber
19339,0080.2000.405149,2324
495,6200.1950.410742,69814
682,1310.1900.415220,0005
1100,0000.1850.420219,08217
548,0010.1800.42510,0001
210,9000.1750.430204,9955
770,8690.1700.435357,8818
256,0000.1650.440498,25615
11105,1950.1600.445125,4284
225,3540.1550.450447,90329
17410,5490.1500.45520,0001
115,0000.1450.460250,0001
17,4140.1350.4706,5001
3305,0000.1000.4904,8501
16,6320.0980.5002321
1500,0000.0960.5106,6201
0.60081
30.500491

Early morning buy / sell .........Sell $30.50each for 49 units ...... 😁.

Hopefully an eventful week ahead.

Regards ,
Esq.

Allways be prepared! 😁
I think, I will place a sales order for 100 EUR tomorrow 🥳. I will sell all my stocks for that SP and start a good life.
I have more than 200,000 stocks 😅, so a life in luxury should be possible.

Just dreaming.
 
  • Like
  • Love
  • Haha
Reactions: 10 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
Screenshot 2025-02-24 at 10.50.50 am.png



EXTRACT



Screenshot 2025-02-24 at 11.05.29 am.png







Screenshot 2025-02-24 at 10.51.52 am.png



Screenshot 2025-02-24 at 10.54.24 am.png

Screenshot 2025-02-24 at 10.54.35 am.png


Screenshot 2025-02-24 at 10.54.51 am.png







Here is a snippet from a podcast published online on the 7 August 2024, where Satalia's CEO & WPP Chief AI Officer, Daniel Hulme, talks about neuromorphic computing and Spiking Neural Networks.




 
  • Like
  • Fire
  • Love
Reactions: 33 users

7für7

Top 20
The fact, that robots are built like the human body once again shows that we are anatomically more efficiently designed to handle multiple tasks. We may not be able to run like a jaguar, but it will never be able to create anything with its own hands. We may not be able to climb trees like a monkey, but in return, we can build intricate clockworks with our hands, etc. And most importantly, we have intelligence (which, as proven, some people clearly need an update for—or at least need their screws tightened cough cough).

Nevertheless, one might wonder why humans want to artificially create something that humans can already do. Well, it’s meant to make human life easier and take over tasks that have been dangerous for us until now—whether in research, space missions on oxygen-poor planets or asteroids that have none, at home in household chores, and so on.

It’s a luxury granted to us by our technological progress in this new millennium and century. We are living in a time that people in the past could only see in science fiction movies or read about in books. I’m glad to be living in such an era, one that paves the way for the next technological revolution/evolution. And I’m invested in one of those great companies which built the future





From now on, things will move forward at breakneck speed! I’m excited for what’s to come!
 
Last edited:
  • Like
  • Love
Reactions: 11 users

Attitude estimation system and attitude estimation method​


Current Assignee: MegaChips Corp

Abstract​

To estimate a user's posture, including a direction of the user's body, using a small number of sensors.SOLUTION: A posture estimation system comprises a measurement member 1 located at any part of four limbs of a user, and a posture acquisition part 520 for acquiring the posture of the measurement member. The measurement member includes an acceleration sensor 14 and a gyro sensor 15. The posture acquisition part 520 includes a reference coordinate determination part 521 for setting a reference coordinate system of the measurement member based on the user's operation of making the measurement member face a target 3, and an attitude estimation part 522 for estimating an attitude of the measurement member relative to the target by acquiring detection values Da and Dr output from the acceleration sensor and the gyro sensor in response to the user's operation of changing the attitude of the measurement member.

View attachment 77936

View attachment 77937

GPT analysis:


This patent describes a posture estimation system that determines a user's body orientation using a minimal number of sensors. It is primarily designed for gaming, VR, fitness tracking, and motion-based interaction systems.




1. Purpose & Use


The system aims to estimate the posture and orientation of a user’s body efficiently, using a small number of sensors instead of a full-body motion capture setup. This is particularly useful for:


  • Gaming – Motion-based gameplay using handheld controllers.
  • Virtual Reality (VR) & Augmented Reality (AR) – Enhancing user movement tracking.
  • Fitness & Rehabilitation – Monitoring body movement for training or therapy.
  • Human-Computer Interaction – Intuitive gesture-based controls.



2. Sensor Technologies


The system uses two key inertial sensors, embedded in a measuring device (such as a handheld controller or a wearable limb sensor):


  1. Acceleration Sensor (Accelerometer)
    • Measures movement acceleration in three axes (X, Y, Z).
    • Helps determine tilt and linear motion.
  2. Gyro Sensor (Gyroscope)
    • Measures rotational velocity in three axes (yaw, pitch, roll).
    • Tracks rotational movement and orientation changes over time.

These sensors are typically placed in:


  • Handheld controllers (left and right hands).
  • Wearable devices (e.g., strapped to feet or arms).
  • Potential expansion to lower body tracking (e.g., sensors on both hands and feet).



3. Processing Technologies & Processor Locations


The system processes sensor data at multiple levels, using different processors located in the controllers and the game console.


A. Processing at the Controller Level (Embedded Processors)


Each controller (or wearable sensor) contains an onboard processor that performs initial data collection and preprocessing:


  • Location: Inside each controller (or wearable sensor).
  • Functions:
    • Collects acceleration and gyroscope data.
    • Filters raw data to reduce noise.
    • Performs preliminary sensor fusion to combine acceleration and rotational data.
    • Communicates with the game console via wireless or wired connection.

B. Processing at the Game Console Level (Central Processing)


The main computational processing happens inside the game console:


  • Location: The game console’s central processor (CPU).
  • Functions:
    1. Reference Coordinate System Setup
      • The user performs a calibration motion, aligning the controllers to a fixed target (e.g., display screen).
      • This sets a baseline reference coordinate system.
    2. Posture Estimation
      • The console’s processor integrates accelerometer and gyroscope data from the controllers.
      • Uses sensor fusion algorithms to track movement and correct drift.
    3. Common Coordinate Conversion
      • Since each controller has an independent coordinate system, the console converts them into a unified coordinate system for consistent tracking.
    4. Machine Learning-Based Full Body Estimation
      • The console’s processor runs a machine learning model to estimate full-body posture based on limited sensor data.
      • The model is trained to predict shoulder, arm, and torso positions from hand-held controllers alone.
    5. Adaptive Motion Correction for Different Users
      • The system adjusts for different body sizes by applying acceleration correction algorithms.
      • Example: A child's arm will have different acceleration characteristics than an adult's, so the system scales acceleration values based on user height.



4. Advantages Over Traditional Systems


  • Fewer sensors required (no need for full-body tracking suits).
  • No waist-mounted sensors needed (orientation is inferred from hand-held devices).
  • Cost-effective and power-efficient (less hardware, lower processing demands).
  • Machine learning integration allows accurate full-body tracking with limited data.
  • Adaptable for different users via automated motion scaling.



Brilliant Uiux, one would have to think Megachips uses BRN in some form here otherwise why invest in BRN back in 2020 if the potential was no there for one of their biggest clients, Nintendo.
Go brainchip
 
  • Like
Reactions: 20 users

inston

Observer
Oh boy - I wish I had the money to buy more right now!
 

Attitude estimation system and attitude estimation method​


Current Assignee: MegaChips Corp

Abstract​

To estimate a user's posture, including a direction of the user's body, using a small number of sensors.SOLUTION: A posture estimation system comprises a measurement member 1 located at any part of four limbs of a user, and a posture acquisition part 520 for acquiring the posture of the measurement member. The measurement member includes an acceleration sensor 14 and a gyro sensor 15. The posture acquisition part 520 includes a reference coordinate determination part 521 for setting a reference coordinate system of the measurement member based on the user's operation of making the measurement member face a target 3, and an attitude estimation part 522 for estimating an attitude of the measurement member relative to the target by acquiring detection values Da and Dr output from the acceleration sensor and the gyro sensor in response to the user's operation of changing the attitude of the measurement member.

View attachment 77936

View attachment 77937

GPT analysis:


This patent describes a posture estimation system that determines a user's body orientation using a minimal number of sensors. It is primarily designed for gaming, VR, fitness tracking, and motion-based interaction systems.




1. Purpose & Use


The system aims to estimate the posture and orientation of a user’s body efficiently, using a small number of sensors instead of a full-body motion capture setup. This is particularly useful for:


  • Gaming – Motion-based gameplay using handheld controllers.
  • Virtual Reality (VR) & Augmented Reality (AR) – Enhancing user movement tracking.
  • Fitness & Rehabilitation – Monitoring body movement for training or therapy.
  • Human-Computer Interaction – Intuitive gesture-based controls.



2. Sensor Technologies


The system uses two key inertial sensors, embedded in a measuring device (such as a handheld controller or a wearable limb sensor):


  1. Acceleration Sensor (Accelerometer)
    • Measures movement acceleration in three axes (X, Y, Z).
    • Helps determine tilt and linear motion.
  2. Gyro Sensor (Gyroscope)
    • Measures rotational velocity in three axes (yaw, pitch, roll).
    • Tracks rotational movement and orientation changes over time.

These sensors are typically placed in:


  • Handheld controllers (left and right hands).
  • Wearable devices (e.g., strapped to feet or arms).
  • Potential expansion to lower body tracking (e.g., sensors on both hands and feet).



3. Processing Technologies & Processor Locations


The system processes sensor data at multiple levels, using different processors located in the controllers and the game console.


A. Processing at the Controller Level (Embedded Processors)


Each controller (or wearable sensor) contains an onboard processor that performs initial data collection and preprocessing:


  • Location: Inside each controller (or wearable sensor).
  • Functions:
    • Collects acceleration and gyroscope data.
    • Filters raw data to reduce noise.
    • Performs preliminary sensor fusion to combine acceleration and rotational data.
    • Communicates with the game console via wireless or wired connection.

B. Processing at the Game Console Level (Central Processing)


The main computational processing happens inside the game console:


  • Location: The game console’s central processor (CPU).
  • Functions:
    1. Reference Coordinate System Setup
      • The user performs a calibration motion, aligning the controllers to a fixed target (e.g., display screen).
      • This sets a baseline reference coordinate system.
    2. Posture Estimation
      • The console’s processor integrates accelerometer and gyroscope data from the controllers.
      • Uses sensor fusion algorithms to track movement and correct drift.
    3. Common Coordinate Conversion
      • Since each controller has an independent coordinate system, the console converts them into a unified coordinate system for consistent tracking.
    4. Machine Learning-Based Full Body Estimation
      • The console’s processor runs a machine learning model to estimate full-body posture based on limited sensor data.
      • The model is trained to predict shoulder, arm, and torso positions from hand-held controllers alone.
    5. Adaptive Motion Correction for Different Users
      • The system adjusts for different body sizes by applying acceleration correction algorithms.
      • Example: A child's arm will have different acceleration characteristics than an adult's, so the system scales acceleration values based on user height.



4. Advantages Over Traditional Systems


  • Fewer sensors required (no need for full-body tracking suits).
  • No waist-mounted sensors needed (orientation is inferred from hand-held devices).
  • Cost-effective and power-efficient (less hardware, lower processing demands).
  • Machine learning integration allows accurate full-body tracking with limited data.
  • Adaptable for different users via automated motion scaling.



Number 1 post for me this year

1740366799411.gif
 
  • Like
  • Haha
  • Fire
Reactions: 9 users

TECH

Regular
View attachment 77971


EXTRACT



View attachment 77983






View attachment 77978


View attachment 77979
View attachment 77980

View attachment 77981






Here is a snippet from a podcast published online on the 7 August 2024, where Satalia's CEO & WPP Chief AI Officer, Daniel Hulme, talks about neuromorphic computing and Spiking Neural Networks.






Hi Bravo...some nice posts over the last few days, great work...after your very convincing post/s highlighting a possible solid connection to Veritone with it's aiWARE product/s.

I went back through my Linkedin conversations with Chad Steelberg, who at the time was the CEO and Chair of the Board of Directors, before handing off both titles during 2023 and 2024 to his brother Ryan.

My few conversations with Chad were in 2019/2020 and in September 2020 when I asked about our potential of being embedded within Veritone aiWare product/s he said at the time......
Chad Steelberg 11:12 PM
  • 👏
  • 👍
  • 😊

"Thanks for the note. Not much engagement at this time, but congrats on the progress"

We had issues with Studio and I'm reasonably comfortable in saying that, I think we aren't involved with Veritone in anyway.

BUT I HOPE I'M 100% WRONG, AS HAPPENS OFF AND ON :ROFLMAO:

Another interesting point is that with our 2024 Annual Report due out any day now, I just wonder if the Top 20 maybe coupled
with the report....hopefully there is nothing to rock the boat, we need surprises on the revenue front.

Tech (I know nothing) (y)
 
  • Like
Reactions: 8 users

MegaportX

Regular
Anyone have an idea when NVIDIA and Intel with be reporting next.
 

Iseki

Regular
Looks like MB.OS is a rebranding of QNX out of Waterloo uni.

QNX is already in use in all the major car makers vehicles - BMW, Bosch, Continental, Dongfeng Motor, Geely, Ford, Honda, Subaru, Toyota, Volkswagen, Volvo, and more.

So, it looks to me that we've been used by MB in a slimy sort of way.

In fact QNX is an Unix-like, real-time operating system that is embedded in not only cars, but medical devices, machinery, and other devices.So I wonder if BRN will produce drivers for QNX as well as linux?

 
  • Sad
Reactions: 1 users
Anyone have an idea when NVIDIA and Intel with be reporting next.
I believe that NVIDIA report is out on Thursday USA time according to comm bank report
 
  • Like
  • Fire
Reactions: 2 users

uiux

Regular
Looks like MB.OS is a rebranding of QNX out of Waterloo uni.

QNX is already in use in all the major car makers vehicles - BMW, Bosch, Continental, Dongfeng Motor, Geely, Ford, Honda, Subaru, Toyota, Volkswagen, Volvo, and more.

So, it looks to me that we've been used by MB in a slimy sort of way.

In fact QNX is an Unix-like, real-time operating system that is embedded in not only cars, but medical devices, machinery, and other devices.So I wonder if BRN will produce drivers for QNX as well as linux?



"Looks like MB.OS is a rebranding of QNX out of Waterloo uni."




Where is this info from
 
  • Like
Reactions: 1 users

uiux

Regular
Looks like MB.OS is a rebranding of QNX out of Waterloo uni.

QNX is already in use in all the major car makers vehicles - BMW, Bosch, Continental, Dongfeng Motor, Geely, Ford, Honda, Subaru, Toyota, Volkswagen, Volvo, and more.

So, it looks to me that we've been used by MB in a slimy sort of way.

In fact QNX is an Unix-like, real-time operating system that is embedded in not only cars, but medical devices, machinery, and other devices.So I wonder if BRN will produce drivers for QNX as well as linux?



Found something all good

 
  • Like
  • Fire
  • Love
Reactions: 11 users

Iseki

Regular
"Looks like MB.OS is a rebranding of QNX out of Waterloo uni."




Where is this info from
I think it is a valid question to ask: is MB really writing their own OS or is it a re-branding of an existing OS.
I haven't seen MB answer the question, but I do know that back in the day QNX was the OS for realtime data routing. It's from Waterloo Uni, and a quick google will show that it's already in about 300million vehicles plus medical devices, and lo and behold, the owners of QNX, Blackberry, are suddenly (last month) pushing its public awareness.

I don't have any proof other than the above, which seems to be logical, and if true, might mean that we should understand what it means.

There are people here on this forum who are currently writing AKD1000 drivers for Windows.

What do you think?
 

Iseki

Regular
Found something all good

Yes, well. As you are probably aware much of the performance saving may be the simple fact that they are running a form of QNX, a micro-kernel OS.

On the one hand MB.OS is just a rebranding of QNX, and so we shouldn't expect too much from them.
On the other hand if MB are smart enough to select QNX, they might be smart enough to select Akida.

Either way the question remains: Why no Akida driver for QNX?
 
  • Like
Reactions: 1 users

uiux

Regular
Yes, well. As you are probably aware much of the performance saving may be the simple fact that they are running a form of QNX, a micro-kernel OS.

On the one hand MB.OS is just a rebranding of QNX, and so we shouldn't expect too much from them.
On the other hand if MB are smart enough to select QNX, they might be smart enough to select Akida.

Either way the question remains: Why no Akida driver for QNX?


If they have based MB.OS off QNX they would just write their own proprietary driver.


Mercedes are designing MSOCs. They don't need drivers for pcie cards, they will likely be working with chips on their own interfaces
 
  • Like
Reactions: 6 users
1740386723720.png


 
  • Like
  • Fire
  • Wow
Reactions: 26 users
Top Bottom