BRN Discussion Ongoing

Guzzi62

Regular
"Below 26C not at 26, there is a huge difference and I think we are talking about the surroundings, not body temp, otherwise the glasses are useless."

Sorry, yes that's what I meant, 26°C and below.
Still not right 😛
25.9°C and below!

How is something sitting on the body "not" going to reach external body temperature, unless it's being cooled?..
Not large parts of the glasses are touching the body, are they?

The bridge of the nose and over the ears are the normal touching/support points, that's not enough contacts area to heat up the glasses.

But okay if the glasses are out in the sun, they will get too hot to work but as I said earlier they should work fine inside an AC cooled car.
 
  • Like
Reactions: 1 users
"Below 26C not at 26, there is a huge difference and I think we are talking about the surroundings, not body temp, otherwise the glasses are useless."

Sorry, yes that's what I meant, 26°C and below.
Still not right 😛
25.9°C and below!

How is something sitting on the body "not" going to reach external body temperature, unless it's being cooled?..
My opinion only @DingoBorat but the temperature issue might have to do with the glasses using ”Dry electrodes” when contacting the skin.

Guessing here but the higher temperature might cause the human to sweat which may interfere with the readings? Unless your Prince Andrew of course who doesn’t sweat 😂

Obviously the temp issue is not related to Brainchip’s accelerator or its ability to process the data. The quality of data in (readings) will affect the results.

I could be completely off track but that was my first thought on reading the temp problem to overcome.



:)
 
  • Like
  • Haha
  • Fire
Reactions: 7 users
Not large parts of the glasses are touching the body, are they?

The bridge of the nose and over the ears are the normal touching/support points, that's not enough contacts area to heat up the glasses.

But okay if the glasses are out in the sun, they will get too hot to work but as I said earlier they should work fine inside an AC cooled car.
I disagree on your points 😛

The 3 contact points "are" enough to raise the temperature of the glasses, to body temperature, why wouldn't they?
Saying that's not enough contact area, doesn't make sense, as there only needs to be "contact" unless they're made from something that insulates them from the body, or wicks away the body heat, quicker than it's being absorbed.

They will work inside an air conditioned car, if they have it blowing at their face..
 
  • Like
Reactions: 2 users
I disagree on your points 😛

The 3 contact points "are" enough to raise the temperature of the glasses, to body temperature, why wouldn't they?
Saying that's not enough contact area, doesn't make sense, as there only needs to be "contact" unless they're made from something that insulates them from the body, or wicks away the body heat, quicker than it's being absorbed.

They will work inside an air conditioned car, if they have it blowing at their face..

Again @DingoBorat I could be wrong but I think the glasses helping to prevent epilepsy has to do with light hitting the eyes which is why lots of TV shows warn against watching it as it could cause an epileptic fit. It’s about the flashes or flickering of light hitting the eyes can trigger an epileptic fit.

I think they would be similar to to self tinting glasses or even welding visors to help prevent that occurring. I don’t think that is the cause of the temperature issue.

But I’m not qualified in these area’s. They are just my thoughts on it!
 
  • Like
Reactions: 4 users
My opinion only @DingoBorat but the temperature issue might have to do with the glasses using ”Dry electrodes” when contacting the skin.

Guessing here but the higher temperature might cause the human to sweat which may interfere with the readings? Unless your Prince Andrew of course who doesn’t sweat 😂

Obviously the temp issue is not related to Brainchip’s accelerator or its ability to process the data. The quality of data in (readings) will affect the results.

I could be completely off track but that was my first thought on reading the temp problem to overcome.



:)
You can sweat with exertion, below 26°C though too..

But sounds like it's probably got "something" to do with the electrode tech, like you said..
 
  • Like
Reactions: 4 users

Diogenese

Top 20
My opinion only @DingoBorat but the temperature issue might have to do with the glasses using ”Dry electrodes” when contacting the skin.

Guessing here but the higher temperature might cause the human to sweat which may interfere with the readings? Unless your Prince Andrew of course who doesn’t sweat 😂

Obviously the temp issue is not related to Brainchip’s accelerator or its ability to process the data. The quality of data in (readings) will affect the results.

I could be completely off track but that was my first thought on reading the temp problem to overcome.



:)
Hi SG,

Photochrome glasses are temperature sensitive.

https://www.sciencedirect.com/science/article/abs/pii/S0143720811000957

I would think Akida would not break into a sweat at 45C.
 
  • Like
Reactions: 6 users

Guzzi62

Regular
I disagree on your points 😛

The 3 contact points "are" enough to raise the temperature of the glasses, to body temperature, why wouldn't they?
Saying that's not enough contact area, doesn't make sense, as there only needs to be "contact" unless they're made from something that insulates them from the body, or wicks away the body heat, quicker than it's being absorbed.

They will work inside an air conditioned car, if they have it blowing at their face..
Wrong!

A much larger area of the glasses are in "free" air, not toughing the skin.

Those 3 contacts point are far from enough heating up the glasses.

But never mind, we need more info before coming to premature conclusions.
 
  • Like
Reactions: 2 users
  • Like
Reactions: 1 users
Hi SG,

Photochrome glasses are temperature sensitive.

https://www.sciencedirect.com/science/article/abs/pii/S0143720811000957

I would think Akida would not break into a sweat at 45C.
Thanks @Diogenese i should have dug a little deeper. Didn’t realise that! Gotta love science!

Probably better outcome for us if that’s the temperature issue vs the dry contacts as hopefully that means “Our” part of the glasses can continue without issue. Eg data input and processing is not the issue.

Cheers

:)
 
  • Like
Reactions: 3 users
I don't think they are photochromic, because they work off UV light exposure and would be too slow?..

Auto darkening welding helmets, use sensors and work differently and they have to use that kind of tech?..

I honestly don’t know @DingoBorat. I was spitballing ideas.

I just hope they can overcome the issue as the market and revenue would be awesome; let alone being part of a beneficial ai solution to help those who need it!

:)
 
  • Fire
  • Like
Reactions: 3 users
Wrong!

A much larger area of the glasses are in "free" air, not toughing the skin.

Those 3 contacts point are far from enough heating up the glasses.

But never mind, we need more info before coming to premature conclusions.
"Those 3 contacts point are far from enough heating up the glasses."

I'm sorry, but you can't just bluntly state that I am wrong, especially when 2 of the contact points you mention "conducting" the heat "contain" the sensors..
 
  • Haha
  • Like
Reactions: 3 users
I honestly don’t know @DingoBorat. I was spitballing ideas.

I just hope they can overcome the issue as the market and revenue would be awesome; let alone being part of a beneficial ai solution to help those who need it!

:)
1741932299426.gif
 
  • Haha
Reactions: 3 users

jtardif999

Regular
  • Like
Reactions: 4 users

Frangipani

Top 20
"Below 26C not at 26, there is a huge difference and I think we are talking about the surroundings, not body temp, otherwise the glasses are useless."

Sorry, yes that's what I meant, 26°C and below.
Still not right 😛
25.9°C and below!

Good morning from Germany,

not that it makes much of a difference, but I should have translated the German original “bei Umgebungstemperaturen bis zu 26 Grad” literally as “at surrounding/ambient temperatures up to 26 degrees” - my bad.
So depending on whether they mean “less than” or “less than or equal”, the smart glasses might actually still work at 26.0 degrees. 🤪

The article doesn’t specify, though, why NEXA currently won’t function at higher temperatures.
 
  • Like
  • Love
Reactions: 10 users

GazDix

Regular
In the spirit of the establishment of the controversial DOGE department to detect waste and inefficiency in government departments, I would like to propose perhaps a controversial question to BRN shareholders here.
Why do we fund and go to these Embedded World and other exhibitions year after year costing our company a considerable amount of money?

In our Annual Report, under the current liablilites column 'trade and other payables' we have moved from $853,642 in 2023 to 1,373,294 in 2024.

Over a $500k difference which is double our receipts from customers.

Why? I can only assume at least 25% are these conferences. But as usual, nothing is ever clear.
If we were just beginnning, sure, this is a good expense for future growth. But we have been doing these for years on end with gaining 'partnerships' but little to no solidified contracts since 2020. In fact, our marketing and sales team have changed so much in the last few years, why show up with different folks? Great look.

If a reason to do this is to appease shareholders, I'm not buying it. So can anyone with experience in these matters explain why?
 
  • Like
  • Thinking
  • Fire
Reactions: 5 users

Frangipani

Top 20

Nice to see the top brass at Tata posting about us

View attachment 79183

Reckon that Accenture have put out a similar patent using ultrasound and gesture recognition; one of their patents that surfaced last year?

This week’s LinkedIn posts by TCS Research and Sounak Dey do not refer to a new patent, but instead to the one granted last year that we already knew about.

I believe it was @TECH who first posted about it on 24 April 2024:

https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-419909

See the link provided by TCS:

9458525F-A2C0-46E3-8353-6A5CF8C641BE.jpeg


While Sounak Dey specifically refers to the patent idea having been “tried using BrainChip Akida MetaTF platform”, please note that the actual patent, which was filed in December 2022, mentions both Akida and Loihi - like all relevant patents by TCS and Accenture do, if I’m not mistaken.

6F2701C8-2697-454C-BE31-E182A6DB4F72.jpeg



Sounak Dey also mentions in his post that a related paper was accepted at ISCAS (the IEEE International Symposium on Circuits and Systems), so I reckon he wanted to use this positive notification by the conference organisers as an opportunity to promote their invention.
 
  • Like
  • Love
  • Fire
Reactions: 14 users

Esq.111

Fascinatingly Intuitive.
Evening GazDix,

I thought all junkets, fairs etc would have fallen under SALES & MARKETING .

Yes , SHIPLOAD GOING OUT FOR SCANT RETURN.

From our last Annual Report , page 40.


Regards,
Esq.
 

Attachments

  • 20250314_182434.jpg
    14.9 MB · Views: 48
Last edited:
  • Like
  • Thinking
Reactions: 5 users

Frangipani

Top 20
C'mon TCS....ya know you want Akida 2.0 and TENNs.......sign here pls :)

Am I the only one who thinks that it is not going to be consultancy companies such as TCS or Accenture who are potentially going to sign an IP license with us, but rather their customers?
 
  • Like
  • Love
Reactions: 8 users

CHIPS

Regular
Tata consultancy services with gesture recognition


Didn't BrainChip withdraw Akida 2 in favour of some company's development last year or so? Now they are presenting Akida 2 in Nuremberg and, at the same time, Tata informs about their new patent. Coincidence?
 
  • Like
  • Thinking
  • Wow
Reactions: 12 users

SERA2g

Founding Member
Wonder if we'll get an intro to these guys as some point. Co funded by the EU and have also participated in space programs with ESA and NASA.

Like to think someone at ESA may discuss us with them sometime?





What are the challenges of implementing neuromorphic vision in AI?​

Trends
December 23, 2024
Neuromorphic Vision


Neuromorphic vision is a field that draws on the workings of the human visual system to develop electronic systems that process visual information efficiently and in real time. This approach uses sensors and algorithms designed to mimic the biological properties of the eye and brain.
Instead of capturing data in fixed frames like traditional cameras, neuromorphic sensors record individual events (changes in light intensity) at each pixel. This makes them highly efficient in terms of energy consumption and processing speed.

Origins of Neuromorphic Vision​

The term ‘neuromorphic’ was coined by Carver Mead in the 1980s. Mead, a pioneer in microelectronics, proposed to design electronic systems inspired by the structure and function of the human brain. Since then, research in neuromorphic sensors has evolved, with key milestones such as the development of event cameras (e.g. Dynamic Vision Sensor, DVS) that mimic the behaviour of the human eye.

Relationship with Artificial Intelligence (AI)​

Neuromorphic vision is closely linked to AI, providing highly optimised and relevant data for the training and execution of deep learning and machine learning algorithms. Some of its main contributions are:
  1. Real-time processing: data obtained from neuromorphic sensors allows AI models to react immediately, useful in applications such as autonomous driving and robotics.
  2. Energy efficiency: neuromorphic vision significantly reduces energy consumption compared to traditional cameras, improving the sustainability of AI-based applications.
  3. Non-redundant data: event-specific detection allows AI systems to work with non-redundant data, improving accuracy in tasks such as object recognition or navigation.

Current and Future Impact​

The implementation of neuromorphic vision is highly recommended in sectors where low latency and energy efficiency are essential and robust real-time event processing is required. Therefore, neuromorphic vision has promising applications in sectors such as:
– Robotics: enabling improved visual perception of robots for navigation and manipulation in complex environments.
– Autonomous driving: enables fast and efficient detection of both objects and obstacles.
– Medical devices: supports some technologies such as visual prostheses or biomedical analysis.
– Security and surveillance: provides highly accurate real-time detection of suspicious movements and critical events.
– Industry and automation: aiding quality inspection systems, tracking objects on assembly lines, and industrial IoT systems.
The combination of neuromorphic vision and AI will transform the way machines perceive and understand the environment, bringing them closer to human biological processing.

What are the challenges of implementing neuromorphic vision in AI?​

Implementing neuromorphic vision in artificial intelligence presents several challenges, whether technical, economic or practical:
– Development of specialised hardware: neuromorphic sensors require advanced chips that mimic the neural activity of the brain, which are expensive and technically complex to manufacture.
– Unconventional data processing: instead of conventional images, neuromorphic sensors generate data in event format, which requires specific algorithms and new paradigms to interpret the data.
– Specialised learning algorithms: New algorithms, such as spiking neural networks (SNNs), are needed that are compatible with the asynchronous and event-driven nature of the data.
– Scalability: Algorithms and systems need to be scalable for large-scale applications, which has not yet been fully achieved.
– Lack of expertise and training: There are few experts in neuromorphic vision, and it takes time and resources to build technical teams.
In summary, neuromorphic vision has transformative potential in multiple industries. Its implementation is strategic for companies seeking technological advantage in artificial intelligence applications.

Neuromorphic Vision and Artificial Intelligence in ARQUIMEA​

ARQUIMEA, from its research center located in the Canary Islands, has a research orbital dedicated to robotics and another to Artificial Intelligence that develops projects that explore the potential of neuromorphic vision.
In addition, all ARQUIMEA Research Center projects belong to the QCIRCLE project, co-funded by the European Union, which aims to create a center of scientific excellence in Spain
Found your post searching for Arquimea on Tsex.

An Embedded Software Research Engineer from Arquimea just liked one of Brainchip's recent LinkedIn posts.

Here's the post and like:

https://www.linkedin.com/posts/brai...p&rcm=ACoAABNRhjYBFfSUxFCyQN7T30eFUWBJTh-LcmM

1741941960894.png


Nothing concrete but thought you'd like to know! Arquimea could be a rabbit hole worth chasing given we now know someone working there is aware of Brainchip.
 
  • Like
  • Love
  • Fire
Reactions: 9 users
Top Bottom