BRN Discussion Ongoing

If they can ~tell (?) red wine from chardonnay they can tell synthetic fuel from conventional fuel. There is much more water in it, for example. They have the sensors in their laboratory - how else could the videos have been created? One week testing and training and next week producing the video. The other two videos were made to show what is possible. Here is an example with urgent significant importance worldwide. A use case that will be noticed by politicians. This is a stage. And it's almost free. Just please do it now. Get out the sensors from the last videos and do this with fuels instead of wine or beer. IMO

I'm only telling you this because it's up and down here in the news. They say it would be technically difficult to realize. But I don't believe that.

I just mean that politicians here say that this is technically ~not possible. And I question that. Of course it is possible. And the equipment is bored. So let's go, instead of wine fuels.
As you are familiar with the debate and have the technical knowledge and insist it can be done you should contact Tony Dawe direct as you will be able to tell him the full story.

Someone else here contacting Tony Dawe and saying to him Cosors an anonymous poster who I don’t know personally believes you should immediately do a demonstration as he knows you can do it is going to look silly.

I encourage you to contact him as soon as possible.

My opinion only DYOR
FF

AKIDA BALLISTA
 
  • Like
  • Wow
  • Love
Reactions: 19 users

cosors

👀
As you are familiar with the debate and have the technical knowledge and insist it can be done you should contact Tony Dawe direct as you will be able to tell him the full story.

Someone else here contacting Tony Dawe and saying to him Cosors an anonymous poster who I don’t know personally believes you should immediately do a demonstration as he knows you can do it is going to look silly.

I encourage you to contact him as soon as possible.

My opinion only DYOR
FF

AKIDA BALLISTA
Such vehicles would have to use technology that would prevent them from driving if other fuels are used, the draft said.

I said above if I were CTO and that some of you know the board personally. So how should an email from a tiny retailer carry more weight than your word? We are not just 1000eyes. We share. Do we?

___
Dear FF don't get me wrong. Others here are in a much better position to address this. Nobody takes me seriously. Does that help BRN? I think not.
 
Last edited:
  • Like
Reactions: 7 users
Such vehicles would have to use technology that would prevent them from driving if other fuels are used, the draft said.

I said above if I were CTO and that some of you know the board personally. So how should an email from a tiny retailer carry more weight than your word? We are not just 1000eyes. We share. Do we?
As I said you should contact Tony Dawe and discuss it with him.

My opinion only DYOR
FF

AKIDA BALLISTA
 
  • Like
Reactions: 3 users
I have something for the tech savvy to answer. SNN models work on converting input data into electrical spikes with that spike indicating to the model whether that data is significant or not. I know that our hardware is able to process these spikes in a native SNN model or a CNN2SNN converted model with limited or no need for an external CPU. I’m not sure though what is going on if someone only licences the IP. Are they licensing the design of our hardware to intergrate to their hardware or are they licensing a software only version of our process in which case I’m not sure how it can process a SNN model. I guess what im really asking is what the difference in licensing the IP block or actually integrating our physical chip.
 

Dhm

Regular
Hi Dhm.
How sure are you that brn was not involved in that demo and what is you reason.
Upto now there is no knowledge other the Qualcomm and prophesee.
On the other hand we all know merc is using us through their announcement and Qualcomm is providing them.snapdragon for the same. Which means there is a possibility Qualcomm is working on our product as well
So .....
Hi @rgupta I'm not sure either way. However I would suppose that at least one of the 1000 eyes may have found a strand of proof, yet there was none forthcoming. In addition to this, and around the same time, @chapman89 had a text conversation with the CEO of Prophesee and Luca stated that we were a demonstration partner, or words to that effect. So my case rests, your honour - and I wish I was wrong in saying no!
 
  • Like
Reactions: 6 users

Quatrojos

Regular
Such vehicles would have to use technology that would prevent them from driving if other fuels are used, the draft said.

I said above if I were CTO and that some of you know the board personally. So how should an email from a tiny retailer carry more weight than your word? We are not just 1000eyes. We share. Do we?

___
Dear FF don't get me wrong. Others here are in a much better position to address this. Nobody takes me seriously. Does that help BRN? I think not.

E-methanol has the same chemical signature as regular methanol; no difference in carbon-hydrogen-alcohol linkages.
 
  • Like
Reactions: 4 users

cosors

👀
E-methanol has the same chemical signature as regular methanol; no difference in carbon-hydrogen-alcohol linkages.
I see that differently after the subject of engines and turbines. And why are e-fuels harmful for the current setting if it is identical? We can distinguish it I am sure. If we want to.

You destroy a combustion engine if you run it on pure e-fuels. That is a fact. So it is not identical.

But I better retire before Dredd shoots me down.
 
  • Haha
  • Like
  • Fire
Reactions: 11 users

Quatrojos

Regular
I see that differently after the subject of engines and turbines. And why are e-fuels harmful for the current setting if it is identical? We can distinguish it I am sure. If we want to.
I don't know about engines/turbines but I know about organic chem. It doesn't matter from where individual atoms are sourced. Once H20 and CO2 are combined to form methanol, there is no signature emitted from them apart from spectroscopy identifying them as 'methanol'.
 
  • Like
Reactions: 4 users

Learning

Learning to the Top 🕵‍♂️
CHICAGO, March 22, 2023 /PRNewswire/ -- The AI sensor market is projected to reach USD 22.1 billion by 2028, from USD 3.0 billion in 2022, at a CAGR of 41.6% according to a new report by MarketsandMarkets™. Key factors driving the growth of the AI sensor market include the growing demand for AI sensors in IoT-based devices across home automation and automotive applications, smart infrastructure, and digitization to facilitate edge computing growth, and the increased use of AI sensor-equipped wireless technologies.


Learning 🏖
 
  • Like
  • Fire
  • Wow
Reactions: 21 users

cosors

👀
I don't know about engines/turbines but I know about organic chem. It doesn't matter from where individual atoms are sourced. Once H20 and CO2 are combined to form methanol, there is no signature emitted from them apart from spectroscopy identifying them as 'methanol'.
I am an engineer from the ~automotive and I can only say: it is not identical. I mentioned that there is much more water in it, e.g.
Fill up tomorrow your car with e-fuels and we'll continue the discussion/debate.
We put some of it in here so that the machines don't break down. I don't know how it is in Australia or the USA. For me it is clear and fact.
 
  • Like
Reactions: 4 users

Tuliptrader

Regular
Something for Akida 2.0 to get its teeth into, perhaps.

https://www.intellisenseinc.com/innovation-lab/augmented-intelligence/real-time-image-enhancement/

Intellisense Systems, Inc.

Real-Time Image Enhancement​

No longer the stuff of spy movies, Intellisense Systems is developing deep learning-based, super-resolution algorithms that can enhance and clarify images in almost real time. Learn more about this innovation and how it is making military, rescue, and other hazardous operations safer.

“Zoom in and enhance.” These words have become ubiquitous in nearly every spy movie and TV show. Typically, a group of people will gather in a shadowy room surrounded by dozens of screens showing surveillance footage. One of the monitors will zoom in the pixelated visage of a passerby. But real-time image enhancement turns the once blurry freeze-frame into a crystal-clear picture, often revealing the face of a brave hero, or a nefarious villain.
Many viewers may roll their eyes at this cliché, but this technology can greatly bolster the safety and efficiency of intelligence or combat operations. Real-time image enhancement can not only help differentiate friend from foe, but it can also identify key items in people’s hands, as well as clarifying marks on vehicles or structures. Getting a clearer picture of people or the environmental improves the U.S. intelligence and ensures that targets are acquired while civilians remain safe.
With this goal in mind, the United States military solicited work from the Department of Defense’s Small Business Innovative Research (SBIR) program to develop a means of advancing real-time image enhancement technology. After successfully proposing a solution to this requirement, Intellisense Systems developed and demonstrated the ability to enhance images in low-light and nighttime conditions based on the novel use of a convolutional neural network (CNN). A CNN consists of layers and algorithms that mimic biological neurons, and it requires relatively little pre-processing compared to other image classification algorithms. This method of processing is ideal for image enhancement and video analysis; thus, the Intellisense team implemented into embedded hardware to help servicemembers in both recognition classification and laser pointing.
The machine-learning specialists at Intellisense devised this real-time image enhancement system to detect military-relevant targets in both still images and live video. Using a tablet, the software could automatically identify key elements in either a photo or video and present bounding boxes, icons, or color highlighting to key points of interest. The operator can then select an area of the image or video stream for enhancement. From here, the CNN begins its layer processing, increasing the image’s resolution and improving its contrast, acuity, and stability. It can eliminate motion blur or make out certain items that were previously undetectable, like a weapon in an enemy combatant’s hand or text on a mobile phone.
To further relieve the effort required by personnel, the CNN employs unsupervised learning and processing methodology. This means the network uses datasets as examples. As a result, it processes data without a specific answer or outcome to identify. Instead, the system can automatically determine the structure of the data (in this case, a still image or live video) without human input. This enables the CNN to identify patterns based on the datasets and autonomously enhance pictures and video, taking some of the burden off the system’s human users.
This system’s processing can be completed via a tablet to bolster the ease of use and mobility. It is compatible with the next generation of the U.S. Armed Forces’ handheld targeting system, which can be packaged into compact housing and mounted into a variety of locations. This innovation enables remote viewing and control via radio network. Additionally, the open-architecture approach makes this solution compatible with various software implementations. This enables real-time image enhancement algorithms to be developed independently of the armed forces’ new handheld targeting system.
With this solution, service members can gather intelligence and acquire targets with greater success and efficiency, all while keeping civilians and non-combatants out of harm’s way. And with funding from the SBIR, this solution can be commercialized to serve in other applications, like search-and-rescue missions and emergency response. Over the next few years, Intellisense will continue to train and refine these CNN algorithms so that real-time image enhancement can greatly improve decision-making, reduce users’ workload in detecting key information, and most importantly, save lives.

TT
https://www.intellisenseinc.com/
 
  • Like
  • Fire
  • Love
Reactions: 13 users

Kachoo

Regular
Such vehicles would have to use technology that would prevent them from driving if other fuels are used, the draft said.

I said above if I were CTO and that some of you know the board personally. So how should an email from a tiny retailer carry more weight than your word? We are not just 1000eyes. We share. Do we?

___
Dear FF don't get me wrong. Others here are in a much better position to address this. Nobody takes me seriously. Does that help BRN? I think not.
I think if you have ideas and information as a share holder you should contact them. They will take a look at this and no better as they have the skills to understand.

The wine tasting sensor may be different then the ones you speak of. Sensors are built to look for compound a or b or c maybe more. And yeah Akida could process the data and predict what fuels are there. Some sensors may not there are millions of sensors that do various things.

But your idea is valid and yeah bring it up to them. Likely they will thank you and pass this on to the right person that knows if this will work.

To be honest they would appreciate constructive support I've continued questions of why this or that is happening.

Cheers
 
  • Like
  • Fire
  • Love
Reactions: 11 users

Quatrojos

Regular
I am an engineer from the ~automotive and I can only say: it is not identical. I mentioned that there is much more water in it, e.g.
Fill up tomorrow your car with e-fuels and we'll continue the discussion/debate.
We put some of it in here so that the machines don't break down. I don't know how it is in Australia or the USA. For me it is clear and fact.
OK, I thought you meant that, as a result of origin, the methanol molecule was somehow different. If there's a difference in proportion of free H2O molecules between e-methanol and original recipe, why couldn't they just dress-up original recipe methanol by adding water?
 

cosors

👀
I think if you have ideas and information as a share holder you should contact them. They will take a look at this and no better as they have the skills to understand.

The wine tasting sensor may be different then the ones you speak of. Sensors are built to look for compound a or b or c maybe more. And yeah Akida could process the data and predict what fuels are there. Some sensors may not there are millions of sensors that do various things.

But your idea is valid and yeah bring it up to them. Likely they will thank you and pass this on to the right person that knows if this will work.

To be honest they would appreciate constructive support I've continued questions of why this or that is happening.

Cheers
I'm thinking about it. I am just very shy. I was thinking of those who know the board personally. Who should listen to. I had this situation exactly with Talga and I wrote then. And a year later I realize that this is exactly what is happening. Is not up to me. But this is not about a year but that one sticks something to the other. Where are our sensors for wine testing?

I am quiet now.
 
  • Like
Reactions: 5 users

cosors

👀
OK, I thought you meant that, as a result of origin, the methanol molecule was somehow different. If there's a difference in proportion of free H2O molecules between e-methanol and original recipe, why couldn't they just dress-up original recipe methanol by adding water?
The water comes through the manufacturing process.
 
  • Like
Reactions: 4 users

Frangipani

Top 20
Last February, @butcherano posted:

“So this put me onto looking at implantable medical devices that rely on low power and long battery life and require good pattern identification skills, which led me to pacemakers. I think @Fact Finder has mentioned this before but I don’t recall seeing any articles or research posted (…)
So what are people’s thoughts on seeing Akida inside something like a pacemaker some time soon? Any chance of joining some dots here?...”

@Learning replied to this post at the time and mentioned he knew someone with a pacemaker implanted in his chest to control early Parkinson’s.

Both of them along with the rest of us will be pleased to learn about the following hot off the press article, which boiled down to its essence (you are welcome, @Rocket577 😉) says:
“Researchers at Michigan Technological University are applying neuromorphic computing to improve the effectiveness and energy efficiency of deep brain stimulation systems used to treat Parkinson’s disease.”


The only fly in the ointment being that those MTU researchers have so far been experimenting with Loihi. They are, however, more than aware of Intel’s competitors as evidenced by the following quote: “We’ve discovered that neuromorphic chips, including Intel Loihi, outperform other computational platforms in terms of energy-efficiency by 109 times,” An said. (…) An and Yu plan to collaboratively design their own memristive neuromorphic chip specifically for closed-loop DBS systems. “Our research on these new, innovative computational paradigms — along with the design of emergent AI chips — will open a new door to greater and faster development of smart medical devices for brain rehabilitation,” said An. “Even wearable medical devices are now well within the realm of possibility.” “

Surely there must be other labs around the world doing similar research.
Beneficial AI at its best.
 
Last edited:
  • Like
  • Love
  • Fire
Reactions: 21 users

Kachoo

Regular
I'm thinking about it. I am just very shy. I was thinking of those who know the board personally. Who should listen to. I had this situation exactly with Talga and I wrote then. And a year later I realize that this is exactly what is happening. Is not up to me. But this is not about a year but that one sticks something to the other. Where are our sensors for wine testing?

I am quiet now.
Mate,

Tony and crew are very professional and will accept your inquiry.

The world is not built by one individual people have all sorts of strenghts and weaknesses.

Your Idea may in really good or not I don't know. Pass it to the pros and they will take you idea dissect it and see if there is merit and value.

You would never get a rude or negative response from such people.
 
  • Like
  • Fire
Reactions: 22 users

sl.oldi

Member
Das Wasser kommt durch den Herstellungsprozess.

Ich denke darüber nach. Ich bin nur sehr schüchtern. Ich dachte an diejenigen, die den Vorstand persönlich kennen. Wer sollte zuhören. Ich hatte genau diese Situation mit Talga und habe damals geschrieben. Und ein Jahr später merke ich, dass genau das passiert. Liegt nicht an mir. Aber hier geht es nicht um ein Jahr, sondern darum, dass einer dem anderen etwas anhaftet. Wo sind unsere Sensoren für Weintests?

Ich bin jetzt ruhig.
Do it .... mach einfach ....
 
  • Like
  • Fire
Reactions: 5 users
Hi @rgupta I'm not sure either way. However I would suppose that at least one of the 1000 eyes may have found a strand of proof, yet there was none forthcoming. In addition to this, and around the same time, @chapman89 had a text conversation with the CEO of Prophesee and Luca stated that we were a demonstration partner, or words to that effect. So my case rests, your honour - and I wish I was wrong in saying no!
I think we need to go back to your original statement which was that SynSense had won the first round as they were engaged with Qualcomm and Prophesee on the no blur lenses.

It was pointed out that there was no evidence that SynSense was involved with Qualcomm and in fact all the evidence points to Qualcomm using their own processor with Prophesee’s vision sensor.

I am not sure who it is that you say is still claiming that Brainchip is involved with Qualcomm and Prophesee on the no blur lenses that are coming out this year but I did not think this was a prevailing view on TSEx.

In my opinion the focus here is on the following:

1. Prophesee prior to Luca Verre appearing on the Brainchip podcast has had public engagements with Intel, SynSense, Qualcomm, Sony and said that every previous engagement until they combined with Brainchip involved compromise and did not allow their vision sensor to achieve its full potential,

2. At CES 2023 Prophesee booked a suite and invited Brainchip to be its demonstration partner,

3. Prophesee announced just prior to CES 2023 commencing that they had over 100 customer meetings scheduled and the majority were subject of Non Disclosure Agreements,

4. Prophesee’s Luca Verre is on record answering a question from Jesse Chapman that they hoped to reach a commercial agreement with Brainchip,

5. The recently updated Brainchip website makes crystal clear that Brainchip and Prophesee are partners.

6. The fact that both Sony and Qualcomm have both partnered with Prophesee and are bringing product to market proves that Prophesee has an advanced vision sensor with commercial application even when not performing to its full potential because of being hooked up to less ideal processors.

7. This being the case logic suggests that if Prophesee can achieve sales engagements with Sony and Qualcomm in less than perfect circumstances that when combined with Brainchip’s AKIDA technology the 100 new NDA protected customer engagements are highly likely to yield commercial engagements for an optimised Prophesee vision sensor utilising AKIDA technology.

And this is why I am excited and I believe others are excited by the partnership between Brainchip and Prophesee.

Mercedes Benz 12 months before engaging with Brainchip had worked with Intel but clearly moved up to AKIDA. This is exactly the same story as has unfolded with Prophesee.

They have tried the rest but gone with the BE(A)ST being the MIND BOGGLING SCIENCE FICTION that is AKIDA Technology.

My opinion only DYOR
FF

AKIDA BALLISTA
 
Last edited:
  • Like
  • Love
  • Fire
Reactions: 57 users

Quatrojos

Regular
The water comes through the manufacturing process.
But could be added at any stage so that original recipe methanol 'looks like' e-methanol.
 
  • Fire
Reactions: 1 users
Top Bottom