BRN Discussion Ongoing

If I were the CTO of Brainchip I would publish a video with a test within two weeks with the title something like e-fuel detection vs diesel or gasoline. Some politicians here argue and say that this is technically difficult to realize. So far there have been impressive fun videos about wine or beer. I understand the background with food. But I would do it now and release it globally in two weeks at the latest.
Or do you think that Akida can't recognize e-fuels?

____
Decisions are made soon whether we like them or not. We can show politicians that it is possible and with that they can make decisions. I just can't imagine that the Brainchiper nerds can't prove that. Investment very low and attention maximum.

___
Some of you know them personally. Send them this and they should also send it directly to the EU Commission.
Hi @cosors
There is no doubt AKIDA could process the data but as Brainchip states “we don’t make sensors, we make them smart”.

The issue is does anyone currently make a sensor that is capable of detecting these chemicals that will provide reliable data for AKIDA to process every time someone needs to refuel without being cleaned or replaced after every exposure???

My opinion only DYOR
FF

AKIDA BALLISTA
 
  • Like
  • Fire
  • Thinking
Reactions: 12 users

jtardif999

Regular
There's also Hailo & GrAI Matter Labs. They sell chips so will be more difficult to scale as you have mentioned.



Akida appears to be unique with learning capabilities & IP business model can be scaled rapidly.

Akida could become the dominant processor similar to ARM due to lack of competitors & superior product.

Intel could advance their Loihi chips & IBM could advance their True North chips in a few years providing more competition along with a few new start ups should their technology prove to be as good or superior to Akida. However, Akida should have a commanding market share by the time the others catch up with their tech.

In the next few years we could probably allow 50% market share for BRN due to limited competition. Will also depend on whether or not the majority of AI applications require Akida's learning feature. There may be many basic AI applications that don't require the learning feature or vice versa.

I am being conservative with 10-20% long term market share & will not be surprised if it's much higher.
Don’t forget that Akida is out there on their own in terms of power consumption. All end-point devices will need and will eventually be mandated to have very low power consumption needs, end-point devices that will mark in the billions. They may not all need one-shot learning, but they will all need the low power. Remember also as NASA and the US DOD have found out that it’s not even just power, it’s SWaP - size, weight and power that puts Akida out their on its own. AIMO.
 
  • Like
  • Fire
  • Thinking
Reactions: 19 users

cosors

👀
Hi @cosors
There is no doubt AKIDA could process the data but as Brainchip states “we don’t make sensors, we make them smart”.

The issue is does anyone currently make a sensor that is capable of detecting these chemicals that will provide reliable data for AKIDA to process every time someone needs to refuel without being cleaned or replaced after every exposure???

My opinion only DYOR
FF

AKIDA BALLISTA
If they can ~tell (?) red wine from chardonnay they can tell synthetic fuel from conventional fuel. There is much more water in it, for example. They have the sensors in their laboratory - how else could the videos have been created? One week testing and training and next week producing the video. The other two videos were made to show what is possible. Here is an example with urgent significant importance worldwide. A use case that will be noticed by politicians. This is a stage. And it's almost free. Just please do it now. Get out the sensors from the last videos and do this with fuels instead of wine or beer. IMO

I'm only telling you this because it's up and down here in the news. They say it would be technically difficult to realize. But I don't believe that.

I just mean that politicians here say that this is technically ~not possible. And I question that. Of course it is possible. And the equipment is bored. So let's go, instead of wine fuels.
 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 14 users

stuart888

Regular
Nvidia CEO Jensen Huang has mentioned that "the iPhone moment of AI has started".

Nvidia turns to AI cloud rental to spread new technology​


Reuters.png
Economy 20 minutes ago (Mar 22, 2023 08:33)

By Stephen Nellis

(Reuters) -Nvidia Corp Chief Executive Jensen Huang on Tuesday laid out the company's plans to make the powerful and expensive supercomputers used to develop AI technologies like ChatGPT available for rent to nearly any business.

While that access will not come cheap - at $37,000 a month for eight of Nvidia's flagship A100 or H100 chips strung together - offering it to a wider swath of business customers could accelerate an AI boom that has driven Nvidia shares up 77% this year, making it about five times more valuable than longtime rival Intel Corp (NASDAQ:INTC).

The Santa Clara, California-based company already dominates the field for artificial intelligence chips and has helped partners like Microsoft Corp (NASDAQ:MSFT) build huge systems for ChatGPT creator OpenAI's services to answer questions with human-like text and generate images from prompts.

At Nvidia's annual software developer conference on Tuesday, Huang said the company was working with partners such as Oracle Corp (NYSE:ORCL) to offer access to Nvidia's DGX supercomputers with as many as 32,000 of Nvidia's chips to anyone who can log on with a web browser.

"The iPhone moment of AI has started," Huang said in the virtual keynote address, referring to how Apple Inc (NASDAQ:AAPL) opened up the market for smartphones.

Huang said Nvidia was also working with Microsoft and Alphabet (NASDAQ:GOOGL) Inc to offer its supercomputers, used to create new AI products, as a service. Nvidia on Tuesday announced new chips and software designed to make products like chatbots much cheaper to operate on a day-to-day basis after they have been created with supercomputers.

Those products "are years ahead of the competition," said Hans Mosesmann, a semiconductors analyst at Rosenblatt Securities. "Nvidia's leadership on the software side of AI is not only monumental - it is accelerating."

Nvidia is also partnering with AT&T Inc (NYSE:T) to make dispatching trucks more efficient, collaborating with quantum computing researchers to speed software development, and working with industry giant Taiwan Semiconductor Manufacturing Co to speed up chip development, Huang added.

Nvidia's new rental service, called DGX Cloud, could give many more developers the chance to access tens of thousands of its chips at once. Biotech firm Amgen Inc (NASDAQ:AMGN) and software firm ServiceNow Inc have started using the service, Nvidia said.

Nvidia also launched a service called AI Foundations to help companies train their customized artificial intelligence models. Several major owners of stock image databases plan to use the service, which would avert legal questions about copyright of images used to generate AI content.

Huang also announced technology to speed up the design and manufacturing of semiconductors. The software uses Nvidia's chips to speed up a step that sits between the software-based design of a chip and the physical fabrication of the lithography masks used to print that design on a piece of silicon.

Those calculations could take a traditional computing chip two weeks to complete, but Nvidia said Tuesday its chips and software can handle the task overnight and reduce the electricity used in the task from 35 megawatts to 5 megawatts.

Nvidia said it was working with ASML Holding (AS:ASML), Synopsys (NASDAQ:SNPS) Inc and TSMC to bring it to market. TSMC will start readying the technology for production in June, Huang said.
Fantastic 30-minute interview with the Nvidia Ceo standing outside.
Answering each question, smooth. He shows off the smallish AI Hardware Boxes. Extremely bullish for all things AI. Freaky good, loaded with goodies. Great overview into Nvidia's future, like the AI Factory!

So glad am I that Brainchip/Nvidia have so many clues, partnership type statements.



Jensen Huang, Nvidia Founder and CEO, joins Yahoo Finance Live to discuss the future of Artificial Intelligence technology and the tech industry at the annual GTC Conference.
 
  • Fire
  • Like
  • Love
Reactions: 9 users
If they can ~tell (?) red wine from chardonnay they can tell synthetic fuel from conventional fuel. There is much more water in it, for example. They have the sensors in their laboratory - how else could the videos have been created? One week testing and training and next week producing the video. The other two videos were made to show what is possible. Here is an example with urgent significant importance worldwide. A use case that will be noticed by politicians. This is a stage. And it's almost free. Just please do it now. Get out the sensors from the last videos and do this with fuels instead of wine or beer. IMO

I'm only telling you this because it's up and down here in the news. They say it would be technically difficult to realize. But I don't believe that.

I just mean that politicians here say that this is technically ~not possible. And I question that. Of course it is possible. And the equipment is bored. So let's go, instead of wine fuels.
As you are familiar with the debate and have the technical knowledge and insist it can be done you should contact Tony Dawe direct as you will be able to tell him the full story.

Someone else here contacting Tony Dawe and saying to him Cosors an anonymous poster who I don’t know personally believes you should immediately do a demonstration as he knows you can do it is going to look silly.

I encourage you to contact him as soon as possible.

My opinion only DYOR
FF

AKIDA BALLISTA
 
  • Like
  • Wow
  • Love
Reactions: 19 users

cosors

👀
As you are familiar with the debate and have the technical knowledge and insist it can be done you should contact Tony Dawe direct as you will be able to tell him the full story.

Someone else here contacting Tony Dawe and saying to him Cosors an anonymous poster who I don’t know personally believes you should immediately do a demonstration as he knows you can do it is going to look silly.

I encourage you to contact him as soon as possible.

My opinion only DYOR
FF

AKIDA BALLISTA
Such vehicles would have to use technology that would prevent them from driving if other fuels are used, the draft said.

I said above if I were CTO and that some of you know the board personally. So how should an email from a tiny retailer carry more weight than your word? We are not just 1000eyes. We share. Do we?

___
Dear FF don't get me wrong. Others here are in a much better position to address this. Nobody takes me seriously. Does that help BRN? I think not.
 
Last edited:
  • Like
Reactions: 7 users
Such vehicles would have to use technology that would prevent them from driving if other fuels are used, the draft said.

I said above if I were CTO and that some of you know the board personally. So how should an email from a tiny retailer carry more weight than your word? We are not just 1000eyes. We share. Do we?
As I said you should contact Tony Dawe and discuss it with him.

My opinion only DYOR
FF

AKIDA BALLISTA
 
  • Like
Reactions: 3 users
I have something for the tech savvy to answer. SNN models work on converting input data into electrical spikes with that spike indicating to the model whether that data is significant or not. I know that our hardware is able to process these spikes in a native SNN model or a CNN2SNN converted model with limited or no need for an external CPU. I’m not sure though what is going on if someone only licences the IP. Are they licensing the design of our hardware to intergrate to their hardware or are they licensing a software only version of our process in which case I’m not sure how it can process a SNN model. I guess what im really asking is what the difference in licensing the IP block or actually integrating our physical chip.
 

Dhm

Regular
Hi Dhm.
How sure are you that brn was not involved in that demo and what is you reason.
Upto now there is no knowledge other the Qualcomm and prophesee.
On the other hand we all know merc is using us through their announcement and Qualcomm is providing them.snapdragon for the same. Which means there is a possibility Qualcomm is working on our product as well
So .....
Hi @rgupta I'm not sure either way. However I would suppose that at least one of the 1000 eyes may have found a strand of proof, yet there was none forthcoming. In addition to this, and around the same time, @chapman89 had a text conversation with the CEO of Prophesee and Luca stated that we were a demonstration partner, or words to that effect. So my case rests, your honour - and I wish I was wrong in saying no!
 
  • Like
Reactions: 6 users

Quatrojos

Regular
Such vehicles would have to use technology that would prevent them from driving if other fuels are used, the draft said.

I said above if I were CTO and that some of you know the board personally. So how should an email from a tiny retailer carry more weight than your word? We are not just 1000eyes. We share. Do we?

___
Dear FF don't get me wrong. Others here are in a much better position to address this. Nobody takes me seriously. Does that help BRN? I think not.

E-methanol has the same chemical signature as regular methanol; no difference in carbon-hydrogen-alcohol linkages.
 
  • Like
Reactions: 4 users

cosors

👀
E-methanol has the same chemical signature as regular methanol; no difference in carbon-hydrogen-alcohol linkages.
I see that differently after the subject of engines and turbines. And why are e-fuels harmful for the current setting if it is identical? We can distinguish it I am sure. If we want to.

You destroy a combustion engine if you run it on pure e-fuels. That is a fact. So it is not identical.

But I better retire before Dredd shoots me down.
 
  • Haha
  • Like
  • Fire
Reactions: 11 users

Quatrojos

Regular
I see that differently after the subject of engines and turbines. And why are e-fuels harmful for the current setting if it is identical? We can distinguish it I am sure. If we want to.
I don't know about engines/turbines but I know about organic chem. It doesn't matter from where individual atoms are sourced. Once H20 and CO2 are combined to form methanol, there is no signature emitted from them apart from spectroscopy identifying them as 'methanol'.
 
  • Like
Reactions: 4 users

Learning

Learning to the Top 🕵‍♂️
CHICAGO, March 22, 2023 /PRNewswire/ -- The AI sensor market is projected to reach USD 22.1 billion by 2028, from USD 3.0 billion in 2022, at a CAGR of 41.6% according to a new report by MarketsandMarkets™. Key factors driving the growth of the AI sensor market include the growing demand for AI sensors in IoT-based devices across home automation and automotive applications, smart infrastructure, and digitization to facilitate edge computing growth, and the increased use of AI sensor-equipped wireless technologies.


Learning 🏖
 
  • Like
  • Fire
  • Wow
Reactions: 21 users

cosors

👀
I don't know about engines/turbines but I know about organic chem. It doesn't matter from where individual atoms are sourced. Once H20 and CO2 are combined to form methanol, there is no signature emitted from them apart from spectroscopy identifying them as 'methanol'.
I am an engineer from the ~automotive and I can only say: it is not identical. I mentioned that there is much more water in it, e.g.
Fill up tomorrow your car with e-fuels and we'll continue the discussion/debate.
We put some of it in here so that the machines don't break down. I don't know how it is in Australia or the USA. For me it is clear and fact.
 
  • Like
Reactions: 4 users

Tuliptrader

Regular
Something for Akida 2.0 to get its teeth into, perhaps.

https://www.intellisenseinc.com/innovation-lab/augmented-intelligence/real-time-image-enhancement/

Intellisense Systems, Inc.

Real-Time Image Enhancement​

No longer the stuff of spy movies, Intellisense Systems is developing deep learning-based, super-resolution algorithms that can enhance and clarify images in almost real time. Learn more about this innovation and how it is making military, rescue, and other hazardous operations safer.

“Zoom in and enhance.” These words have become ubiquitous in nearly every spy movie and TV show. Typically, a group of people will gather in a shadowy room surrounded by dozens of screens showing surveillance footage. One of the monitors will zoom in the pixelated visage of a passerby. But real-time image enhancement turns the once blurry freeze-frame into a crystal-clear picture, often revealing the face of a brave hero, or a nefarious villain.
Many viewers may roll their eyes at this cliché, but this technology can greatly bolster the safety and efficiency of intelligence or combat operations. Real-time image enhancement can not only help differentiate friend from foe, but it can also identify key items in people’s hands, as well as clarifying marks on vehicles or structures. Getting a clearer picture of people or the environmental improves the U.S. intelligence and ensures that targets are acquired while civilians remain safe.
With this goal in mind, the United States military solicited work from the Department of Defense’s Small Business Innovative Research (SBIR) program to develop a means of advancing real-time image enhancement technology. After successfully proposing a solution to this requirement, Intellisense Systems developed and demonstrated the ability to enhance images in low-light and nighttime conditions based on the novel use of a convolutional neural network (CNN). A CNN consists of layers and algorithms that mimic biological neurons, and it requires relatively little pre-processing compared to other image classification algorithms. This method of processing is ideal for image enhancement and video analysis; thus, the Intellisense team implemented into embedded hardware to help servicemembers in both recognition classification and laser pointing.
The machine-learning specialists at Intellisense devised this real-time image enhancement system to detect military-relevant targets in both still images and live video. Using a tablet, the software could automatically identify key elements in either a photo or video and present bounding boxes, icons, or color highlighting to key points of interest. The operator can then select an area of the image or video stream for enhancement. From here, the CNN begins its layer processing, increasing the image’s resolution and improving its contrast, acuity, and stability. It can eliminate motion blur or make out certain items that were previously undetectable, like a weapon in an enemy combatant’s hand or text on a mobile phone.
To further relieve the effort required by personnel, the CNN employs unsupervised learning and processing methodology. This means the network uses datasets as examples. As a result, it processes data without a specific answer or outcome to identify. Instead, the system can automatically determine the structure of the data (in this case, a still image or live video) without human input. This enables the CNN to identify patterns based on the datasets and autonomously enhance pictures and video, taking some of the burden off the system’s human users.
This system’s processing can be completed via a tablet to bolster the ease of use and mobility. It is compatible with the next generation of the U.S. Armed Forces’ handheld targeting system, which can be packaged into compact housing and mounted into a variety of locations. This innovation enables remote viewing and control via radio network. Additionally, the open-architecture approach makes this solution compatible with various software implementations. This enables real-time image enhancement algorithms to be developed independently of the armed forces’ new handheld targeting system.
With this solution, service members can gather intelligence and acquire targets with greater success and efficiency, all while keeping civilians and non-combatants out of harm’s way. And with funding from the SBIR, this solution can be commercialized to serve in other applications, like search-and-rescue missions and emergency response. Over the next few years, Intellisense will continue to train and refine these CNN algorithms so that real-time image enhancement can greatly improve decision-making, reduce users’ workload in detecting key information, and most importantly, save lives.

TT
https://www.intellisenseinc.com/
 
  • Like
  • Fire
  • Love
Reactions: 13 users

Kachoo

Regular
Such vehicles would have to use technology that would prevent them from driving if other fuels are used, the draft said.

I said above if I were CTO and that some of you know the board personally. So how should an email from a tiny retailer carry more weight than your word? We are not just 1000eyes. We share. Do we?

___
Dear FF don't get me wrong. Others here are in a much better position to address this. Nobody takes me seriously. Does that help BRN? I think not.
I think if you have ideas and information as a share holder you should contact them. They will take a look at this and no better as they have the skills to understand.

The wine tasting sensor may be different then the ones you speak of. Sensors are built to look for compound a or b or c maybe more. And yeah Akida could process the data and predict what fuels are there. Some sensors may not there are millions of sensors that do various things.

But your idea is valid and yeah bring it up to them. Likely they will thank you and pass this on to the right person that knows if this will work.

To be honest they would appreciate constructive support I've continued questions of why this or that is happening.

Cheers
 
  • Like
  • Fire
  • Love
Reactions: 11 users

Quatrojos

Regular
I am an engineer from the ~automotive and I can only say: it is not identical. I mentioned that there is much more water in it, e.g.
Fill up tomorrow your car with e-fuels and we'll continue the discussion/debate.
We put some of it in here so that the machines don't break down. I don't know how it is in Australia or the USA. For me it is clear and fact.
OK, I thought you meant that, as a result of origin, the methanol molecule was somehow different. If there's a difference in proportion of free H2O molecules between e-methanol and original recipe, why couldn't they just dress-up original recipe methanol by adding water?
 

cosors

👀
I think if you have ideas and information as a share holder you should contact them. They will take a look at this and no better as they have the skills to understand.

The wine tasting sensor may be different then the ones you speak of. Sensors are built to look for compound a or b or c maybe more. And yeah Akida could process the data and predict what fuels are there. Some sensors may not there are millions of sensors that do various things.

But your idea is valid and yeah bring it up to them. Likely they will thank you and pass this on to the right person that knows if this will work.

To be honest they would appreciate constructive support I've continued questions of why this or that is happening.

Cheers
I'm thinking about it. I am just very shy. I was thinking of those who know the board personally. Who should listen to. I had this situation exactly with Talga and I wrote then. And a year later I realize that this is exactly what is happening. Is not up to me. But this is not about a year but that one sticks something to the other. Where are our sensors for wine testing?

I am quiet now.
 
  • Like
Reactions: 5 users

cosors

👀
OK, I thought you meant that, as a result of origin, the methanol molecule was somehow different. If there's a difference in proportion of free H2O molecules between e-methanol and original recipe, why couldn't they just dress-up original recipe methanol by adding water?
The water comes through the manufacturing process.
 
  • Like
Reactions: 4 users
Top Bottom