BRN Discussion Ongoing

" Hey Fact Finder "
Looks to be Nvidia, are they working with Brainchip ?

The article you have found is from October, 2021 and states that the system is being used in the AMG SL model. The article I am referring to is from this year 2022 and states “Mercedes cAR will start with the market launch of the new A-Class. This is the first model to feature the Mercedes-Benz User Experience (MBUX)," My article states it is a new MBUX system.

So while it is entirely possible that Nvidia is still supplying GPU's it is clearly not the MBUX system that was used since 2018 and is in the AMG SL released October last year, 2021.

It was pointed out by others that Nvidia has been for a number of years the Mercedes Benz main computing partner however it is clear that with the release of the concept vehicle in late January, 2022 which nominated Brainchip for Hey Mercedes and other matters there has been a drift away from Nvidia to some extent.

Rob Telson spoke to the fact that rather than Nvidia being a competitor he would see it in more of a partnership relationship moving forward.

The idea that Nvidia and Brainchip are both working with Mercedes at the same time and not interacting seems fanciful.

Nvidia does not provide the same type of technology as Brainchip and Brainchip does not provide the same type of technology as Nvidia so unless the engineers at Mercedes had a main workshop with two breakaway rooms off this workshop each with external exits and kept the identities of the Nvidia and Brainchip engineers a secret from one to the other and did all the integration themselves then it seems logical to believe that Nvidia and Brainchip must have worked together on the Mercedes concept vehicle.

Accepting this and given how Mercedes broke the news about AKIDA's greater efficiency I would like to think that Nvidia would be well aware of just what Brainchip has achieved and be interested in engaging with Brainchip beyond this single project and as Rob Telson said more as partners than competitors.

My opinion only DYOR
FF

AKIDA BALLISTA
 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 37 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
  • Like
Reactions: 3 users

Diogenese

Top 20
The article you have found is from October, 2021 and states that the system is being used in the AMG SL model. The article I am referring to is from this year 2022 and states “Mercedes cAR will start with the market launch of the new A-Class. This is the first model to feature the Mercedes-Benz User Experience (MBUX)," My article states it is a new MBUX system.

So while it is entirely possible that Nvidia is still supplying GPU's it is clearly not the MBUX system that was used since 2018 and is in the AMG SL released October last year, 2021.

It was pointed out by others that Nvidia has been for a number of years the Mercedes Benz main computing partner however it is clear that with the release of the concept vehicle in late January, 2022 which nominated Brainchip for Hey Mercedes and other matters there has been a drift away from Nvidia to some extent.

Rob Telson spoke to the fact that rather than Nvidia being a competitor he would see it in more of a partnership relationship moving forward.

The idea that Nvidia and Brainchip are both working with Mercedes at the same time and not interacting seems fanciful.

Nvidia does not provide the same type of technology as Brainchip and Brainchip does not provide the same type of technology as Nvidia so unless the engineers at Mercedes had a main workshop with two breakaway rooms off this workshop each with external exits and kept the identities of the Nvidia and Brainchip engineers a secret from one to the other and did all the integration themselves then it seems logical to believe that Nvidia and Brainchip must have worked together on the Mercedes concept vehicle.

Accepting this and given how Mercedes broke the news about AKIDA's greater efficiency I would like to think that Nvidia would be well aware of just what Brainchip has achieved and be interested in engaging with Brainchip beyond this single project and as Rob Telson said more as partners than competitors.

My opinion only DYOR
FF

AKIDA BALLISTA
Hi FF,

For those who missed yesterdays episode, Akida's SNN SoC recognizes and classifies sensor data.

Previously this work was done on programmable CPUs/NPUs using software programs implementing CNNs (Convolutional neural Networks) which are very heavy on computation, slow and power hungry. The CPU/NPU then had another program to utilize the output of the CNNs, for example, generating a voice response or a steering response. A major portion of the power and time is consumed in the CNN process.

Akida is many times faster and more power efficient than CNN mainly because of the sparsity of event (change) processing as compared to processing digital bytes, as well as avoiding the von Neumann bottleneck of CPUs in retrieving data from memory, and in reading/implementing program instructions.

In Akida, on the other hand, the operation time is merely the time for transistor switch operations to propagate through the NN (quasi-)*asynchronously when the input event data triggers a spike in the first layer of digital neurons. There is no need for Akida to continually retrieve and implement instructions.

The output of Akida can be substituted for the CNN program as an input for the programmable processor CPU/GPU, greatly reducing the overall power requirements and improving the timeliness of the response.

* "quasi" because Akida does packetize the input data.

The programmable processor could be provided by ARM/Intel/Nvidia ...
 
  • Like
  • Fire
  • Love
Reactions: 35 users

Diogenese

Top 20
That's quite a list! From a quick 1 minute perusal it doesn't appear that they have anything relating to edge technologies?
This one could use Akida in speech recognized 218 and parser 220. Interestingly (or not) the inventors are from Germany.

WO2021138200A1 LOCATION-AWARE REMINDERS

A method for managing location-aware reminders in an automobile includes monitoring a geographic location of the automobile using a computer system installed in the vehicle. The computer system detects that the automobile has entered a geographic region associated with a location-aware reminder and issues a reminder message associated with the location-aware reminder to a driver of the automobile based on the detecting.
1646453421510.png
 
  • Like
  • Fire
Reactions: 23 users
Hi FF,

For those who missed yesterdays episode, Akida's SNN SoC recognizes and classifies sensor data.

Previously this work was done on programmable CPUs/NPUs using software programs implementing CNNs (Convolutional neural Networks) which are very heavy on computation, slow and power hungry. The CPU/NPU then had another program to utilize the output of the CNNs, for example, generating a voice response or a steering response. A major portion of the power and time is consumed in the CNN process.

Akida is many times faster and more power efficient than CNN mainly because of the sparsity of event (change) processing as compared to processing digital bytes, as well as avoiding the von Neumann bottleneck of CPUs in retrieving data from memory, and in reading/implementing program instructions.

In Akida, on the other hand, the operation time is merely the time for transistor switch operations to propagate through the NN (quasi-)*asynchronously when the input event data triggers a spike in the first layer of digital neurons. There is no need for Akida to continually retrieve and implement instructions.

The output of Akida can be substituted for the CNN program as an input for the programmable processor CPU/GPU, greatly reducing the overall power requirements and improving the timeliness of the response.

* "quasi" because Akida does packetize the input data.

The programmable processor could be provided by ARM/Intel/Nvidia ...
You are the perfect partner to have backing up my fingers crossed understanding of how the AKIDA technology works.

I hope all here show you the proper appreciation. FF

AKIDA BALLISTA hand in hand with ARM/Intel/Nvidia in 2022 and beyond.
 
  • Like
  • Fire
Reactions: 30 users

VictorG

Member
You are the perfect partner to have backing up my fingers crossed understanding of how the AKIDA technology works.

I hope all here show you the proper appreciation. FF

AKIDA BALLISTA hand in hand with ARM/Intel/Nvidia in 2022 and beyond.
Oh I truly appreciate Diogenese for all his posts, the things I understand and the things I pretend to understand. If there were a legend badge award, I would nominate him and you FF.

VG
 
  • Like
  • Fire
Reactions: 21 users
Ok here is something that should occupy the 1,000 Eyes for the next few weeks.

Ford entered an EAP with Brainchip for ADAS and AV in May, 2020.

In late August, 2020 Ford filed the following patent:


Everyone here has read about Neuralink and a host of other research relating to brain machine interface and every single one requires the ability to recognise the signals produced by the brain and or muscles of the body. In fact we have had a constant flow of could this be AKIDA due to the fact that spiking neural networks seem to be what is required to process these signals coming from the brain in an efficient manner for robotics and artificial limbs.

We know that Ford did not want to be disclosed but the ASX forced Brainchip's hand on this issue. We know that Ford is often spoken of in hushed tones by Brainchip or not at all but Ford continues in the background to be an EAP.

The above patent is so revolutionary in regards to autonomous vehicle management it would certainly provide a basis for the type of security around what Brainchip and Ford are doing so I will leave it to you to decide but what other company could be providing the necessary technology for this purpose that currently and in 2020 was engaged with Ford?

My opinion only DYOR
FF

AKIDA BALLISTA
Hi everyone and Dio in particular,

I have been marshalling an understanding of this Ford patent and how it works to put forward my own lay explanation of the problem and the Ford patent solution.

Not having an autonomous vehicle in my garage I was unaware of what Ford has identified as a problem before reading the patent but I think it is as follows:

I am driving along in my up to Level 4 autonomous vehicle requiring me to also monitor the road ahead.

Currently there are head tilt camera warning systems that try to measure if I am not carefully observing the road ahead. These systems are not fool proof as I am not a perfect human specimen and can be watching even when to a head tilt camera I appear not to be.

Even radar or lidar watching my eyes and facial expressions for fatigue fall down for the same reason.

But you say surely that is a good thing that the autonomous vehicle will not depend upon me and will take action to avoid the unfolding changes in traffic that it thinks I have not seen and issues me with an alert to pay attention while it hits the brakes and pulls to the left or whatever.

Well according to Ford this is not a good thing.

I suspect when you consider that at highway speeds lots of things can happen very, very quickly giving me a warning or taking action can affect the action I am about to take as unbeknown to the autonomous vehicle despite my appearance to its system I am actually watching the road ahead and had a planned course of action which was at variance with the autonomous vehicles planned course of action. This leads to the autonomous vehicle and myself both acting at the same time which might be at odds one with the other.

This situation in my mind would be analogous to the passenger in a non autonomous vehicle fearing that I was not watching the roadway ahead grabbing the wheel at the same time as I was engaged in taking action. This might not be very pretty.

So the revolutionary idea behind this patent is that if the autonomous system was able to monitor brain pulses or muscle contractions of the driver early enough or in real time it could tell if I was despite my appearance watching the semi trailer ahead jack knifing down the three lanes ahead and in the process of taking action as required.

Thus knowing this the autonomous system would not interfere with what I was doing and thereby distract me causing an adverse outcome in such circumstances.

The thing about this system though is that it could not possibly depend upon the cloud as the autonomous system is measuring my brain muscle response immediately before it would intervene and so it all needs to be happening in microseconds or less as even a 1 second delay at 110 kilometres per hour on a wet expressway down hill would be too long to delay taking action by either myself or the autonomous system.

My excitement about this patent is because I have never in all my reading about autonomous vehicle development read that this problem existed and hence this patent from Ford is the first time I have read of a possible solution.

This is the ultimate wearable technology and would also have applications in workplaces where robots and humans work together to achieve an outcome.

As I said AKIDA technology is tailor made to do this job and is anything from 2 to 3 years at least ahead of any possible competitors with a solid patent wall.

My opinion only DYOR
FF

AKIDA BALLISTA
 
  • Like
  • Fire
  • Love
Reactions: 44 users

Diogenese

Top 20
Hi everyone and Dio in particular,

I have been marshalling an understanding of this Ford patent and how it works to put forward my own lay explanation of the problem and the Ford patent solution.

Not having an autonomous vehicle in my garage I was unaware of what Ford has identified as a problem before reading the patent but I think it is as follows:

I am driving along in my up to Level 4 autonomous vehicle requiring me to also monitor the road ahead.

Currently there are head tilt camera warning systems that try to measure if I am not carefully observing the road ahead. These systems are not fool proof as I am not a perfect human specimen and can be watching even when to a head tilt camera I appear not to be.

Even radar or lidar watching my eyes and facial expressions for fatigue fall down for the same reason.

But you say surely that is a good thing that the autonomous vehicle will not depend upon me and will take action to avoid the unfolding changes in traffic that it thinks I have not seen and issues me with an alert to pay attention while it hits the brakes and pulls to the left or whatever.

Well according to Ford this is not a good thing.

I suspect when you consider that at highway speeds lots of things can happen very, very quickly giving me a warning or taking action can affect the action I am about to take as unbeknown to the autonomous vehicle despite my appearance to its system I am actually watching the road ahead and had a planned course of action which was at variance with the autonomous vehicles planned course of action. This leads to the autonomous vehicle and myself both acting at the same time which might be at odds one with the other.

This situation in my mind would be analogous to the passenger in a non autonomous vehicle fearing that I was not watching the roadway ahead grabbing the wheel at the same time as I was engaged in taking action. This might not be very pretty.

So the revolutionary idea behind this patent is that if the autonomous system was able to monitor brain pulses or muscle contractions of the driver early enough or in real time it could tell if I was despite my appearance watching the semi trailer ahead jack knifing down the three lanes ahead and in the process of taking action as required.

Thus knowing this the autonomous system would not interfere with what I was doing and thereby distract me causing an adverse outcome in such circumstances.

The thing about this system though is that it could not possibly depend upon the cloud as the autonomous system is measuring my brain muscle response immediately before it would intervene and so it all needs to be happening in microseconds or less as even a 1 second delay at 110 kilometres per hour on a wet expressway down hill would be too long to delay taking action by either myself or the autonomous system.

My excitement about this patent is because I have never in all my reading about autonomous vehicle development read that this problem existed and hence this patent from Ford is the first time I have read of a possible solution.

This is the ultimate wearable technology and would also have applications in workplaces where robots and humans work together to achieve an outcome.

As I said AKIDA technology is tailor made to do this job and is anything from 2 to 3 years at least ahead of any possible competitors with a solid patent wall.

My opinion only DYOR
FF

AKIDA BALLISTA
Boeing 737 Max is all over this!
 
  • Like
  • Haha
Reactions: 8 users

equanimous

Norse clairvoyant shapeshifter goddess
Brainchip would be beneficial here
1646457279348.png
 
  • Like
Reactions: 8 users

M_C

Founding Member

Joining Hands for an Electric Future​

Electric vehicles have become the foundation of the new mobility paradigm all over the world. Specific segments, such as two-wheelers and light EVs, are gaining traction in emerging markets because they fit multiple use cases.
Tata Elxsi and Renesas have opened a state-of-the-art Next Generation EV Innovation Center (NEVIC) in Bangalore to develop targeted solutions for electric vehicles (EV).
Renesas will provide their state-of-the-art semiconductor devices, software, and expertise. Tata Elxsi will leverage its experience and expertise in hardware and software engineering to create the reference designs and provide design support for customization and System integration testing.
Through NEVIC’s product and service offerings, Tata Elxsi and Renesas will collaborate to accelerate eMobility adoption, particularly the two-wheel and light EV segments, assisting clients in overcoming time to market and technology complexity.
 
  • Like
  • Fire
Reactions: 15 users

M_C

Founding Member
Interesting forbes article about the foundry state of play in the U.S and more specifically INTEL. Interestingly Rob Telson is a fan too..........


System on a Package​

Many cloud service providers as well as startups are designing multi-chiplet platforms for workloads such as AI, placing accelerators, CPUs, and I/O dies (possibly from different manufacturing nodes) on a package. “System on a Package” is the new mantra, but there are few companies that have the required technology and available fabrication capacity to make it real. In the old days, Intel saved the underlying technologies such as EMIB for multi-die packages as a differentiator for Intel products. Now Intel is stepping up and making this approach available for all comers. Is Intel becoming Open? Sounds like it to me.

Conclusions​

I’ve been in this industry for over four decades, and I am not exaggerating when I say this is perhaps one of the most impactful strategic announcements I have seen. But it may take years to see it pay off for Intel: will long-time competitors decide to trust Intel for critical services and fab capacity? They may not have much choice. The capacity constraints at TSMC and Samsung for advanced process wafer starts may last years. It takes 3 years to build a new fab, minimum so new capacity is slow to come on-line. But nimble companies will see Intel’s commitment and investment in RISC-V as just the force needed to reach a tipping point, and will quickly get in line for wafer start slots at Intel Foundry Services.

1646466694756.png
 
  • Like
  • Fire
Reactions: 13 users

TheFunkMachine

seeds have the potential to become trees.
  • Like
Reactions: 6 users

FJ-215

Regular
Weeks, maybe a couple months at most imo, Russia will withdraw or Ukraine will capitulate.

Really no other outcomes here.

BRN likely going under $1 Monday or Tuesday going by USA markets..
G'day Hittman,

I know this is going to sound heartless and it is not my personal take....BUT!!

The stock market doesn't give a flying F@#% about the people of The Ukraine!!!!

Ever see the movie "Charlie Wilson's War"? And after that, somehow America (and us) got sucked in to Afghanistan. Go figure!
How are the people of Syria going after US congress voted to bank roll "opposition forces"?

Does this sound familiar?


Did The Ukraine court NATO or did NATO court The Ukraine??? Doesn't really matter now. Russia had a buffer around themselves with the USSR, that's gone, they don't want NATO's weapons on their door step in any of the former states.

And Where the Fuck Is NATO?

Article 5!!!

One in all in???

What is different this time is the captains of industry have pulled the plug on Russia. It's not just State Vs State anymore.

No More Posturing......

The market has taken on a nuclear state that does not care about money> it only cares about its own survival....

The world needs some Akida Logic really quick!!!


Then again, it is the weekend and I've had a few bevvies. Such is life!!
 
  • Like
Reactions: 9 users
Hi FF, in one of your replies to me you talked about AGI artificial general intelligence, I've read somewhere in one of Brainchip's articles of them talking about their tech as encompassing ( for lack of a better word) AGI. I'm not sure if you've come across this article but they are saying that they can't see it happening until the end of the century 😯 so WOW , are Brainchip that far advanced ?

How imminent is AGI?
In predicting that AGI won’t arrive until the year 2300, Rodney Brooks, an MIT roboticist and co-founder of iRobot, doesn’t mince words: “It is a fraught time understanding the true promise and dangers of AI. Most of what we read in the headlines… is, I believe, completely off the mark.”

The article 👇

There is an earlier McKinsey Report on this point I think 2018 which I posted over on HC and was trolled relentlessly by a sad individual who claimed to be a driving instructor. In that report they suggested that incremental on chip learning would not occur until 2028.

Of course Brainchip were already claiming they would produce AKD1000 which could do this which would have placed them 10 years ahead at that point.

I am not sure if you have read the whole of the report you posted but the date for AGI is predicted by others sooner than 2300.

When AKD2000 with LSTM comes out later this year based on McKinsey’s analysis over time I personally believe Brainchip will be close to 10 years ahead rather than the 5 years offered by Brainchip.

My opinion only DYOR
FF

AKIDA BALLISTA
 
Last edited:
  • Like
  • Fire
Reactions: 24 users

Baisyet

Regular
Ok here is something that should occupy the 1,000 Eyes for the next few weeks.

Ford entered an EAP with Brainchip for ADAS and AV in May, 2020.

In late August, 2020 Ford filed the following patent:


Everyone here has read about Neuralink and a host of other research relating to brain machine interface and every single one requires the ability to recognise the signals produced by the brain and or muscles of the body. In fact we have had a constant flow of could this be AKIDA due to the fact that spiking neural networks seem to be what is required to process these signals coming from the brain in an efficient manner for robotics and artificial limbs.

We know that Ford did not want to be disclosed but the ASX forced Brainchip's hand on this issue. We know that Ford is often spoken of in hushed tones by Brainchip or not at all but Ford continues in the background to be an EAP.

The above patent is so revolutionary in regards to autonomous vehicle management it would certainly provide a basis for the type of security around what Brainchip and Ford are doing so I will leave it to you to decide but what other company could be providing the necessary technology for this purpose that currently and in 2020 was engaged with Ford?

My opinion only DYOR
FF

AKIDA BALLISTA
When it was question about Ford I went and checked all the announcement Like you said FF they ASX forced Braincip, other wise on May24 2020 below was announced.
 
Last edited:
  • Like
Reactions: 3 users

FJ-215

Regular
Ps ....Call out to Dingo....

Really missing your contributions

or did you change identities?
 
  • Like
  • Fire
Reactions: 9 users

BaconLover

Founding Member
Do you think it was a organised push down to make sure BRN stays in Top 300?

Short answer in my opinion - No. Wouldn't have made much difference anyways in this market environment.

Let us see an example from last year. On March 22, 2021 BRN got added to the ASX 300 list. It did a run up, but then it got settled down a few weeks later. I remember this because I remember waiting though there was lots of excitement around this time in terms of buying in the Superannuation funds. Though the addition to ASX 300 is a great news for the company and shareholders, that alone isn't enough to warrant an organised push down in my opinion. They probably did some shorting activities etc as usual, but that wouldn't be to make sure BRN doesn't enter the Top 200. From memory, the new list consisted of commodity stocks mainly, which makes sense in the current market situation.

If anything, I believe the instos would have used this opportunity to do another run up and take profits, but the whole market scenario didn't favour that play for tech sector.

If you look the photo below, the initial run up was in March last year, leading upto the ASX 300 inclusion, and you can see what happened soon after. (yes I understand any of these buys would be around 100% profit now, but just saying the instos will make money when they can and wouldn't have missed the opportunity to do so if they had the chance)
Screenshot (7).png
 
  • Like
  • Thinking
Reactions: 15 users

Diogenese

Top 20
Btw how so two differrent systems doing the same thing ?

One thing I know is that Ford hold the most patents for autonomous tech I thought but Bosch ranked highest for autonomous driving, it's hard to keep up as things are moving exponentially... brainchip may have put a spanner in the works..🤔

Bosch 👇


Ford 👇

Hi Frogstar,

The autonomous driving patent stats relate to patent applications filed up to 2017, so will not include any Akida-related application. Not all autonomous driving patents will relate to NNs. Those that include NNs will probably relate to software CNNs, or, if there are any hardware SNNs, they will be analog SNNs which would not have been capable of on-chip learning, and would suffer from the inaccuracies caused by manufacturing inconsistencies. Just because something is patented does not mean it makes it to production.

As patent applications are not published for 18 months, there may be some Akida-related (digital SNN) autonomous driving patent applications in the pipeline now.
 
  • Like
  • Fire
Reactions: 17 users

Diogenese

Top 20
Btw how so two differrent systems doing the same thing ?

One thing I know is that Ford hold the most patents for autonomous tech I thought but Bosch ranked highest for autonomous driving, it's hard to keep up as things are moving exponentially... brainchip may have put a spanner in the works..🤔

Bosch 👇


Ford 👇

"Btw how so two different systems doing the same thing?"

As Ella always reminds us:
"'tain't what you do,
it's the way that you do it."
 
  • Like
  • Haha
  • Fire
Reactions: 13 users

Boab

I wish I could paint like Vincent
Woo hoo, check this out JD, Valeo are getting/already.in the vr xr space 🤓 sad also , can't see them rolling them out in old people's homes 😑
Anyway, valeo on the pulse..😉


That is so cool.
 
  • Like
Reactions: 5 users
Top Bottom