BRN Discussion Ongoing

TheDon

Regular
It's pretty green on my screen and it's goin to finish green today.
My opinion only

TheDon
 

Attachments

  • Screenshot_20220221-124014_Google.jpg
    Screenshot_20220221-124014_Google.jpg
    321.4 KB · Views: 96
  • Like
Reactions: 8 users

Realinfo

Regular
What are your thoughts on Brn making the Asx 200 in March? Our sp has dropped over the last 2 weeks and further red looks likely ATM.
Me myself personally believe it’s a lay down misere.

Given it’s decided largely on market cap, and that over thirty of the ASX 200 companies have a MC well south of $2B, we could go down into the $0.90‘s, and still qualify ahead of many of these companies.

Correct me if I’m wrong…the big players will then be obliged to hold an index weighted parcel. Won’t that be a hoot !!
 
  • Like
Reactions: 13 users

RobjHunt

Regular
WTF is going on?...$1.28...Should I be worried?...this is killing me.
Well that's how i feel.
But in reality, if it was back to $2.30 I doubt i'd be selling any.
It's just a feel good thing!
Pantene ole mate.
 
  • Like
Reactions: 15 users

Dhm

Regular
Everywhere I go by following the science I find good news for AKIDA technology. It has been stated by the former CEO Mr. Dinardo and echoed by others that AKIDA has found a sweet spot in Lidar. The following recent news article is a huge positive then:


My opinion only DYOR
FF

AKIDA BALLISTA
I have a 2 yo Tesla 3 and love it. One of the safety sells was radar technology that was able to 'view' the car two in front of me and if that car was to stop or at least slow aggressively then the Tesla AI would immediately know about it. For Musk to dismiss radar technology surprises me as that extra layer of safety should be paramount. Now, I may be looking at this issue too simplistically but if Musk sees the future as only cameras as defence, I wouldn't be backing that horse.

Edit: US futures up 0.5% at present.
 
Last edited:
  • Like
Reactions: 18 users
Hi all

i don't post much, especially with the HotPot of crap over there, nice to be a part and observe the sanity of this group, so thankyou for accepting the request

i'll try and engage with sharing of info and debate along the way, but no guarantees, there are much smarter and time rich people thank myself

good luck to us all !

SS
 
  • Like
  • Love
Reactions: 24 users

Newk R

Regular
Hi Newk R,

Where I have always found the positive with Brainchip is in the technology itself and the integrity of the company's personnel.

Earlier I posted the following report from 2019 and suggested that it was a must read for every investor. This is the link:


In reading this report I found the following statement by Brainchip to the report writer:

"Brainchip was involved with a successful proof of concept demonstration with an automotive system supplier who wanted to see if the company's SNN technology could infer the height of objects based on output from ultrasound sensors. It would be helpful to know, for example, if a sensed object is indeed an obstacle or if it is short enough that the vehicle could comfortably back over it."

Taking the report as a whole and knowing the full history of Brainchip it is a reasonable assumption that Valeo was the automotive system supplier that was involved. I posed the question in my post regarding how many reversing sensors are there being fitted to motor vehicles of every description around the world every single day. I do not know and have not even bothered to look because it is obviously a massive market and this is just one type of sensor that will benefit from being made smart.

Think about the other applications. What about such sensors with AKIDA intelligence for the visually impaired? What about such sensors for estimating the height of patients in medical settings? What about pool gates that restrict access to children under a certain height? What about a rental car automatically sensing the height of the next unseen before driver and adjusting the seat as they are about to enter?

Anyway as I say if I need to raise my confidence levels I ignore the price and follow the science. When Brainchip shares collapsed to 4 cents science and my belief in the company's personnel carried me through.

My opinion only DYOR
FF

AKIDA BALLISTA
Thanks FF, my confidence is on an upward trajectory again.
 
  • Like
  • Love
Reactions: 10 users

Baisyet

Regular
Thanks FF, my confidence is on an upward trajectory again.
Hey @Newk R it happens to me all the time when i hear no news and SP going down. But i tend to be strong and reach out to forum and guess what who come t boost our faith our respected @Fact Finder :)
 
  • Like
  • Love
Reactions: 17 users

mcm

Regular
Dow futures up around 100 points. Perhaps the invasion may still be avoided: https://www.canberratimes.com.au/story/7628109/biden-agrees-to-ukraine-summit-with-putin/?cs=14232

Putin cites security as a reason why he doesn't want a potential NATO member on his border ... however that is nonsense imo given Russia is a super power and will never be invaded because it would result in WW3 and the destruction of the planet.

If Putin does invade Ukraine it will be purely an ego thing ... a legacy he wants to leave behind as being the one who brought Ukraine back into the folds of Mother Russia ... and he doesn't care how many lives will be lost in the process and the immense suffering he'll wreak on the citizens of Ukraine as well as his own country.

'Tis to be hoped sanity prevails and Putin withdraws his troops.

While BRN's share price is down I think if there is one stock that will benefit from global tensions it will be BRN given the important role Akida is likely to play in the future defence capabilities of the US and its allies. I'm certainly holding tight.
 
  • Like
  • Thinking
  • Fire
Reactions: 30 users
I have a 2 yo Tesla 3 and love it. One of the safety sells was radar technology that was able to 'view' the car two in front of me and if that car was to stop or at least slow aggressively then the Tesla AI would immediately know about it. For Musk to dismiss radar technology surprises me as that extra layor of safety should be paramount. Now, I may be looking at this issue too simplistically but if Musk sees the future as only cameras as defence, I wouldn't be backing that horse.

Edit: US futures up 0.5% at present.
What you raise about safe driving is in fact what I was taught by the Police Driving Instructors at St Ives a lifetime ago. Never drive just to the signals coming from the vehicle immediately ahead of you but to the signals coming from as many vehicles as you can take notice of ahead of you. This is a core attribute of all defensive driving techniques.

When I watched the Valeo Lidar video the fact that it was capable of monitoring other vehicles not within the driver's direct line of sight was the most impressive part of its capabilities particularly when it showed it was able to keep tabs on the motorcycle that was on the outside of the truck travelling next to the Valeo Lidar protected vehicle.

Ordinary cameras cannot see through solid objects such as semi trailers. I cannot see the rationale for pursuing camera only just because that is how humans do it. I thought the whole idea of autonomous vehicles was to create something safer and better than human drivers??

My opinion only and he is the billionaire not me so DYOR
FF

AKIDA BALLISTA
 
  • Like
  • Fire
Reactions: 35 users

Dhm

Regular
What you raise about safe driving is in fact what I was taught by the Police Driving Instructors at St Ives a lifetime ago. Never drive just to the signals coming from the vehicle immediately ahead of you but to the signals coming from as many vehicles as you can take notice of ahead of you. This is a core attribute of all defensive driving techniques.

When I watched the Valeo Lidar video the fact that it was capable of monitoring other vehicles not within the driver's direct line of sight was the most impressive part of its capabilities particularly when it showed it was able to keep tabs on the motorcycle that was on the outside of the truck travelling next to the Valeo Lidar protected vehicle.

Ordinary cameras cannot see through solid objects such as semi trailers. I cannot see the rationale for pursuing camera only just because that is how humans do it. I thought the whole idea of autonomous vehicles was to create something safer and better than human drivers??

My opinion only and he is the billionaire not me so DYOR
FF

AKIDA BALLISTA
Right this minute I would rather trust your logic than Musk. BTW I always hated Musk sticks. Much preferred Redskins.
 
  • Like
  • Haha
  • Thinking
Reactions: 8 users

Rskiff

Regular
What you raise about safe driving is in fact what I was taught by the Police Driving Instructors at St Ives a lifetime ago. Never drive just to the signals coming from the vehicle immediately ahead of you but to the signals coming from as many vehicles as you can take notice of ahead of you. This is a core attribute of all defensive driving techniques.

When I watched the Valeo Lidar video the fact that it was capable of monitoring other vehicles not within the driver's direct line of sight was the most impressive part of its capabilities particularly when it showed it was able to keep tabs on the motorcycle that was on the outside of the truck travelling next to the Valeo Lidar protected vehicle.

Ordinary cameras cannot see through solid objects such as semi trailers. I cannot see the rationale for pursuing camera only just because that is how humans do it. I thought the whole idea of autonomous vehicles was to create something safer and better than human drivers??

My opinion only and he is the billionaire not me so DYOR
FF

AKIDA BALLISTA
To be fair FF, Tesla's pursuit of vision only to be better than a human is that the cars have 8 surround cameras as opposed to just one set of eyes. Not saying that it will be best especially, if not using Akida.
 
  • Like
Reactions: 5 users
To be fair FF, Tesla's pursuit of vision only to be better than a human is that the cars have 8 surround cameras as opposed to just one set of eyes. Not saying that it will be best especially, if not using Akida.
That is why I said he is the billionaire. After seeing what Valeo can do with Lidar leaving aside the inclusion of AKIDA or not I just don't understand the complete rejection of Lidar technology and how it could not be complimentary to cameras even if only to keep track of out of line of sight objects. Would love to understand Tesla reasoning. When I read about the percentage of drivers on our roads affected by drugs, illogical out of the blue actions will and do occur all too often.

My opinion only DYOR
FF

AKIDA BALLISTA
 
  • Like
Reactions: 20 users

Terroni2105

Founding Member
Howdy All,

I just wanted to back-track a little on what we know from the Mercedes Vision EQXX reveal and to try to piece together some further possibilitites.

When discussing the MBUX, Sean Hehir hinted in his latest presentation that Akida has more than one use case. This marries up very nicely with a statement from Mercedes which indicates they have been working on 'systems' plural.



Extract 1
“Working with California-based artificial intelligence experts BrainChip, Mercedes-Benz engineers developed systems based on BrainChip’s Akida hardware and software,” Mercedes noted in a statement describing the Vision EQXX.

Extract 2
“Although neuromorphic computing is still in its infancy, systems like these will be available on the market in just a few years,” Mercedes said. “When applied at scale throughout a vehicle, they have the potential to radically reduce the energy needed to run the latest AI technologies.”


In an interview with Kora JoonAng Daily Markus Shafer said that Semiconductors that use less energy contributed to the range and that many of the key components and concepts have been developed will be incorporated into production models in two or three years.


Extract 3

Semiconductors that use less energy contributed to the range, according to the CTO.

"The plan is to bring most of the elements of the Vision EQXX into real vehicles in a time frame of 2024 to 2025," he said.


In an interview on the 8th January, Markus Schäfer said Mercedes plans to “make sure we have custom, standardized chips in the car", rather than using a thousand different chips.

Extract 4

Mercedes, for example, plans to use fewer specialized chips in upcoming models and more standardized semiconductors and to write its own software, said Markus Schäfer, a member of the German automaker’s board that oversees purchasing.

In the future, Mercedes will “make sure we have custom, standardized chips in the car,” Mr Schäfer said in an interview Wednesday. “Not a thousand different chips.”


To try and work out the quantity of sensors that may be incorporated I thought that the image that was used in some of the press releases was potentially instructive. The image of the brain shows the spiking neural networks in bright blue dots, so it would make sense if the bright blue dots on the car indicated the location and number of SNN sensors required IMO.




View attachment 1481


The other thing that occurred to me is that if Mercedes recognises how important Akida is to radically reduce energy consumption, then what would prevent them from utilising the same sensor solution (Akida) in their automotive production lines? If Akida can do key word spotting 5 to 10 times better than conventional technology whilst consuming ultra low power, then how many other applications can Akida perform to maximize efficiency from an industrial standpoint (i.e defect detection, vibration analysis and much, much more)?

Let's face it, Mercedes understand how good Akida is. In my opinion it be incredibly unlikely for them not to expand the use cases for it on their assembly lines.



Extract 1 and 2

Extract 3

Extract 4

Thanks Bravo, this is a fantastic post. Could you also put it in the Mercedes thread?
 
  • Like
Reactions: 14 users

Equitable

Regular
There has been a lot of discussion about the current market downturn, inflationary fears and concern about a Russian invasion of Ukraine.

These things will pass. The share market always climbs the 'wall of worry' and powers on to new heights.

Throughout the Twentieth Century we had two world wars, a flu pandemic that killed many more people then did Covid19, the Great Depression, the oil shock, Korea and Vietnam. While the market was down for a while sometimes, it never failed to continue to move on and provide great returns.

1645422163703.png


From 1900 to 2019, the Australian share market returned 11.8% per annum including dividends.

120 Years of Historical Returns

Don't look at what is happening now, look through and beyond. The issues you read about now in the news are not an endgame in themselves, they will pass and the market will move on. It always has and always will. All IMHO of course.
 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 74 users

GDJR69

Regular
Agreed, nothing has changed with BRN, SP fluctuations are just mad Mr Market having a bad day as Benjamin Graham would say.
 
  • Like
Reactions: 16 users

Bombersfan

Regular
After a while i just wanted to see the drama at the other side and guess what the drama never ends I never saw this person psot anything before there. hot crapper is just unbelievable.


this guys is the poster .
dwuuuu

4,242 Posts.


195

21/02/22

14:50

Post #: 59708726

Share



BRN Price at posting: $1.30 Sentiment: Sell Disclosure: Not Held
Don’t bring that shit over here.
 
  • Like
Reactions: 6 users

Justchilln

Regular
There has been a lot of discussion about the current market downturn, inflationary fears and concern about a Russian invasion of Ukraine.

These things will pass. The share market always climbs the 'wall of worry' and powers on to new heights.

Throughout the Twentieth Century we had two world wars, a flu pandemic that killed many more people then did Covid19, the Great Depression, the oil shock, Korea and Vietnam. While the market was down for a while sometimes, it never failed to continue to move on and provide great returns.

View attachment 1504

From 1900 to 2019, the Australian share market returned 11.8% per annum including dividends.

120 Years of Historical Returns

Don't look at what is happening now, look through and beyond. The issues you read about now in the news are not an endgame in themselves, they will pass and the market will move. It always has and always will. All IMHO of course.
Absolutely correct, can I also just add that we are extremely well financed to ride out a few rough years and I’m also expecting brainchip to out perform a bearish market due to rapidly increasing revenue growth.
 
  • Like
Reactions: 24 users

Slade

Top 20
The Mantra that keeps me strong:
Mercedes, Valeo, Renesas and MegaChips
Mercedes, Valeo, Renesas and MegaChips
Mercedes, Valeo, Renesas and MegaChips
 
  • Like
  • Fire
  • Love
Reactions: 47 users
After a while i just wanted to see the drama at the other side and guess what the drama never ends I never saw this person psot anything before there. hot crapper is just unbelievable.


this guys is the poster .
dwuuuu

4,242 Posts.


195

21/02/22

14:50

Post #: 59708726

Share



BRN Price at posting: $1.30 Sentiment: Sell Disclosure: Not Held
He could have a valid point but for the fact that the purchase price of both products includes prepaid face to face engineering and design support hours which from memory was 20 hours and 10 hours respectively. I would expect Brainchip would cost out Kristopher Carlson and his peers at at least $US350.00 per hour to $US500.00 per hour.

This HC investment adviser has also over looked that these are products aimed at corporations and that the consumer product is actually $US499.00.

I expect he does not earn anywhere near what the employees at Brainchip earn and there is an obvious reason for this I would suggest.

My opinion only DYOR
FF

AKIDA BALLISTA
 
  • Like
  • Fire
Reactions: 17 users
May have been posted previously as I recall someone posting an interview with Chittipeddi of Renesas but maybe this is a diff one?

Interesting all the same & bold a couple points that fit Akida :)













Handling machine-learning chores at the edge is becoming more common as tools and hardware have improved. Editor Bill Wong talks with Dr. Sailesh Chittipeddi, Executive VP & General Manager for IoT & Infrastructure at Renesas, about artificial intelligence in edge-computing nodes. Check out the video or its transcript:



Wong: Well, artificial intelligence and machine learning to edge-computing nodes is becoming more common these days, especially as microcontrollers and SOCs gained support for machine-learning inference models. Today I'm speaking with Dr. Sailesh Chittipeddi, who is Executive Vice President and General Manager of IoT & Infrastructure business at Renesas.


Hopefully, we'll be getting a better understanding of this trend. So to start with, the advantages of centralized cloud resources have made AI in the cloud very popular. Where do you see AI at the edge or even the endpoint? How does it fit into the mix?

Chittipeddi: Good question, Bill, and thank you for having me. Let me begin by kind of talking about the dominance of the edge of AI in the cloud. Primarily that's been the sweet spot for a long period of time.

Obviously, there's always going to be workloads where it makes a lot of sense to have it continue to be in the cloud; for example, things like weather forecasting and so on. It would never make sense to move that to the edge of the network, however, as computing becomes much more ubiquitous and powerful at the endpoint.


As we start getting much more capability to do even AI and tiny machine-learning capabilities at the end point of the network, we actually start to see significant enhancement of capabilities to do some of these inference models at the edge of the network, and that's driving a trend for moving certain workloads, not all, obviously, from the cloud to the endpoint of the network, and the trend that's driving this is the need for very low latency. Obviously, that's a very important factor in the discussion.

The second point also is the security aspect of it, which is equally important as you move towards the endpoint.

And then the third aspect, which is the need to be able to have instantaneous filters for stuff where you don't want to wait, for example, for the transit time, for information to go to the cloud and get it back right.

So even when you look at things like video surveillance, for example, where the initial trend was having all the processing done at the core of the network, if you will, the trend much more now is to have simpler facial-recognition models embedded in the video camera application itself.


So that's kind of broadly driving some of the trends that we're seeing. And power consumption obviously is another important factor. There is less power consumption at the end of the network, and it actually works a lot better relative to what needs to happen.

And the other thing also that I don't want to underestimate is our networks that are significantly improving quality moving from 4G to 5G networks, right? And that certainly provides us better capability in terms of driving some of these trends that we're seeing in the marketplace.

Wong: Well, what's driving the acceleration to moving AI to the endpoint? Is there additional hardware capabilities that are coming about? Are there improved pieces of software out there? What are the pieces?

Chittipeddi: Yeah. So there's both. Right. So there's two elements to it.

As I briefly mentioned earlier, one is the hardware element, which is a massive amount of compute power. A massive amount of computing at ultra-low power certainly helps. That's from a CPU perspective. But also you now have embedded AI solutions, whether it'd be spiking neural networks or whether it be CNN, either one of those, right. You have capabilities now of embedding AI together with our CPU to enhance the capabilities of AI at the end point. Something that you normally did not have before because the traditional model was consuming significant amount of power in order to handle the processing.


But now, actually, it's transitioning much more for the ability to serve the needs of the end point significantly better than was ever done before. Software certainly is an important factor. You know, the capabilities for doing these, the AI scripts, if you will, what I call inference scripts at the end point, and that certainly makes a big deal.

Then you have simplified libraries and compilers that are available for doing more AI at the endpoint of the network. So all those factors are driving the move to where we're seeing,

Wong: OK. So do you anticipate AI at the edge becoming the norm as opposed to the exception as it is right now?

Chittipeddi: I think there will always be a mixed bag, right? It's strong. Depends on the workloads that you're trying to handle, the workloads that are compute-intensive or CPU-intensive always continue to be done in the core of the network.


On the other hand, as the workloads get simplified, you'll find them increasingly moving towards the end point or towards the edge of the network. And I think you'll see that trend accelerated. That is a matter of fact as the number of devices that are connected at the end point increase.

You'll find this capability going up significantly by some estimates and I was looking at a piece of paper over here. There'll be 55 billion connected devices generating 73 zettabytes worth of data by 2025 and it's certainly not all in the cloud. So that kind of gives you an idea of the growth that you have in there.

Wong: Could you give some examples of AI endpoints that are starting to emerge?

Chittipeddi: So, voice being the biggest, voice features are certainly the most elementary example of that, but we have other technologies, even with the dinosaurs that we're working on that handle everything from video processing to preventive maintenance technologies using what we call our dynamically reconfigurable technologies (DRP) that we have, our DRP technologies, which is embedded AI using a feedforward neural network together with the core MPA in our devices.


This device, for example, one of the best examples that I can give, is it allows you to do simple facial-recognition technology at the end point. So it's got so sophisticated right now, Bill, that you have the ability now to be able to track certain faces in a crowd and pick them up in a crowd pretty close to the end part of the network.

And that's, you know, it has both its positives and its negatives. Of course, we prefer to focus on the positive aspects of it. But certainly that'll be a trend that will continue and then being able to look for defects within a line. That's something that you don't want to be waiting around for all the data to go up to the cloud. That's in line where you find out what a defect is. So those are kind of some of the simplest examples.

And then voice, having a subset of certain features being available at the end point is, of course, a trend that's continuing, and we're partnered with a number of other companies in this area to enhance that capability.

Wong: OK, what's the impact of communications bandwidth with respect to AI?

Chittipeddi: I mean, that's a good question. Certainly the endpoint to the edge, especially with 5G, becomes far more important, right? You have much more bandwidth now between the edge of the endpoint than you were able to do before and, certainly, that's good.

That's going to be a major contributing factor to anything that we're seeing in this particular area. But, also increasingly, even in terms of easier access to the core, the bandwidth is improving, whether it's optical networks or whether it's wireless networks.

There are trends that are driving favorability in that regard but, nonetheless, there always will be a need for the low latency factor between the edge and the end point that'll be contributing factors to helping this trend to its end point. I call it Edge II, if you will, because often times it still doesn't make sense to drive everything to the core of the network and then wait for the response, even though the latency times are improving significantly.

Wong: There's also low-speed networks like LoRaWAN, for example, where you really can't shove a lot of data back and forth.

Chittipeddi: Exactly like narrowband, IoT level like narrowband IoT, and so on. Then, of course, you have increasingly the ability to use Zigbee-based technologies for the simplest of sensors to be able to access information.

So, yeah, obviously that's a very important point as well for the data sets that don't that periodically send a burst of data. It is a contributing factor to what we're seeing as well.

Wong: Excellent. Well, thanks for the overview. It was very informative and I appreciate you speaking with us.



Chittipeddi: You're welcome.
 
  • Like
  • Fire
Reactions: 30 users
Top Bottom