BRN Discussion Ongoing

Diogenese

Top 20
Leaving aside US retail investors who have bailed out of US based tech in significant numbers this year if I was a large sophisticated US Institutional Investor I would have the ability in place to buy directly on overseas markets.

I would also know that if I buy on a market with very, very low liquidity my buying would likely affect the market for the share I am buying.

A list is published here from time to time showing institutional investors in Brainchip a couple of which Vanguard and Blackrock have US connections and there are other lessor knowns from the US.

These lists being accurate then it is inaccurate to say US institutions have not taken notice. Some have but have chosen buying directly on the ASX rather than on the OTC or through BONYM via ADR’s.

My opinion only DYOR
FF

AKIDA BALLISTA


https://www.bing.com/videos/search?...508549EBBA129C0141430F61&ghsh=0&ghacc=1&ghpl=

Love the dance moves FF.
 
  • Love
  • Like
Reactions: 7 users
Dude nextdoor was watching porn and had his Bluetooth speaker still connected on the front porch nextdoor took him a couple minutes to work out :oops:
I wonder if akida somehow could prevent this happening.
 
  • Haha
  • Like
Reactions: 25 users
I am told that Sally is much loved at Brainchip US Office. I think after my wife, daughter, son, grandchildren, son-in-law and daughter-in-law I share their sentiment where Sally is concerned. A must read.
My opinion only DYOR
FF

AKIDA BALLISTA:

Cars That Think Like You​

1bbbb

Home » World News » Cars That Think Like You
By Sally Ward-Foxton 07.22.2022
Car makers are checking out neuromorphic technology to implement AI–powered features such as keyword spotting, driver attention, and passenger behaviour monitoring.
Imitating biological brain processes is alluring because it promises to enable advanced features without adding significant power draw at a time when vehicles are trending towards battery–powered operation. Neuromorphic computing and sensing also promise benefits like extremely low latency, enabling real–time decision making in some cases. This combination of latency and power efficiency is extremely attractive.

Here’s the lowdown on how the technology works and a hint on how this might appear in the cars of the future.

The truth is there are still some things about how the human brain works that we just don’t understand. However, cutting–edge research suggests that neurons communicate with each other by sending electrical signals known as spikes to each other, and that the sequences and timing of spikes are the crucial factors, rather than their magnitude. The mathematical model of how the neuron responds to these spikes is still being worked out. But many scientists agree that if multiple spikes arrive at the neuron from its neighbours at the same time (or in very quick succession), that would mean the information represented by those spikes is correlated, therefore causing the neuron to fire off a spike to its neighbour.
This is in contrast to artificial neural networks based on deep learning (mainstream AI today) where information propagates through the network at a regular pace; that is, the information coming into each neuron is represented as numerical values and is not based on timing.
Making artificial systems based on spiking isn’t easy. Aside from the fact we don’t know exactly how the neuron works, there is also no agreement on the best way to train spiking networks. Backpropagation — the algorithm that makes training deep learning algorithms possible today — requires computation of derivatives, which is not possible for spikes. Some people approximate derivatives of spikes in order to use backpropagation (like SynSense) and some use another technique called spike timing dependent plasticity (STDP), which is closer to how biological brains function. STDP, however, is less mature as a technology (BrainChip uses this method for one–shot learning at the edge). There is also the possibility of taking deep learning CNNs (convolutional neural networks), trained by backpropagation in the normal way, and converting them to run in the spiking domain (another technique used by BrainChip).

SYNSENSE SPECK

SynSense is working with BMW to advance the integration of neuromorphic chips into smart cockpits and explore related fields together. BMW will be evaluating SynSense’s Speck SoC, which combines SynSense’s neuromorphic vision processor with a 128 x 128–pixel event–based camera from Inivation. It can be used to capture real–time visual information, recognize and detect objects, and perform other vision–based detection and interaction functions.
“When BMW replaces RGB cameras with Speck modules for vision sensing, they can replace not just the sensor but also a significant chunk of GPU or CPU computation required to process standard RGB vision streams,” Dylan Muir, VP global research operations at SynSense, told EE Times.
Using an event–based camera provides higher dynamic range than standard cameras, beneficial for the extreme range of lighting conditions experienced inside and outside the car.
BMW will explore neuromorphic technology for car applications, including driver attention and passenger behavior monitoring with the Speck module.
“We will explore additional applications both inside and outside the vehicle in coming months,” Muir said.
SynSense’s neuromorphic vision processor has a fully asynchronous digital architecture. Each neuron uses integer logic with 8–bit synaptic weights, a 16–bit neuron state, 16–bit threshold, and single–bit input and output spikes. The neuron uses a simple integrate–and–fire model, combining the input spikes with the neuron’s synaptic weights until the threshold is reached, when the neuron fires a simple one–bit spike. Overall, the design is a balance between complexity and computational efficiency, Muir said.

SynSense’s digital chip is designed for processing event–based CNNs, with each layer processed by a different core. Cores operate asynchronously and independently; the entire processing pipeline is event driven.
“Our Speck modules operate in real–time and with low latency,” Muir said. “We can manage effective inference rates of >20Hz at <5mW power consumption. This is much faster than what would be possible with traditional low–power compute on standard RGB vision streams.”
While SynSense and BMW will be exploring neuromorphic car use cases in the “smart cockpit” initially, there is potential for other automotive applications, too.
“To begin with we will explore non–safety–critical use cases,” Muir said. “We are planning future versions of Speck with higher resolution, as well as revisions of our DynapCNN vision processor that will interface with high–resolution sensors. We plan that these future technologies will support advanced automotive applications such as autonomous driving, emergency braking, etc.”

The Mercedes EQXX concept car, debuted at CES 2022, features BrainChip’s Akida neuromorphic processor performing in–cabin keyword spotting. Promoted as “the most efficient Mercedes–Benz ever built,” the car takes advantage of neuromorphic technology to use less power than deep learning powered keyword spotting systems. This is crucial for a car that is supposed to deliver a 620–mile range (about 1,000 km) on a single battery charge, 167 miles further than Mercedes’ flagship electric vehicle, the EQS
Mercedes said at the time that BrainChip’s solution was 5 to 10× more efficient than conventional voice control when spotting the wake word “Hey Mercedes”.

“Although neuromorphic computing is still in its infancy, systems like these will be available on the market in just a few years,” according to Mercedes. “When applied at scale throughout a vehicle, they have the potential to radically reduce the energy needed to run the latest AI technologies.”
“[Mercedes is] looking at big issues like battery management and transmission, but every milliwatt counts, and the context of [BrainChip’s] inclusion was that even the most basic inference, like spotting a keyword, is important when you consider the power envelope,” Jerome Nadel, chief marketing officer at BrainChip, told EE Times.
Nadel said that a typical car in 2022 may have as many as 70 different sensors. For in–cabin applications, these sensors may be enabling facial detection, gaze estimation, emotion classification, and more.
“From a systems architecture point of view, we can do it in a 1:1 way, there’s a sensor that will do a level of pre–processing, and then the data will be forwarded,” he said. “There would be AI inference close to the sensor and… it would pass the inference meta data forward and not the full array of data from the sensor.”
The idea is to minimize the size and complexity of data packets sent to AI accelerators in automotive head units, while lowering latency and minimizing energy requirements. With a potential for 70 Akida chips or Akida–enabled sensors in each vehicle, Nadel said each one will be a “low–cost part that will play a humble role,” noting that the company needs to be mindful of the bill of materials for all these sensors.

Looking further into the future, Nadel said neuromorphic processing will find its way into ADAS and autonomous vehicle systems, too. There is potential to reduce the need for other types of power–hungry AI accelerators.
“If every sensor had a limited, say, one or two node implementation of Akida, it would do the sufficient inference and the data that would be passed around would be cut by an order of magnitude, because it would be the inference metadata… that would have an impact on the horsepower that you need in the server in the trunk,” he said.
BrainChip’s Akida chip accelerates spiking neural networks (SNNs) and convolutional neural networks (via conversion to SNNs). It is not tailored for any particular use case or sensor, so it can work with vision sensing for face recognition or person detection, or other audio applications such as speaker ID. BrainChip has also demonstrated Akida with smell and taste sensors, though it’s more difficult to imagine how these sensors might be used in automotive (smelling and tasting for air pollution or fuel quality, perhaps).
Akida is set up to process SNNs or deep learning CNNs that have been converted to the spiking domain. Unlike native spiking networks, converted CNNs retain some information in spike magnitude, so 2– or 4–bit computation may be required. This approach, hwoever, allows exploitation of CNNs’ properties, including their ability to extract features from large datasets. Both types of networks can be updated at the edge using STDP — in the Mercedes example, that might mean retraining the network to spot more or different keywords after deployment.

Mercedes has confirmed that “many innovations”, including “specific components and technologies” from the EQXX concept car, will make it into production vehicles, reports Autocar. There is no word yet on whether new models of Mercedes will feature artificial brains.
 
  • Like
  • Love
  • Fire
Reactions: 54 users
This is one of the pots of gold everyone has been looking for.... :) Read whats been said in the 4th last paragraph :)

Interesting article but six words stood out. :ROFLMAO:😂🤣:cool:

"Everyone is not satisfied with the products on sale. Does Mercedes-Benz still have new cards to satisfy discerning consumers?

Mercedes-Benz's answer is the VISION EQXX concept car, which is a collection of Mercedes-Benz's latest technologies in the field of pure electricity.

In April this year, the Mercedes-Benz VISION EQXX concept car departed from Sindelfingen, Germany, and arrived in Cassis, France. The whole journey is about 1,008 kilometers. It has become the first known product to exceed 1,000 kilometers on a single charge. energy control capability. In addition to battery life, VISION EQXX also uses a 900V high-voltage platform, a 47.5-inch ultra-thin integrated screen, a new generation of MBUX system with faster response, and Akida neuromorphic system-on-chip.

This is a product that is more in line with the current electric vehicle evaluation system, and the black technology it carries will also be applied to the next-generation Mercedes-Benz electric vehicles."

(Apologies to all those who got in before me. I thought about deleting but I really, really like those six words. FF)
 
  • Like
  • Fire
  • Love
Reactions: 52 users
Interesting article but six words stood out. :ROFLMAO:😂🤣:cool:

"Everyone is not satisfied with the products on sale. Does Mercedes-Benz still have new cards to satisfy discerning consumers?

Mercedes-Benz's answer is the VISION EQXX concept car, which is a collection of Mercedes-Benz's latest technologies in the field of pure electricity.

In April this year, the Mercedes-Benz VISION EQXX concept car departed from Sindelfingen, Germany, and arrived in Cassis, France. The whole journey is about 1,008 kilometers. It has become the first known product to exceed 1,000 kilometers on a single charge. energy control capability. In addition to battery life, VISION EQXX also uses a 900V high-voltage platform, a 47.5-inch ultra-thin integrated screen, a new generation of MBUX system with faster response, and Akida neuromorphic system-on-chip.

This is a product that is more in line with the current electric vehicle evaluation system, and the black technology it carries will also be applied to the next-generation Mercedes-Benz electric vehicles."

(Apologies to all those who got in before me. I thought about deleting but I really, really like those six words. FF)
Something I was not aware of. From the article:

“In July this year, Mercedes-Benz and Tencent joined hands. The two parties will cooperate on high-level autonomous driving.”

Was this known by 1000 eyes?

FF. Your memory is better than most on these relationships!
 
  • Like
Reactions: 6 users
Something I was not aware of. From the article:

“In July this year, Mercedes-Benz and Tencent joined hands. The two parties will cooperate on high-level autonomous driving.”

Was this known by 1000 eyes?

FF. Your memory is better than most on these relationships!
Yes it was posted about and if I am not mistaken it was mentioned again last week by someone. Sorry for not remembering who it was. Might have also been mentioned in one of @BaconLover ‘s posts as well.

Regards
FF

AKIDA BALLISTA
 
  • Like
  • Fire
Reactions: 11 users
I am told that Sally is much loved at Brainchip US Office. I think after my wife, daughter, son, grandchildren, son-in-law and daughter-in-law I share their sentiment where Sally is concerned. A must read.
My opinion only DYOR
FF

AKIDA BALLISTA:

Cars That Think Like You​

1bbbb

Home » World News » Cars That Think Like You
By Sally Ward-Foxton 07.22.2022
Car makers are checking out neuromorphic technology to implement AI–powered features such as keyword spotting, driver attention, and passenger behaviour monitoring.
Imitating biological brain processes is alluring because it promises to enable advanced features without adding significant power draw at a time when vehicles are trending towards battery–powered operation. Neuromorphic computing and sensing also promise benefits like extremely low latency, enabling real–time decision making in some cases. This combination of latency and power efficiency is extremely attractive.

Here’s the lowdown on how the technology works and a hint on how this might appear in the cars of the future.

The truth is there are still some things about how the human brain works that we just don’t understand. However, cutting–edge research suggests that neurons communicate with each other by sending electrical signals known as spikes to each other, and that the sequences and timing of spikes are the crucial factors, rather than their magnitude. The mathematical model of how the neuron responds to these spikes is still being worked out. But many scientists agree that if multiple spikes arrive at the neuron from its neighbours at the same time (or in very quick succession), that would mean the information represented by those spikes is correlated, therefore causing the neuron to fire off a spike to its neighbour.
This is in contrast to artificial neural networks based on deep learning (mainstream AI today) where information propagates through the network at a regular pace; that is, the information coming into each neuron is represented as numerical values and is not based on timing.
Making artificial systems based on spiking isn’t easy. Aside from the fact we don’t know exactly how the neuron works, there is also no agreement on the best way to train spiking networks. Backpropagation — the algorithm that makes training deep learning algorithms possible today — requires computation of derivatives, which is not possible for spikes. Some people approximate derivatives of spikes in order to use backpropagation (like SynSense) and some use another technique called spike timing dependent plasticity (STDP), which is closer to how biological brains function. STDP, however, is less mature as a technology (BrainChip uses this method for one–shot learning at the edge). There is also the possibility of taking deep learning CNNs (convolutional neural networks), trained by backpropagation in the normal way, and converting them to run in the spiking domain (another technique used by BrainChip).

SYNSENSE SPECK

SynSense is working with BMW to advance the integration of neuromorphic chips into smart cockpits and explore related fields together. BMW will be evaluating SynSense’s Speck SoC, which combines SynSense’s neuromorphic vision processor with a 128 x 128–pixel event–based camera from Inivation. It can be used to capture real–time visual information, recognize and detect objects, and perform other vision–based detection and interaction functions.
“When BMW replaces RGB cameras with Speck modules for vision sensing, they can replace not just the sensor but also a significant chunk of GPU or CPU computation required to process standard RGB vision streams,” Dylan Muir, VP global research operations at SynSense, told EE Times.
Using an event–based camera provides higher dynamic range than standard cameras, beneficial for the extreme range of lighting conditions experienced inside and outside the car.
BMW will explore neuromorphic technology for car applications, including driver attention and passenger behavior monitoring with the Speck module.
“We will explore additional applications both inside and outside the vehicle in coming months,” Muir said.
SynSense’s neuromorphic vision processor has a fully asynchronous digital architecture. Each neuron uses integer logic with 8–bit synaptic weights, a 16–bit neuron state, 16–bit threshold, and single–bit input and output spikes. The neuron uses a simple integrate–and–fire model, combining the input spikes with the neuron’s synaptic weights until the threshold is reached, when the neuron fires a simple one–bit spike. Overall, the design is a balance between complexity and computational efficiency, Muir said.

SynSense’s digital chip is designed for processing event–based CNNs, with each layer processed by a different core. Cores operate asynchronously and independently; the entire processing pipeline is event driven.
“Our Speck modules operate in real–time and with low latency,” Muir said. “We can manage effective inference rates of >20Hz at <5mW power consumption. This is much faster than what would be possible with traditional low–power compute on standard RGB vision streams.”
While SynSense and BMW will be exploring neuromorphic car use cases in the “smart cockpit” initially, there is potential for other automotive applications, too.
“To begin with we will explore non–safety–critical use cases,” Muir said. “We are planning future versions of Speck with higher resolution, as well as revisions of our DynapCNN vision processor that will interface with high–resolution sensors. We plan that these future technologies will support advanced automotive applications such as autonomous driving, emergency braking, etc.”

The Mercedes EQXX concept car, debuted at CES 2022, features BrainChip’s Akida neuromorphic processor performing in–cabin keyword spotting. Promoted as “the most efficient Mercedes–Benz ever built,” the car takes advantage of neuromorphic technology to use less power than deep learning powered keyword spotting systems. This is crucial for a car that is supposed to deliver a 620–mile range (about 1,000 km) on a single battery charge, 167 miles further than Mercedes’ flagship electric vehicle, the EQS
Mercedes said at the time that BrainChip’s solution was 5 to 10× more efficient than conventional voice control when spotting the wake word “Hey Mercedes”.

“Although neuromorphic computing is still in its infancy, systems like these will be available on the market in just a few years,” according to Mercedes. “When applied at scale throughout a vehicle, they have the potential to radically reduce the energy needed to run the latest AI technologies.”
“[Mercedes is] looking at big issues like battery management and transmission, but every milliwatt counts, and the context of [BrainChip’s] inclusion was that even the most basic inference, like spotting a keyword, is important when you consider the power envelope,” Jerome Nadel, chief marketing officer at BrainChip, told EE Times.
Nadel said that a typical car in 2022 may have as many as 70 different sensors. For in–cabin applications, these sensors may be enabling facial detection, gaze estimation, emotion classification, and more.
“From a systems architecture point of view, we can do it in a 1:1 way, there’s a sensor that will do a level of pre–processing, and then the data will be forwarded,” he said. “There would be AI inference close to the sensor and… it would pass the inference meta data forward and not the full array of data from the sensor.”
The idea is to minimize the size and complexity of data packets sent to AI accelerators in automotive head units, while lowering latency and minimizing energy requirements. With a potential for 70 Akida chips or Akida–enabled sensors in each vehicle, Nadel said each one will be a “low–cost part that will play a humble role,” noting that the company needs to be mindful of the bill of materials for all these sensors.

Looking further into the future, Nadel said neuromorphic processing will find its way into ADAS and autonomous vehicle systems, too. There is potential to reduce the need for other types of power–hungry AI accelerators.
“If every sensor had a limited, say, one or two node implementation of Akida, it would do the sufficient inference and the data that would be passed around would be cut by an order of magnitude, because it would be the inference metadata… that would have an impact on the horsepower that you need in the server in the trunk,” he said.
BrainChip’s Akida chip accelerates spiking neural networks (SNNs) and convolutional neural networks (via conversion to SNNs). It is not tailored for any particular use case or sensor, so it can work with vision sensing for face recognition or person detection, or other audio applications such as speaker ID. BrainChip has also demonstrated Akida with smell and taste sensors, though it’s more difficult to imagine how these sensors might be used in automotive (smelling and tasting for air pollution or fuel quality, perhaps).
Akida is set up to process SNNs or deep learning CNNs that have been converted to the spiking domain. Unlike native spiking networks, converted CNNs retain some information in spike magnitude, so 2– or 4–bit computation may be required. This approach, hwoever, allows exploitation of CNNs’ properties, including their ability to extract features from large datasets. Both types of networks can be updated at the edge using STDP — in the Mercedes example, that might mean retraining the network to spot more or different keywords after deployment.

Mercedes has confirmed that “many innovations”, including “specific components and technologies” from the EQXX concept car, will make it into production vehicles, reports Autocar. There is no word yet on whether new models of Mercedes will feature artificial brains.
I think I have paid sufficient respect to Sally so now I say to Sally what about some in-depth investigative journalism please. Prove that the Valeo ultrasonic sensors being supplied to BMW have AKIDA IP:

“BMW “Neue Klasse” will also feature the next generation of Valeo’s ultrasonic sensors, the full set of surround view cameras, as well as a new multifunctional interior camera that will contribute to improved safety and create a new level of user experience.”

My opinion only DYOR
FF

AKIDA BALLISTA
 
  • Like
  • Fire
  • Love
Reactions: 31 users

pgetties

Member
Are we linked with SynSense?


1663481346994.png
 
  • Like
Reactions: 10 users

Proga

Regular
Hi @dippY22,

If I had more money I would buy one of the Honda Legends or Mercedes-Benz S class which are capable of achieving Level 3 automation. Once purchased, I would then proceed to pull said car apart to locate the sensors so that I could independently verify which ones incorporate AKIDA. And then I might accidentally spill the beans to some media outlets about my discovery.

We know that the Honda Legend and the Mercedes-Benze S class are the only two vehicles in the world to have reached Level 3 automation and Valeo have stated previously "Valeo widely contributed to this unique achievement, by supplying a high-performance on-board control unit as well as most of the sensors used to achieve Level 3 functionality. No less than five Valeo SCALA® 3D LiDARs and two front cameras observe the car’s surroundings."

I suppose what I'm getting at is that once Valeo lets the cat out of the bag, by which I mean to say, once it is 100% confirmed (by someone other than me) that these mystical sensors that Vaelo speaks of are in fact powered by AKIDA, then I expect that there will be a heck of a lot of interest on the USA markets.

Imagine how much media coverage BrainChip could leverage, not just in the US but globally, by being able to affirm that AKIDA is the secret sauce which makes Level 3 autonomous driving a reality, thus leaving Tesla's FSD well and truly in the dust and leaving me less upset that my new car is lying in pieces in my carport next to my dining table that I still haven't managed to get rid of on Gumtree.
@Bravo,

If Akida were in the current versions we'd be seeing that reflected in revenue. We're not.

@dippY22 , be careful what you wish for. All those which are down have billion dollar revenues.

In the last 12 months
BRN up 98.6%
AMD down 26.4%
Intel down 46.1%
Qualcomm down 6.5%
MegaChips down 29%
Valeo down 13.1%
 
Last edited:
  • Like
  • Love
  • Fire
Reactions: 20 users
JD...

If word is getting around like you're sure it is, the reaction is a bored "meh" followed by a big yawn, in my opinion.

Everything you say, except that, is true and I am aligned in my Brainchip investment not unlike you. However, you and me and all our friends and family are doing bupkis to budge the stock price on the OTC marketplace.

To state the obvious which is akin to crashing through open doors.....THERE IS NO INTEREST IN BUYING BRAINCHIP SHARES ON THE USA MARKETS, at least at the moment, and as proven by the ongoing pathetic OTC trading volumes.

And sure, we (and the markets) are mired in a bunch of worldwide macro mud at the moment, but no one in America is going to buy Brainchip. Not anytime soon, and regardless of the USA office location, awkwardly located hundreds of miles south of where it should be (Silicon Valley).

A 400-500% increase in revenues last quarter (year over year) didn't inspire buying. Maybe if they produce 20 million soon, that might nudge the volume upward and the stock price too, but if the chip processor industry all know who Brainchip is, per Sean Hehir at the AGM, what is the reason there is not more interest, let alone excitement, in buying shares? ..... Yeah, ......I don't know, either.

Is this apathy a function of the lethargic price action of BRN on the Australian markets due to a lack of any recent positive headline news releases? Yes, I'm sure that's a part of it. Because to be fair, when the Mercedes news broke the volumes on the OTC was noteworthy. But the action in BRCHF on Friday was what,.... 3,000 shares +/- ? Sad. BCHPY (ADR) volume was, ....wait for it,....are you sitting down,....drum roll .......... 13 shares!!!

But what about some of our recent partnership announcements? Noteworthy? Yes. Stock price moving? Alas, no. To me, the ARM Holdings partnersip announcement was (is) HUGE. When I get depressed about the price action (infrequently) I go look at the "our partners" section of the ARM website and smile. I think this partnership with ARM is incredibly significant. Much more significant than the Mercedes announcement, again,....imo. But in America,.......crickets, .......at least on the OTC where BRCHF hovers around mid to low sixty cents.

There is absolutely nothing lighting a fire under American investors, retail or institutional, at the moment. There is NOBODY jumping around about this stock and THAT is a House of Pain that I live in because I am, like Mr. Delekto, an American retail investor hoping and anticipating great things (eventually) and volumes that are skewed toward more buyers than sellers for obvious reasons.

Am I annoyed by the lack of interest. Hell yeah. I find it vexing, and honestly do not understand it. I have previously accused Integrous of not delivering on their strategy to improve results in commercial investment opportunities in Brainchip investment. I stand by that. What are they doing to generate institutional interest in our investment? Very little as far as I'm concerned based on the ongoing volumes.

And why are retail investors in BRCHF shares once again paying a $50 foriegn transaction fee when it had dissappeared for a couple of months? That is a ridiculous charge in this day and age, thank you very much.

I am not a downramper. This Brainchip story is awesome and I feel lucky to have found it, and am long and strong for years to come. I am not impatient. I understand product cycles and new tech implementation and the challenge(s) to introduce new technology to the world.

But come on....seriously, the word is clearly NOT getting out about Brainchip in the USA because potential investors, retail especially but also institutional (i.e. Ark Invest) are not rushing in as a result and that causes me some concern. Why? Why am I invested in this company and others, some massively smarter than me, may not be? What is holding them back? What company offers significant ROI, ...Intel (for example) or Brainchip? You know where I stand (...and no, I do not own Intel).

Good news continues developing, lot's of presentations, good hires, ...but America seems to not see our unfolding story. And I don't get it.

This American lack of interest in owning Brainchip shares should be keeping Brainchip executives awake at night even if they espouse that the stock will do what the stock will do.

Regards, dippY

My opinions only, ...no stock advice given or intended.


Hi @dippY22

I’ll preface my reply with stating I enjoy reading your posts and value your thoughts on this but I’m not sure what the issue is with our SP. It could be higher but given the macro events and where we are in our commercialisation journey it’s where we are currently valued. There has been satisfactory growth which beats the banks and my current super fund. It’s not exponential but it’s growth all the same!

I’ll also state that I am still waiting to buy a significant amount in December in my super so the SP remaining as it is works in my favour at the moment. I own a satisfactory amount at the moment but I’ve been nervously waiting to buy in and expecting a Murphy’s Law price rise to occur just before I can lock some more away!

I also understand shareholders are at different cycles in life, e.g. someone recently posted they were 80 years of age and obviously would like a steep price increase so they can enjoy some profits whilst they still can. That’s fair enough!


My opinion is that we are still in the infancy of commercialisation. Yes we have some potential big fish on the hook but we have yet to really see them landed yet. Therefore the public investor would still see us as a risk as we are only just starting to see revenues and not yet covering our running costs.

I am strongly of the opinion we are in Valeo Scala 3 Lidar which will eventually bring us strong revenue growth which is part of the basis of my investment decision. They are pivotal to our success at the moment.

I also note I am very disappointed Nanose hasn’t come through with it’s product as I first thought it would. It was my prime reason in investing in Brainchip when I first did 18+ months ago. I expected that to push the SP to astronomical heights but it hasn‘t eventuated and they have missed the Covid money train! I am thinking it had something to do with the ability to ”Clean“ the sensor between use to prevent contamination for the next use and therefore affect the results and/or possibly infect the next user? I thought the actual detection and analysis using Akita produced satisfactory results to enable TGA approval. I am still hopeful of a Nanose instrument of some kind to hit the market in the next few years.

I am glad that Brainchip are not selling a share price or trying to artificially inflate it with fluff announcements. I want them to focus on developing relationships, building the ecosystem and becoming a de-facto standard to increase our customer base and make IP sales. That’s what counts to me but I am comfortable waiting a few more years for that to occur.

Imagine where we would be without the Arm, SiFive, Megachips, Edge Impulse etc announcements?

As reported earlier this week NVIDIA and Intel are pushing a white paper which to my limited understanding will try to develop a standard which could affect Brainchips ability to be fully utilised as our ”Bits” are different to their “8 bit standard.” Is this an attempt to derail Brainchip’s success as VHS did with Beta. I am hopeful Brainchip are already too far down the product commercialisation path for that to occur!


So I see our SP as not really increasing significantly until we have product releases, such as MB and others to provide sustained revenue growth which might still be ”Lumpy” for the next few quarters. When/if Brainchip announce a patent regarding Akita 2.0 it might spice things up a bit so I’m hoping that will be around Xmas or January!

I am confident that if Brainchip continues to work on sales then eventually the SP will go up regardless of limited advertising to investors.

Eventually the financials will push the SP which to me is the way it should be!

I may be thinking differently come January when I am more heavily invested and get anxious for results. I will be interested to see how well I cope in that situation myself.

I am thinking my investment will be settling at it’s peak range in about 2030 once is is ubiquitous in ADAS/EV cars, phones, laptops, large server centres (lookin at you Dell) and a plethora of Defence products. By then I expect dividends which will time in nicely for my retirement so I can live off of them and not touch my holdings which I will eventually leave to my children. That is my plan and I enjoy reading this forum and doing my own research whilst I wait patiently for that to occur!

Cheers all!

:)
 
  • Like
  • Love
  • Fire
Reactions: 58 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
@Bravo,

If Akida were in the current versions we'd be seeing that reflected in revenue. We're not.

@dippY22 , be careful what you wish for. Not sure you understand global markets and where BRN fits in with our peers. All those which are down have billion dollar revenues.

In the last 12 months
BRN up 98.6%
AMD down 26.4%
Intel down 46.1%
Qualcomm down 6.5%
MegaChips down 29%
Valeo down 13.1%

I'm not so sure that we haven't already seen some of the revenue @Proga? According to the articles I've read, there appears to have only been a limited number of both of these L3 vehicles sold at this point in time.


Screen Shot 2022-09-18 at 4.35.14 pm.png
Screen Shot 2022-09-18 at 4.34.36 pm.png
 
  • Like
  • Love
  • Thinking
Reactions: 13 users

Crestman

Regular
I thought I would search up where Bill Gates said if someone invents artificial intelligence then they are worth 10 x Microsofts, and found this site:


What I love about this page is this bit:

If anything, Gates was too conservative in his estimates. Experts say the market opportunity is now far, far greater than 10 Microsofts.

Maybe we should change to 20 x or 50 x or 100 x Microsofts!

🚀🚀🚀;)
 
  • Like
  • Love
  • Fire
Reactions: 27 users

Proga

Regular
I'm not so sure that we haven't already seen some of the revenue @Proga? According to the articles I've read, there appears to have only been a limited number of both of these L3 vehicles sold at this point in time.


View attachment 16882 View attachment 16883
Wouldn't Honda or Valeo have to pay a licence fee for those 100 vehicles produced last year?
 
Last edited:
  • Like
Reactions: 5 users

Boab

I wish I could paint like Vincent
Last edited:
  • Haha
  • Like
Reactions: 5 users
Are we linked with SynSense?


View attachment 16881
No Brainchip is not linked to SynSense other than last year SynSense announced a partnering with Prophesee however as we know this year Prophesee announced it had partnered with Brainchip.

As for BMW SynSense announced the possibility of partnering with them but since that time BMW, Qualcomm and Valeo have announced the following real agreement:

"Valeo will develop and produce the ADAS domain controller capable of managing all data flows from all ADAS sensors in BMW Group vehicles based on the “Neue Klasse” platform. All driving assistance functions will be hosted and processed by the Valeo ADAS domain controller, which will be powered by Qualcomm Snapdragon SoCs**. The ADAS domain controller will host Valeo’s software platform for low-speed maneuvering, as well as software assets from BMW and Qualcomm for driving automation."

The beauty of being a start up like SynSense unlike a listed company on an exchange like Brainchip is you can make as many announcements as you like without substance. Compare the headline of the article with the paragraph in the body of the article what is a partnership to explore integration is a partnering to EXPLORE the POSSIBILITY of using neuromorphic chips in smart cockpits.

SynSense has a product that falls incredibly short when compared with the Brainchip AKIDA offering and Prophesee has on the face of their announcement worked that out and likely BMW has or will as well.

My opinion only DYOR
FF

AKIDA BALLISTA
 
  • Like
  • Love
  • Fire
Reactions: 37 users

buena suerte :-)

BOB Bank of Brainchip
I thought I would search up where Bill Gates said if someone invents artificial intelligence then they are worth 10 x Microsofts, and found this site:


What I love about this page is this bit:

If anything, Gates was too conservative in his estimates. Experts say the market opportunity is now far, far greater than 10 Microsofts.

Maybe we should change to 20 x or 50 x or 100 x Microsofts!

🚀🚀🚀;)
Microsoft Logo

🔥🔥Maybe we should change to 20 x or 50 x or 100 x Microsoft 🔥🔥 o_Oo_O Happy with matching it!!!! :cool::cool: $$$$$$$$$$$$$$


MSFT
#3
Rank
$1.825 Trillion
Marketcap
USA
Country
$244.74
Share price
-0.26%
Change (1 day)
-18.38%
Change (1 year)
👨‍💻 Software👩‍💻 Tech🇺🇸 Dow jones🎮 Video games
Categories
Microsoft is an American company that develops and distributes software and services such as: a search engine (Bing), cloud solutions and the computer operating system Windows.

Market capitalization of Microsoft (MSFT)​

Market cap: $1.825 Trillion

As of September 2022 Microsoft has a market cap of $1.825 Trillion. This makes Microsoft the world's third most valuable company by market cap according to our data. The market capitalization, commonly called market cap, is the total market value of a publicly traded company's outstanding shares and is commonly used to measure how much a company is worth.

Market cap history of Microsoft from 2001 to 2022​

 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 15 users

Diogenese

Top 20
Hi @dippY22

I’ll preface my reply with stating I enjoy reading your posts and value your thoughts on this but I’m not sure what the issue is with our SP. It could be higher but given the macro events and where we are in our commercialisation journey it’s where we are currently valued. There has been satisfactory growth which beats the banks and my current super fund. It’s not exponential but it’s growth all the same!

I’ll also state that I am still waiting to buy a significant amount in December in my super so the SP remaining as it is works in my favour at the moment. I own a satisfactory amount at the moment but I’ve been nervously waiting to buy in and expecting a Murphy’s Law price rise to occur just before I can lock some more away!

I also understand shareholders are at different cycles in life, e.g. someone recently posted they were 80 years of age and obviously would like a steep price increase so they can enjoy some profits whilst they still can. That’s fair enough!


My opinion is that we are still in the infancy of commercialisation. Yes we have some potential big fish on the hook but we have yet to really see them landed yet. Therefore the public investor would still see us as a risk as we are only just starting to see revenues and not yet covering our running costs.

I am strongly of the opinion we are in Valeo Scala 3 Lidar which will eventually bring us strong revenue growth which is part of the basis of my investment decision. They are pivotal to our success at the moment.

I also note I am very disappointed Nanose hasn’t come through with it’s product as I first thought it would. It was my prime reason in investing in Brainchip when I first did 18+ months ago. I expected that to push the SP to astronomical heights but it hasn‘t eventuated and they have missed the Covid money train! I am thinking it had something to do with the ability to ”Clean“ the sensor between use to prevent contamination for the next use and therefore affect the results and/or possibly infect the next user? I thought the actual detection and analysis using Akita produced satisfactory results to enable TGA approval. I am still hopeful of a Nanose instrument of some kind to hit the market in the next few years.

I am glad that Brainchip are not selling a share price or trying to artificially inflate it with fluff announcements. I want them to focus on developing relationships, building the ecosystem and becoming a de-facto standard to increase our customer base and make IP sales. That’s what counts to me but I am comfortable waiting a few more years for that to occur.

Imagine where we would be without the Arm, SiFive, Megachips, Edge Impulse etc announcements?

As reported earlier this week NVIDIA and Intel are pushing a white paper which to my limited understanding will try to develop a standard which could affect Brainchips ability to be fully utilised as our ”Bits” are different to their “8 bit standard.” Is this an attempt to derail Brainchip’s success as VHS did with Beta. I am hopeful Brainchip are already too far down the product commercialisation path for that to occur!


So I see our SP as not really increasing significantly until we have product releases, such as MB and others to provide sustained revenue growth which might still be ”Lumpy” for the next few quarters. When/if Brainchip announce a patent regarding Akita 2.0 it might spice things up a bit so I’m hoping that will be around Xmas or January!

I am confident that if Brainchip continues to work on sales then eventually the SP will go up regardless of limited advertising to investors.

Eventually the financials will push the SP which to me is the way it should be!

I may be thinking differently come January when I am more heavily invested and get anxious for results. I will be interested to see how well I cope in that situation myself.

I am thinking my investment will be settling at it’s peak range in about 2030 once is is ubiquitous in ADAS/EV cars, phones, laptops, large server centres (lookin at you Dell) and a plethora of Defence products. By then I expect dividends which will time in nicely for my retirement so I can live off of them and not touch my holdings which I will eventually leave to my children. That is my plan and I enjoy reading this forum and doing my own research whilst I wait patiently for that to occur!

Cheers all!

:)
Hi SG,

My understanding is that the performance of VHS/Beta is that they were in the same ball park.

8-bit floating point will not come close to the performance of Akida. It will necessitate MAC operations which would be several times slower and more power hungry than Akida before taking into account Akida's N-of-M coding. So, where battery power is concerned, Akida is orders of magnitude ahead. Where power and time are not critical, the inferior 8-bit FP may be a substitute, but it will be slower and more power hungry and possibly only marginally more accurate, but 8-bit FP has significant inaccuracies itself.

The human nervous system does not use 8 bits FP, it uses spikes ... and the stronger spikes which carry the bulk of the information are processed first. The loss of accuracy by ignoring the later spikes in N-of-M coding is not really significant.
 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 58 users
  • Like
  • Love
  • Fire
Reactions: 30 users
Hi SG,

My understanding is that the performance of VHS/Beta is that they were in the same ball park.

8-bit floating point will not come close to the performance of Akida. It will necessitate MAC operations which would be several times slower and more power hungry than Akida before taking into account Akida's N-of-M coding. So, where battery power is concerned, Akida is orders of magnitude ahead. Where power and time are not critical, the inferior 8-bit FP may be a substitute, but it will be slower and more power hungry and possibly only marginally more accurate, but 8-bit FP has significant inaccuracies itself.

The human nervous system does not use 8 bits, it uses spikes ... and the stronger spikes which carry the bulk of the information are processed first. The loss of accuracy by ignoring the later spikes in N-of-M coding is not really significant.
I love to watch you post.😎
 
  • Like
  • Love
  • Fire
Reactions: 21 users
Top Bottom