BRN Discussion Ongoing

BigDonger101

Founding Member
I cannot believe our senior management would just out of the blue raise funds (sell shares) at these low prices. There must be an agenda we are not privy to. Just when the reasoning for this is released, who knows? Tomorrow? AGM....probably not then as it will be too late to supercharge the SP. Whatever it is will just drop out of the blue, as the Intel news was. And VVDN. All I know is that our senior management are smart cookies.

Re the Anastasi interview with Mike Davies, it was as though he was talking of the market leader in Loihi 2. We all know this isn't correct as L2 is still a research chip, so just what is going on. I was thinking Intel may abandon L2 for Akida, which in turn would supply reasoning for the share sales in my first point. So much we don't know and are only speculating on.
I am confused by this because how else does the entity support it's ability to continue with out capital?

They have two options:
1. Continue using LDA for a capital source with no need to ask the wider public.
2. Raise capital to the wider public maybe through an SPP or normal capital raise.

It doesn't really matter which one they do, but LDA is expecting to make money.... So it's not all that bad.
 
  • Like
Reactions: 2 users

BaconLover

Founding Member
Quite possibly this has been shared already, but I see this particular video was uploaded 8 hours ago, so some may have not watched it.

Worth a watch for the ''Hey ........... '' part around 8:30 min where the lady says they're using AI tech, and ONLY responds to user's voice and imposters will be recognised etc.

Akida or not, some exciting times ahead.

 
  • Like
  • Fire
  • Love
Reactions: 21 users

HopalongPetrovski

I'm Spartacus!
Hi Hoppy.. I think Akida2.0 as I understand it is gearing towards being optimized for LSTM and transformers as there has been a lot of talk about this recently and as I said all Akida versions will all have their own use cases!
as to your comment "Akida 1 will be capable of activity that Akida 2 is not?" Hard to comment on this until the Grand Reveal, maybe @Diogenese can HELP! but I think they are geared for specific applications and manufactures can choose which one best suits their products.

Peter is a very modest man and a great guy to chat to and correct he doesn't drink alcohol he is happy with his sparkling water.
We met in Perth at a few get togethers arranged by @Earlyrelease & TD and have also had the pleasure of meeting Ken & Pia also great people to chat with.

Cheers
Great that you Guy's and Gal's get together on a regular basis and have the opportunity to have informal time with the Perth based crew.
Thanks for the response BienSuerte and Good Luck to us all. 🤣
 
  • Like
  • Love
  • Fire
Reactions: 10 users

hotty4040

Regular
When you see a post and you nod your agreeing with what it says but your scratching your head at the same time



Agree entirely equanimous, and i've no idea why ! Just another head scratching moment for me, unfortunately, but, I'm assuming this is another Ballista indication of the momentous future of this ground breaking tech, yet to be revealed asap in it's entirety, eventually, moving forward.

Let's hope there is a smidgeon ( at least ) of what's in store in the future with this next 4C offering.

I'm still excited about our future, can't help myself, because all of the insights we've been experiencing over the last many years, ( much head scratching along the way ) should/must amount to something that will bring a huge smile to my face.

Would be nice to hear from FF sometime soon, hope all's well with him and family.

Thanks to all contributors for there great insights, and knowledge boost for my consumption and everyone on these threads. Just amazing reading. Please continue, it's just awesome.

Akida Ballista >>>>>> The 4C will offer something/anything that will please us, surely ? <<<<<<

hotty...
 
  • Like
  • Love
  • Fire
Reactions: 15 users

Diogenese

Top 20
I cannot believe our senior management would just out of the blue raise funds (sell shares) at these low prices. There must be an agenda we are not privy to. Just when the reasoning for this is released, who knows? Tomorrow? AGM....probably not then as it will be too late to supercharge the SP. Whatever it is will just drop out of the blue, as the Intel news was. And VVDN. All I know is that our senior management are smart cookies.

Re the Anastasi interview with Mike Davies, it was as though he was talking of the market leader in Loihi 2. We all know this isn't correct as L2 is still a research chip, so just what is going on. I was thinking Intel may abandon L2 for Akida, which in turn would supply reasoning for the share sales in my first point. So much we don't know and are only speculating on.



Very much looking forward to the Grand Reveal of the next Gen Akida! hopefully very soon!!

I have met and chatted with Peter on a few occasions now, and when we last spoke a few months ago he made the point that Akida1.0 will not be obsolete when Akida 2.0 (Or it's new name?) comes to the market because they all have their own unique functions so they will always have their own place within different applications, thought I would share that so people didn't think that the next Gen Akida's will be superseding previous versions! (They won't be!)

Have a great day whatever you do 🏊‍♂️⛱️🍻 Cheers

We have been told what the rise is for:

https://cdn-api.markitdigital.com/a...access_token=83ff96335c2d45a094df02a206a39ff4

1674695924161.png

i. "tape out another chip"
ii. "release significant enhancements to our IP offering"
iii. "hiring personnel in key international markets"
iv. "increase our domestic sales and marketing headcount"

The choice of words for item (ii) is interesting, in particular "release" rather than "develop". To me, this suggests that the enhancements are well in hand.

We all know what Akida 1 can do. One of its capabilities is key-word spotting, as used in EQXX.

With the proviso that I don't know how transformers work (apologies to Ella):

LSTM is useful for processing sequences of data in context, be it speech, video, ... . One suggested use was in the prediction of the trajectory of moving objects, eg, in ADAS/AV. But it can only handle relatively short sequences of data because its memory is comparatively "short".

Transformers can handle longer sequences of data and "understand" or interpret the context, and can be used in translation and text prediction. Apparently it does this by extracting the "essence" of the context (don't ask me - I'm just making this up as I go along), so it can "understand" the context (and syntax - noun, verb, adjective, adverb, subject, object, ...) of longer data sequences, and do this more efficiently that LSTM. For instance, if you use a noun in one sentence, and then use the word "it" in a following sentence, the transformer may infer that "it" is the noun from the earlier sentence - spooky stuff.

On the software side, we've recently been exposed to the prowess of ChatGPT (Generative Pre-trained Transformer). Incorporating that capability in silicon (without the "imaginative" results) will be no mean feat.

So, getting back to the EQXX and its derivatives, key-word spotting is all very well, but the next bit is "understanding" what the driver says after "Hey Mercedes!" ... and every driver may use different syntax to express their requirements. So transformers may be advantageous in this context.


As for key markets, we've recently seen advertisements for marketing personnel in Japan. A couple of industries in Japan include automotive and consumer electronics,
 
Last edited:
  • Like
  • Love
  • Fire
Reactions: 77 users

stuart888

Regular
Synsense and Loihi mentioned but didn't hear any references to Akida. But I was multitasking so my attention was not totally focused.
Well @dippY22, it seemed clear to me, he was told not to mention Brainchip or Akida. Same with the presenter, despite all the spiking questions.

Don't you think it highly-odd, that just a couple of weeks ago the following happened?

BrainChip joins Prophesee at The Venetian Hotel as part of a technology showcase highlighting Prophesee’s Metavision platform. BrainChip partners with Prophesee to optimize computer vision AI performance and efficiency to deliver next-generation platforms for OEMs interested in integrating event-based vision systems with high levels of AI performance coupled with ultra-low power technologies.

Management on all sides might be in a quiet period, prior to some sort of reveal.
Just a hunch, with obvious clues presented.👨‍🎓👨‍🎓👨‍🎓
 
  • Like
  • Fire
  • Thinking
Reactions: 23 users

Dhm

Regular
We have been told what the rise is for:

https://cdn-api.markitdigital.com/a...access_token=83ff96335c2d45a094df02a206a39ff4

View attachment 27961
i. "tape out another chip"
ii. "release significant enhancements to our IP offering"
iii. "hiring personnel in key international markets"
iv. "increase our domestic sales and marketing headcount"

The choice of words for item (ii) is interesting, in particular "release" rather than "develop". To me, this suggests that the enhancements are well in hand.

We all know what Akida 1 can do. One of its capabilities is key-word spotting, as used in EQXX.

With the proviso that I don't know how transformers work:

LSTM is useful for processing sequences of data in context, be it speech, video, ... . One suggested use was in the prediction of the trajectory of moving objects, eg, in ADAS/AV. But it can only handle relatively short sequences of data because its memory is comparatively "short".

Transformers can handle longer sequences of data and "understand" or interpret the context, and can be used in translation and text prediction. Apparently it does this by extracting the "essence" of the context (don't ask me - I'm just making this up as I go along), so it can "understand" the context (and syntax - noun, verb, adjective, adverb, subject, object, ...) of longer data sequences, and do this more efficiently that LSTM. For instance, if you use a noun in one sentence, and then use the word "it" in a following sentence, the transformed may infer that "it" is the noun from the earlier sentence - spooky stuff.

On the software side, we've recently been exposed to the prowess of ChatGPT (Generative Pre-trained Transformer). Incorporating that capability in silicon (without the "imaginative" results) will be no mean feat.

So, getting back to the EQXX and its derivatives, key-word spotting is all very well, but the next bit is "understanding" what the driver says after "Hey Mercedes!" ... and every driver may use different syntax to express their requirements. So transformers may be advantageous in this context.


As for key markets, we've recently seen advertisements for marketing personnel in Japan. A couple of industries in Japan include automotive and consumer electronics,
Thank you super hero @Diogenese I must have been asleep at the wheel of life and failed to read that. Whilst our friend below bears no resemblance to you Dio, his dancing actions will be familiar when A2 drops.

superman dancing GIF
 
  • Haha
  • Like
  • Love
Reactions: 7 users

buena suerte :-)

BOB Bank of Brainchip
We have been told what the rise is for:

https://cdn-api.markitdigital.com/a...access_token=83ff96335c2d45a094df02a206a39ff4

View attachment 27961
i. "tape out another chip"
ii. "release significant enhancements to our IP offering"
iii. "hiring personnel in key international markets"
iv. "increase our domestic sales and marketing headcount"

The choice of words for item (ii) is interesting, in particular "release" rather than "develop". To me, this suggests that the enhancements are well in hand.

We all know what Akida 1 can do. One of its capabilities is key-word spotting, as used in EQXX.

With the proviso that I don't know how transformers work:

LSTM is useful for processing sequences of data in context, be it speech, video, ... . One suggested use was in the prediction of the trajectory of moving objects, eg, in ADAS/AV. But it can only handle relatively short sequences of data because its memory is comparatively "short".

Transformers can handle longer sequences of data and "understand" or interpret the context, and can be used in translation and text prediction. Apparently it does this by extracting the "essence" of the context (don't ask me - I'm just making this up as I go along), so it can "understand" the context (and syntax - noun, verb, adjective, adverb, subject, object, ...) of longer data sequences, and do this more efficiently that LSTM. For instance, if you use a noun in one sentence, and then use the word "it" in a following sentence, the transformed may infer that "it" is the noun from the earlier sentence - spooky stuff.

On the software side, we've recently been exposed to the prowess of ChatGPT (Generative Pre-trained Transformer). Incorporating that capability in silicon (without the "imaginative" results) will be no mean feat.

So, getting back to the EQXX and its derivatives, key-word spotting is all very well, but the next bit is "understanding" what the driver says after "Hey Mercedes!" ... and every driver may use different syntax to express their requirements. So transformers may be advantageous in this context.


As for key markets, we've recently seen advertisements for marketing personnel in Japan. A couple of industries in Japan include automotive and consumer electronics,
Dio Saves the day again! Thanks for your very knowledgeable and comprehensive responses to many of us here! 🙏 Cheers Dio
 
  • Like
  • Love
  • Fire
Reactions: 11 users

HopalongPetrovski

I'm Spartacus!
Just playing " Devils Advocate " ........... what if the money raised via LDA is partially used towards a Co acquisition or maybe a new start up Co .... who or what could it be ??? !!!
 
  • Like
  • Fire
  • Haha
Reactions: 4 users

Dozzaman1977

Regular
Just out of a matter of interest, could one of our posters ask the ChatGPT the following question and post the results.

"As of January the 26th 2023, how many IP licence deals has Brainchip sold to customers"

I know the answer will not be reliable or accurate, Im just curious what will come up. You would hope Renesas and Megachips (the known knowns)

Thanks in advance
 
Last edited:
  • Like
  • Fire
Reactions: 3 users

buena suerte :-)

BOB Bank of Brainchip
Great that you Guy's and Gal's get together on a regular basis and have the opportunity to have informal time with the Perth based crew.
Thanks for the response BienSuerte and Good Luck to us all. 🤣
It is great catching up with these guys we have formed some really good relationships.. Looking forward to the next one! and one day we will have an 'All State party' 🥂🍾🍾🍾🥂:cool:

And yep GOOD LUCK to us all :love:
 
  • Like
  • Fire
  • Love
Reactions: 7 users

TECH

Regular
Both Renesas and Megachips have IP Licenses.

Both can obviously produce their own products or produce products for their clients.

The next cab off the rank would be engineering service fees, if Brainchip is approached to assist with any design/technical expertise, well our gun engineers would be engaged on some contractual agreement, as per explained on our website.

If said products were for OTS our royalty fees would be subject to customers actually buying enough to satisfy the royalty payment agreement, which is confidential and, on a case-by-case basis I would suggest.

All of the 100% known engagements to this point, well, someone has to buy an IP License to move forward on any projects that involve mass production, some party, third or otherwise MUST BUY an IP license, there is no other way around it, unless of course all known engagements deal only through Renesas and/or Megachips.

We are first and foremost an IP supplier...yes, we sell development boards etc for around $499 USD.

How many IP Licenses has ARM sold over the last 10 years, for example, just a handful?

Intel is on that slippery slope, the latest press out of the US is very damning, profit and market share wise, trying to remain relevant is an uphill battle, welcoming Brainchip into their environment was a very smart play, which may bear some real fruit for us over the next 5 years..I'd watch this space.

My opinion and views only, could well be totally wrong. :ROFLMAO::ROFLMAO:;)
 
  • Like
  • Fire
  • Thinking
Reactions: 26 users

alwaysgreen

Top 20
Quite possibly this has been shared already, but I see this particular video was uploaded 8 hours ago, so some may have not watched it.

Worth a watch for the ''Hey ........... '' part around 8:30 min where the lady says they're using AI tech, and ONLY responds to user's voice and imposters will be recognised etc.

Akida or not, some exciting times ahead.



Hopefully it's not my mates IP above. She worked for Qualcomm prior to and after working at Brainchip. Let's hope she wasn't sent on a reconnaissance mission 🤣🤣.........🤔......
 
  • Like
  • Haha
Reactions: 3 users

HopalongPetrovski

I'm Spartacus!
Just out of a matter of interest, could one of our posters ask the ChatGPT the following question and post the results.

"As of January the 26th 2023, how many IP licence deals has Brainchip sold to customers"

I know the answer will not be reliable or accurate, Im just curious.

Thanks in advance

From my recollection we have sold 2 to direct to megachips and renesas, and maybe 2 from a third party (megachips) 4 in total.
Cheers
I have heard that ChatGPT's knowledge base is currently limited to info available up to some point in 2021, so beyond its already notorious unreliability and the opaque and possibly dubious nature of its source material, it also may be out of date.

Garbage In, Garbage Out.

I wonder if at some point they will refine the program so that it will self verify its sources of information rather than allowing all data equal weighting as fact. My elder brother (living off grid as a hermit) who has only fairly recently become exposed to the WWW tends to believe anything he reads on the internet as factual.
He assumes and allows indiscriminate credence in his child like exuberance of the medium.
I have confiscated his credit card. 🤣
 
  • Haha
  • Like
Reactions: 7 users

BaconLover

Founding Member
Hopefully it's not my mates IP above. She worked for Qualcomm prior to and after working at Brainchip. Let's hope she wasn't sent on a reconnaissance mission 🤣🤣.........🤔......

Don't think so mate.

A couple of years ago, she came up during discussion at the crapper and from memory someone contacted Brainchip and we were told we were in completely different path.

Might add, it's a great question you asked regarding her ties with BRN and something a few of us have asked before. Thanks for bringing it out for the new investors.
 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 14 users

Dozzaman1977

Regular
I have heard that ChatGPT's knowledge base is currently limited to info available up to some point in 2021, so beyond its already notorious unreliability and the opaque and possibly dubious nature of its source material, it also may be out of date.

Garbage In, Garbage Out.

I wonder if at some point they will refine the program so that it will self verify its sources of information rather than allowing all data equal weighting as fact. My elder brother (living off grid as a hermit) who has only fairly recently become exposed to the WWW tends to believe anything he reads on the internet as factual.
He assumes and allows indiscriminate credence in his child like exuberance of the medium.
I have confiscated his credit card. 🤣
Cheers, I didnt know that . With Microsoft investing heavily into the ChatGPT it will eventually be a serious competitior to Google search engines. I was just curious to the answer it would spew out regarding IP customers. Whatever the answer given, it would not change my opinion on Brainchip.
Cheers
 
  • Like
Reactions: 2 users

Andy38

The hope of potential generational wealth is real
We have been told what the rise is for:

https://cdn-api.markitdigital.com/a...access_token=83ff96335c2d45a094df02a206a39ff4

View attachment 27961
i. "tape out another chip"
ii. "release significant enhancements to our IP offering"
iii. "hiring personnel in key international markets"
iv. "increase our domestic sales and marketing headcount"

The choice of words for item (ii) is interesting, in particular "release" rather than "develop". To me, this suggests that the enhancements are well in hand.

We all know what Akida 1 can do. One of its capabilities is key-word spotting, as used in EQXX.

With the proviso that I don't know how transformers work (apologies to Ella):

LSTM is useful for processing sequences of data in context, be it speech, video, ... . One suggested use was in the prediction of the trajectory of moving objects, eg, in ADAS/AV. But it can only handle relatively short sequences of data because its memory is comparatively "short".

Transformers can handle longer sequences of data and "understand" or interpret the context, and can be used in translation and text prediction. Apparently it does this by extracting the "essence" of the context (don't ask me - I'm just making this up as I go along), so it can "understand" the context (and syntax - noun, verb, adjective, adverb, subject, object, ...) of longer data sequences, and do this more efficiently that LSTM. For instance, if you use a noun in one sentence, and then use the word "it" in a following sentence, the transformer may infer that "it" is the noun from the earlier sentence - spooky stuff.

On the software side, we've recently been exposed to the prowess of ChatGPT (Generative Pre-trained Transformer). Incorporating that capability in silicon (without the "imaginative" results) will be no mean feat.

So, getting back to the EQXX and its derivatives, key-word spotting is all very well, but the next bit is "understanding" what the driver says after "Hey Mercedes!" ... and every driver may use different syntax to express their requirements. So transformers may be advantageous in this context.


As for key markets, we've recently seen advertisements for marketing personnel in Japan. A couple of industries in Japan include automotive and consumer electronics,
You’re starting to sound a lot like our long lost friend Fact Finder with this sort of reasoning! Great pick up mate, thanks for clarifying 👌
 
  • Like
  • Fire
  • Thinking
Reactions: 12 users

Diogenese

Top 20
Hopefully it's not my mates IP above. She worked for Qualcomm prior to and after working at Brainchip. Let's hope she wasn't sent on a reconnaissance mission 🤣🤣.........🤔......
Mouna left BrainChip in April 2016,

At the time, we had just announced the achievement of unsupervised learning and autonomous feature extraction for the SNAP (Spiking Neuron Adaptive Processor), the latter in cooperation with the Davis artificial retina (DVS event camera) from Inilabs.

https://brainchip.com/brainchip-announces-unsupervised-visual-learning-achieved/

BrainChip Announces: Unsupervised Visual Learning Achieved​

ALISO VIEJO, CA — (Marketwired) — 02/23/16 —
BrainChip Holdings Limited (ASX: BRN), developer of a revolutionary new Spiking Neuron Adaptive Processor (SNAP) technology that has the ability to learn autonomously, evolve and associate information just like the human brain, is pleased to report that it has achieved a further significant advancement of its artificial intelligence technology.

The R&D team in Southern California has completed the development of an Autonomous Visual
Feature Extraction system (AVFE), an advancement of the recently achieved and announced Autonomous Feature Extraction (AFE) system. The AVFE system was developed and interfaced with the DAVIS artificial retina purchased from its developer, Inilabs of Switzerland. DAVIS has been developed to represent data streams in the same way as BrainChip’s neural processor, SNAP.

Highlights


  • Capable of processing 100 million visual events per second
  • Learns and identifies patterns in the image stream within seconds — (Unsupervised Feature Learning)
  • Potential applications include security cameras, collision avoidance systems in road vehicles and Unmanned Aerial Vehicle’s (UAV’S), anomaly detection, and medical imaging
  • AVFE is now commercially available
  • Discussions with potential licensees for AVFE are progressing
AVFE is the process of extracting informative characteristics from an image. The system initially has no knowledge of the contents of an input stream. The system learns autonomously by repetition and intensity, and starts to find patterns in the image stream. BrainChip’s SNAP learns to recognize features within a few seconds, just like a human would when looking at a scene. This image stream can originate from any source, such as an image sensor like the DAVIS artificial retina, but also from other sources that are outside of human perception such as radar or ultrasound images.

Then came:
Spikenet 20160630
JAST 201700320
BrainChip Studio facial recognition software 20170719
BrainChip Accelerator 201710 (hardware accelerator for Studio)
Akida 20180910.

So the putative Mata Hari did not have access to the creme de la creme or the secret sauce.
 
Last edited:
  • Like
  • Love
  • Fire
Reactions: 39 users
Just out of a matter of interest, could one of our posters ask the ChatGPT the following question and post the results.

"As of January the 26th 2023, how many IP licence deals has Brainchip sold to customers"

I know the answer will not be reliable or accurate, Im just curious what will come up. You would hope Renesas and Megachips (the known knowns)

Thanks in advance
The only issue with this is ChatGPT only has data up until 2021.

and today this is the only message you will recieve.... to many people interested in it :)
1674704011701.png
 
  • Like
Reactions: 4 users

Diogenese

Top 20
The only issue with this is ChatGPT only has data up until 2021.

and today this is the only message you will recieve.... to many people interested in it :)
View attachment 27965
Maybe it could benefit from a hardware transformer?
 
  • Like
  • Fire
  • Haha
Reactions: 16 users
Top Bottom