BRN Discussion Ongoing

Deena

Regular
Last year the annual report and Appendix 4E were released on Friday 24th February. One could assume that this years reports may be released around Friday 23rd February.

These reports essentially give us a summary of what has happened over the past year. So there essentially is unlikely to be any significant new information, which of course needs to be released via an ASX announcement. If you are expecting that you will be disappointed.

Having said that the summary of what has happened over the past 12 months is not insignificant. If the current surge in the share price is a result of the cumulative good news that has been released in the last 12 months, then all I can say is that the current buyers are slow learners. The shear volume of partnerships and adopters of the Brainchip story is now staggering.

Things have been going exceedingly well, and they will only get better as revenue starts to kick in and build.
What a great company to be to hold shares in!

CUSTOMERS + ECOSYSTEM + PARTNERS + PATENT PROTECTION + RESEARCH + MANAGEMENT + QUALITY EMPLOYEES + PUBLICITY + UNIQUE OFFERING.
What more could you want. Go Brainchip / Akida.
Cheers, Deena
 
Last edited:
  • Like
  • Love
  • Fire
Reactions: 66 users

Tels61

Member
Whilst your down there pom, see if you can find that Defence, sorry couldn't help myself, but you must have known it was coming. LOL
 
  • Haha
Reactions: 1 users

TheDrooben

Pretty Pretty Pretty Pretty Good
Speeding ticket today IMO if this volume continues
 
  • Like
Reactions: 5 users

Esq.111

Fascinatingly Intuitive.
Morning TheDrooben ,

Just need some one to dump a $10 to $20 mill buy order straight onto the market , At market.

Wipe the lot and most if not all of the hidden orders.

Would / Will be rather exciting to watch.

😙.

Regards,
Esq.
 
  • Like
  • Fire
  • Love
Reactions: 21 users

HarryCool1

Regular
Morning TheDrooben ,

Just need some one to dump a $10 to $20 mill buy order straight onto the market , At market.

Wipe the lot and most if not all of the hidden orders.

Would / Will be rather exciting to watch.

😙.

Regards,
Esq.

I'm out I'm afraid.. I'll chip in the $1-$2 orders..
giphy.gif
 
  • Haha
  • Like
Reactions: 7 users

Esq.111

Fascinatingly Intuitive.
But you know you want to.

Esq.
 
  • Like
Reactions: 2 users
Whilst your down there pom, see if you can find that Defence, sorry couldn't help myself, but you must have known it was coming. LOL
Glad I never got up to watch it COYI
 
  • Like
Reactions: 1 users
There is only just over 4% short interest against BRN, or just under 80 million shares.

20240212_120313.jpg


In the past year, they have always just taken out a big mallet and just smashed little "uprisings" like this back down.

They will likely do it again, with 4 to 10 million fresh borrowed stock, over the next day or two.

They are creatures of habit and not just shorting BRN, but 100s of other companies.
Their algorithmic programs, may just do it automatically (most likely).

I believe that's how they get burnt, when there is genuine strong demand, for a Company's stock.

Let's see how they fare, this time.
 
  • Like
  • Fire
  • Haha
Reactions: 28 users

ndefries

Regular
There is only just over 4% short interest against BRN, or just under 80 million shares.

In the past year, they have always just taken out a big mallet and just smashed little "uprisings" like this back down.

They will likely do it again, with 4 to 10 million fresh borrowed stock, over the next day or two.

They are creatures of habit and not just shorting BRN, but 100s of other companies.
Their algorithmic programs, may just do it automatically (most likely).

I believe that's how they get burnt, when there is genuine strong demand, for a Company's stock.

Let's see how they fare, this time.
hopefully they have less money at the moment to do this.
 
  • Like
Reactions: 3 users
  • Like
  • Thinking
Reactions: 3 users

equanimous

Norse clairvoyant shapeshifter goddess
Got to love a reserection.
 
  • Like
  • Haha
  • Love
Reactions: 9 users

equanimous

Norse clairvoyant shapeshifter goddess
  • Haha
  • Like
Reactions: 12 users

Exclusive: Nvidia pursues $30 billion custom chip opportunity with new unit​

By Max A. Cherney and Stephen Nellis
February 10, 20249:12 AM


https://www.reuters.com/technology/...chip-market-with-new-unit-sources-2024-02-09/
...
"If you're really trying to optimize on things like power, or optimize on cost for your application, you can't afford to go drop an H100 or A100 in there," Greg Reichow, general partner at venture capital firm Eclipse Ventures said in an interview. "You want to have the exact right mixture of compute and just the kind of compute that you need."
Nvidia does not disclose H100 prices, which are higher than for the prior-generation A100, but each chip can sell for $16,000 to $100,000 depending on volume and other factors. Meta plans to bring its total stock to 350,000 H100s this year.

Nvidia officials have met with representatives from Amazon.com (AMZN.O), opens new tab, Meta, Microsoft, Google and OpenAI to discuss making custom chips for them, two sources familiar with the meetings said. Beyond data center chips, Nvidia has pursued telecom, automotive and video game customers
.
...
In 2022, Nvidia said it would let third-party customers integrate some of its proprietary networking technology with their own chips. It has said nothing about the program since, and Reuters is reporting its wider ambitions for the first time.
A Nvidia spokesperson declined to comment beyond the company's 2022 announcement
.

We recall RT's "partners rather than competitors" remark, in the context of MB.OS, but are there wider implications?

Nvidia "would let 3rd party customers integrate some of its (Nvidia's) proprietary networking technology with their (the customers) own chips." Is this a 2-way street?
Hi Diogenese

Well I don’t think you need to look very hard to find that two-way street on a map.

Just put Edge Impulse into your SatNav and you will find it runs straight past their office.

Nvidia Tao can be trialled and applications built directly on AKIDA using Edge Impulse and MetaTF.

The interesting thing to ponder is whether Nvidia’s commercial interest would involve keeping a partnership with Brainchip a secret.

ARM most certainly did not see the need and in fact advertises AKIDA’s compatibility with its processes. It even wrote of the fact that AKIDA made their process more powerful and efficient.

So if a potential customer can go onto Edge Impulse and run Tao on AKIDA and benchmark it as Tata Consulting Services and Quantum Ventura have against Jetson Nano why would Nvidia not adopt the ARM approach of boasting AKIDA is available through them and compatible with their entire GPU range.

Now here is something I have been pondering. Brainchip is heavily promoting its Edge credentials yet Dr. Anthony Thorne the new CTO asked the question can it scale up for large language models?

Up is the Cloud/Data Centre.

Nvidia has a strangle hold on GPUs for Cloud/Data Centre applications.

Sam Altman is looking to make the power consumption of CHATGpt affordable and green acceptable.

This pursuit underpinned the RainAi $51 million exercise.

Nvidia would need to be asleep at the wheel not to recognise the whole world is chasing a solution to this power consumption issue.

If Nvidia had a quick fix it would overnight ensure its strangle hold on the Cloud/Data Centre market would continue.

If AKIDA can “scale up” and plug into Nvidia GPUs this emergency would be averted.

Peter van der Made and Anil Mankar have always said that AKIDA is scalable and that it could take over all the Ai inference in Data Centres.

Indeed this was always Peter van der Made’s ambition.

AKIDA 2.0 P was initially 50 TOPS when first announced but barely 5 months later it had been scaled up to 131 TOPS. Almost three times what was originally planned.

WHY???

Tapeout was close according to Tod Vierra so in about three months AKIDA 2.0 will be available to put on PCle Boards and plugged into someone’s GPU to offload Ai inference.

Lots of questions to consider starting with why did Rob Telson suggest Nvidia was more likely to be a partner than a competitor???

My opinion only DYOR
Fact Finder
 
  • Like
  • Fire
  • Love
Reactions: 112 users

IloveLamp

Top 20
Hi Diogenese

Well I don’t think you need to look very hard to find that two-way street on a map.

Just put Edge Impulse into your SatNav and you will find it runs straight past their office.

Nvidia Tao can be trialled and applications built directly on AKIDA using Edge Impulse and MetaTF.

The interesting thing to ponder is whether Nvidia’s commercial interest would involve keeping a partnership with Brainchip a secret.

ARM most certainly did not see the need and in fact advertises AKIDA’s compatibility with its processes. It even wrote of the fact that AKIDA made their process more powerful and efficient.

So if a potential customer can go onto Edge Impulse and run Tao on AKIDA and benchmark it as Tata Consulting Services and Quantum Ventura have against Jetson Nano why would Nvidia not adopt the ARM approach of boasting AKIDA is available through them and compatible with their entire GPU range.

Now here is something I have been pondering. Brainchip is heavily promoting its Edge credentials yet Dr. Anthony Thorne the new CTO asked the question can it scale up for large language models?

Up is the Cloud/Data Centre.

Nvidia has a strangle hold on GPUs for Cloud/Data Centre applications.

Sam Altman is looking to make the power consumption of CHATGpt affordable and green acceptable.

This pursuit underpinned the RainAi $51 million exercise.

Nvidia would need to be asleep at the wheel not to recognise the whole world is chasing a solution to this power consumption issue.

If Nvidia had a quick fix it would overnight ensure its strangle hold on the Cloud/Data Centre market would continue.

If AKIDA can “scale up” and plug into Nvidia GPUs this emergency would be averted.

Peter van der Made and Anil Mankar have always said that AKIDA is scalable and that it could take over all the Ai inference in Data Centres.

Indeed this was always Peter van der Made’s ambition.

AKIDA 2.0 P was initially 50 TOPS when first announced but barely 5 months later it had been scaled up to 131 TOPS. Almost three times what was originally planned.

WHY???

Tapeout was close according to Tod Vierra so in about three months AKIDA 2.0 will be available to put on PCle Boards and plugged into someone’s GPU to offload Ai inference.

Lots of questions to consider starting with why did Rob Telson suggest Nvidia was more likely to be a partner than a competitor???

My opinion only DYOR
Fact Finder

reading-my-mind-talking.gif
 
  • Like
  • Haha
  • Love
Reactions: 22 users

7für7

Regular
Hi Diogenese

Well I don’t think you need to look very hard to find that two-way street on a map.

Just put Edge Impulse into your SatNav and you will find it runs straight past their office.

Nvidia Tao can be trialled and applications built directly on AKIDA using Edge Impulse and MetaTF.

The interesting thing to ponder is whether Nvidia’s commercial interest would involve keeping a partnership with Brainchip a secret.

ARM most certainly did not see the need and in fact advertises AKIDA’s compatibility with its processes. It even wrote of the fact that AKIDA made their process more powerful and efficient.

So if a potential customer can go onto Edge Impulse and run Tao on AKIDA and benchmark it as Tata Consulting Services and Quantum Ventura have against Jetson Nano why would Nvidia not adopt the ARM approach of boasting AKIDA is available through them and compatible with their entire GPU range.

Now here is something I have been pondering. Brainchip is heavily promoting its Edge credentials yet Dr. Anthony Thorne the new CTO asked the question can it scale up for large language models?

Up is the Cloud/Data Centre.

Nvidia has a strangle hold on GPUs for Cloud/Data Centre applications.

Sam Altman is looking to make the power consumption of CHATGpt affordable and green acceptable.

This pursuit underpinned the RainAi $51 million exercise.

Nvidia would need to be asleep at the wheel not to recognise the whole world is chasing a solution to this power consumption issue.

If Nvidia had a quick fix it would overnight ensure its strangle hold on the Cloud/Data Centre market would continue.

If AKIDA can “scale up” and plug into Nvidia GPUs this emergency would be averted.

Peter van der Made and Anil Mankar have always said that AKIDA is scalable and that it could take over all the Ai inference in Data Centres.

Indeed this was always Peter van der Made’s ambition.

AKIDA 2.0 P was initially 50 TOPS when first announced but barely 5 months later it had been scaled up to 131 TOPS. Almost three times what was originally planned.

WHY???

Tapeout was close according to Tod Vierra so in about three months AKIDA 2.0 will be available to put on PCle Boards and plugged into someone’s GPU to offload Ai inference.

Lots of questions to consider starting with why did Rob Telson suggest Nvidia was more likely to be a partner than a competitor???

My opinion only DYOR
Fact Finder

1707706238396.gif
 
  • Like
  • Fire
  • Haha
Reactions: 9 users

miaeffect

Oat latte lover
Hi Diogenese

Well I don’t think you need to look very hard to find that two-way street on a map.

Just put Edge Impulse into your SatNav and you will find it runs straight past their office.

Nvidia Tao can be trialled and applications built directly on AKIDA using Edge Impulse and MetaTF.

The interesting thing to ponder is whether Nvidia’s commercial interest would involve keeping a partnership with Brainchip a secret.

ARM most certainly did not see the need and in fact advertises AKIDA’s compatibility with its processes. It even wrote of the fact that AKIDA made their process more powerful and efficient.

So if a potential customer can go onto Edge Impulse and run Tao on AKIDA and benchmark it as Tata Consulting Services and Quantum Ventura have against Jetson Nano why would Nvidia not adopt the ARM approach of boasting AKIDA is available through them and compatible with their entire GPU range.

Now here is something I have been pondering. Brainchip is heavily promoting its Edge credentials yet Dr. Anthony Thorne the new CTO asked the question can it scale up for large language models?

Up is the Cloud/Data Centre.

Nvidia has a strangle hold on GPUs for Cloud/Data Centre applications.

Sam Altman is looking to make the power consumption of CHATGpt affordable and green acceptable.

This pursuit underpinned the RainAi $51 million exercise.

Nvidia would need to be asleep at the wheel not to recognise the whole world is chasing a solution to this power consumption issue.

If Nvidia had a quick fix it would overnight ensure its strangle hold on the Cloud/Data Centre market would continue.

If AKIDA can “scale up” and plug into Nvidia GPUs this emergency would be averted.

Peter van der Made and Anil Mankar have always said that AKIDA is scalable and that it could take over all the Ai inference in Data Centres.

Indeed this was always Peter van der Made’s ambition.

AKIDA 2.0 P was initially 50 TOPS when first announced but barely 5 months later it had been scaled up to 131 TOPS. Almost three times what was originally planned.

WHY???

Tapeout was close according to Tod Vierra so in about three months AKIDA 2.0 will be available to put on PCle Boards and plugged into someone’s GPU to offload Ai inference.

Lots of questions to consider starting with why did Rob Telson suggest Nvidia was more likely to be a partner than a competitor???

My opinion only DYOR
Fact Finder

Dr. Anthony Thorne the new CTO
AL + ST
ek2zpaJi9shsQ1bgpyWwonZ9TvFrLSBJqoAGJdjw0QBlhqRm3bBeoaBXHQTL7PNO9UREBDSZluIxIza7Nw4p5Q.gif
 
  • Haha
  • Like
  • Fire
Reactions: 13 users

7für7

Regular
Possible scenario in “the dean” and “tom&jerrys” brain at the moment!

1707707534677.jpeg
 
  • Haha
Reactions: 12 users

MDhere

Regular
Morning TheDrooben ,

Just need some one to dump a $10 to $20 mill buy order straight onto the market , At market.

Wipe the lot and most if not all of the hidden orders.

Would / Will be rather exciting to watch.

😙.

Regards,
Esq.
Yep only a few hours left to head straight to the top of the ladder $4.50 board. How hard can that be?
@Pom down under , dig deeper into that couch of yours or up in the attic.
Surely someone can make the full wipe to the top in a just a few hours :)
 
  • Haha
  • Like
  • Love
Reactions: 14 users

Esq.111

Fascinatingly Intuitive.
BOOM
 
  • Like
  • Fire
  • Wow
Reactions: 27 users
Top Bottom