BRN Discussion Ongoing

The Pope

Regular
It's embarrassing for BRN to go from the ASX200 back to the ASX300.
Are we still waiting for 2023 or 2024 or 2025 or 2030.
If I said the later (2030) can you wait to then?
Imagine when we reach 20000 more pages of research posts by then on TSEx and still dot joining to Apple, Samsung, Telsa etc.
While I would not be happy to wait until then, I will wait if SP goes up to $10 or more. I try to remain being a glass half full approach to all things linked to BRN. Onward and upwards !!!
 
  • Like
  • Haha
Reactions: 15 users

Ian

Founding Member
I believe you're just embarrassing yourself
It's embarrassing for BRN to go from the ASX200 back to the ASX300.
Are we still waiting for 2023 or 2024 or 2025 or 2030.
 
  • Like
  • Fire
  • Haha
Reactions: 14 users

MDhere

Top 20
Afternoon KKFoo ,

I would be guessing, say 350,000 ish shares.

Alternately phone the share registry .

Below photo is from the latest Annual Report.& Boardroon Pty ( Share Registry) ph no.

Alternately , simply join the directors at BRN and such wish shall be granted swiftly.





Regards,
Esq.
There are only 1480 people in the world with 100k of shares in this company ? thankyou lord in allowing me to be one of those. that leaves how many people ? :) kudos to the others too that jumped on board below 100k. but I bet the top 20 will happy soon :) now is a good time to be "stardust" who happens to be top 20. who amongst us is stardust? :)
 
  • Like
  • Love
  • Fire
Reactions: 27 users

HopalongPetrovski

I'm Spartacus!
It's embarrassing for BRN to go from the ASX200 back to the ASX300.
Are we still waiting for 2023 or 2024 or 2025 or 2030.
No.
We are just waiting for some sustained revenue.

IMO and from the vantage of hindsight the MB announcement which popped and caught most of us by surprise, skyrocketed our share price to what turned out to be an unsustainable height.
Whilst we all enjoyed the ride I'm sure, without follow up 'in kind' news or confirmation from other similarly influential sources, our valuation decayed.
Given the occasion of consequent unanticipated but significant global shocks and smelling potential carrion, manipulators, shorts and other assorted bottom feeders began their feasting on our carcass which has been our unpleasant experience for the last 18 months or so.

The company has used this time in reshaping itself, putting new muscle on bone, honing and expanding both our skill sets and alliances with promising contemporary players and significant but ageing marque holders who have amassed and exert gravitic influence.

Enduring these setbacks and coming to terms with the realities of our environment has made us stronger, leaner and hungrier.

Whilst our revised strategy is longer in the making and requires more initial resilience, it is also liable to reward with both broader and deeper revenue streams, once sufficient momentum is generated.

Up until now we have only heard our motor cough a few times and emit some blue smoke, but we, here, are aware of a powerful rumbling in the distance, and soon, soon, we hope to hear that mighty merlin roar into life.

 
  • Like
  • Love
  • Fire
Reactions: 46 users

Makeme 2020

Regular
I believe you're just embarrassing yourself
Thanks Buddy
Please explain.
 

Zedjack33

Regular
There are only 1480 people in the world with 100k of shares in this company ? thankyou lord in allowing me to be one of those. that leaves how many people ? :) kudos to the others too that jumped on board below 100k. but I bet the top 20 will happy soon :) now is a good time to be "stardust" who happens to be top 20. who amongst us is stardust? :)
Me too MD. 😀
 
  • Like
  • Fire
Reactions: 6 users

Makeme 2020

Regular
  • Haha
  • Like
Reactions: 5 users

Cartagena

Regular
There are only 1480 people in the world with 100k of shares in this company ? thankyou lord in allowing me to be one of those. that leaves how many people ? :) kudos to the others too that jumped on board below 100k. but I bet the top 20 will happy soon :) now is a good time to be "stardust" who happens to be top 20. who amongst us is stardust? :)

The 30 cent share price is completely disconnected from the potential applications and imminent news we are yet to see. We are a work in progress start up laying the foundations and once we move to the next stage, new secured contracts or IP agreements and then followed by revenue, I think it will be interesting to watch the rerate of the SP in that stage. Patience is needed. We just need one major household name to sign to send this into the dollars again, just my belief as a tech investor.
 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 29 users
Apologies in advance for the multiple posts however was the only way of dropping the whole article / review in its full via screen grabs.

I could have posted just the link but thought some may appreciate being to read via TSE when scrolling through.

While we had the Tata Elxsi Ann and it covered a couple of areas of interest, there is far more opportunity with this partnership imo and this info covers quite a bit & definitely worth a read.......couple of interesting things in there for mine ;)


TATA ELXSI ARTICLE JAN 23 (A).png
TATA ELXSI ARTICLE JAN 23 (B).png
TATA ELXSI ARTICLE JAN 23 (C).png
TATA ELXSI ARTICLE JAN 23 (D).png
 
  • Like
  • Love
  • Fire
Reactions: 46 users

wilzy123

Founding Member
It's embarrassing for BRN to go from the ASX200 back to the ASX300.
Are we still waiting for 2023 or 2024 or 2025 or 2030.

BRN hasn't gone "back to ASX300". When did they leave it?

Great post!!!
 
  • Haha
  • Fire
  • Like
Reactions: 6 users
Apologies in advance for the multiple posts however was the only way of dropping the whole article / review in its full via screen grabs.

I could have posted just the link but thought some may appreciate being to read via TSE when scrolling through.

While we had the Tata Elxsi Ann and it covered a couple of areas of interest, there is far more opportunity with this partnership imo and this info covers quite a bit & definitely worth a read.......couple of interesting things in there for mine ;)


View attachment 44682 View attachment 44683 View attachment 44684 View attachment 44685
Part 2


TATA ELXSI ARTICLE JAN 23 (E).png
TATA ELXSI ARTICLE JAN 23 (F).png
TATA ELXSI ARTICLE JAN 23 (G).png
TATA ELXSI ARTICLE JAN 23 (H).png
 
  • Like
  • Love
  • Fire
Reactions: 39 users

yogi

Regular
There are only 1480 people in the world with 100k of shares in this company ? thankyou lord in allowing me to be one of those. that leaves how many people ? :) kudos to the others too that jumped on board below 100k. but I bet the top 20 will happy soon :) now is a good time to be "stardust" who happens to be top 20. who amongst us is stardust? :)
I do get some where there but humble thing is i met in AGM who belongs in top 20 and he is so humble he doesnt show any sign of how much he hold and all and keeps quiet.
BTW i am going through some health Issue and hard for me to type so forgive me for any Typo.
 
  • Love
  • Like
  • Sad
Reactions: 33 users

Tothemoon24

Top 20
IMG_7525.jpeg
Environmental impact of Generative AI products:

AI, with the recent arrival of ChatGPT, Bard, etc., has sparked massive interest in the public and industries alike. These NLP tools built on large language models assist in customer support, help in developing and debugging code, and serve as conversational agents, aiming to boost our productivity and hence boost the economic value.

But would these benefits justify the harm done to our planet?

1. Electricity consumption for training humungous language models is beyond the roof, for example, training a 13B parameter LLM on 390B text tokens on 200 GPUs for 7 days costs $151,744 (Source: HuggingFace new training cluster service page - https://lnkd.in/g6Vc5cz3). And even larger models with 100+B parameters cost $10+M just to train. Then pay for inferencing every time a new prompt request arrives.

2. Water consumption for cooling, researchers at the University of California, Riverside estimated the environmental impact of ChatGPT-like service, and say it gulps up 500 milliliters of water (close to what’s in a 16-ounce water bottle) every time you ask it a series of between 5 to 50 prompts or questions. The range varies depending on where its servers are located and the season. The estimate includes indirect water usage that the companies don’t measure — such as to cool power plants that supply the data centers with electricity. (Source: https://lnkd.in/gybcxX8C)

Given the benefits of AI/ GenAI, its demand is only bound to go up, and so will its side effects on our planet. How can we reduce or neutralize the side effects of AI on our planet? Carbon capture and nuclear power are in the right direction. But we need to fundamentally rethink the way we do AI, is it the wrong way to do tonnes of matrix multiplications?

Our brain can learn and do many tasks in parallel, in and under 10W, but why do these AI systems consume 10s of megawatts to train models?

⬇️⬇️⬇️⬇️

Perhaps the future holds energy-efficient architectures such as neuromorphic architectures and spiking neural network-based transformers that are closest to the human brain, which might consume 100-1000x lower energy, hence reducing the cost of using AI, thereby democratizing it and saving our planet.
 
  • Like
  • Love
  • Fire
Reactions: 37 users

Learning

Learning to the Top 🕵‍♂️
Apologies in advance for the multiple posts however was the only way of dropping the whole article / review in its full via screen grabs.

I could have posted just the link but thought some may appreciate being to read via TSE when scrolling through.

While we had the Tata Elxsi Ann and it covered a couple of areas of interest, there is far more opportunity with this partnership imo and this info covers quite a bit & definitely worth a read.......couple of interesting things in there for mine ;)


View attachment 44682 View attachment 44683 View attachment 44684 View attachment 44685
Thank you FMF,

Fantastic post.
Greatly appreciating the time and effort 👌

Learning 🪴
 
  • Like
  • Love
Reactions: 26 users
No.
We are just waiting for some sustained revenue.

IMO and from the vantage of hindsight the MB announcement which popped and caught most of us by surprise, skyrocketed our share price to what turned out to be an unsustainable height.
Whilst we all enjoyed the ride I'm sure, without follow up 'in kind' news or confirmation from other similarly influential sources, our valuation decayed.
Given the occasion of consequent unanticipated but significant global shocks and smelling potential carrion, manipulators, shorts and other assorted bottom feeders began their feasting on our carcass which has been our unpleasant experience for the last 18 months or so.

The company has used this time in reshaping itself, putting new muscle on bone, honing and expanding both our skill sets and alliances with promising contemporary players and significant but ageing marque holders who have amassed and exert gravitic influence.

Enduring these setbacks and coming to terms with the realities of our environment has made us stronger, leaner and hungrier.

Whilst our revised strategy is longer in the making and requires more initial resilience, it is also liable to reward with both broader and deeper revenue streams, once sufficient momentum is generated.

Up until now we have only heard our motor cough a few times and emit some blue smoke, but we, here, are aware of a powerful rumbling in the distance, and soon, soon, we hope to hear that mighty merlin roar into life.


Damn well said! @HopalongPetrovski

Perfectly worded and the true and current landscape we are in.

I will add that those “global shocks” you mentioned have compounded the shorting effect and the “follow up” big news that never fully arrived, just smashed the share price over time unfortunately.

Various tech (IT, Finance and Bio) stocks that I watch that have missed a goal or two in this “global shock” economic environment have also been decimated from a share price perspective so it’s highly important to know that Brainchip are not alone here in having their SP smashed down. Economically, risk or growth stocks are just not in favour right now - that is pure investment planning and risk management of the institutions presently.

I look forward to a time in the economic cycle we’re the growth tech stocks are backed firmly again in the share market as that will also help us greatly with this investment and limit some of the manipulation.

I loved seeing the tens of millions of shares with buyers lining up to buy all that stock at the market close and happy to take all and every insto sale. A great sign this afternoon to know we are not crazy the the company is wanted heavily in the market when BRN stock is there to buy in bulk!
 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 15 users

Murphy

Life is not a dress rehearsal!
  • Like
  • Love
Reactions: 23 users

Murphy

Life is not a dress rehearsal!

Could Akida be used here?​

I do know we have a partnership with Cadence.
Main parts highlighted in orange!

Cadence Accelerates On-Device and Edge AI Performance and Efficiency with New Neo NPU IP and NeuroWeave SDK for Silicon Design​

SAN JOSE, Calif.— September 14, 2023 -- Cadence Design Systems, Inc. (Nasdaq: CDNS) today unveiled its next-generation AI IP and software tools to address the escalating demand for on-device and edge AI processing. The new highly scalable Cadence® Neo™ Neural Processing Units (NPUs) deliver a wide range of AI performance in a low-energy footprint, bringing new levels of performance and efficiency to AI SoCs. Delivering up to 80 TOPS performance in a single core, the Neo NPUs support both classic and new generative AI models and can offload AI/ML execution from any host processor—including application processors, general-purpose microcontrollers and DSPs—with a simple and scalable AMBA® AXI interconnect. Complementing the AI hardware, the new NeuroWeave™ Software Development Kit (SDK) provides developers with a “one-tool” AI software solution across Cadence AI and Tensilica® IP products for no-code AI development.
“While most of the recent attention on AI has been cloud-focused, there are an incredible range of new possibilities that both classic and generative AI can enable on the edge and within devices,” said Bob O’Donnell, president and chief analyst at TECHnalysis Research. “From consumer to mobile and automotive to enterprise, we’re embarking on a new era of naturally intuitive intelligent devices. For these to come to fruition, both chip designers and device makers need a flexible, scalable combination of hardware and software solutions that allow them to bring the magic of AI to a wide range of power requirements and compute performance, all while leveraging familiar tools. New chip architectures that are optimized to accelerate ML models and software tools with seamless links to popular AI development frameworks are going to be incredibly important parts of this process.”
The flexible Neo NPUs are well suited for ultra-power-sensitive devices as well as high-performance systems with a configurable architecture, enabling SoC architects to integrate an optimal AI inferencing solution in a broad range of products, including intelligent sensors, IoT and mobile devices, cameras, hearables/wearables, PCs, AR/VR headsets and advanced driver-assistance systems (ADAS). New hardware and performance enhancements and key features/capabilities include:
  • Scalability: Single-core solution is scalable from 8 GOPS to 80 TOPS, with further extension to hundreds of TOPS with multicore
  • Broad configuration range: supports 256 to 32K MACs per cycle, allowing SoC architects to optimize their embedded AI solution to meet power, performance and area (PPA) tradeoffs
  • Integrated support for a myriad of network topologies and operators: enables efficient offloading of inferencing tasks from any host processor—including DSPs, general-purpose microcontrollers or application processors—significantly improving system performance and power
  • Ease of deployment: shortens the time to market to meet rapidly evolving next-generation vision, audio, radar, natural language processing (NLP) and generative AI pipelines
  • Flexibility: Support for Int4, Int8, Int16, and FP16 data types across a wide set of operations that form the basis of CNN, RNN and transformer-based networks allows flexibility in neural network performance and accuracy tradeoffs
  • High performance and efficiency: Up to 20X higher performance than the first-generation Cadence AI IP, with 2-5X the inferences per second per area (IPS/mm2) and 5-10X the inferences per second per Watt (IPS/W)
Since software is a critical part of any AI solution, Cadence also upgraded its common software toolchain with the introduction of the NeuroWeave SDK. Providing customers with a uniform, scalable and configurable software stack across Tensilica DSPs, controllers and Neo NPUs to address all target applications, the NeuroWeave SDK streamlines product development and enables an easy migration as design requirements evolve. It supports many industry-standard domain-specific ML frameworks, including TensorFlow, ONNX, PyTorch, Caffe2, TensorFlow Lite, MXNet, JAX and others for automated end-to-end code generation; Android Neural Network Compiler; TF Lite Delegates for real-time execution; and TensorFlow Lite Micro for microcontroller-class devices.

“For two decades and with more than 60 billion processors shipped, industry-leading SoC customers have relied on Cadence processor IP for their edge and on-device SoCs. Our Neo NPUs capitalize on this expertise, delivering a leap forward in AI processing and performance,” said David Glasco, vice president of research and development for Tensilica IP at Cadence. “In today’s rapidly evolving landscape, it’s critical that our customers are able to design and deliver AI solutions based on their unique requirements and KPIs without concern about whether future neural networks are supported. Toward this end, we’ve made significant investments in our new AI hardware platform and software toolchain to enable AI at every performance, power and cost point and to drive the rapid deployment of AI-enabled systems.”

“At Labforge, we use a cluster of Cadence Tensilica DSPs in our Bottlenose smart camera product line to enable best-in-class AI processing for power-sensitive edge applications,” said Yassir Rizwan, CEO of Labforge, Inc. “Cadence’s AI software is an integral part of our embedded low power AI solution, and we’re looking forward to leveraging the new capabilities and higher performance offered by Cadence’s new NeuroWeave SDK. With an end-to-end compiler toolchain flow, we can better solve challenging AI problems in automation and robotics—accelerating our time to market to capitalize on generative AI-based application demand and opening new market streams that may not have been possible otherwise.”

The Neo NPUs and the NeuroWeave SDK support Cadence’s Intelligent System Design™ strategy by enabling pervasive intelligence through SoC design excellence.

Availability

The Neo NPUs and the NeuroWeave SDK are expected to be in general availability beginning in December 2023. Early engagements have already started for lead customers. For more information, please visit www.cadence.com/go/NPU.

About Cadence

Cadence is a pivotal leader in electronic design, building upon more than 30 years of computational software expertise. The company applies its underlying Intelligent System Design strategy to deliver software, hardware and IP that turn design concepts into reality. Cadence customers are the world’s most innovative companies, delivering extraordinary electronic products from chips to boards to systems for the most dynamic market applications, including consumer, hyperscale computing, 5G communications, automotive, mobile, aerospace, industrial and healthcare. For nine years in a row, Fortune magazine has named Cadence one of the 100 Best Companies to Work For. Learn more at cadence.com.
Bravo had soon big connections outlined with Cadence, I reckon. Sounds promising!


If you don't have dreams, you can't have dreams gone true
 
  • Like
  • Love
Reactions: 9 users
Top Bottom