BRN Discussion Ongoing

D

Deleted member 118

Guest
Just got this email, not sure if posted, but too busy working to look

4A098489-AFC5-4831-9093-06712DE6ABC6.png
 
  • Like
  • Fire
  • Love
Reactions: 27 users

buena suerte :-)

BOB Bank of Brainchip
A thought retrospective:

“Teksun focuses on end to end IoT product development and enabling intelligent solutions, such as predictive and preventative maintenance devices, analytics and diagnostics for portable healthcare, and vision based devices for security and surveillance. The partnership between BrainChip and Teksun proliferates intelligence through the Teksun product development channels.”

The above quote is from the Partners page on the Brainchip website. Teksun is categorised under the same heading as MegaChips that Peter van der Made as acting CEO said of that the market did not understand the significance of this partnership to Brainchip’s commercial success.

In the website quote above Brainchip claims that with Teksun they will proliferate intelligence through Teksun development channels.

What does proliferate mean:

“proliferate \pruh-LIF-uh-rayt\ verb. 1 : to grow or cause to grow by rapid production of new parts, cells, buds, or offspring. 2 : to increase or cause to increase in number as if by proliferating : multiply.3 Mar 2023”

So what are Teksun’s development channels:

Too HUGE to fit here so enjoy its YouTube presentations:


If you have less time than the rest of your life if you avoid sleeping to watch all of the above then this is a brief summary:

Our Domains

Teksun helps businesses, technology providers, and start-ups build products in the domains of:

Home Automation,
Wearable,
Consumer Electronics,
Industrial Automation, Semiconductor,
Aerospace,
Automotive,
Healthcare,
Agritech, and
more - to be found here:

We provide consulting, development, testing, support, and maintenance to a wide range of mentioned domains.”

So when you consider that Teksun was so keen to get the word out about Brainchip it put up an unauthorised statement about Brainchip on their Website revealing not yet revealed customers of Brainchip in Cisco and Toshiba which was quickly removed once Brainchip discovered this breach how do you think Peter van der Made would describe the partnership with Teksun if he was still the Acting CEO by reference to how he rated the partnership with MegaChip.

Is that the sound of fireworks I can hear in the background.

My opinion only DYOR
FF

AKIDA BALLISTA
Happy New Year Celebration GIF by Faith Holland
 
  • Like
  • Haha
  • Fire
Reactions: 14 users

Lex555

Regular
Should mention that the five largest microcontroller suppliers develop and sell ARM-based MCUs.

Perfect recipe for BRN large market share.

Full article here: https://www.emsnow.com/the-five-biggest-mcu-suppliers-accounted-for-82-of-2021-sales/
Great stuff @Steve10

If we take the mid range of 50bn microcontrollers by around 2026 we could be looking at a market cap of $45bn for just MCU’s.

50b units
x $1 per MCU price
x $0.10 or 10% royalty
x 30% BRN market share
= $1.5bn revenue
x 30 price to sales
= $45bn market cap
 
  • Like
  • Fire
  • Wow
Reactions: 31 users

mcm

Regular
Great stuff @Steve10

If we take the mid range of 50bn microcontrollers by around 2026 we could be looking at a market cap of $45bn for just MCU’s.

50b units
x $1 per MCU price
x $0.10 or 10% royalty
x 30% BRN market share
= $1.5bn revenue
x 30 price to sales
= $45bn market cap
Looks good to me! 😎
Over 1.4 million shares on the buy at 46c and a few want to sell their precious BRN's for 46.5c? I just don't get their thinking. I do get the sense however that the share price is set to explode upwards very soon with all the ducks that keep lining up.
 
  • Like
  • Fire
  • Love
Reactions: 23 users
Still, what is AGI?

I suppose that a human is GI, but what about a dolphin? A Pig? A Dog? A Cat? At what level does General Intelligence stop?

Where does AGI start?

Could Akida-P maybe compete with a cat in regards to intelligence, although it's a significantly different kind of intelligence?

I understand that wet-ware neural networks operate at very low frequencies, something like 50-70 hz. But have much more neurons. Supposedly a cat has a bit more than 500 million neurons. The previous Akida 1.0 could be connected to have up around 70 million neurons, but operating at hundreds of millions of hz. How about Akida 2.0 P? Performing 50 TFLOPS (the Akida 2.0 kind)?

I think the debate about if AGI is here starts now and will last until long after AGI has surpassed GI.
Hey I like cats, but any company that develops "cat like" intelligence, is going to be a failure.

How often is something like a C.A.I. doorbell, going to "feel" like working?

I can already imagine the frustration 🤣


"Hey! I know you know it's me, let me in damn it!"


_20230321_110838.JPG
 
Last edited:
  • Haha
  • Like
  • Love
Reactions: 22 users

Tothemoon24

Top 20
Carnegie Mellon University / NASA = Brainchip


NewsStories Archives 2023 March ›CMU To Lead NASA Space Technology Research Institut

CMU To Lead NASA Space Technology Research Institute​

By Monica CooneyEmail
Media Inquiries
In space travel, custom parts for vehicles such as rockets and satellites are often needed quickly to accommodate changes in design, as well as for repair and functionality purposes. Additive manufacturing is an ideal technology to meet these needs, as components can be made through a relatively short cycle of design, build and test. However, this cycle must be continually refined to ensure the quality and reliability of the 3D-printed parts.
A new NASA Space Technology Research Institute (STRI) led by Carnegie Mellon University seeks to shorten the cycle required to design, manufacture and test parts that can withstand the conditions of space travel through the development of models for qualification and certification (Q&C).
First set up in 2016, the STRI program aims to strengthen NASA's ties to the academic community through long-term, sustained investment in research and technology development, while also fostering talent among highly skilled engineers, scientists and technologists.
"The STRI affords us an opportunity for a major collaboration through which we can construct the models that our partners at NASA very much need." — Tony Rollett
The Institute for Model-based Qualification & Certification of Additive Manufacturing (IMQCAM), a $15 million project spread over five years, will be co-directed by Tony Rollett, a professor of materials science and engineeringat CMU, and Michael Callas, a professor of civil and systems engineering at Johns Hopkins University.
"In order to make a printed product have predictable properties, we need to understand more about what its internal structure is, how it depends on the printing process and what properties it has," said Rollett. "The STRI affords us an opportunity for a major collaboration through which we can construct the models that our partners at NASA very much need in order to do their work."
Over the course of five years, the institute will develop detailed computer models, or digital twins, for additively manufactured parts that have been validated against experimental data, verified against physical mechanisms and subjected to rigorous uncertainty quantification protocols. The models will evaluate response to fatigue in spaceflight materials that are currently used for 3D printing, as well as introducing and qualifying new materials.
The project outcomes will serve as a vital resource for partners at NASA, as the models will enable them to better predict the parts' performance abilities.
The institute will also serve as a catalyst for recruiting and training students and post-docs to have a comprehensive understanding of the additive manufacturing Q&C process and be the future leaders in the field. Students from institutional partners will be mentored by both STRI team members and NASA researchers throughout the project.
Carnegie Mellon faculty members Sneha Prabha Narra, Mohadeseh Taheri-Mousavi, and Bryan Webler will also contribute their expertise to the institute.
Additional institutional partners on the project include Vanderbilt University, University of Texas at San Antonio, University of Virginia, Case Western Reserve University, Johns Hopkins University Applied Physics Laboratory, Southwest Research Institute, and Pratt & Whitney.

 
  • Like
  • Fire
  • Love
Reactions: 32 users

Steve10

Regular
Great stuff @Steve10

If we take the mid range of 50bn microcontrollers by around 2026 we could be looking at a market cap of $45bn for just MCU’s.

50b units
x $1 per MCU price
x $0.10 or 10% royalty
x 30% BRN market share
= $1.5bn revenue
x 30 price to sales
= $45bn market cap

All the MCU's will not be for AI applications.

I would allow approx. 1% for AI enabled MCU's x 30B in 2023 = 300M chips TAM.

In three years I would allow 5-10% for AI enabled MCU's x 42B = 2.1-4.2B chips TAM.

2.1-4.2B chips x 10% BRN market share = 210-420M chips x 30c BRN royalty = $63-126M revenue.
(I have allowed 30c for royalty, however, AI enabled MCU's appear to be priced at around $20 x 2-3% royalty = 40-60c)

With 30% market share = $189-378M revenue x 60% EBITDA = $113.4-226.8M x 0.7 if taxed in Australia = $79.4-158.8M NPAT x PE60 =$4.76-9.52B MC.

It appears that most of the MCU market future growth will be in AI enabled MCU's.

By 2030 there will be approx. 70B MCU's per year - 30B in 2023 = 40B per year growth within 7 years.

Most likely 80% of the 40B MCU future growth will be for AI applications = 32B / 70B total = 45.7% of MCU's will be AI enabled by 2030 when there will be mass adoption.

So in 2030 if 32B MCU's are AI enabled x 10% BRN market share = 3.2B x 30c royalty = $960M revenue x 60% EBITDA = $576M x 0.7 ATO = $403.2M NPAT x PE60 = $24.2B MC. With 30% market share MC will be $72.6B.
(PE could be as high as 100. Nvidia is currently trading at PE 148.65)

Revenue growth will be similar to J-curve as mass adoption takes place. We are at the early adoption phase at the moment with early mass adoption expected within 3-6 years.
 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 55 users
D

Deleted member 118

Guest
Maybe a few $$$ for us on future sales
 

Attachments

  • 6A1141143_YPB.pdf
    225.8 KB · Views: 226
  • Like
Reactions: 4 users

MDhere

Regular
On driving holiday to Melbourne and couldn't help myself to this little man at Parkes Observatory :)
20230321_123935.jpg
 
  • Like
  • Haha
  • Fire
Reactions: 23 users

HopalongPetrovski

I'm Spartacus!
  • Haha
  • Like
Reactions: 15 users

Steve10

Regular
With the lack of many competitors offering neuromorphic hardware I would expect BRN to have a good chunk of market share.

The top 5 players in MCU market account for 82.1% of all MCU's. The top 4 have 16.7-18.8% market share each whereas number 5 has 11.8%.

BRN should be in the top 5 for neuromorphic hardware with at least 15-20% market share.

Hailo & SynSense appear to be producing & selling neuromorphic chips whereas BRN is selling IP.

The IP business model scales faster & can attain high market share as witnessed by ARM. It is also more flexible as customers can incorporate the IP within their own custom designs. BRN will also have the ARM based markets covered as well as the emerging RISC-V based markets for AI.

Doesn't appear that customers can incorporate Hailo or SynSense IP into their designs as they sell chips & not IP. They have partnered with other companies to develop/produce 'one off' products for particular applications such as SynSense partnering with iniVation (Prophesee's competitor) to develop Speck.
 
  • Like
  • Fire
  • Love
Reactions: 56 users

Esq.111

Fascinatingly Intuitive.
All the MCU's will not be for AI applications.

I would allow approx. 1% for AI enabled MCU's x 30B in 2023 = 300M chips TAM.

In three years I would allow 5-10% for AI enabled MCU's x 42B = 2.1-4.2B chips TAM.

2.1-4.2B chips x 10% BRN market share = 210-420M chips x 30c BRN royalty = $63-126M revenue.
(I have allowed 30c for royalty, however, AI enabled MCU's appear to be priced at around $20 x 2-3% royalty = 40-60c)

With 30% market share = $189-378M revenue x 60% EBITDA = $113.4-226.8M x 0.7 if taxed in Australia = $79.4-158.8M NPAT x PE60 =$4.76-9.52B MC.

It appears that most of the the MCU market future growth will be in AI enabled MCU's.

By 2030 there will be approx. 70B MCU's per year - 30B in 2023 = 40B per year growth within 7 years.

Most likely 80% of the 40B MCU future growth will be for AI applications = 32B / 70B total = 45.7% of MCU's will be AI enabled by 2030 when there will be mass adoption.

So in 2030 if 32B MCU's are AI enabled x 10% BRN market share = 3.2B x 30c royalty = $960M revenue x 60% EBITDA = $576M x 0.7 ATO = $403.2M NPAT x PE60 = $24.2B MC. With 30% market share MC will be $72.6B.
(PE could be as high as 100. Nvidia is currently trading at PE 148.65)

Revenue growth will be similar to J-curve as mass adoption takes place. We are at the early adoption phase at the moment with early mass adoption expected within 3-6 years.
Good Afternoon Steve10,

Love all of your workings.

I hereby bestow the prestigious abbicus award upon thee.

😊.

Top job.

Regards,
Esq.
 
  • Like
  • Haha
  • Fire
Reactions: 40 users

Foxdog

Regular
Good Afternoon Steve10,

Love all of your workings.

I hereby bestow the prestigious abbicus award upon thee.

😊.

Top job.

Regards,
Esq.
The PAA can only be bestowed with an estimate of SP for the corresponding periods Dec 2023, 2028, 2030 🤣🤔
 
  • Like
  • Haha
Reactions: 4 users

Steve10

Regular
Seriously SynSense are dreaming with their pricing for Speck.

Somebody requested pricing for one unit for home hobby & price is $4k. Try putting that into a smartphone. LOL

Small market only for very expensive low volume industrial applications. Appears to do object classification, object detection, object tracking etc but no learning & it uses CNN (not ideal for Prophesee).

jason

10 days ago (edited)
Can you purchase one (speck dvs camera) at the hobbyist level (a few hundred dollars?)
Open Neuromorphic
8 days ago
You mean the Speck SoC? Unfortunately, the price for that one is set to 4k$, since it is an ASIC recently introduced.

For those that don't know what Speck is here are details:

Speck™ is a fully event-driven neuromorphic vision SoC. Speck™ is able to support large-scale spiking convolutional neural network (sCNN) with a fully asynchronous chip architecture. Speck™ is fully configurable with the spiking neuron capacity of 320K. Furthermore, it integrates the state-of-art dynamic vision sensor (DVS) that enables fully event-driven based, real-time, highly integrated solution for varies dynamic visual scene. For classical applications, Speck™ can provide intelligence upon the scene at only mWs with a response latency in few ms.

1679364494399.png


 
  • Like
  • Fire
  • Love
Reactions: 25 users

Steve10

Regular
The PAA can only be bestowed with an estimate of SP for the corresponding periods Dec 2023, 2028, 2030 🤣🤔

Who's first?

1679364664300.png
 
  • Haha
  • Like
  • Thinking
Reactions: 12 users
Seriously SynSense are dreaming with their pricing for Speck.

Somebody requested pricing for one unit for home hobby & price is $4k. Try putting that into a smartphone. LOL

Small market only for very expensive low volume industrial applications. Appears to do object classification, object detection, object tracking etc but no learning & it uses CNN (not ideal for Prophesee).

jason

10 days ago (edited)
Can you purchase one (speck dvs camera) at the hobbyist level (a few hundred dollars?)
Open Neuromorphic
8 days ago
You mean the Speck SoC? Unfortunately, the price for that one is set to 4k$, since it is an ASIC recently introduced.

For those that don't know what Speck is here are details:

Speck™ is a fully event-driven neuromorphic vision SoC. Speck™ is able to support large-scale spiking convolutional neural network (sCNN) with a fully asynchronous chip architecture. Speck™ is fully configurable with the spiking neuron capacity of 320K. Furthermore, it integrates the state-of-art dynamic vision sensor (DVS) that enables fully event-driven based, real-time, highly integrated solution for varies dynamic visual scene. For classical applications, Speck™ can provide intelligence upon the scene at only mWs with a response latency in few ms.

View attachment 32719

Actually they might be onto something here. Brainchip's Studio if memory serves me at $3,800 failed to sell in a solid world economy without wars all over the shop and Europe having plenty of energy.

In today's world with banking collapses and wars and trade sanctions rounding up to $4,000 might very well be a winner as it prevents hobbyists and universities trialling it in a side by side comparison with AKIDA and Prophesee who are giving them to Universities along with free course materials and guest lecturers. Not being a listed company anywhere as well allows them to produce material and make claims that might not pass muster as a listed company. Probably a win, win for them. :ROFLMAO::devilish::ROFLMAO::devilish:😂😂😂🤣

My opinion only DYOR
FF

AKIDA BALLISTA
 
  • Like
  • Haha
  • Fire
Reactions: 12 users

Diogenese

Top 20
Seriously SynSense are dreaming with their pricing for Speck.

Somebody requested pricing for one unit for home hobby & price is $4k. Try putting that into a smartphone. LOL

Small market only for very expensive low volume industrial applications. Appears to do object classification, object detection, object tracking etc but no learning & it uses CNN (not ideal for Prophesee).

jason

10 days ago (edited)
Can you purchase one (speck dvs camera) at the hobbyist level (a few hundred dollars?)
Open Neuromorphic
8 days ago
You mean the Speck SoC? Unfortunately, the price for that one is set to 4k$, since it is an ASIC recently introduced.

For those that don't know what Speck is here are details:

Speck™ is a fully event-driven neuromorphic vision SoC. Speck™ is able to support large-scale spiking convolutional neural network (sCNN) with a fully asynchronous chip architecture. Speck™ is fully configurable with the spiking neuron capacity of 320K. Furthermore, it integrates the state-of-art dynamic vision sensor (DVS) that enables fully event-driven based, real-time, highly integrated solution for varies dynamic visual scene. For classical applications, Speck™ can provide intelligence upon the scene at only mWs with a response latency in few ms.

View attachment 32719

A really exciting parlour game is to combine darts with pin-the-tail-on-the-donkey.
 
  • Haha
  • Like
Reactions: 10 users

wasMADX

Regular
I was reading Science Illustrated Issue#96 p.38 https://pubhtml5.com/kvtj/jffv/Science_Illustrated_Australia_2023/.

It says, animals given a mirror to look in who try to remove a mark put on the forehead (under prior sedation) are judged to be self-aware.
Macaques don't try to remove the mark unless they are rewarded if they touch it. Before that, they don't care. Other tests show they can be clever, e.g. can remember 80% of numbers in a test vs 40% by humans.

I was wondering if the same principle can be applied, or is already, to Akidas i.e. can learn by rewarding a prior ignored piece of important data.
 
  • Like
Reactions: 3 users

Diogenese

Top 20
Popular Science:

Why the Air Force wants 1,000 new combat drones​

Story by Kelsey D. Atherton • Yesterday 10:00 pm

https://www.msn.com/en-au/news/tech...1&cvid=004d1971adbe4ab88b61e54026f6a087&ei=76

Scroll down to the video:
1:25 - "a high performance aircraft capable of autonomous action"

Will this use the brains of the Boeing "Loyal Wingman" developed in Australia?

https://en.wikipedia.org/wiki/Boeing_MQ-28_Ghost_Bat

The Boeing MQ-28 Ghost Bat, previously known as the Boeing Airpower Teaming System (ATS) and the Loyal Wingman project, is a stealth, multirole, unmanned aerial vehicle in development by Boeing Australia for the Royal Australian Air Force (RAAF). It is designed as a force multiplier aircraft capable of flying alongside manned aircraft for support and performing autonomous missions independently using artificial intelligence.[4]
 
  • Like
  • Fire
  • Love
Reactions: 21 users
Top Bottom