BRN Discussion Ongoing

This is going to be my biggest issue in life in a couple years time. I'm on track to retire over a decade early, and BRN could make my retirement a bit more comfy if I pressed the button then. OR... hold a few years more and be what I never dreamed of being ever in my life - a multi millionaire! Agonising decision it will be :-(

Sounds like you’re a winner either way @Derby1990 😉
 
  • Like
  • Love
  • Fire
Reactions: 15 users
Is there still a way to tune in to the AGM if you don’t have the official link? I am basically a full time employee/scout 🤣🤣🤣

Work From Home Reaction GIF
 
Last edited:
  • Like
Reactions: 3 users

Evermont

Stealth Mode
Relevant article following the recent Embedded Visions Summit.


Computing at the Edge and More for Embedded Vision​

May 17, 2022
Definable or not, end users continue deploying embedded vision in more and more applications.
Chris Mc Loone



In part...

Emerging Applications​

Just as it is difficult to come up with an all-encompassing definition for embedded vision, so is it difficult to compile a comprehensive list of applications. Although embedded vision has roots on the factory floor, it’s use today goes well beyond the confines of industrial automation. “The industrial field should be regarded as the starting point of embedded vision,” Huang says. “Embedded vision was widely used on production lines in factories and was fixed on equipment to complete the tasks of positioning, measurement, identification, and inspection. Today, embedded vision walks out of the ‘box’ as an industrial robot or an autonomous mobile robot (AMR). It is widely used in a variety of sectors ranging from agriculture, transportation, automotive, mining, medical to military.”

Mentioning AMRs means the discussion turns toward more dynamic environments. For example, Ni cites warehouses as one of these more dynamic environments. Unlike embedded vision systems deployed on production lines, which feature well-configured lighting, warehouse applications feature autonomous guided vehicles (AGVs)/AMRs that use vision for localization and obstacle detection. Another example is intelligent agricultural machinery that uses cameras for autonomous guidance. “These applications bring new challenges since it’s almost impossible to have a stable, well-set-up illumination condition,” he says. “Thus, we may need new cameras, new computing hardware, and new software to get the job done.”

Schmitt says Vision Components sees growing demand for embedded vision in all fields of business—both consumer and industrial. One recent application used AMRs for intralogistics and manufacturing processes. MIPI camera modules provide reliable and precise navigation, especially when collaborating with other robot systems and when positioning and aligning in narrow passages.

Taking industrial automation as an example for AMRs, Kenny Chang, vice president of system product BU at ASRock Industrial (Taipei City, Taiwan; www.asrockind.com), explains that AMRs employ embedded vision to sense their surroundings in the factory. “In addition, AI-implemented Automated Optical Inspection is another big trend for smart manufacturing, delivering huge benefits to manufacturers.”

Speaking of AI, Jeremy Pan, product manager, Aetina (New Taipei City, Taiwan; www.aetina.com), says that other AI applications for embedded vision include AI inference. One example is a virtual fence solution to detect if factory staff is entering a working robotic arm’s movement/motion radius to force the robotic arm to stop. Additionally, AI visual inspection can be used for defect detection in factories.

Ed Goffin, Marketing Manager, Pleora Technologies Inc. (Kanata, ON, Canada; www.pleora.com), adds, “Probably like many others, we’re seeing more emerging applications integrating AI and embedded processing for vision applications. For offline or manual inspection tasks, there are desktop systems that integrate a camera, edge processing, display panel, and AI apps to help guide operator decision-making. The next step in the development of these systems is to integrate the processing directly into the cameras, so they are even more compact for a manufacturing setting. For manual inspection tasks, these desktop embedded vision applications help operators quickly make quality decisions on products. Behind-the-scenes, the AI model is consistently and transparently trained based on user actions. These systems really take advantage of ‘the best of’ embedded processing, in terms of compact size, local decision making, and powerful processing—plus cost—to help manufacturers leverage the benefits of AI.”

Charisis cites smart cities as an emerging application for embedded vision. “What we recognize as an emerging trend is the increasing adoption of embedded vision at scale in civilian and commercial applications on smart cities and smart spaces,” he says. Applications here include smart lighting poles that sense the roads and adapt luminance to vehicle traffic and pedestrian usage, smart traffic lights that optimize traffic management and minimize commute or dwell times by adjusting in real-time traffic conditions, and smart bus stops that sense people and improve queues through better planning and routing. There are even smart trash bins that optimize waste management and maintenance scheduling.

Basler’s (Ahrensburg, Germany; www.baslerweb.com) Florian Schmelzer, product marketing manager, Embedded Vision Solutions, explains that high dynamic range (HDR) is opening up applications for its dart camera series, for example intelligent light systems. These systems must operate reliably in highly variable conditions—there could be glistening daylight or there could be twilight situations. “This is just one scenario where Basler’s embedded camera dart with HDR feature is able to deliver the required image quality so that the embedded vision system as a whole functions,” he says.

Lansche cites a recent example from MATRIX VISION where two remote sensor heads with different image sensors—each for day or night use—were part of a license plate recognition system for a British traffic monitoring company. “Also, multicamera applications in agriculture, in the food sector, medical technology, or in production are predestined for embedded vision,” he says.

“The most exciting innovation in embedded vision is currently happening with the combination and optimization of the edge and the cloud,” says Sebastien Dignard, president, iENSO (Richmond Hill, ON, Canada; www.ienso.com). “That means, embedding a camera is no longer about taking a great picture. It’s about how you analyze the picture, what vision data you can extract, and what you do with that data. The System on Chip (SoC), which is the ‘brain’ of the camera, is driving this new reality. SoCs are becoming smaller, more powerful, and more affordable. Optimization from the edge to the cloud, aggregating and analyzing vision data from multiple devices, and making business decisions based on this analysis—these are all elements of embedded vision that are taking center stage. We see this being deployed in applications from home appliances to robotics and precision farming.”

Subramanyam states that embedded vision has been revolutionizing retail, medical, agricultural, and industrial markets. He also says that some of the cutting-edge applications across these markets where embedded vision is creating a wave of change are automated sports broadcasting systems, smart signages and kiosks, autonomous shopping systems, agricultural vehicles and robots, point of care diagnostics, life sciences and lab equipment.
 
  • Like
  • Fire
  • Love
Reactions: 13 users

Mugen74

Regular
OT arrghh politics🤮their all bigger bullshitters than TMF🤢
 
  • Like
  • Love
Reactions: 9 users
Pretty quiet since this announcement...

Very quiet.

However, this is on SoftCryptum's webpage. They are obviously still interested in AI.

Video and Image Content Analytics​

Complete video investigation platform for rapid CCTV analysis, documentation and report-production for internal briefings and court presentation materials.

Artificial intelligence (AI) technology to aid law enforcement and intelligence organizations rapidly search vast amounts of video footage and identify patterns or faces using advanced facial detection, extraction and classification algorithms.

Speech and Language Processing​

Robust technologies to automatically detect spoken languages, recognize identical locutors in a broadcast or a telephone conversation. Automatic transcription and translation in thirty different languages.
 
  • Like
  • Love
  • Fire
Reactions: 6 users
Hi FF,

You've set my hobbyhorse rocking, so now it's my turn to dissent.

I think that, in light of the GFC, we should take the ratings agency evaluations with a blood-pressure endangering load of salt.

We have a record-breaking deficit and unemployment comparisons with the 1970s are quite misleading because of the changes in the measure of "employment" which, back then did not include 1 hour a week as "employed".

Here is a graph of Australia's employment in manufacturing since we signed up to WTO.
View attachment 7488

From 1989 to 2017, manufacturing employment declined from 1.16 million to 890,000 - almost 25% decrease and at an accelerating rate, and that was before Abbott oversaw the nailing down of the coffin lid of vehicle manufacture. Were it not for the rise in China's increased demand for coal and iron we would be a banana republic, much to the delight of a certain Asian leader. (I'll refrain from commenting on the equine dental inspection of our largest customer/poking the sleeping panda in the eye with a burnt stick.).

Labor's world champion treasurer was instrumental in to bulldozing our playing field flat ahead of all other countries (advanced or not so advanced), while failing to get agreement on the industry where we had a major advantage - agriculture - "a courageous action, minister", not that we could have expected any more enlightenment from the other team.

Then Honest John strongarmed/blackmailed the states into privatizing public utilities - that worked well for electricity, not to mentioned water, rail, telecomms ... . The fatuous idea of splitting up the utilities for the sake of the appearance of "economic efficiency" (generation, poles and wires, distribution) ignores the reality that economic efficiency can only follow engineering efficiency. Had the power companies remained in government control we would not now have the chaos of maintaining base-load supply. All that achieved was to create separate profit centres which each had to generate a profit, and it may have created some wealthy companies - but where does the profit come from? And, on top of that, the taxpayers have to pay for the lost revenue from the previously government owned utilities. Free market fundamentalist economists are the first up against the wall come the revolution.

It's funny to see the Libs boasting about getting us through the epidemic by implementing Keynesian economics on a previously unimaginable scale, when they have spent all their lives seeking to enforce "fiscal rectitude".

So the advent of Akida is sorely needed to provide some real economic sustenance for our enfeebled economy.
Hear Hear!
_20220523_184117.JPG

There was a documentary a few years back, called "Australia the Lucky Country"
I never actually saw it, but the gist of it was, yes we are incredibly lucky here, natural resources, vast arable land, strong manufacturing and skilled workers, artisans and scientists of nearly every field.

Then politicians, did everything in their power, to strip away those advantages, by slashing tarrifs and opening up free World trade agreements, with countries whom we could never compete etc etc..

Now they push the idea of a "healthy economy" with close to full employment, in a World where people will become increasingly unemployable...

What a joke..

Politicians, make me want to be sick!

And Hey, go BRN! 😛
 
  • Like
  • Love
  • Fire
Reactions: 20 users
Relevant article following the recent Embedded Visions Summit.


Computing at the Edge and More for Embedded Vision​

May 17, 2022
Definable or not, end users continue deploying embedded vision in more and more applications.
Chris Mc Loone



In part...

Emerging Applications​

Just as it is difficult to come up with an all-encompassing definition for embedded vision, so is it difficult to compile a comprehensive list of applications. Although embedded vision has roots on the factory floor, it’s use today goes well beyond the confines of industrial automation. “The industrial field should be regarded as the starting point of embedded vision,” Huang says. “Embedded vision was widely used on production lines in factories and was fixed on equipment to complete the tasks of positioning, measurement, identification, and inspection. Today, embedded vision walks out of the ‘box’ as an industrial robot or an autonomous mobile robot (AMR). It is widely used in a variety of sectors ranging from agriculture, transportation, automotive, mining, medical to military.”

Mentioning AMRs means the discussion turns toward more dynamic environments. For example, Ni cites warehouses as one of these more dynamic environments. Unlike embedded vision systems deployed on production lines, which feature well-configured lighting, warehouse applications feature autonomous guided vehicles (AGVs)/AMRs that use vision for localization and obstacle detection. Another example is intelligent agricultural machinery that uses cameras for autonomous guidance. “These applications bring new challenges since it’s almost impossible to have a stable, well-set-up illumination condition,” he says. “Thus, we may need new cameras, new computing hardware, and new software to get the job done.”

Schmitt says Vision Components sees growing demand for embedded vision in all fields of business—both consumer and industrial. One recent application used AMRs for intralogistics and manufacturing processes. MIPI camera modules provide reliable and precise navigation, especially when collaborating with other robot systems and when positioning and aligning in narrow passages.

Taking industrial automation as an example for AMRs, Kenny Chang, vice president of system product BU at ASRock Industrial (Taipei City, Taiwan; www.asrockind.com), explains that AMRs employ embedded vision to sense their surroundings in the factory. “In addition, AI-implemented Automated Optical Inspection is another big trend for smart manufacturing, delivering huge benefits to manufacturers.”

Speaking of AI, Jeremy Pan, product manager, Aetina (New Taipei City, Taiwan; www.aetina.com), says that other AI applications for embedded vision include AI inference. One example is a virtual fence solution to detect if factory staff is entering a working robotic arm’s movement/motion radius to force the robotic arm to stop. Additionally, AI visual inspection can be used for defect detection in factories.

Ed Goffin, Marketing Manager, Pleora Technologies Inc. (Kanata, ON, Canada; www.pleora.com), adds, “Probably like many others, we’re seeing more emerging applications integrating AI and embedded processing for vision applications. For offline or manual inspection tasks, there are desktop systems that integrate a camera, edge processing, display panel, and AI apps to help guide operator decision-making. The next step in the development of these systems is to integrate the processing directly into the cameras, so they are even more compact for a manufacturing setting. For manual inspection tasks, these desktop embedded vision applications help operators quickly make quality decisions on products. Behind-the-scenes, the AI model is consistently and transparently trained based on user actions. These systems really take advantage of ‘the best of’ embedded processing, in terms of compact size, local decision making, and powerful processing—plus cost—to help manufacturers leverage the benefits of AI.”

Charisis cites smart cities as an emerging application for embedded vision. “What we recognize as an emerging trend is the increasing adoption of embedded vision at scale in civilian and commercial applications on smart cities and smart spaces,” he says. Applications here include smart lighting poles that sense the roads and adapt luminance to vehicle traffic and pedestrian usage, smart traffic lights that optimize traffic management and minimize commute or dwell times by adjusting in real-time traffic conditions, and smart bus stops that sense people and improve queues through better planning and routing. There are even smart trash bins that optimize waste management and maintenance scheduling.

Basler’s (Ahrensburg, Germany; www.baslerweb.com) Florian Schmelzer, product marketing manager, Embedded Vision Solutions, explains that high dynamic range (HDR) is opening up applications for its dart camera series, for example intelligent light systems. These systems must operate reliably in highly variable conditions—there could be glistening daylight or there could be twilight situations. “This is just one scenario where Basler’s embedded camera dart with HDR feature is able to deliver the required image quality so that the embedded vision system as a whole functions,” he says.

Lansche cites a recent example from MATRIX VISION where two remote sensor heads with different image sensors—each for day or night use—were part of a license plate recognition system for a British traffic monitoring company. “Also, multicamera applications in agriculture, in the food sector, medical technology, or in production are predestined for embedded vision,” he says.

“The most exciting innovation in embedded vision is currently happening with the combination and optimization of the edge and the cloud,” says Sebastien Dignard, president, iENSO (Richmond Hill, ON, Canada; www.ienso.com). “That means, embedding a camera is no longer about taking a great picture. It’s about how you analyze the picture, what vision data you can extract, and what you do with that data. The System on Chip (SoC), which is the ‘brain’ of the camera, is driving this new reality. SoCs are becoming smaller, more powerful, and more affordable. Optimization from the edge to the cloud, aggregating and analyzing vision data from multiple devices, and making business decisions based on this analysis—these are all elements of embedded vision that are taking center stage. We see this being deployed in applications from home appliances to robotics and precision farming.”

Subramanyam states that embedded vision has been revolutionizing retail, medical, agricultural, and industrial markets. He also says that some of the cutting-edge applications across these markets where embedded vision is creating a wave of change are automated sports broadcasting systems, smart signages and kiosks, autonomous shopping systems, agricultural vehicles and robots, point of care diagnostics, life sciences and lab equipment.
Sums up perfectly why we have a young virile bull by the tail with Brainchip:

“The most exciting innovation in embedded vision is currently happening with the combination and optimization of the edge and the cloud,” says Sebastien Dignard, president, iENSO (Richmond Hill, ON, Canada; www.ienso.com).

“That means, embedding a camera is no longer about taking a great picture.

It’s about how you analyze the picture, what vision data you can extract, and what you do with that data.

The System on Chip (SoC), which is the ‘brain’ of the camera, is driving this new reality. SoCs are becoming smaller, more powerful, and more affordable.

Optimization from the edge to the cloud, aggregating and analyzing vision data from multiple devices, and making business decisions based on this analysis—these are all elements of embedded vision that are taking center stage.

We see this being deployed in applications from home appliances to robotics and precision farming.”

Subramanyam states that embedded vision has been revolutionizing retail, medical, agricultural, and industrial markets.

He also says that some of the cutting-edge applications across these markets where embedded vision is creating a wave of change are automated sports broadcasting systems, smart signages and kiosks, autonomous shopping systems, agricultural vehicles and robots, point of care diagnostics, life sciences and lab equipment”

I have added breaks to make it really hit home. So don’t let go.

My opinion only DYOR
FF

AKIDA BALLISTA
 
  • Like
  • Love
  • Fire
Reactions: 32 users
D

Deleted member 118

Guest
  • Haha
  • Like
  • Wow
Reactions: 17 users

SERA2g

Founding Member
  • Haha
  • Like
  • Wow
Reactions: 40 users

Evermont

Stealth Mode
Playing catch-up, couldn't resist. 🙃

1653299486293.png
 
  • Haha
  • Like
  • Love
Reactions: 23 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
  • Haha
  • Like
  • Love
Reactions: 23 users

Deadpool

hyper-efficient Ai
Is there still a way to tune in to the AGM if you don’t have the official link? I am basically a full time employee/scout 🤣🤣🤣
if your a SH and you don't have a paper copy of BRN notice of meeting, login to InvestorServe and you will find it their, theres a link in fine print to https://web.lumiagm.com/ there is a number but don't know if it is unique to me?
Is there still a way to tune in to the AGM if you don’t have the official link? I am basically a full time employee/scout 🤣🤣🤣
 
  • Like
  • Love
  • Fire
Reactions: 5 users

uiux

Regular
  • Like
  • Haha
  • Love
Reactions: 35 users
if your a SH and you don't have a paper copy of BRN notice of meeting, login to InvestorServe and you will find it their, theres a link in fine print to https://web.lumiagm.com/ there is a number but don't know if it is unique to me?

Thanks @HALMAN, really appreciate it. I’ll give it a whirl as I can’t bear the FOMO 😃 I’m in NZ so not familiar with InvestorServe.

I’ve also flicked Tony Dawe an email but got an out of office - due to him attending AGM in Sydney then the Investor Roadshow in Melbourne. Thanks again 🤙🏼
 
  • Like
Reactions: 5 users
  • Like
  • Fire
Reactions: 18 users

Labsy

Regular
Today was only the first day of our association with ARM. This ARM partnership is going to have legs. I’m expecting another very green day tomorrow.
Im expecting a few days for it to sink into serious investors and then wings!
 
  • Like
  • Fire
Reactions: 11 users

Diogenese

Top 20
Here's a f'rinstance - CSL is the 5th largest company on the ASX, and the highest ranked "industrial".

It has a market cap of $89B.


https://www.asxshareprice.com/top-10-asx-listed-companies-in-australia/

5. CSL Limited (CSL)​

CSL is a leader in Australia’s burgeoning biotechnology industry. Valued by market capitalization at $89 billion, the company develops, manufactures and markets human pharmaceutical and diagnostic products made from human plasma. CSL trades presently at a P/E multiple of 36.69, making it an attractive investment for investors searching for capital appreciation.



CSL is pretty much a one-trick-pony as far as the business sector it serves, and, while they do excellent work, they are only a medium sized fish in the global pool.

There are almost no confines to Akida's potential market sectors, and Akida is 2 or 3 years ahead of the field, a field which is experiencing exponential growth.

So what are the implications for BRN's market cap?
I just had a browse through CSL's financials:

https://www.intelligentinvestor.com.au/shares/asx-csl/csl-limited/financials

What struck me was that an $132B company which is highly profitable has $7B debt (a bit over twice the net profit), and pays a quarter of a billion in interest which could otherwise go to shareholders as dividends.

Then I think back to the 80s and the days of the corporate raiders unleashed by the deregulation part of the Davos/WTO mantra (privatization, deregulation, incentivization, globalization) who took over cash rich companies just to plunder the assets and discard the corporate shell wreaking havoc in western economies.

So companies need to be highly geared to avoid that trap, and the Davos boys get the interest payments for free - a nice protection racket - leaving the companies performing a tightrope act balancing profit and debt.

So, once the money starts to flow, BrainChip will need to distribute a lot of the profit to keep the sharks at bay - the rainy-day-sock-under-the-bed will only attract them - or they could go on a buying spree.
 
  • Like
  • Thinking
  • Fire
Reactions: 17 users

Diogenese

Top 20
I joked a while ago I was going to subscribe to MF and AFR so I could invest against their advice................can't go wrong I reckon
You weren't Alan Jones' grade 8 teacher by any chance?
 
  • Haha
  • Like
Reactions: 9 users

FJ-215

Regular
I just had a browse through CSL's financials:

https://www.intelligentinvestor.com.au/shares/asx-csl/csl-limited/financials

What struck me was that an $132B company which is highly profitable has $7B debt (a bit over twice the net profit), and pays a quarter of a billion in interest which could otherwise go to shareholders as dividends.

Then I think back to the 80s and the days of the corporate raiders unleashed by the deregulation part of the Davos/WTO mantra (privatization, deregulation, incentivization, globalization) who took over cash rich companies just to plunder the assets and discard the corporate shell wreaking havoc in western economies.

So companies need to be highly geared to avoid that trap, and the Davos boys get the interest payments for free - a nice protection racket - leaving the companies performing a tightrope act balancing profit and debt.

So, once the money starts to flow, BrainChip will need to distribute a lot of the profit to keep the sharks at bay - the rainy-day-sock-under-the-bed will only attract them - or they could go on a buying spree.
Just watching Bloomberg TV. As luck would have it, currently live from Davos.

All the talk there at the moment is de-globalization/regionalization
 
  • Like
Reactions: 6 users
Top Bottom