BRN Discussion Ongoing

Blessed Good Friday to all here, especially those who celebrate it, a good reminder for the basis of our beliefs. Enjoy your holiday I'm sure you all deserve the rest.
 
  • Like
  • Love
  • Fire
Reactions: 17 users

Jamescan

Member
What a calculation and no one can make a fault out of that.
On the other hand everyone including me is so excited to see share price can bounce very quickly.
One thing I want to add is this price drop is mainly because of shorter's activity and if we keep our faith in the technology and keep holding the stock until positive news start flowing, I assume $3 is not that far either.
To me the main things to look out are
1.Technology is real and everyone start talking on same.
2. We have a growing list of partners are we are definitely passed that stage where everyone will ask had they heard about brainchip?
3. Qualcomm is promoting a similarly technology which means even if they are competitors( only if but most probably a friend) there is a market and we can sell our product quickly enough.
And next mile stones for me are
1. Getting break even ( expenses and income equal)
2. Announcement of licence agreement by one or two EAP partners.
3. Start of 1st royality income.
4. Company can start projecting forward looking statements.
5. One of major broker start coverage on brn.
So to me until those things are start getting covered we may keep on getting same fluctuations based on assumptions.
On top a nasdaq listing could be an icing on cake.
Hi,
over the years the "mainly because of shorter's activity" reason has been used many times. This seems to excuse and or ignore what is the real issue, IMO, which is no news, and a wider fall in sentiment of some tech plays.
Do you have the data to show that traders have been actively shorting BRN in the last 6 months? I'm not meaning to sound belligerent, just interested
 
  • Thinking
  • Haha
Reactions: 4 users

rgupta

Regular
Hi,
over the years the "mainly because of shorter's activity" reason has been used many times. This seems to excuse and or ignore what is the real issue, IMO, which is no news, and a wider fall in sentiment of some tech plays.
Do you have the data to show that traders have been actively shorting BRN in the last 6 months? I'm not meaning to sound belligerent, just interested
Very easy check here
 
  • Like
  • Fire
Reactions: 15 users

equanimous

Norse clairvoyant shapeshifter goddess
Hi,
over the years the "mainly because of shorter's activity" reason has been used many times. This seems to excuse and or ignore what is the real issue, IMO, which is no news, and a wider fall in sentiment of some tech plays.
Do you have the data to show that traders have been actively shorting BRN in the last 6 months? I'm not meaning to sound belligerent, just interested
Screenshot_20230407_154702_Brave.jpg
 
  • Like
  • Wow
  • Sad
Reactions: 17 users

equanimous

Norse clairvoyant shapeshifter goddess
Around 120 million shares

About 3.38 million shares up for sale on commsec. If something really go goes BRN way well shit i wouldnt be left holding any shorts...
 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 16 users

McHale

Regular
Let's just forget for a moment about Intel and IBM and their research projects, lets instead focus our attention on what we
all know as fact.

We are and have been working or advising NASA when approached with regards all things AKIDA.

We do have proven technology, performing better in silicon than was initially thought possible, we are involved with known
space projects.

We know that some of these projects are highly confidential, extremely sensitive on a US Federal level, as in, top secret with or
without any sauce.

The link below may well have already been posted, but it just doesn't matter...Novel ways of performing tasks in space is all
about "Beneficial AI" or "Essential AI" you choose.


P.S. About time Brisbane lifted their game last night, still about 6 players who are well below basic standard skill sets and decision-making
at this AFL level.

Regards....Tech (Perth) :coffee::giggle:
Yes Tech, I was very happy to see the mighty Lions do a number on Collingwood last night, agree about more upside to come, just like BRN.
 
  • Like
  • Sad
  • Love
Reactions: 9 users

rgupta

Regular
Another reason to smile

 
  • Like
  • Fire
  • Love
Reactions: 12 users

Easytiger

Regular
  • Like
  • Sad
Reactions: 2 users
This was a recent Qualcomm presentation about AI liked by our own RT.

I‘m sure it’ll be over my head but @Diogenese or others with technical knowledge might enjoy it.

Transformers are discussed which is interesting given our Gen 2.

About this Episode​

Today we’re joined by Vinesh Sukumar, a senior director, and head of AI/ML product management at Qualcomm Technologies. In our conversation with Vinesh, we explore how mobile and automotive devices have different requirements for AI models and how their AI stack helps developers create complex models on both platforms. We also discuss the growing interest in text-based input and the shift towards transformers, generative content, and recommendation engines. Additionally, we explore the challenges and opportunities for ML Ops investments on the edge, including the use of synthetic data and evolving models based on user data. Finally, we delve into the latest advancements in large language models, including Prometheus-style models and GPT-4.




:)
 
  • Like
  • Love
Reactions: 10 users
S

Straw

Guest
Given the the share price fall over the period, the short sellers are doing very nicely - hard lessons?
Not really.
I'd say merely further confirmation there are way too many damaging financial derivatives which are systematically employed to distort sentiment and increase volatility to the benefit of those with the greatest financial/regulatory influence.
 
Last edited by a moderator:
  • Like
  • Thinking
  • Love
Reactions: 15 users

Boab

I wish I could paint like Vincent
The interesting bit
The passenger display supports gaming and streaming video via the internet. It can be operated on the run, but only for viewing by the front seat passenger. A camera continuously monitors the eye movement of the driver; when he or she attempts to view the passenger display, the content is hidden.
If you were not happy with your passenger you could easily make them very annoyed😂
 
  • Like
  • Haha
Reactions: 19 users

Tothemoon24

Top 20
  • Like
  • Fire
  • Love
Reactions: 5 users

zeeb0t

Administrator
Staff member
Hi all

To address some complaints or concerns recently regarding off-topic moderation.

I understand that friendly banter can be enjoyable, but it's important to keep in mind that our forum's rules exist to ensure that discussions remain focused and informative for everyone. I enforce these rules evenly for all members, even for those who are regulars or have good intentions, to ensure a fair and consistent experience for everyone. I do have an off-topic section called The Lounge, where members are free to chat about anything they like. Please feel free to continue your friendly banter there. Here's the link: https://thestockexchange.com.au/forums/the-lounge-where-anything-goes.15/. Thank you for your understanding.
 
  • Like
  • Love
  • Fire
Reactions: 47 users

TheDon

Regular
Hi all

To address some complaints or concerns recently regarding off-topic moderation.

I understand that friendly banter can be enjoyable, but it's important to keep in mind that our forum's rules exist to ensure that discussions remain focused and informative for everyone. I enforce these rules evenly for all members, even for those who are regulars or have good intentions, to ensure a fair and consistent experience for everyone. I do have an off-topic section called The Lounge, where members are free to chat about anything they like. Please feel free to continue your friendly banter there. Here's the link: https://thestockexchange.com.au/forums/the-lounge-where-anything-goes.15/. Thank you for your understanding.
Hi Zeboot, my post was removed and i got a warning about the post I've done. I'm only a simple guy and not the sharpest tool in the shed . please explain it to me in a simple way that I can understand.
Thank you

TheDon
 

zeeb0t

Administrator
Staff member
Hi Zeboot, my post was removed and i got a warning about the post I've done. I'm only a simple guy and not the sharpest tool in the shed . please explain it to me in a simple way that I can understand.
Thank you

TheDon
Please read the message it sent you. It seems pretty straight forward, but if there is a concern or confusion please send me a message.
 
  • Like
Reactions: 4 users
  • Like
  • Fire
  • Love
Reactions: 35 users
  • Like
Reactions: 1 users

Tothemoon24

Top 20

Announcing Support for Seeed Studio SenseCAP A1101 LoRaWAN Vision AI Sensor​

TINYML, MACHINE LEARNING
Edge Impulse
8 April 2023
Frame_1554_4_406f2d6f9e.png

Read the Table of contents sectionTable of contents​

Edge Impulse is partnering with Seeed Studio to launch official support for the SenseCAP A1101 LoRaWAN Vision AI Sensor, enabling users to use it to acquire, develop, and deploy vision-based ML applications with Edge Impulse Studio or the new Edge Impulse Python SDK.
The SenseCAP A1101 is a smart image sensor, supporting a variety of AI models such as image recognition, people counting, target detection, and meter recognition. Equipped with an IP66 enclosure and industrial-grade design, it can be implemented in challenging conditions such as places with extreme temperatures or difficult accessibility. It combines tinyML and LoRaWAN® to enable local inferencing and long-range transmission, which are the two major needs from outdoor use.
What's more, the sensor is battery-powered. This means that data can be collected in remote locations and transmitted over long distances, without requiring access to an AC power source. This makes it ideal for remote monitoring applications, such as agricultural automation or smart city projects, where power sources might not be easily accessible. This product is designed to deploy widely and be used in distributed monitoring systems without the user needing to worry about maintaining power sources across multiple sites. It is open for customization to meet your unique requirements, including camera, enclosure, transmission protocols, and more. You can also use the SenseCAP app for quick configuration with just three steps — scan, configure, done. Easy peasy.
H3cGFb6rSljSsZTB_5O7NAiizu966XAWQ3_C1EYxDySywfXBZTPNL7mDDNRjEOKA4RxFyUzMPCSBxQ-8O6G32W58Y7JyJBL3d7pWwIWN2E9AMII9kkrbY1LpegyQvOoxwONFaNkA2Ny7fFgWtq7p-NvU_iekRu7W

Read the How do I get started? sectionHow do I get started?​

You can purchase the SenseCAP A1101 here. Then, follow the SenseCAP A1101/Edge Impulse documentation page for instructions on how to quickly develop and deploy your first vision-based ML application!
SenseCAP A1101 with Edge Impulse in action
After connecting the SenseCAP A1101 with Edge Impulse Studio, the board will be listed as follows:
 
  • Like
Reactions: 6 users

Sirod69

bavarian girl ;-)
Imaging and Machine Vision Europe
Imaging and Machine Vision EuropeImaging and Machine Vision Europe

In our latest article, Luca Verre, Co-founder and CEO of PROPHESEE, highlights how event-based vision is set to revolutionise mobile photography.

"By combining an event-based sensor with a frame-based system, effective deblurring can be achieved, making photography and video capture more precise and lifelike."

 
  • Like
  • Love
  • Fire
Reactions: 21 users

Tothemoon24

Top 20
Great read .
I couldn’t copy the original article in its entirety, link provides full read 🐰





Montaseri said one factor that is driving intelligence at the edge is that connectivity is not a hundred percent available, which means delays getting information back from the cloud. Another is that microprocessors and micro controllers are becoming more powerful. This enables the necessary data management at the edge, he said, and allows to devices quickly analyze and make sense of data.

ITTIA is focused on providing the software necessary for edge data streaming, analysis and management for embedded systems and IoT devices – robust data management is foundational for doing AI and machine learning in embedded systems at the edge, Montaseri said.

diagram

ITTIA provides software for edge data streaming, analysis and management for embedded and IoT for uses such as transportation when it's not feasible to depend on a central cloud. (ITTIA)
Reliability is critical for smart edge devices, he added, whether it’s for industrial robotics, medical or transportation applications. “You want to make sure that they don't go down.”

What’s also becoming apparent is that not all edges are created equal – some will be smarter sooner than others depending on the use case and industry, such as robotics and medical devices. Montaseri said today’s embedded systems that gain intelligence through IoT deployments will be doing the jobs needed for the next generation of computing. “The nature of everything is changing,” he said. “We are seeing more security, more safety, and more functionality, like the ingestion rate and the query rate. Our focus is safety, security, and reliability.”


Not all edges are created equal

What makes the smart edge murky is the definition of edge, which means different things to different people, Nandan Nayampally, CMO at BrainChip, said in an interview with Fierce Electronics. He was previously at ARM for more than 15 years when the edge was viewed as primarily sensor driven. “That's how IoT kind of cropped up,” he said. “IoT is a sensor plus connectivity plus processing.” While a Dell or an Intel might think of the smart edge as another giant box that’s now smaller, the right starting point to him is IoT with AI.

AI on the edge is a step forward from a sensor just doing one function, with devices now having more storage, memory, and processing power. Nayampally said this battle between cloud and the edge has been going on a for a while, going back to the days of a terminal connected to mainframe before the move to a client/server model. “What you realize is however much we think that latency to cloud or connectivity to cloud is guaranteed, and the bandwidth assured, it's never going to be the case,” he said. “You need that intelligence and computational power at the edge.”

diagram of chip

BrainChip's Akida processor can learn at the edge to address security and privacy while limiting network congestion. (BrainChip)
Having the smarts at the edge is beneficial for preventative maintenance in factories and patient monitoring, Nayampally said, both in terms of latency and privacy. “Anytime you send raw data or sensitive data out, you are obviously going to have challenges.” Privacy and security have become especially important to the general public, he added. BrainChip was started with the idea that edge computing was necessary and that any approach to AI at the edge had to be different from the cloud. “The cloud kind of assumes almost infinite resources and infinite compute.”

While compute resources at the edge are rather finite, more AI is possible due to advances with low power hardware including memory and systems on chip (SoC), which means not all training and machine learning need be shipped back to the cloud. Nayampally said it’s a matter of scaling, with neuromorphic computing offering inspiration for how to low power intelligence at the edge. “Let's try to emulate the efficiency of it and start from there.”

Machine learning will increasingly happen at the edge both because of inherent capability but also out of necessity. Nayampally said some applications that require a real-time response can’t afford the delay between the edge and the cloud, or the power. “Any time you use radio and connectivity, especially to cloud, that burns a lot of power,” he said. “Radios are the most expensive parts of devices.” Smaller, more cost-effective devices may not be able to afford to have connectivity and need to do more compute locally.

Nayampally said the neuromorphic nature of BrainChip’s Akida platform allows it to learn at the edge, which also addresses security and privacy and reduces network congestion – today’s autonomous vehicles can generate a terabyte of data per day, he noted, so it makes sense to be selective about how much data needs to travel the network.

For the smart edge, simple is better and BrainChip’s processor does that from a computational standpoint, as well as from a development and deployment standpoint, Nayampally said. “It's almost like a self-contained processor.” It is neuromorphic and event driven, so it only processes data when needed, and only communicates when needed, he said.

Being event driven is an excellent example of how basic machine learning may express itself in a single device for the user or the environment, which is what Synaptics is calling the “edge of the edge,” said Elad Baram, director of product marketing for low-power edge AI. The company has a portfolio of low power AI offerings operating at a milliwatt scale which is enabling machine learning using minimal processing and minimal learning – an initiative in line with the philosophy of the tinyML Foundation, he said. While an ADAS uses gigabytes of memory, Synaptics is using megabytes of memory.

Baram’s expertise is in computer vision, and Synaptics sees a lot of potential around any kind of sensing and doing the compute right next to where the data is coming from. Moving data requires power and increases latency and creates privacy issues. Organizations like tinyML are an indicator of how smart the edge could get. “We are at an inflection point within this year or next year,” he said. “This inflection point is where it's booming.”

diagram of a chip

Synaptics has a context aware Smart Home SoC with an AI accelerator for 'edge at the edge'. (Synaptics)
Context aware edge remain application specific

Baram said just as the GPU boom occurred in the cloud five years ago, the same evolution is happening with TinyML. Workloads at the edge that previously required an Nvidia processor, such as detection and recognition, can now be done on a microcontroller. “There is a natural progression.”

Sound event detection is already relatively mature, Baram said, starting with Alexa and Siri and progressing to detecting glass breaking or a baby crying. “We are seeing a lot of use cases in smart home and home security around the audio space.” In the vision space, he said, Synaptics is supporting “context awareness” for laptops so they can detect whether a user is present or not, and to ensure privacy, any imaging stays on the on-camera module – it’s never stored on the computer’s main processor.

Power, of course, is important for context awareness applications. Baram said. “You don't want the power to the battery to drain too fast.” But having this awareness actually extends battery life, he said, because now the system understands if the user is engaged with the device and its content and can respond accordingly. “You approach the laptop, it's turned on and you log in and it's like magic. The machine just knows what you want to do, what you are doing, and it can adapt itself.”

Similarly, an HVAC system could adapt based on the number of occupants in a room, or a dishwasher could let you know how full it is. Baram said a fridge could be smart enough so you can know whether or not you need to pick up milk on the way home. Aside from smart laptops and home appliances, there are many safety applications in construction, manufacturing and agriculture that could benefit from context awareness. “The amount of use cases out there in the world is pretty amazing.”

Baram said the hardware is coming together to enable the smart edge, including machine learning, while algorithms and networking are also improving significantly. “The neural networks are way more efficient than they were a decade ago.” As compute capabilities advance, devices will be able to have more general purposes, but for now processor and algorithm constraints mean smart edge devices will have targeted applications.

In the long run, making the edge smarter is ultimately contingent on successfully pulling on these elements together, which requires an AI framework, Baram said, such as TensorFlow, an open source ML platform. “Those frameworks make it much easier to deploy a neural network into edge devices.”
 
Last edited:
  • Like
  • Fire
Reactions: 26 users
Top Bottom