BRN Discussion Ongoing

GDJR69

Regular


Brothers In ARMs 😉another favorite

:ROFLMAO::ROFLMAO::ROFLMAO: oh that is brilliant, that get's my ARM prize!
 
  • Like
Reactions: 5 users

Kozikan

Regular
Not sure if this PVDM hypothesis has been mooted? Is it feasible that the resistance to his reappointment comes from those who look favourably at a future takeover bid? Has PVDM inserted himself to prevent the possible sale of everything he has worked to create? Although such a sale would made him rich beyond his wildest dreams, he does not seem like the type of character with whom this would sit well
Hi tls,
1: yes it has been
2: no idea
3: Absolutely 100%
4: imo , I agree. Way to soon for such thoughts
 
  • Like
  • Love
Reactions: 6 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
By Jove, I think I've cracked it!

Beyond the edge! This has to be a reference for smart wearable devices which can monitor various human body symptoms ranging from heart, respiration, movement, to brain activities. AKIDA should be able to detect, predict, and analyze physical performance, physiological status, biochemical composition, mental alertness, etc.
View attachment 7594




Some examples of smart wearables are:

  1. Smart Rings
  2. Smart Watches
  3. Smart Glasses, Light-Filtering Glasses,Google AR Glasses Conceptual Translating Eyewear
  4. Smart Clothing with Sensors
  5. Smart Earphones
  6. Medical Wearables - blood pressure monitors, ECG monitors, etc.
  7. Smart Helmets
  8. Biosensors



Wearable sensors with machine learning to monitor ECG, EMG, etc, for applications including cancer detection, heart disease diagnosis and brain-computer interface.


View attachment 7595




Hey Brain Fam,

Yesterday I posted the above graph which mentions an application for wearables called "brain-computer interface" and it must have lodged itself in my sub-conscious, because last night I had a series of very weird dreams about "brain-computer interface" and... MERECEDES.

This morning , despite feeling a bit tired and rough around the 'edges' (he-he-he!), I got up and typed in "Mercedes+ Brain-computer interface" into the search bar and look at what Dr Google spat back at me!

Is it merely a coincidence that the "Hey Mercedes" voice assistant technology is discussed in tandem with the introduction of the first BCI application in the Mercedes VISION AVTR concept vehicle, or are we involved in this to? I know what my dreams were indicating...💕🧠🍟



Screen Shot 2022-05-25 at 10.45.47 am.png



 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 48 users

equanimous

Norse clairvoyant shapeshifter goddess
With ARM in place and Nvidia accelareted I bet AMD are pulling their leg being gpu and cpu competitive. AMD will be trying maintain advantage over intel..
 
  • Like
  • Love
Reactions: 6 users

Quercuskid

Regular
all the talk of birds I have to show you the bird which came into my garden today.

8E82F8F9-E0DB-4192-AF2F-617DFCE72728.jpeg
 
  • Like
  • Wow
  • Love
Reactions: 38 users
Yes Bad Bad form .... and so many large parcels bought prior to AGM = voting rights!! ?
Nothing will stop this from being the world changing tech that it is.
 
  • Like
  • Love
  • Fire
Reactions: 12 users

Boab

I wish I could paint like Vincent
I must have signed up to receive email updates and received the below this morning.
I don't think I'll have time to read it all.
Enjoy I hope.

edge-ai-vision.com
edge-ai-vision.com

A NEWSLETTER FROM THE EDGE AI AND VISION ALLIANCE
Late May 2022 | VOL. 12, NO. 10
To view this newsletter online, please click here
LETTER FROM THE EDITOR

Dear Colleague,
2022 Embedded Vision Summit

Last week's Embedded Vision Summit was a resounding success, with more than 1,000 attendees learning from 100+ expert speakers and 80+ exhibitors and interacting in person for the first time since 2019. Special congratulations go to the winners of the 2022 Edge AI and Vision Product of the Year Awards:
and the winners of this year's Vision Tank Start-up Competition:
2022 Embedded Vision Summit presentation slide decks in PDF format are now available for download from the Alliance website; publication of presentation videos will begin in the coming weeks. See you at the 2023 Summit!
Brian Dipert
Editor-In-Chief, Edge AI and Vision Alliance
ROBUST REAL-WORLD VISION IMPLEMENTATIONS

Optimizing ML Systems for Real-World Deployment
iRobot

In the real world, machine learning models are components of a broader software application or system. In this talk from the 2021 Embedded Vision Summit, Danielle Dean, Technical Director of Machine Learning at iRobot, explores the importance of optimizing the system as a whole–not just optimizing individual ML models. Based on experience building and deploying deep-learning-based systems for one of the largest fleets of autonomous robots in the world (the Roomba!), Dean highlights critical areas requiring attention for system-level optimization, including data collection, data processing, model building, system application and testing. She also shares recommendations for ways to think about and achieve optimization of the whole system.

A Practical Guide to Implementing Machine Learning on Embedded Devices
Chamberlain Group

Deploying machine learning onto edge devices requires many choices and trade-offs. Fortunately, processor designers are adding inference-enhancing instructions and architectures to even the lowest cost MCUs, tools developers are constantly discovering optimizations that extract a little more performance out of existing hardware, and ML researchers are refactoring the math to achieve better accuracy using faster operations and fewer parameters. In this presentation from the 2021 Embedded Vision Summit, Nathan Kopp, Principal Software Architect for Video Systems at the Chamberlain Group, takes a high-level look at what is involved in running a DNN model on existing edge devices, exploring some of the evolving tools and methods that are finally making this dream a reality. He also takes a quick look at a practical example of running a CNN object detector on low-compute hardware.
CAMERA DEVELOPMENT AND OPTIMIZATION

How to Optimize a Camera ISP with Atlas to Automatically Improve Computer Vision Accuracy
Algolux

Computer vision (CV) works on images pre-processed by a camera’s image signal processor (ISP). For the ISP to provide subjectively “good” image quality (IQ), its parameters must be manually tuned by imaging experts over many months for each specific lens / sensor configuration. However, “good” visual IQ isn’t necessarily what’s best for specific CV algorithms. In this session from the 2021 Embedded Vision Summit, Marc Courtemanche, Atlas Product Architect at Algolux, shows how to use the Atlas workflow to automatically optimize an ISP to maximize computer vision accuracy. Easy to access and deploy, the workflow can improve CV results by up to 25 mAP points while reducing time and effort by more than 10x versus today’s subjective manual IQ tuning approaches.

10 Things You Must Know Before Designing Your Own Camera
Panopteo

Computer vision requires vision. This is why companies that use computer vision often decide they need to create a custom camera module (and perhaps other custom sensors) that meets the specific needs of their unique application. This 2021 Embedded Vision Summit presentation from Alex Fink, consultant at Panopteo, will help you understand how cameras are different from other types of electronic products; what mistakes companies often make when attempting to design their own cameras; and what you can do to end up with cameras that are built on spec, on schedule and on budget.
FEATURED NEWS

Intel's oneAPI 2022.2 is Now Available
FRAMOS Makes Next-Generation GMSL3 Accessible for Any Embedded Vision Application
AMD's Robotics Starter Kit Kick-starts the Intelligent Factory of the Future
iENSO Makes CV22 and CV28 Ambarella-based Embedded Vision Camera Modules Commercially Available
Imagination Technologies and Visidon Partner
for Deep-learning-based Super Resolution Technology
More News

About This E-Mail
LETTERS AND COMMENTS TO THE EDITOR: Letters and comments can be directed to the editor, Brian Dipert, at insights@edge-ai-vision.com.

PASS IT ON...Feel free to forward this newsletter to your colleagues. If this newsletter was forwarded to you and you would like to receive it regularly, click here to register.
 
  • Like
  • Fire
Reactions: 5 users

buena suerte :-)

BOB Bank of Brainchip
  • Like
  • Fire
Reactions: 8 users

Shadow59

Regular
Can a speeding ticket be asked for a short attack like this?
 
  • Like
  • Fire
Reactions: 5 users
So annoying, dropping faster than we went up
 
  • Sad
  • Like
Reactions: 4 users
  • Haha
  • Like
  • Sad
Reactions: 5 users
Shorters would do that to create panic and sell off so simple and unfortunate for us genuine BRN shareholders. Pretty smart move if you ask me, they need to close their positions and move on to the next company but for how long can they play these games, well we need to wait for announcements and future revenues so they can release the share price in an all mighty push. Im waiting patiently. 🙂
All they do is create opportunities for us long-term believers also the new holders to pick up some bargains. Once she pops there will be no holding us back. Of course manipulation will still be there but by that time we'll be laughing all the way to the bank so to speak.
 
  • Like
  • Fire
Reactions: 15 users
People have no respect for this share millions of orders just being wiped
 
  • Like
Reactions: 2 users

Yak52

Regular
MANIPULATION

IRON GRIP by the same organization as yesterdays manipulated dump using their buying/selling power via the BoTs to control the SP today.
They had knowledge of the ASX release just before (performance rights/shares to CEO & Board) before the Market and did that hard dump from $1.17 down to $1.13 and are now forcing the SP down more.

THEY .....have vacuumed up 12 Million shares so far today.


Yak52
 
  • Like
  • Wow
  • Fire
Reactions: 22 users

JK200SX

Regular
A few ANN's have just come out.....
 
  • Fire
Reactions: 1 users
Hey Brain Fam,

Yesterday I posted above the above graph which mentions an application for wearables called "brain-computer interface" and it must have lodged itself in my sub-conscious, because last night I had a series of very weird dreams about "brain-computer interface" and... MERECEDES.

This morning , despite feeling a bit tired and rough around the 'edges' (he-he-he!), I got up and typed in "Mercedes+ Brain-computer interface" into the search bar and look at what Dr Google spat back at me!

Is it merely a coincidence that the "Hey Mercedes" voice assistant technology is discussed in tandem with the introduction of the first BCI application in the Mercedes VISION AVTR concept vehicle, or are we involved in this to? I know what my dreams were indicating...💕🧠🍟



View attachment 7649


Can we make you a glass of hot milk and put you back to bed clearly you do your best work while asleep.

Some wonderful dots exposed here Bravo, Bravo, Bravo.

My opinion only DYOR
FF

AKIDA BALLISTA
 
  • Like
  • Haha
  • Fire
Reactions: 22 users

Makeme 2020

Regular
Another post from Professor Hossam Haick he has been posting quite frequently lately on Twitter, Could we be close to getting a Announcement regarding NaNose.
 
  • Like
  • Fire
Reactions: 22 users
So are we still looking at 10x Microsoft
Given the share price will do what the share price does???
 
  • Like
  • Fire
Reactions: 2 users

JK200SX

Regular
I'm just curious, what did Pia do to acquire 823K shares?
 
  • Like
  • Thinking
  • Fire
Reactions: 13 users
Another post from Professor Hossam Haick he has been posting quite frequently lately on Twitter, Could we be close to getting a Announcement regarding NaNose.

Been very active on LinkedIn also
 
  • Like
  • Fire
Reactions: 13 users
Top Bottom