BRN Discussion Ongoing

Boab

I wish I could paint like Vincent
  • Haha
  • Like
Reactions: 3 users

HopalongPetrovski

I'm Spartacus!
The traders and shortee's will do their thing trying to panic investors into dropping our cookies citing interest rate changes, inflation fears, gas shortages, lithium downgrades, index restructures, wars, monkey pox pandemics and invaders from Mars!
We are all well aware of the games and that once they have run the cycle one way they'll throw it into reverse and try and play the run up as well. This is all secondary to the real story being run by Peter and our magnificent management team. We are on track, entrenching ourselves in myriad technological eco systems, well funded, patent protected, with a future runway of incremental products in various states of development and following a proven and commercially viable sales model and run by people who have played this game before. If they manage to push it down to my further trigger points I will happily buy more. Bring it. :cool:
285530460_5883051418392909_2075318289007448900_n.jpg
 
  • Like
  • Love
  • Fire
Reactions: 26 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
SAY WHAT? 😫



Prophesee releases all its neuromorphic AI software for free​

Prophesee releases all its neuromorphic AI software for free


Business news | June 1, 2022

French neuromorphic sensor developer Prophesee is to make its latest software entirely free. The newest release of the event-driven AI Metavision Intelligence suite will be offered complete with all modules at no cost for development and commercial use based around the Prophesee image sensor. The suite of software tools and…

By Nick Flaherty


Share:







French neuromorphic sensor developer Prophesee is to make its latest software entirely free.

The newest release of the event-driven AI Metavision Intelligence suite will be offered complete with all modules at no cost for development and commercial use based around the Prophesee image sensor.
The suite of software tools and code samples support engineers developing event-driven computer vision applications on a PC for a wide range of markets, including industrial automation, IoT, surveillance, mobile, medical, automotive and more. Using an event driven approach for computer vision requires significantly less power consumption and compute power but a completely different design approach and software libraries.
The aim of making the code free to use is to increase the number of users ten fold in the next two years says Luca Verre, co-founder and CEO of Prophesee. This comes as Intel launches its mainstream neuromorphic chip, called Liohi2, on a 4nm process technology which will be in production in that two year timeframe.

Related Prophesee articles​

“We have seen a significant increase in interest and use of Event-Based Vision and we now have an active and fast-growing community of more than 4,500 inventors using Metavision Intelligence since its launch,” he said. “As we are opening the event-based vision market across many segments, we decided to boost the adoption of MIS throughout the ecosystem targeting 40,000 users in the next two years. By offering these development aids, we can accelerate the evolution of event-based vision to a broader range of applications and use cases and allow for each player in the chain to add its own value.”
The free modules in Metavision Intelligence 3.0 are available through C++ and Python APIs and include a comprehensive machine learning toolkit. The suite also offers a no-code option through the Studio tool which enables users to play pre-recorded datasets provided for free, without owning an event camera. With an event camera, users can stream or record events from their event camera in seconds.
In total, the suite consists of 95 algorithms, 67 code samples and 11 ready-to-use applications. Plug-and-play-provided algorithms include high-speed counting, vibration monitoring, spatter monitoring, object tracking, optical flow, ultra-slow-motion, machine learning and others. It provides users with both C++ and Python APIs as well as extensive documentation and a wide range of samples organized by its implementation level to incrementally introduce the concept of event-based machine vision.
The latest release includes enhancements to help speed up time to production, allowing developers to stream their first events in minutes, or even build their own event camera from scratch using the provided camera plugins under open-source license as a base.
Developers also have the tools to port their developments on Windows or Ubuntu operating systems. Metavision Intelligence 3.0 features also allow access to the full potential of advanced sensor features (e.g. anti-flickering, bias adjustment) by providing source code access to key sensor plugins.
The Metavision Studio tool has also enhanced the user experience with improvements to the onboarding guidance, UI, ROI and bias setup process.
The core ML modules include an open-source event-to-video converter, as well as a video-to-event simulator. The event-to-video converter utilizes the pretrained neural network to build grayscale images based on events. This allows users to make the best use of their existing development resources to process event-based data and build algorithms upon it.
The video-to-event pipeline breaks down the barrier of data scarcity in the event-based domain by enabling the conversion of conventional frame-based datasets to event-based datasets.
Developers can easily download the Metavision Intelligence Suite and begin building products leveraging Prophesee sensing technologies for free.

 
  • Like
  • Wow
  • Thinking
Reactions: 11 users

VictorG

Member
Sometimes we must drive through hell to get to heaven. Some will fall off the BRN bus, but many will find strength and courage to complete the journey with a resolute determination that will shape the person they become. Stay firm and do not sell but above all buy more BRN (if you can). My opinion only, DYOR.

TR Fear Courage.PNG
 
  • Like
  • Fire
  • Love
Reactions: 25 users
10:01:39 am1.0200
Looks like you got it cheaper mcm, I am fuming 😤😂😂😂





In other news, Hotcrapper poster is on the naughty list again. Someone send this to me and just posting for all to see.
If you think market manipulation doesn't happen, think again.

There is also a link in that article, where you can submit suspicious activities. So, if you see illegal activities, P&D posts by traders who use them to increase the volatility to their advantage, please report them.
Hopefully this acts as a warning to all those traders who spread misinformation against companies to help their cause.
In the good old days small fish or not, courts would in cases where the offence is commonly being committed but seldom prosecuted would send a message by giving a solid sentence as a warning to those who think they are safe.

The idea being to up the risk anti in the risk v reward equation.

My opinion only DYOR
FF

AKIDA BALLISTA
 
  • Like
  • Fire
Reactions: 14 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
  • Like
  • Haha
Reactions: 4 users
SAY WHAT? 😫



Prophesee releases all its neuromorphic AI software for free​

Prophesee releases all its neuromorphic AI software for free


Business news | June 1, 2022

French neuromorphic sensor developer Prophesee is to make its latest software entirely free. The newest release of the event-driven AI Metavision Intelligence suite will be offered complete with all modules at no cost for development and commercial use based around the Prophesee image sensor. The suite of software tools and…

By Nick Flaherty


Share:







French neuromorphic sensor developer Prophesee is to make its latest software entirely free.

The newest release of the event-driven AI Metavision Intelligence suite will be offered complete with all modules at no cost for development and commercial use based around the Prophesee image sensor.
The suite of software tools and code samples support engineers developing event-driven computer vision applications on a PC for a wide range of markets, including industrial automation, IoT, surveillance, mobile, medical, automotive and more. Using an event driven approach for computer vision requires significantly less power consumption and compute power but a completely different design approach and software libraries.
The aim of making the code free to use is to increase the number of users ten fold in the next two years says Luca Verre, co-founder and CEO of Prophesee. This comes as Intel launches its mainstream neuromorphic chip, called Liohi2, on a 4nm process technology which will be in production in that two year timeframe.

Related Prophesee articles​

“We have seen a significant increase in interest and use of Event-Based Vision and we now have an active and fast-growing community of more than 4,500 inventors using Metavision Intelligence since its launch,” he said. “As we are opening the event-based vision market across many segments, we decided to boost the adoption of MIS throughout the ecosystem targeting 40,000 users in the next two years. By offering these development aids, we can accelerate the evolution of event-based vision to a broader range of applications and use cases and allow for each player in the chain to add its own value.”
The free modules in Metavision Intelligence 3.0 are available through C++ and Python APIs and include a comprehensive machine learning toolkit. The suite also offers a no-code option through the Studio tool which enables users to play pre-recorded datasets provided for free, without owning an event camera. With an event camera, users can stream or record events from their event camera in seconds.
In total, the suite consists of 95 algorithms, 67 code samples and 11 ready-to-use applications. Plug-and-play-provided algorithms include high-speed counting, vibration monitoring, spatter monitoring, object tracking, optical flow, ultra-slow-motion, machine learning and others. It provides users with both C++ and Python APIs as well as extensive documentation and a wide range of samples organized by its implementation level to incrementally introduce the concept of event-based machine vision.
The latest release includes enhancements to help speed up time to production, allowing developers to stream their first events in minutes, or even build their own event camera from scratch using the provided camera plugins under open-source license as a base.
Developers also have the tools to port their developments on Windows or Ubuntu operating systems. Metavision Intelligence 3.0 features also allow access to the full potential of advanced sensor features (e.g. anti-flickering, bias adjustment) by providing source code access to key sensor plugins.
The Metavision Studio tool has also enhanced the user experience with improvements to the onboarding guidance, UI, ROI and bias setup process.
The core ML modules include an open-source event-to-video converter, as well as a video-to-event simulator. The event-to-video converter utilizes the pretrained neural network to build grayscale images based on events. This allows users to make the best use of their existing development resources to process event-based data and build algorithms upon it.
The video-to-event pipeline breaks down the barrier of data scarcity in the event-based domain by enabling the conversion of conventional frame-based datasets to event-based datasets.
Developers can easily download the Metavision Intelligence Suite and begin building products leveraging Prophesee sensing technologies for free.

Bit late Brainchip released MetaTF years ago the moment it was ready free to all the world.

A bit of a beat up or spin of the negative that they have not been able to sell their software so they are going to try giving it away.

My opinion only DYOR
FF

AKIDA BALLISTA
 
  • Like
  • Love
  • Haha
Reactions: 21 users
Can someone please send the New Jersey Institute of Technology an AKIDA or two?

AND THE ANSWER IS:

"And the answer is event-based cameras and spiking neural networks," he added. "When we don't have to sample repeated parts of the scene that aren't moving, we save a lot in terms of power and data storage. It has this cascading effect of reducing so much load on the network. It's really attractive to computer vision."

Their opinion and mine only so DYOR
FF

AKIDA BALLISTA
 
  • Like
  • Fire
  • Love
Reactions: 14 users

sleepymonk

Regular
  • Haha
  • Like
Reactions: 20 users

Esq.111

Fascinatingly Intuitive.
Afternoon Chippres,

Tomorrow, Just a little reminder ...

$, LDMicro International, Sean Hehir presenting at Westlake Village, California, USA.

&

$$, Six Five Summit. The growth go's on
Sailesh Chittipeddi, EVP of RENESEAS ELECTRONICS , giving a rundown of happenings.

Regards,
Esq.
 
  • Like
  • Love
  • Fire
Reactions: 46 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
  • Like
  • Haha
  • Fire
Reactions: 18 users

ndefries

Regular
It would be interesting to know how many shares the 1000 eyes community caretake all together. Each day it gets bigger. Its good to know they are in good hands. I'm hoping its at least 300M.
 
  • Like
  • Love
Reactions: 19 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
Hi @Fact Finder, I think this would be the perfect way for BrainChip to get in bed with Seeing Machines (if they aren't already)!!!! Seeing Machines have recently been awarded Linkage Project grant with The Australian National University to develop AI systems that monitor human behaviour while driving!!!!

The project will run for 4 years and will support three ANU PhD students and one Research Fellow to use advanced artificial intelligence methods to infer and predict dangerous driver and passenger behaviour.

🤩🥳



IT Brief Australia logo

Story image






ANU and Seeing Machines to use AI to improve driver safety​


By Jessie Chiang
Fri 13 May 2022




The Australian National University (ANU) and Seeing Machines have been awarded a Linkage Project grant to develop AI systems that monitor human behaviour while driving.

The project, ‘Towards in-vehicle situation awareness using visual and audio sensors’, will run for four years and bring together Professor Stephen Gould and Dr Liang Zheng from ANU with Seeing Machines’ Dr Akshay Asthana, Professor Mike Lenn and company co-founder Dr Sebastien Rougeaux.
Seeing Machines says the grant will support three ANU PhD students and one Research Fellow to use advanced artificial intelligence methods to infer and predict dangerous driver and passenger behaviour.
The company says in 2015, road accidents in Australia cost an estimated $30 billion, resulting in both the tragic loss of thousands of lives, as well as severe injury and trauma to survivors.
Seeing Machines says it is widely recognised that many of these accidents were due to, and continue to be, human error, including driver fatigue and distraction.
The company says understanding how and when humans become tired or distracted and using technology to mitigate the resulting risks will significantly benefit society, especially as automotive technology continues to develop.
Founded at ANU in 2000, Seeing Machines has made advances in driver and occupant monitoring system technology, providing solutions to the automotive, commercial transport and aviation industries, including for customers such as General Motors, Daimler, Toll and Air Ambulance Victoria.
Seeing Machines’ professor Mike Lenn says R&D is fundamental to the ongoing innovation of our driver and operator monitoring systems.
“We’re delighted to have been successful in accessing the funding provided by the Australian Research Council and to be working with our esteemed colleagues from the ANU to achieve this,” he says.
“Being at the forefront of driver and occupant technology and having unrivalled understanding of human behaviour and access to the data behind that, is key to our ongoing success. Programs such as this grant help Seeing Machines maintain our leadership position and most importantly, ensure our customers can deliver leading features in their vehicles and get people home safely.”
Seeing Machines’ technology include AI algorithms, embedded processing and optics, and power products thst deliver reliable real-time understanding of vehicle operators. The technology spans the critical measurement of where a driver is looking through to their cognitive state as it applies to accident risk. Reliable “driver state” measurement is the end goal of Driver Monitoring Systems (DMS) technology.
The company has offices in Australia, the USA, Europe and Asia and supplies technology solutions and services to industry leaders in each market.

 
  • Like
  • Fire
  • Thinking
Reactions: 16 users

TheFunkMachine

seeds have the potential to become trees.
  • Like
  • Fire
  • Love
Reactions: 24 users
10:01:39 am1.0200
Looks like you got it cheaper mcm, I am fuming 😤😂😂😂





In other news, Hotcrapper poster is on the naughty list again. Someone send this to me and just posting for all to see.
If you think market manipulation doesn't happen, think again.

There is also a link in that article, where you can submit suspicious activities. So, if you see illegal activities, P&D posts by traders who use them to increase the volatility to their advantage, please report them.
Hopefully this acts as a warning to all those traders who spread misinformation against companies to help their cause.
I remember that guy geez that took a while for that case to conclude.
 
  • Like
  • Fire
Reactions: 5 users

Labsy

Regular
It would be interesting to know how many shares the 1000 eyes community caretake all together. Each day it gets bigger. Its good to know they are in good hands. I'm hoping its at least 300M.
Could be about 200milI reckon... if half of us have over 100k that's 50 mill... if 100 of us have over 500k that's another 50 mill... throw in a couple whales over 1 mill and 350 odd with between 25 to 50 k... very close. ;)
 
  • Like
  • Fire
Reactions: 17 users

Labsy

Regular
Could be about 200milI reckon... if half of us have over 100k that's 50 mill... if 100 of us have over 500k that's another 50 mill... throw in a couple whales over 1 mill and 350 odd with between 25 to 50 k... very close. ;)
Assuming we all only have 1 eye :/
 
  • Haha
  • Like
  • Love
Reactions: 10 users
Would suggest we now confirmed moving to Phase II with Intellisense Systems and NECR proposal :D

P2 NECR 0.jpg
P2 NECR 1.jpg
P2 NECR 2.jpg
 
  • Like
  • Fire
  • Love
Reactions: 95 users
But what about the Woolies commercials - ‘clean, green e-lec-tricity in stores by 2025’ 😎
Hi @jtardif999

The following link explains that Woolworths is not approaching this problem from both ends.

Their plan is to source all their power from renewables but not one word about reducing their consumption.

At the moment as a retail energy consumer you can elect to source your electricity from renewables but at the end of the day this does not make your fridge or air conditioning more efficient.

Base load power consumption remains the same.

The cynical side of me wonders if Woolworths are just trying to cash in on another source of capital:


AS POINTED OUT IN BRN’s WHITE PAPER THEY REDUCE YOUR CARBON FOOTPRINT AT SOURCE BY DRAMATICALLY REDUCING THE POWER YOU USE REGARDLESS OF WHETHER IT COMES FROM COAL, DIESEL, WIND, SOLAR OR GREEN HYDROGEN.

My opinion only DYOR
FF

AKIDA BALLISTA
 
  • Like
  • Fire
  • Love
Reactions: 23 users

rayzor

Regular
Today's Australian Newspaper

hotcrapper pump and dump trader faces jail time​

The securities watchdog has secured a guilty finding against a Victorian man who used 13 aliases to manipulate the share price of 20 listed companies.


Would be an interesting read, not that it ever happens


DYOR
 
  • Like
  • Fire
  • Love
Reactions: 31 users
Top Bottom