BRN Discussion Ongoing

MDhere

Regular
  • Haha
  • Like
Reactions: 16 users

RobjHunt

Regular
Nice start this morning.

C'mon Dover....move ya bloody arse!
 
  • Like
Reactions: 4 users

mrgds

Regular
Today is 6 July, 2022. The 4C is likely 2 weeks away and no one who has done their research will be expecting big things.

Brainchip definitely in my opinion does not leak however can we be sure of all the companies they are now definitely engaged with some of which are venture capital backed.

We of course would all trust venture capitalists never to abuse their positions and trade on insider information.😂😞

Is there a known unknown influencing what is happening at the moment?

My speculation only DYOR
(and take care)
FF

AKIDA BALLISTA
CW ARK, ............... is my hope of the "known unknowns" :cool:🥰
 
  • Love
  • Like
  • Fire
Reactions: 9 users

Tysons

Member
Well it's nice being back over here on this site very positive and informative to say the least, I might have to stay for a while lol
 
  • Like
  • Love
  • Fire
Reactions: 13 users
Great find @chapman89 - I always thought after seeing cars being counted at extremely high speed from a bridge over a Californian highway back in early 16 that this kind of application, just simply being able to count things with high accuracy at extreme speed would be one of the first type of products the technology would find itself in. Very lucrative, since so many things need counting on production lines. 😎
Imago and Prophesee released this camera in 2019 as far as I can tell. Unless it is an upgrade don't think it is us. YET!

SC
 
  • Like
Reactions: 4 users

RobjHunt

Regular
Well it's nice being back over here on this site very positive and informative to say the least, I might have to stay for a while lol
The question remains, why would you leave in the first place 😉 Welcome back.
 
  • Like
Reactions: 6 users

Lex555

Regular
Linkedin post from NVISO - ARK providing information on AI market.
FF and 1% calculator to come out again.


View attachment 10743
I’ll bite, 1% of 8 trillion ai hardware market = 0.08 x price to sales ratio of 15.
Results in a market cap of $1.2T USD which is heading in direction of 10x Microsoft

F370A156-943C-423D-A748-6A31F8550284.gif
 
  • Like
  • Haha
  • Love
Reactions: 39 users
  • Like
  • Fire
  • Love
Reactions: 13 users
F

Filobeddo

Guest
Old school maths, off the top of my head, 8% annual growth equals approx $2.90 s/p,

"TELL EM THEIR DREAMIN"
I expect the decimal point to be minimum one place to the right, ........... possibly two,
:cool:

$29 would do me
 
  • Like
  • Love
Reactions: 16 users

alwaysgreen

Top 20
$29 would do me
$10 is my sell price but if it starts speeding past it, I might need to make it $20.
 
  • Like
  • Haha
  • Love
Reactions: 14 users

gex

Regular
  • Like
  • Love
  • Fire
Reactions: 8 users

miaeffect

Oat latte lover
  • Haha
  • Like
  • Love
Reactions: 40 users
F

Filobeddo

Guest
  • Haha
  • Like
Reactions: 7 users
Someone mentioned that Prophessee had not yet amended its website to include Brainchip on its partners page. Neither has Brainchip yet. However Prophessee does have the following on its website as does Brainchip which is a bit more substantial than the respective company names and nothing more.
My opinion only DYOR
FF

AKIDA BALLISTA

BrainChip Partners with Prophesee Optimizing Computer Vision AI Performance and Efficiency​


Laguna Hills, Calif. – June 14, 2022 – BrainChip Holdings Ltd (ASX: BRN, OTCQX: BRCHF, ADR: BCHPY), the world’s first commercial producer of neuromorphic AI IP, and Prophesee, the inventor of the world’s most advanced neuromorphic vision systems, today announced a technology partnership that delivers next-generation platforms for OEMs looking to integrate event-based vision systems with high levels of AI performance coupled with ultra-low power technologies.
Inspired by human vision, Prophesee’s technology uses a patented sensor design and AI algorithms that mimic the eye and brain to reveal what was invisible until now using standard frame-based technology. Prophesee’s computer vision systems open new potential in areas such as autonomous vehicles, industrial automation, IoT, security and surveillance, and AR/VR.
BrainChip’s first-to-market neuromorphic processor, Akida, mimics the human brain to analyze only essential sensor inputs at the point of acquisition, processing data with unparalleled efficiency, precision, and economy of energy. Keeping AI/ML local to the chip, independent of the cloud, also dramatically reduces latency.

“We’ve successfully ported the data from Prophesee’s neuromorphic-based camera sensor to process inference on Akida with impressive performance,” said Anil Mankar, Co-Founder and CDO of BrainChip. “This combination of intelligent vision sensors with Akida’s ability to process data with unparalleled efficiency, precision and economy of energy at the point of acquisition truly advances state-of-the-art AI enablement and offers manufacturers a ready-to-implement solution.”

“By combining our Metavision solution with Akida-based IP, we are better able to deliver a complete high-performance and ultra-low power solution to OEMs looking to leverage edge-based visual technologies as part of their product offerings, said Luca Verre, CEO and co-founder of Prophesee.”
 
  • Like
  • Fire
  • Love
Reactions: 47 users
  • Like
  • Haha
  • Love
Reactions: 8 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
  • Like
  • Fire
  • Love
Reactions: 34 users

Bloodsy

Regular
Pure, unashamed, baseless, conspiracy/ speculation here.

Nintendo currently trending on twitter after they made their latest console videos 'private' on youtube. WTF does that mean i hear you ask?

Last time they did that, they announced a new console.

We are going into the second half of the year which is fabled to have increased revenues, and MEGACHIPS is a chip supplier for NINTENDO.

Akida to be onboard a possible Switch 2/ Switch pro / all new console???

Might explain a bit of strength in the SP??

Go take a look at twitter trending for more speculation from fans!!
 
  • Like
  • Love
  • Fire
Reactions: 76 users

Boab

I wish I could paint like Vincent
Latest newsletter from Edge AI


edge-ai-vision.com edge-ai-vision.com

A NEWSLETTER FROM THE EDGE AI AND VISION ALLIANCE
Early July 2022 | VOL. 12, NO. 13
To view this newsletter online, please click here
EDGE AI DEVELOPMENT AND DEPLOYMENT

Deploying PyTorch Models for Real-time Inference On the EdgeNomitri
In this 2021 Embedded Vision Summit presentation, Moritz August, CDO at Nomitri GmbH, provides an overview of workflows for deploying compressed deep learning models, starting with PyTorch and creating native C++ application code running in real-time on embedded hardware platforms. He illustrates these workflows on smartphones with real-world examples targeting Arm-based CPUs, GPUs, and NPUs as well as embedded chips and modules like the NXP i.MX8+ and NVIDIA Jetson Nano. August examines TorchScript, architecture-side optimizations, quantization and common pitfalls. Additionally, he shows how the PyTorch deployment workflow can be extended to conversion to ONNX and quantization of ONNX models using an ONNX Runtime. On the application side, he demonstrates how deployed models can be integrated efficiently into a C++ library that runs natively on mobile and embedded devices and highlights known limitations.

Streamlining the Development of Edge AI ApplicationsNVIDIA
Edge AI provides benefits for cost, latency, privacy, and connectivity. Developing and deploying optimized, accurate and effect AI on edge-based systems is a time-consuming, challenging and complex process. In this talk from the 2021 Embedded Vision Summit, Barrie Mullins, former Director of Technical Product Marketing at NVIDIA, explains how the company makes it easier for developers to build, deploy, maintain and manage embedded edge products. NVIDIA Jetson brings accelerated AI performance to the edge in a power-efficient and compact module form factor. Together with NVIDIA pretrained models, Transfer Learning Toolkit, DeepStream and JetPack SDK, these Jetson modules open the door for you to develop and deploy innovative products across all industries.

SECURITY AND SURVEILLANCE APPLICATIONS

Developing and Deploying a Privacy-preserving Vision-based Sensor System for Commercial Real EstateXY Sense
In the United States alone, commercial buildings contain roughly 70 billion square feet of space. Constructing, operating and maintaining this space consumes a tremendous amount of resources. Yet, facility operators typically have little insight into how their spaces are utilized. XY Sense set out to develop a solution that allows facilities operators to obtain accurate, real-time information on how spaces are utilized, enabling informed decisions about how to adjust and allocate their spaces. XY Sense’s solution uses wide-angle cameras to collect real-time occupancy information over large areas while preserving privacy and security. In this 2021 Embedded Vision Summit interview conducted by Jeff Bier, Founder of the Edge AI and Vision Alliance, Luke Murray, the co-founder and CTO of XY Sense, introduces the key requirements of the application, and explores some of the challenges that XY Sense overcame in developing and deploying its solutions—including monitoring people’s movements without compromising privacy—and the approaches the company employed to overcome these challenges.

Challenges in Vision-based Adaptive Traffic Control SystemsSahaj Software Solutions
Adaptive traffic control systems (ATCSs) adjust traffic signal timing based on demand. Venkatesh Wadawadagi, Solution Consultant at Sahaj Software Solutions, begins this 2021 Embedded Vision Summit talk by presenting the main building blocks of a vision-based ATCS, including pre-processing, vehicle detection, vehicle classification and vehicle tracking. Next, he examines several of the key technical challenges in developing a computer vision-based ATCS and explores approaches for overcoming these challenges. These challenges stem from the need for an ATCS to perform accurate person and vehicle detection despite a huge variety of vehicle types, occlusion of objects of interest and difficult lighting conditions.

UPCOMING INDUSTRY EVENTS

Accelerating TensorFlow Models on Intel Compute Devices Using Only 2 Lines of Code - Intel Webinar: August 25, 2022, 9:00 am PT

More Events

FEATURED NEWS

e-con Systems Launches a Multi-camera Solution for the NVIDIA Jetson AGX Orin

New Arm Total Compute Solutions Redefine Visual Experiences

The VVDN-QCS610/410 Development Kit Meets Next-generation Visual AI Intelligence Application Requirements

IDS Imaging Development Systems' uEye XC Closes the Market Gap Between Industrial Cameras and Webcams

Qualcomm's New Unified AI Stack Portfolio Revolutionizes Developer Access and Extends AI Leadership Across the Connected Intelligent Edge

More News

EDGE AI AND VISION PRODUCT OF THE YEAR WINNER SHOWCASE

Luxonis OAK-D-Lite (Best Camera or Sensor)Luxonis
Luxonis’ OAK-D-Lite is the 2022 Edge AI and Vision Product of the Year Award winner in the Cameras and Sensors category. OAK-D-Lite is Luxonis’ next-generation spatial AI camera. It can run AI and CV on-device and fuse these results with stereo disparity depth perception to provide spatial coordinates of detected objects or features it detects. OAK-D-Lite combines the power of the Intel Myriad X Visual Processing Unit with a 4K (13 Mpixel) color camera and 480P stereo depth cameras, and can produce 300k depth points at up to 200 FPS. It has an USB-C connector for power delivery and communication with the host computer, and its 4.5 W max power consumption is ideal for low power applications. It has a baseline distance of 7.5 cm so it can perceive depth from 20 cm up to 15 m. OAK-D-Lite is an entry-level device designed to be accessible to anyone, from corporations to students. Its tiny form factor can fit just about anywhere, including in your pocket, and it comes with a sleek front gorilla-glass cover. OAK-D-Lite is offered at MSRP of $149.

Please see here for more information on Luxonis’ OAK-D-Lite. The Edge AI and Vision Product of the Year Awards celebrate the innovation of the industry's leading companies that are developing and enabling the next generation of edge AI and computer vision products. Winning a Product of the Year award recognizes a company's leadership in edge AI and computer vision as evaluated by independent industry experts.
About This E-Mail
LETTERS AND COMMENTS TO THE EDITOR: Letters and comments can be directed to the editor, Brian Dipert, at insights@edge-ai-vision.com.

PASS IT ON...Feel free to forward this newsletter to your colleagues. If this newsletter was forwarded to you and you would like to receive it regularly, click here to register.
Edge AI and Vision Alliance, 1646 N California Blvd, Suite 360, Walnut Creek, California 94596, United States, +1 925.954.1411
Unsubscribe Manage preferences
 
  • Like
  • Fire
Reactions: 16 users

MDhere

Regular
Seems Shikino High Tech who has recently acquired a stake in Magik-eye has alot to do with Renesas :)

Renesas and Shikino appear to be tied at the hip with Intelligent cameras. Camera Shikino High-Tech Object detection, facial recognition etc etc

I will be looking foward to seeing what comes out of this. They appear to have a tie with Daitron Japan as well but haven't been able to find too much on that connection yet.
 
  • Like
  • Fire
Reactions: 24 users

KMuzza

Mad Scientist
Hi - i just want to mention- 1) Brainchipinc.com- Monthly Newsletter due for publication- last one was -7th June 22- out any day now.

and 2) - the hours away soon to be released Podcast - VPWS - Rob Telson with CMO Jerome Nadel to -
"discuss the opportunities for BrainChip’s neuromorphic technology to become the de facto standard in AI at the Edge."

We have all heard those words before- so no bones about it - and the only Commercial Available Neuromorphic SoC.
Should be interesting- both the Monthly News Letter and the Podcast.

AKIDA BALLISTA UBQTS.
 
  • Like
  • Love
  • Fire
Reactions: 35 users
Top Bottom