Rise from the ashes
Regular
Just bringing these up front as vvdn was mentioned.Latest newsletter from Edge AI
edge-ai-vision.com edge-ai-vision.com
A NEWSLETTER FROM THE EDGE AI AND VISION ALLIANCE
Early July 2022 | VOL. 12, NO. 13
To view this newsletter online, please click here
EDGE AI DEVELOPMENT AND DEPLOYMENT
Deploying PyTorch Models for Real-time Inference On the EdgeNomitri
In this 2021 Embedded Vision Summit presentation, Moritz August, CDO at Nomitri GmbH, provides an overview of workflows for deploying compressed deep learning models, starting with PyTorch and creating native C++ application code running in real-time on embedded hardware platforms. He illustrates these workflows on smartphones with real-world examples targeting Arm-based CPUs, GPUs, and NPUs as well as embedded chips and modules like the NXP i.MX8+ and NVIDIA Jetson Nano. August examines TorchScript, architecture-side optimizations, quantization and common pitfalls. Additionally, he shows how the PyTorch deployment workflow can be extended to conversion to ONNX and quantization of ONNX models using an ONNX Runtime. On the application side, he demonstrates how deployed models can be integrated efficiently into a C++ library that runs natively on mobile and embedded devices and highlights known limitations.
Streamlining the Development of Edge AI ApplicationsNVIDIA
Edge AI provides benefits for cost, latency, privacy, and connectivity. Developing and deploying optimized, accurate and effect AI on edge-based systems is a time-consuming, challenging and complex process. In this talk from the 2021 Embedded Vision Summit, Barrie Mullins, former Director of Technical Product Marketing at NVIDIA, explains how the company makes it easier for developers to build, deploy, maintain and manage embedded edge products. NVIDIA Jetson brings accelerated AI performance to the edge in a power-efficient and compact module form factor. Together with NVIDIA pretrained models, Transfer Learning Toolkit, DeepStream and JetPack SDK, these Jetson modules open the door for you to develop and deploy innovative products across all industries.
SECURITY AND SURVEILLANCE APPLICATIONS
Developing and Deploying a Privacy-preserving Vision-based Sensor System for Commercial Real EstateXY Sense
In the United States alone, commercial buildings contain roughly 70 billion square feet of space. Constructing, operating and maintaining this space consumes a tremendous amount of resources. Yet, facility operators typically have little insight into how their spaces are utilized. XY Sense set out to develop a solution that allows facilities operators to obtain accurate, real-time information on how spaces are utilized, enabling informed decisions about how to adjust and allocate their spaces. XY Sense’s solution uses wide-angle cameras to collect real-time occupancy information over large areas while preserving privacy and security. In this 2021 Embedded Vision Summit interview conducted by Jeff Bier, Founder of the Edge AI and Vision Alliance, Luke Murray, the co-founder and CTO of XY Sense, introduces the key requirements of the application, and explores some of the challenges that XY Sense overcame in developing and deploying its solutions—including monitoring people’s movements without compromising privacy—and the approaches the company employed to overcome these challenges.
Challenges in Vision-based Adaptive Traffic Control SystemsSahaj Software Solutions
Adaptive traffic control systems (ATCSs) adjust traffic signal timing based on demand. Venkatesh Wadawadagi, Solution Consultant at Sahaj Software Solutions, begins this 2021 Embedded Vision Summit talk by presenting the main building blocks of a vision-based ATCS, including pre-processing, vehicle detection, vehicle classification and vehicle tracking. Next, he examines several of the key technical challenges in developing a computer vision-based ATCS and explores approaches for overcoming these challenges. These challenges stem from the need for an ATCS to perform accurate person and vehicle detection despite a huge variety of vehicle types, occlusion of objects of interest and difficult lighting conditions.
UPCOMING INDUSTRY EVENTS
Accelerating TensorFlow Models on Intel Compute Devices Using Only 2 Lines of Code - Intel Webinar: August 25, 2022, 9:00 am PT
More Events
FEATURED NEWS
e-con Systems Launches a Multi-camera Solution for the NVIDIA Jetson AGX Orin
New Arm Total Compute Solutions Redefine Visual Experiences
The VVDN-QCS610/410 Development Kit Meets Next-generation Visual AI Intelligence Application Requirements
IDS Imaging Development Systems' uEye XC Closes the Market Gap Between Industrial Cameras and Webcams
Qualcomm's New Unified AI Stack Portfolio Revolutionizes Developer Access and Extends AI Leadership Across the Connected Intelligent Edge
More News
EDGE AI AND VISION PRODUCT OF THE YEAR WINNER SHOWCASE
Luxonis OAK-D-Lite (Best Camera or Sensor)Luxonis
Luxonis’ OAK-D-Lite is the 2022 Edge AI and Vision Product of the Year Award winner in the Cameras and Sensors category. OAK-D-Lite is Luxonis’ next-generation spatial AI camera. It can run AI and CV on-device and fuse these results with stereo disparity depth perception to provide spatial coordinates of detected objects or features it detects. OAK-D-Lite combines the power of the Intel Myriad X Visual Processing Unit with a 4K (13 Mpixel) color camera and 480P stereo depth cameras, and can produce 300k depth points at up to 200 FPS. It has an USB-C connector for power delivery and communication with the host computer, and its 4.5 W max power consumption is ideal for low power applications. It has a baseline distance of 7.5 cm so it can perceive depth from 20 cm up to 15 m. OAK-D-Lite is an entry-level device designed to be accessible to anyone, from corporations to students. Its tiny form factor can fit just about anywhere, including in your pocket, and it comes with a sleek front gorilla-glass cover. OAK-D-Lite is offered at MSRP of $149.
Please see here for more information on Luxonis’ OAK-D-Lite. The Edge AI and Vision Product of the Year Awards celebrate the innovation of the industry's leading companies that are developing and enabling the next generation of edge AI and computer vision products. Winning a Product of the Year award recognizes a company's leadership in edge AI and computer vision as evaluated by independent industry experts.
About This E-Mail
LETTERS AND COMMENTS TO THE EDITOR: Letters and comments can be directed to the editor, Brian Dipert, at insights@edge-ai-vision.com.
PASS IT ON...Feel free to forward this newsletter to your colleagues. If this newsletter was forwarded to you and you would like to receive it regularly, click here to register.
Edge AI and Vision Alliance, 1646 N California Blvd, Suite 360, Walnut Creek, California 94596, United States, +1 925.954.1411
Unsubscribe Manage preferences
Earlier.
BRN Discussion Ongoing
CES 2023 promises to be the ignition switch for an amazing year for BrainChip in 2023. We have already been told that both SocioNext and Prophesee will be demonstrating products that feature BrainChip IP at this event. And, the carrot. more to come ... Many companies work towards announcing new...
thestockexchange.com.au