Relevant article following the recent Embedded Visions Summit.
Definable or not, end users continue deploying embedded vision in more and more applications.
www.vision-systems.com
Computing at the Edge and More for Embedded Vision
May 17, 2022
Definable or not, end users continue deploying embedded vision in more and more applications.
Chris Mc Loone
In part...
Emerging Applications
Just as it is difficult to come up with an all-encompassing definition for embedded vision, so is it difficult to compile a comprehensive list of applications. Although embedded vision has roots on the factory floor, it’s use today goes well beyond the confines of industrial automation. “The industrial field should be regarded as the starting point of embedded vision,” Huang says. “Embedded vision was widely used on production lines in factories and was fixed on equipment to complete the tasks of positioning, measurement, identification, and inspection. Today, embedded vision walks out of the ‘box’ as an industrial robot or an autonomous mobile robot (AMR). It is widely used in a variety of sectors ranging from agriculture, transportation, automotive, mining, medical to military.”
Mentioning AMRs means the discussion turns toward more dynamic environments. For example, Ni cites warehouses as one of these more dynamic environments. Unlike embedded vision systems deployed on production lines, which feature well-configured lighting, warehouse applications feature autonomous guided vehicles (AGVs)/AMRs that use vision for localization and obstacle detection. Another example is intelligent agricultural machinery that uses cameras for autonomous guidance. “These applications bring new challenges since it’s almost impossible to have a stable, well-set-up illumination condition,” he says. “Thus, we may need new cameras, new computing hardware, and new software to get the job done.”
Schmitt says Vision Components sees growing demand for embedded vision in all fields of business—both consumer and industrial. One recent application used AMRs for intralogistics and manufacturing processes. MIPI camera modules provide reliable and precise navigation, especially when collaborating with other robot systems and when positioning and aligning in narrow passages.
Taking industrial automation as an example for AMRs, Kenny Chang, vice president of system product BU at ASRock Industrial (Taipei City, Taiwan;
www.asrockind.com), explains that AMRs employ embedded vision to sense their surroundings in the factory. “In addition, AI-implemented Automated Optical Inspection is another big trend for smart manufacturing, delivering huge benefits to manufacturers.”
Speaking of AI, Jeremy Pan, product manager, Aetina (New Taipei City, Taiwan;
www.aetina.com), says that other AI applications for embedded vision include AI inference. One example is a virtual fence solution to detect if factory staff is entering a working robotic arm’s movement/motion radius to force the robotic arm to stop. Additionally, AI visual inspection can be used for defect detection in factories.
Ed Goffin, Marketing Manager, Pleora Technologies Inc. (Kanata, ON, Canada;
www.pleora.com), adds, “Probably like many others, we’re seeing more emerging applications integrating AI and embedded processing for vision applications. For offline or manual inspection tasks, there are desktop systems that integrate a camera, edge processing, display panel, and AI apps to help guide operator decision-making. The next step in the development of these systems is to integrate the processing directly into the cameras, so they are even more compact for a manufacturing setting. For manual inspection tasks, these desktop embedded vision applications help operators quickly make quality decisions on products. Behind-the-scenes, the AI model is consistently and transparently trained based on user actions. These systems really take advantage of ‘the best of’ embedded processing, in terms of compact size, local decision making, and powerful processing—plus cost—to help manufacturers leverage the benefits of AI.”
Charisis cites smart cities as an emerging application for embedded vision. “What we recognize as an emerging trend is the increasing adoption of embedded vision at scale in civilian and commercial applications on smart cities and smart spaces,” he says. Applications here include smart lighting poles that sense the roads and adapt luminance to vehicle traffic and pedestrian usage, smart traffic lights that optimize traffic management and minimize commute or dwell times by adjusting in real-time traffic conditions, and smart bus stops that sense people and improve queues through better planning and routing. There are even smart trash bins that optimize waste management and maintenance scheduling.
Basler’s (Ahrensburg, Germany;
www.baslerweb.com) Florian Schmelzer, product marketing manager, Embedded Vision Solutions, explains that high dynamic range (HDR) is opening up applications for its dart camera series, for example intelligent light systems. These systems must operate reliably in highly variable conditions—there could be glistening daylight or there could be twilight situations. “This is just one scenario where Basler’s embedded camera dart with HDR feature is able to deliver the required image quality so that the embedded vision system as a whole functions,” he says.
Lansche cites a recent example from MATRIX VISION where two remote sensor heads with different image sensors—each for day or night use—were part of a license plate recognition system for a British traffic monitoring company. “Also, multicamera applications in agriculture, in the food sector, medical technology, or in production are predestined for embedded vision,” he says.
“The most exciting innovation in embedded vision is currently happening with the combination and optimization of the edge and the cloud,” says Sebastien Dignard, president, iENSO (Richmond Hill, ON, Canada;
www.ienso.com). “That means, embedding a camera is no longer about taking a great picture. It’s about how you analyze the picture, what vision data you can extract, and what you do with that data. The System on Chip (SoC), which is the ‘brain’ of the camera, is driving this new reality. SoCs are becoming smaller, more powerful, and more affordable. Optimization from the edge to the cloud, aggregating and analyzing vision data from multiple devices, and making business decisions based on this analysis—these are all elements of embedded vision that are taking center stage. We see this being deployed in applications from home appliances to robotics and precision farming.”
Subramanyam states that embedded vision has been revolutionizing retail, medical, agricultural, and industrial markets. He also says that some of the cutting-edge applications across these markets where embedded vision is creating a wave of change are automated sports broadcasting systems, smart signages and kiosks, autonomous shopping systems, agricultural vehicles and robots, point of care diagnostics, life sciences and lab equipment.