Greetings Groovy People,
Check out this article published a few hours ago. This mentions the partnership between Prophesee and Datalogic and also discusses other companies using event-based vision systems such as Sony and Nikon. Because such systems can significantly improve efficiency and increase the amount of data collected it is going to be indispensable in automating a range of manufacturing processes, including counting, quality inspection, and predictive maintenance. It says here " As investment in this space increases, the market is expected to drive growth in other industries at an exponential rate through at least 2030".
Sweet!
Event-Based Vision: Where Tech Meets Biology
January 17, 2023
By
Brad MarleyContributing Editor
Machine vision systems are helpful in automating a range of manufacturing processes, including counting, quality inspection, and predictive maintenance. However, most vision systems in use today rely on frame-based image capture technology that has been around for more than a hundred years.
The next iteration in vision systems relies on what changes in a particular scene, or a specific “event” that happens. The technology takes cues from human biology, namely how efficiently eyes work to process massive amounts of visual data. Event-based vision is based on neuromorphic computing in which machines process information much like how the brain processes information. This can significantly improve efficiency and increase the amount of usable data collected.
A Smart Catalyst for Change
“The introduction of 3D- and AI- (artificial intelligence) based vision systems have really changed the game when it comes to event-based vision systems in the manufacturing space,” said Dan Simmons, senior sales engineer, Datalogic, which has its U.S. headquarters in Eugene, Ore. “Before you had to know how to program a system to do what you wanted it to do. The introduction of AI helps create a vision system that helps you learn what is ‘good’ and what is ‘bad.’”
Simmons explained that AI learns the deviations from good and bad, and from there it makes better determinations without having to wait for a human operator to step in and make changes as it goes. But an AI system is only as good as the images you give it, he added.
On the factory floor, there are three application categories for event-based vision systems. The first lies in optical character recognition. In this scenario, the camera is being used for traceability. The AI can view characters that normally cannot be viewed with a standard vision system that is capable of optical character recognition. This could mean anything from reading characters on a non-flat surface to characters that have a very low print quality.
The second category focuses on error proofing. When it comes to quality, mistakes aren’t tolerated if a company wants to ensure success and produce the products its customers have come to expect—even if we’re just talking about junk food.
“I remember hearing about a use case where a food manufacturer wanted to ensure the right cookie was being placed in the right bag during its trip down the factory line,” Simmons said. “The company taught the vision system to simply recognize that there was writing on the package.”
He explained that it came down to the contrast between black and white, with the system able to decipher black writing on the white bag to let it proceed. If no writing was detected, the process was stopped.
The third application falls within cameras that require calibration, such as vision-guided robotics or applications that require measurement, such as an outer diameter measurement.
For example, machines can be taught to view the gap between spark plugs to ensure width accuracy. Then it becomes a simple pass or fail report if the gap isn’t accurate. The manufacturer can then archive that data to use when teaching a next-generation system what it needs to know for a similar job.
Understanding that manufacturers don’t always have the time or capability to facilitate these teachable moments, Datalogic rolled out its IMPACT Robot Guidance system that helps customers take advantage of smart robots by quickly and easily interfacing between any smart camera or vision processor.
“Our IMPACT software is proven to let users solve not only guidance, but many other machine vision applications with an intuitive drag and drop interface,” said Simmons. “With more than 100 vision tools, our customers won’t have to fret about not finding a guidance system that fits their needs.”
The APDIS Laser is a fast, fully automated, non-contact inspection replacement to traditional CMMs for automotive quality control on the shop floor. (Provided by Nikon Metrology)
Making an Impact on Metrology
Proper measurement is vital when it comes to quality assurance and part calibration to help mitigate risk and ensure parts are built to proper specifications.
One company thriving in the metrology space is Nikon, a name most would recognize as a producer of high-end cameras, camera lenses, and microscopes. Whereas once you saw cameras hanging around the necks of tourists and amateur photographers, the phones in our pockets have taken over, leaving Nikon with a gap to fill in its offerings.
Nikon brings more than one hundred years of experience in lenses and scopes. The company revolutionized quality control and metrology across a wide range of clients, using innovative techniques such as laser-radar systems to help automotive companies, for instance, measure gaps between door frames and window holes in automobile frames.
“The benefits of an event-based vision system are very similar to what we offer our manufacturing customers,” said Pete Morken, senior application engineer, Nikon Metrology, which has a U.S. office in Brighton, Mich. “Our systems help to measure whether or not a part is good or bad simply by scanning a car body on the assembly line.”
With Nikon’s laser-radar stations, manufacturers can measure the geometry of parts—car doors, whole car chassis, etc.—as alternatives to the slow, lumbering horizontal-arm coordinate measurement machine systems (CMMS).
In a typical CMMS, information is gathered slowly offline by the software and stored in a database where it can be accessed later when decisions about non-conformances need to be made. But where it lacks the ability of a laser-radar system is the speed that a company like Nikon can offer to make sure that information is used more efficiently.
“With our laser-radar system, the measurement that our customers obtain can be collected, analyzed, and reported more quickly, using more data, to see improved process quality,” Morken said. “The use of pre-defined positions eliminates the requirement for further programming after installation, so the measurement program can happen immediately and continuously.”
The camera company is well-positioned to improve measurement possibilities for customers.
“Nikon has always lived at the cutting edge of technology, even as far back as its photography advances that re-shaped how we take pictures,” added Morken. “Bringing in an event-based vision system could do for metrology what the company once did for budding photographers.”
As technology advances, companies are starting to see how combining artificial intelligence with vision systems represents that next iteration of this process, and how it can re-shape how manufacturers are able to view products and parts.
At its core, a vision system enables machines to “see” necessary objects, whether it’s a part in a bin or package of cookies. In the past, companies would have to teach the machine the parts or products it needed to scan, and the machine was then limited by what it had learned. If there was a flaw in the product, the machine might not know it was an imperfection because it wasn’t taught to recognize it.
As previously noted, artificial intelligence and machine learning can be used to teach manufacturing systems the difference between good and bad parts. As a result, algorithms become less important while the AI does most of the work.
Nikon Metrology’s APDIS Laser Radar.
Seeing is Collecting
Audio-video giant Sony is working to take vision systems to the next level. Similar to Nikon’s transformation, Sony aims to carve out its spot in the event-based vision system industry by creating sensors that act like retinas in the human eye.
The tiny sensors are becoming ever smaller, which allows more of them to be fitted on a device to boost data collection volumes. The use of these sensors goes far beyond the manufacturing floor. As the technology improves, Sony sees deployment within collision avoidance systems, drones, and event-based 3D cameras.
Sony recently introduced what it touts as the world’s first intelligent vision sensors equipped with AI processing functionality. One highlight: The new chip will be to identify people and objects.
This would allow cameras with the chip to identify stock levels on a store shelf or use heat maps to track and analyze customer behavior. It could even count and forecast the number of customers in a given location, providing valuable data to calculate when foot traffic is highest.
Where the technology stands to shine the most in manufacturing is around data management. Advanced sensors can identify objects and send a description of what they see without having to include an accompanying image that takes up space in the database. This could reduce storage requirements by up to 10,000 times, leaving companies with more space to gather critical data that they previously haven’t been able to access, while giving AI a looser leash to capture relevant information.
Working Together
As technology evolves, partnerships between companies in the event-based vision system space and those that want to deploy across other industries will become commonplace.
Datalogic is joining forces with Paris-based Prophesee, a company that invented advanced neuromorphic vision systems and is working to build the next generation of industrial products.
“We are conducting a very fruitful partnership with Prophesee,” said Michele Benedetti, chief technology officer at Datalogic. “Neuromorphic vision is a fascinating technology inspired by the behavior of the human biological system, exactly like neural networks. We believe that the combination of these technologies will provide innovative solutions to our customers.”
As investment in this space increases, the market is expected to drive growth in other industries at an exponential rate through at least 2030, according to a Grand View Research report on the U.S. machine vision market. The increasing demand for quality inspection, as well as the need for vision-guided robotic systems, is expected to fuel that growth.
While long-term forecasts for emerging technologies are far from an exact science, the future for event-based vision systems looks promising—giving manufacturers cause to be fitted with a pair of 20/20 rose-colored specs.
Think you know vision systems? Think again. Event-based systems take vision to a new level.
www.sme.org