Snapdragon Smart Vision Improves Robotic Material Handling & Tracking
With the recent changes to workplaces required by the coronavirus pandemic, the need for solutions capable of supporting reduced workforces and socially distant operations has been given an unprecedented level of urgency. Industry has been on a steady march toward automation for decades, incorporating robotics, sensors, and automatic data capture and processing tools to reduce reliance on large workforces and improve quality. We are now on the cusp of a new revolution in automation driven by improvements in computer vision technology and corresponding reductions in cost and complexity of the sensors and computing solutions required to power them. Automation solutions are now more generally capable, cost less, and can be implemented as system level solutions that are integrated seamlessly aside traditional workforces.
In order to improve quality and speed of production, it is important to be able to measure the throughput and yield at as many points of the process as possible. This makes it possible to zero in on potential sources of slowdowns or quality issues, while also making it possible to measure the effects of changes implemented in near real-time. While traditional solutions required intensive efforts to connect varied systems with costly hardware and custom software to connect databases and present usable data, the availability of high performance computing hardware with integrated wireless connectivity allows for automation solutions to be conceived, designed, and implemented at the system level. All equipment endpoints can be designed from the ground up to communicate through each other and with a central hub over a wireless or wired network while natively capturing, storing, and presenting data in a cross-compatible format. User level applications can then display this data in an actionable way in real-time.
As an example, component and finished goods inventory is often tracked manually, requiring workers to scan barcodes, verify counts, and monitor progress of packaged goods during all transport. At each step, there is risk of missing or inaccurate data (missing, damaged, or inaccurate labels & counts) and during transport goods can become waylaid or even lost. Automated systems can help reduce these risks without increasing operational complexity. Traditional barcode labels can be combined with inexpensive machine-readable RFID tags. Factory automation equipment can simultaneously read the traditional barcode, match and confirm it to the RFID tag, and use computer vision to match the expected packaged goods with the labelled contents. Once confirmed at a checkpoint, computer vision techniques and long range RFID can track the movement and location of goods as they move through the factory. Goods are verified through redundant systems and tracked from the moment they arrive on the loading dock to the time they are loaded for shipment to a customer. This kind of synergistic redundancy and system level cooperation are only possible now that computationally-intensive computer vision techniques can be implemented reliably on low cost hardware like SBCs and SOMs built around Qualcomm’s Snapdragon processors.
While traditional factories present a clear case for cost savings through automation, the calculus has not been as clear-cut for other industries where labor is inexpensive and the application of automation is significantly more complex due to more dynamic environments. An example of this is grocery and retail stores. While automatic verification of shelf stock and refilling of missing stock would be helpful, implementing it in a retail environment is challenging. Solutions must deal with moving objects (grocery carts), people (and their pets!), potential hazards (spills and construction), as well as dynamic lighting conditions (skylights and reflective surfaces like high gloss flooring). Until now, this has been a major barrier to the development of battery powered solutions that can be implemented at a reasonable cost. Mobile-focused chipsets like Snapdragon are now making this possible by enabling the synthesis of sensor information from visible light, IR, and depth mapping through LIDAR or sonar, while verifying against known RF location beacons. Combining this sensor information allows for reliable identification of store product while navigating constantly shifting hazards.
As costs for hardware continue to drop and computing power and machine vision technology in mobile platforms continues to improve, we will see these situationally aware general purpose automation systems grow in capability and spread in application. Small industries that have long relied on low cost labor will increasingly be able to afford and take advantage of automation and reap the corresponding rewards of improved quality and increased productivity. Penguin Edge is perfectly poised with its Qualcomm Snapdragon based products to address this segment.