{"id":125136,"date":"2023-02-22T22:40:59","date_gmt":"2023-02-22T22:40:59","guid":{"rendered":"https:\/\/www.controleng.com\/vision-and-discrete-sensors\/"},"modified":"2024-12-06T21:05:04","modified_gmt":"2024-12-06T21:05:04","slug":"vision-and-discrete-sensors","status":"publish","type":"page","link":"https:\/\/www.controleng.com\/vision-and-discrete-sensors\/","title":{"rendered":"Vision and Discrete Sensors"},"content":{"rendered":"<h4 data-uw-styling-context=\"true\">Machine Vision Insights<\/h4>\n<ul>\n<li data-uw-styling-context=\"true\">Due to staffing shortages and major supply chain issues, manufacturers are willing to emphasize automation more to meet surging demand. Machine vision and sensing technologies are a major part of this trend.<\/li>\n<li data-uw-styling-context=\"true\">Autonomous mobile robots (AMRs) and automated guided vehicles (AGVs) used machine vision and sensing technologies to move around freely and a key part around automation when it comes to logistics.<\/li>\n<li data-uw-styling-context=\"true\">Machine learning (ML) and deep learning (DL) in embedded platforms are being used to solve applications that were once very challenging as well as opening up new insights gleaned from the sensing and machine vision advancements being developed on the manufacturing floor.<\/li>\n<\/ul>\n<hr \/>\n<p data-uw-styling-context=\"true\">Worldwide, 131 billion parcels were shipped in 2020, according to the 2021 Pitney Bowes Parcel Shipping Index. By 2026, that number is expected to more than double, accelerated by a global pandemic and growing e-commerce industry. With the magnitude increase in retail purchases made online, the need to automate logistics, warehouse, and shipping processes has become a top priority.<sup data-uw-styling-context=\"true\">1<\/sup><\/p>\n<p data-uw-styling-context=\"true\">Package measurement, quality inspection, barcode reading, optical character recognition\/optical character verification (OCR\/OCV), and material handling optimization, which many companies currently carry out manually, are key stages of the shipping industry value chain that lend themselves to automation.<\/p>\n<p data-uw-styling-context=\"true\">\u201cLogistics, warehousing, and shipping organizations are struggling to operate faster. But speeding things up means accuracy and precision are imperative because there\u2019s no time to deal with errors. And then there are the staffing issues,\u201d said Mark Wheeler, director of supply chain solutions, <a href=\"https:\/\/www.automate.org\/companies\/zebra-technologies\" target=\"_blank\" rel=\"noopener\" data-uw-styling-context=\"true\">Zebra Technologies<\/a>. \u201cWhen you put those three things together, what you get is a market that\u2019s very open to trying new things by combining existing and new technologies in innovative ways.\u201d<\/p>\n<p data-uw-styling-context=\"true\">Much of this innovation centers around machine vision.<\/p>\n<h2 data-uw-styling-context=\"true\">Vision-guided robotics<\/h2>\n<p data-uw-styling-context=\"true\">In a warehouse or distribution center, pallet loads typically mark the beginning and end of the warehousing process. Upon entry to a facility, pallet loads are either depalletized into individual cases or stored as full pallets. Depalletizing applications have transitioned from using mostly manual labor to relying on vision-guided robotics. Machine vision accelerates this process by localizing the next package to pick while the robot is placing the previous load on the conveyor.<\/p>\n<p data-uw-styling-context=\"true\">\u201cMost packages arrive at, and leave from, warehouses as pallet loads,\u201d said Garrett Place, business development, robotics perception, <a href=\"https:\/\/www.automate.org\/companies\/ifm-efector-inc\" target=\"_blank\" rel=\"noopener\" data-uw-styling-context=\"true\">ifm efector<\/a>, inc. \u201cTheir journey through the modern warehouse is at the heart of most machine vision applications in logistics.\u201d<\/p>\n<p data-uw-styling-context=\"true\">Ben Carey, senior manager, logistics vision products, <a href=\"https:\/\/www.automate.org\/companies\/cognex-corporation\" target=\"_blank\" rel=\"noopener\" data-uw-styling-context=\"true\">Cognex Corporation<\/a>, agreed. \u201cMachine vision applications in logistics span four areas: gauging, inspection, guidance, and identification. Each of these areas is present from the inbound receiving processes through sorting all the way to outbound check points.\u201d<\/p>\n<h2 data-uw-styling-context=\"true\">Autonomous mobile robots (AMRs)<\/h2>\n<p data-uw-styling-context=\"true\">Ask a machine vision solution developer about the best way to bring repeatability to a use case, and they will likely say something about limiting the number of variables. After all, variables create edge cases. But most warehousing and logistics operations move packages that can be any color, size, shape, and material. This degree of variability makes technology selection \u2014 and solution creation \u2014 extremely difficult.<\/p>\n<p data-uw-styling-context=\"true\">\u201cThe Amazon Pick challenge in years past is a perfect example of this,\u201d Place said, \u201cand a primary reason most machine vision use cases in logistics are multicamera and multimodal. One camera and one technology are just not enough to manage the variability in these types of applications.\u201d<\/p>\n<p data-uw-styling-context=\"true\">John Leonard,\u00a0<a href=\"https:\/\/www.automate.org\/companies\/zivid\" target=\"_blank\" rel=\"noopener\" data-uw-styling-context=\"true\">Zivid<\/a> product marketing manager, concurred. \u201cThe major applications are depalletization and palletization of boxes entering and leaving a facility. In between these in\/out operations are mostly piece-picking operations and order picking to fulfill orders,\u201d he explains. \u201cThese are accomplished using different methods, which vary from place to place.\u201d<\/p>\n<p data-uw-styling-context=\"true\"><a href=\"https:\/\/www.automate.org\/events\/autonomous-mobile-robots-and-logistics-week-2022\" target=\"_blank\" rel=\"noopener\" data-uw-styling-context=\"true\">These methods include autonomous mobile robots (AMRs)<\/a>\u00a0guided by onboard 3D vision. AMRs can, for instance, travel autonomously to walls of bins to find and select items. Robots can also pick items fed by a conveyor. Other mobile robots may carry items to vision stations so that the type and amount of goods can be inspected.<\/p>\n<p><img decoding=\"async\" class=\"size-full wp-image-521512\" src=\"https:\/\/www.controleng.com\/wp-content\/uploads\/2024\/11\/CTL2207_WEB_IMG_A3_Zebra_MachineVision.jpg\" alt=\"Zebra's FlexShelf Guide, which provides flexible configurations for bin sizing and spacing, expands the types of items that can be picked using AMRs. Courtesy: Zebra Technologies\/A3\" \/> Zebra&#8217;s FlexShelf Guide, which provides flexible configurations for bin sizing and spacing, expands the types of items that can be picked using AMRs. Courtesy: Zebra Technologies\/A3<\/p>\n<h2>Automatic guided vehicles (AGVs)<\/h2>\n<p data-uw-styling-context=\"true\">Alternatively, for full pallet load storage, many warehouses deploy automatic guided vehicles (AGVs) to pick and store pallets for retrieval. During travel, AGVs rely on machine vision for pallet pose and obstacle detection. Machine vision code reading tracks pallet and caseloads throughout the process.<sup data-uw-styling-context=\"true\">4<\/sup><\/p>\n<p data-uw-styling-context=\"true\">When full pallets are ready to leave a facility, AGVs manage the movement while robotic arms convert caseloads to full pallets. These ready-to-ship pallets are then weighed and measured before entering the truck, making pallet dimensioning another strong use case for machine vision in logistics.<\/p>\n<p data-uw-styling-context=\"true\">\u201cThe industry has undergone a shift, moving from assessing shipping fees strictly by weight to charging by dimensional weight\u2013\u2013making accurate dimensional measurement more critical than ever,\u201d said Daniel Howe, regional development manager &#8211; Americas, <a href=\"https:\/\/www.automate.org\/companies\/lmi-technologies-inc\" target=\"_blank\" rel=\"noopener\" data-uw-styling-context=\"true\">LMI Technologies<\/a>. \u201cSmart 3D sensors are a key driver for greater automation for processes in packaging and logistics, including volume dimensioning, sizing, sorting, and surface defect detection.\u201d<\/p>\n<p data-uw-styling-context=\"true\">Many AMRs and AGVs rely on the ifm efector O3R platform for robotic perception. It consists of compact camera heads (VGA cameras and time-of-flight sensors) and a vision processing unit (VPU) with NVIDIA Jetson TX2 for the evaluation of the data. Up to six camera heads can be connected to the Linux-based device, including sensors from other companies.<sup data-uw-styling-context=\"true\">3<\/sup><\/p>\n<h2>High demand for increased speed, throughput<\/h2>\n<p data-uw-styling-context=\"true\">While there are many challenges in logistics and warehousing applications, the demand for greater speed and increased throughput is constant. Challenges include items wrapped in transparent poly bags that present imaging challenges due to how they reflect light. Other piece-picking operations may require color as part of the item detection process, which may necessitate 3D vision that supports color information in the image.<\/p>\n<p data-uw-styling-context=\"true\">Calibration is a big challenge with all 3D cameras as they are engineered to work in the range of micrometers and the knocks, temperature fluctuations, and vibrations common in industrial settings can easily affect the calibration and thus the accuracy of 3D cameras, according to Leonard.<\/p>\n<p data-uw-styling-context=\"true\">\u201cSome cameras, such as Zivid 3D cameras, are specifically designed and built to operate in industrial settings and are rated to IP65 and have automatic calibration features,\u201d Leonard said. \u201cThis means if the temperature changes by say 5 degrees due to a large roller door being opened and closed, a very common occurrence in a logistics warehouse, then the camera adjusts for this to remain perfectly calibrated.\u201d<\/p>\n<h2>Box volume dimensioning and void filling<\/h2>\n<p data-uw-styling-context=\"true\">LMI has developed the ultrawide field of view (FOV) Gocator 2490 sensor, which is specifically designed to provide a fast and accurate parcel dimension measurement for shipping. Another application measures boxes to provide an accurate volumetric measurement for determining dimensional weight. Boxes may be traveling on a conveyor at speeds of 2 m\/s. A single wide field of view Gocator 2490 smart sensor can scan and measure complete box dimensions (W x H x D) with a 1 m X 1 m scan area at a rate of 800 Hz and provide resolutions of 2.5 mm in all three dimensions (X, Y, Z), according to Howe.<\/p>\n<p data-uw-styling-context=\"true\">\u201cCompeting camera-based systems typically offer just 3-to-5-millimeter resolution in the X, Y, and Z axes. However, each of our sensors vary in measurement range and resolution so it is essential to pick the correct one for your application,\u201d Howe said. \u201cThe Gocator 2490 has a high enough resolution to measure not only the dimensions of a variety of parcel sizes but even detect subtle defects in the packaging. This in-line inspection functionality allows a pass\/fail decision to be triggered if a package with a defect is detected.\u201d<\/p>\n<p data-uw-styling-context=\"true\">The Gocator 2490 has also opened up opportunities to solve more advanced packaging applications like void filling, which involves scanning an open package with items in it and determining how much packaging material is required to fill the empty space. For this application, a dual camera configuration helps avoid occlusion within the box or tote.<\/p>\n<h2>Deep learning on the edge<\/h2>\n<p data-uw-styling-context=\"true\">Because challenges for machine vision in logistics arise when multiplying complexity in an application \u2014 for example, trying to detect different types of objects of varying dimensions in random orientations on a high-speed conveyor \u2014 traditional rules-based machine vision for detection\/inspection would struggle in these situations.<\/p>\n<p data-uw-styling-context=\"true\">However, easy-to-use machine learning (ML) and deep learning (DL) in embedded platforms is emerging to solve previously challenging applications. For example, Cognex recently launched In-Sight 2800 with edge learning that is easy to setup with no programming required. The In-Sight 2800 gives fast and accurate classification of everything from boxes to totes to poly bags and runs entirely onboard the smart camera, according to Carey.<\/p>\n<p data-uw-styling-context=\"true\">\u201cTechnologies such as edge learning on the In-Sight 2800 increase package detection rates, leading to less manual rework and enabling better order accuracy through more advanced material handling automation,\u201d Carey said. \u201cOur customers benefit from increased processing speed with less manual interaction, allowing these companies to manage fluctuating demand without changing headcount, which continues to be a challenge in today\u2019s labor-constrained environment.\u201d<\/p>\n<h2>Democratizing machine vision<\/h2>\n<p data-uw-styling-context=\"true\">Most of the technologies being deployed in the modern warehouse, including 2D and 3D cameras and increased compute power, for example, are iterations of previously known approaches. What is somewhat new is the utilization of all these technologies in multicamera, multimodal strategies with large processing capability, in combination with ML, to manage the application.<\/p>\n<p data-uw-styling-context=\"true\">\u201cWe used to see single vendor solutions in the warehouse,\u201d Place explains. \u201cWe now see a combination of vendors and technologies, each with their own strengths, deployed in unison to solve the challenge. This approach will continue to unlock use cases previously untouched by machine vision. Think of it as a democratizing of machine vision in warehousing and logistics.\u201d<\/p>\n<p data-uw-styling-context=\"true\">It\u2019s difficult to put a finger on a single technology advance that is unlocking new use cases for machine vision in warehousing and logistics. Of course, cameras are providing better, more repeatable data and compute is faster, but nothing has changed the game. The biggest advance is in how easy the components are to use in a multi-technology approach to solving problems in the warehouse.<\/p>\n<p data-uw-styling-context=\"true\">\u201cLogistics is moving toward robotics as a primary method to manage the massive growth in the industry,\u201d concludes Place.\u00a0 \u201cRobotics is an integration problem. Machine vision, with all of its complexities, is moving from a single camera focus to one that reduces friction on the integration of all of the components required for the modern warehouse. This approach will take us to the next step in this journey.\u201d<\/p>\n<p data-uw-styling-context=\"true\">&#8211; This originally appeared on the <a href=\"https:\/\/www.automate.org\/industry-insights\/machine-vision-and-automation-streamline-logistics-and-warehousing-operations\">Association for Advancing Automation&#8217;s (A3) website<\/a>. A3 is a CFE Media and Technology content partner. Edited by Chris Vavra, web content manager,\u00a0<em>Control Engineering<\/em>, CFE Media and Technology, <a href=\"mailto:cvavra@cfemedia.com\">cvavra@cfemedia.com<\/a>.<\/p>\n<h2>References:<\/h2>\n<ol>\n<li data-uw-styling-context=\"true\"><em data-uw-styling-context=\"true\">LMI Technologies &#8211; LMI Technologies: 3D Smart Sensors for &#8230;<\/em>, https:\/\/www.manufacturingtechnologyinsights.com\/lmi-technologies.<\/li>\n<li data-uw-styling-context=\"true\"><em data-uw-styling-context=\"true\">Zebra Technologies Corporation &#8211; Zebra Technologies &#8230;<\/em>, https:\/\/investors.zebra.com\/news-and-events\/news\/news-details\/2021\/Zebra-Technologies-Expands-Fetch-Robotics-Portfolio-with-Solution-to-Optimize-Fulfillment-Workflows\/default.aspx.<\/li>\n<li data-uw-styling-context=\"true\"><em data-uw-styling-context=\"true\">ifm efector O3R Democratizes Robotic Perception &#8211; Robotics &#8230;<\/em>, https:\/\/www.roboticsbusinessreview.com\/rbr50-company-2022\/ifm-efector-o3r-democratizes-robotic-perception-2\/.<\/li>\n<li data-uw-styling-context=\"true\"><em data-uw-styling-context=\"true\">Automated high-speed pallet storage and data capture<\/em>. (n.d.). Www.Cognex.Com. Retrieved April 20, 2022, from https:\/\/www.cognex.com\/applications\/customer-stories\/logistics\/automated-high-speed-pallet-storage-and-data-capture<\/li>\n<li data-uw-styling-context=\"true\"><em data-uw-styling-context=\"true\">Zebra adds to fulfillment stripes with Fetch Robotics AMRs &#8230;<\/em>, https:\/\/www.freightwaves.com\/news\/zebra-adds-to-fulfillment-stripes-with-fetch-robotics-amrs-workflows.<\/li>\n<\/ol>\n","protected":false},"excerpt":{"rendered":"<p>With the magnitude increase in retail purchases made online, the need to automate logistics and shipping processes has become a top priority.<\/p>\n","protected":false},"author":6,"featured_media":118173,"parent":0,"menu_order":0,"comment_status":"closed","ping_status":"closed","template":"","meta":{"_acf_changed":false,"pgc_sgb_lightbox_settings":"","footnotes":""},"wtwh-gf-sponsor":[],"class_list":{"2":"type-page"},"acf":[],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v25.9 - https:\/\/yoast.com\/wordpress\/plugins\/seo\/ -->\n<title>Vision and Discrete Sensors - Control Engineering<\/title>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/www.controleng.com\/vision-and-discrete-sensors\/\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"Vision and Discrete Sensors - Control Engineering\" \/>\n<meta property=\"og:description\" content=\"With the magnitude increase in retail purchases made online, the need to automate logistics and shipping processes has become a top priority.\" \/>\n<meta property=\"og:url\" content=\"https:\/\/www.controleng.com\/vision-and-discrete-sensors\/\" \/>\n<meta property=\"og:site_name\" content=\"Control Engineering\" \/>\n<meta property=\"article:publisher\" content=\"https:\/\/www.facebook.com\/ControlEngineeringMagazine\" \/>\n<meta property=\"article:modified_time\" content=\"2024-12-06T21:05:04+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/www.controleng.com\/wp-content\/uploads\/2024\/11\/CTL2207_WEB_IMG_A3_Zebra_MachineVision.jpg\" \/>\n\t<meta property=\"og:image:width\" content=\"1000\" \/>\n\t<meta property=\"og:image:height\" content=\"668\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/jpeg\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:site\" content=\"@controlengtips\" \/>\n<meta name=\"twitter:label1\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data1\" content=\"9 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\/\/schema.org\",\"@graph\":[{\"@type\":\"WebPage\",\"@id\":\"https:\/\/www.controleng.com\/vision-and-discrete-sensors\/\",\"url\":\"https:\/\/www.controleng.com\/vision-and-discrete-sensors\/\",\"name\":\"Vision and Discrete Sensors - Control Engineering\",\"isPartOf\":{\"@id\":\"https:\/\/www.controleng.com\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\/\/www.controleng.com\/vision-and-discrete-sensors\/#primaryimage\"},\"image\":{\"@id\":\"https:\/\/www.controleng.com\/vision-and-discrete-sensors\/#primaryimage\"},\"thumbnailUrl\":\"https:\/\/www.controleng.com\/wp-content\/uploads\/2024\/11\/CTL2207_WEB_IMG_A3_Zebra_MachineVision.jpg\",\"datePublished\":\"2023-02-22T22:40:59+00:00\",\"dateModified\":\"2024-12-06T21:05:04+00:00\",\"breadcrumb\":{\"@id\":\"https:\/\/www.controleng.com\/vision-and-discrete-sensors\/#breadcrumb\"},\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\/\/www.controleng.com\/vision-and-discrete-sensors\/\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/www.controleng.com\/vision-and-discrete-sensors\/#primaryimage\",\"url\":\"https:\/\/www.controleng.com\/wp-content\/uploads\/2024\/11\/CTL2207_WEB_IMG_A3_Zebra_MachineVision.jpg\",\"contentUrl\":\"https:\/\/www.controleng.com\/wp-content\/uploads\/2024\/11\/CTL2207_WEB_IMG_A3_Zebra_MachineVision.jpg\",\"width\":1000,\"height\":668},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\/\/www.controleng.com\/vision-and-discrete-sensors\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\/\/www.controleng.com\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"Vision and Discrete Sensors\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\/\/www.controleng.com\/#website\",\"url\":\"https:\/\/www.controleng.com\/\",\"name\":\"Control Engineering\",\"description\":\"Control Engineering covers and educates about automation, control and instrumentation technologies\",\"publisher\":{\"@id\":\"https:\/\/www.controleng.com\/#organization\"},\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\/\/www.controleng.com\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"en-US\"},{\"@type\":\"Organization\",\"@id\":\"https:\/\/www.controleng.com\/#organization\",\"name\":\"Control Engineering\",\"url\":\"https:\/\/www.controleng.com\/\",\"logo\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/www.controleng.com\/#\/schema\/logo\/image\/\",\"url\":\"https:\/\/www.controleng.com\/wp-content\/uploads\/2024\/12\/ce_logo.png\",\"contentUrl\":\"https:\/\/www.controleng.com\/wp-content\/uploads\/2024\/12\/ce_logo.png\",\"width\":300,\"height\":93,\"caption\":\"Control Engineering\"},\"image\":{\"@id\":\"https:\/\/www.controleng.com\/#\/schema\/logo\/image\/\"},\"sameAs\":[\"https:\/\/www.facebook.com\/ControlEngineeringMagazine\",\"https:\/\/x.com\/controlengtips\",\"https:\/\/www.linkedin.com\/company\/control-engineering-magazine\/\"]}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"Vision and Discrete Sensors - Control Engineering","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/www.controleng.com\/vision-and-discrete-sensors\/","og_locale":"en_US","og_type":"article","og_title":"Vision and Discrete Sensors - Control Engineering","og_description":"With the magnitude increase in retail purchases made online, the need to automate logistics and shipping processes has become a top priority.","og_url":"https:\/\/www.controleng.com\/vision-and-discrete-sensors\/","og_site_name":"Control Engineering","article_publisher":"https:\/\/www.facebook.com\/ControlEngineeringMagazine","article_modified_time":"2024-12-06T21:05:04+00:00","og_image":[{"width":1000,"height":668,"url":"https:\/\/www.controleng.com\/wp-content\/uploads\/2024\/11\/CTL2207_WEB_IMG_A3_Zebra_MachineVision.jpg","type":"image\/jpeg"}],"twitter_card":"summary_large_image","twitter_site":"@controlengtips","twitter_misc":{"Est. reading time":"9 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"WebPage","@id":"https:\/\/www.controleng.com\/vision-and-discrete-sensors\/","url":"https:\/\/www.controleng.com\/vision-and-discrete-sensors\/","name":"Vision and Discrete Sensors - Control Engineering","isPartOf":{"@id":"https:\/\/www.controleng.com\/#website"},"primaryImageOfPage":{"@id":"https:\/\/www.controleng.com\/vision-and-discrete-sensors\/#primaryimage"},"image":{"@id":"https:\/\/www.controleng.com\/vision-and-discrete-sensors\/#primaryimage"},"thumbnailUrl":"https:\/\/www.controleng.com\/wp-content\/uploads\/2024\/11\/CTL2207_WEB_IMG_A3_Zebra_MachineVision.jpg","datePublished":"2023-02-22T22:40:59+00:00","dateModified":"2024-12-06T21:05:04+00:00","breadcrumb":{"@id":"https:\/\/www.controleng.com\/vision-and-discrete-sensors\/#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/www.controleng.com\/vision-and-discrete-sensors\/"]}]},{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/www.controleng.com\/vision-and-discrete-sensors\/#primaryimage","url":"https:\/\/www.controleng.com\/wp-content\/uploads\/2024\/11\/CTL2207_WEB_IMG_A3_Zebra_MachineVision.jpg","contentUrl":"https:\/\/www.controleng.com\/wp-content\/uploads\/2024\/11\/CTL2207_WEB_IMG_A3_Zebra_MachineVision.jpg","width":1000,"height":668},{"@type":"BreadcrumbList","@id":"https:\/\/www.controleng.com\/vision-and-discrete-sensors\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/www.controleng.com\/"},{"@type":"ListItem","position":2,"name":"Vision and Discrete Sensors"}]},{"@type":"WebSite","@id":"https:\/\/www.controleng.com\/#website","url":"https:\/\/www.controleng.com\/","name":"Control Engineering","description":"Control Engineering covers and educates about automation, control and instrumentation technologies","publisher":{"@id":"https:\/\/www.controleng.com\/#organization"},"potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/www.controleng.com\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"en-US"},{"@type":"Organization","@id":"https:\/\/www.controleng.com\/#organization","name":"Control Engineering","url":"https:\/\/www.controleng.com\/","logo":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/www.controleng.com\/#\/schema\/logo\/image\/","url":"https:\/\/www.controleng.com\/wp-content\/uploads\/2024\/12\/ce_logo.png","contentUrl":"https:\/\/www.controleng.com\/wp-content\/uploads\/2024\/12\/ce_logo.png","width":300,"height":93,"caption":"Control Engineering"},"image":{"@id":"https:\/\/www.controleng.com\/#\/schema\/logo\/image\/"},"sameAs":["https:\/\/www.facebook.com\/ControlEngineeringMagazine","https:\/\/x.com\/controlengtips","https:\/\/www.linkedin.com\/company\/control-engineering-magazine\/"]}]}},"_links":{"self":[{"href":"https:\/\/www.controleng.com\/wp-json\/wp\/v2\/pages\/125136","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.controleng.com\/wp-json\/wp\/v2\/pages"}],"about":[{"href":"https:\/\/www.controleng.com\/wp-json\/wp\/v2\/types\/page"}],"author":[{"embeddable":true,"href":"https:\/\/www.controleng.com\/wp-json\/wp\/v2\/users\/6"}],"replies":[{"embeddable":true,"href":"https:\/\/www.controleng.com\/wp-json\/wp\/v2\/comments?post=125136"}],"version-history":[{"count":0,"href":"https:\/\/www.controleng.com\/wp-json\/wp\/v2\/pages\/125136\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.controleng.com\/wp-json\/wp\/v2\/media\/118173"}],"wp:attachment":[{"href":"https:\/\/www.controleng.com\/wp-json\/wp\/v2\/media?parent=125136"}],"wp:term":[{"taxonomy":"wtwh-gf-sponsor","embeddable":true,"href":"https:\/\/www.controleng.com\/wp-json\/wp\/v2\/wtwh-gf-sponsor?post=125136"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}