{"id":131012,"date":"2025-05-23T09:35:24","date_gmt":"2025-05-23T14:35:24","guid":{"rendered":"https:\/\/www.controleng.com\/?p=131012"},"modified":"2025-05-23T09:35:25","modified_gmt":"2025-05-23T14:35:25","slug":"transformative-robot-tech-uses-touch-vision-and-sound","status":"publish","type":"post","link":"https:\/\/www.controleng.com\/transformative-robot-tech-uses-touch-vision-and-sound\/","title":{"rendered":"Transformative robot tech uses touch, vision, and sound"},"content":{"rendered":"\n<p>Sensory input\u2014including touch, smell, hearing, and balance \u2014helps the brain interpret and respond to environmental conditions. These functions are essential for navigating seemingly simple settings, such as walking on uneven terrain.<\/p>\n\n\n\n<p>Perception of the canopy overhead provides visual cues that help with navigation. Sounds like snapping branches or the feel of moss underfoot offer information about ground stability. Noises from falling tree or moving branches in strong winds may indicate nearby hazards.<\/p>\n\n\n\n<p>Robots typically rely on visual inputs like cameras or lidar for navigation. Multisensory navigation remains a complex task for machines. Forest, with dense vegetation, physical obstacles, and irregular terrain, present significant challenges for conventional robotic systems.<\/p>\n\n\n\n<p>Researchers from Duke University have developed a framework named&nbsp;WildFusion&nbsp;that combines visual, tactile, and vibrational data to support robotic navigation in outdoor environments. The work was recently accepted to the IEEE International Conference on Robotics and Automation (ICRA 2025), which will take place May 19-23, 2025, in Atlanta, Georgia.<\/p>\n\n\n\n<p>\u201cWildFusion&nbsp;opens a new chapter in robotic navigation and 3D mapping,\u201d said Boyuan Chen, the Dickinson Family Assistant Professor of Mechanical Engineering and Materials Science, Electrical and Computer Engineering, and Computer Science at Duke University. \u201cIt helps robots to operate more confidently in unstructured, unpredictable environments like forests, disaster zones and off-road terrain.\u201d<\/p>\n\n\n\n<p>&#8220;Typical robots rely heavily on vision or LiDAR alone, which often falter without clear paths or predictable landmarks,&#8221; added Yanbaihui Liu, the lead student author and a second-year Ph.D.&nbsp; student in Chen\u2019s lab. \u201cEven advanced 3D mapping methods struggle to reconstruct a continuous map when sensor data is sparse, noisy or incomplete, which is a frequent problem in unstructured outdoor environments. That\u2019s exactly the challenge&nbsp;WildFusion&nbsp;was designed to solve.\u201d<\/p>\n\n\n\n<p>WildFusion,&nbsp;built on a quadruped robot, combines several sensors, including an RGB camera, LiDAR, inertial sensors, contact microphones and tactile sensors. The camera and LiDAR record the environment\u2019s geometry, color and distance. WildFusion&nbsp;also uses acoustic vibrations and touch.<\/p>\n\n\n\n<p>As the robot moves, contact microphones record vibrations produced by each step, helping distinguish surface types such as dry leaves or mud. Tactile sensors measure the force applied to each foot to detect surface stability. These inputs are used alongside an inertial sensor that tracks acceleration and body motion, such as tilt and roll, while moving over uneven terrain.<\/p>\n\n\n\n<p>Each type of sensory data is processed through specialized encoders and combined into a single, comprehensive representation. The core component of&nbsp;WildFusion&nbsp;is a deep learning model based on implicit neural representations. Unlike traditional methods that treat the environment as discrete points, this approach models surfaces and features continuously, allowing the robot to make more informed decisions about foot placement, even with limited or uncertain visual input.<\/p>\n\n\n\n<p>\u201cThink of it like solving a puzzle where some pieces are missing, yet you&#8217;re able to intuitively imagine the complete picture,\u201d explained Chen. \u201cWildFusion\u2019s multimodal approach lets the robot \u2018fill in the blanks\u2019 when sensor data is sparse or noisy, much like what humans do.\u201d<\/p>\n\n\n\n<p>WildFusion&nbsp;was tested at the Eno River State Park in North Carolina near Duke\u2019s campus, where it enabled a robot to navigate environments including forests, grasslands and gravel paths. \u201cWatching the robot confidently navigate terrain was incredibly rewarding,\u201d Liu shared. \u201cThese real-world tests proved&nbsp;WildFusion\u2019s remarkable ability to accurately predict traversability, significantly improving the robot\u2019s decision-making on safe paths through challenging terrain.\u201d<\/p>\n\n\n\n<p>The team plans to expand the system by adding sensors such as thermal or humidity detectors to improve how the robot responds to different environmental conditions. Its modular design supports applications beyond forests, including disaster response in different terrain, inspection of remote infrastructure and navigation in unstructured areas.<\/p>\n\n\n\n<p>This research was supported by DARPA (HR00112490419, HR00112490372) and the Army Research Laboratory (W911NF2320182, W911NF2220113).<\/p>\n\n\n\n<p>\u201cWildFusion: Multimodal Implicit3DReconstructions in the Wild.\u201d Yanbaihui Liu and Boyuan Chen. IEEE International Conference on Robotics and Automation (ICRA 2025)<\/p>\n\n\n\n<p>Project Website:&nbsp;<a href=\"http:\/\/generalroboticslab.com\/WildFusion\" target=\"_blank\" rel=\"noreferrer noopener\">http:\/\/generalroboticslab.com\/WildFusion<\/a><\/p>\n\n\n\n<p>General Robotics Lab Website:&nbsp;<a href=\"http:\/\/generalroboticslab.com\/\" target=\"_blank\" rel=\"noreferrer noopener\">http:\/\/generalroboticslab.com<\/a><\/p>\n\n\n\n<p>Edited by Puja Mitra, WTWH Media, for <em>Control Engineering<\/em>, from a <a href=\"https:\/\/duke.edu\/\" target=\"_blank\" rel=\"noreferrer noopener\">Duke University<\/a> news release.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Duke University researchers developed WildFusion, a framework combining visual, tactile, and vibrational data for outdoor robot navigation.<\/p>\n","protected":false},"author":679,"featured_media":131013,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"_acf_changed":false,"pgc_sgb_lightbox_settings":"","footnotes":""},"categories":[104149],"tags":[109341,109340,110171,110267,109513,110182],"tracking-metrics":[],"display-location":[109353],"class_list":{"2":"type-post"},"acf":[],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v25.9 - https:\/\/yoast.com\/wordpress\/plugins\/seo\/ -->\n<title>Transformative robot tech uses touch, vision, and sound - Control Engineering<\/title>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/www.controleng.com\/transformative-robot-tech-uses-touch-vision-and-sound\/\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"Transformative robot tech uses touch, vision, and sound - Control Engineering\" \/>\n<meta property=\"og:description\" content=\"Duke University researchers developed WildFusion, a framework combining visual, tactile, and vibrational data for outdoor robot navigation.\" \/>\n<meta property=\"og:url\" content=\"https:\/\/www.controleng.com\/transformative-robot-tech-uses-touch-vision-and-sound\/\" \/>\n<meta property=\"og:site_name\" content=\"Control Engineering\" \/>\n<meta property=\"article:publisher\" content=\"https:\/\/www.facebook.com\/ControlEngineeringMagazine\" \/>\n<meta property=\"article:published_time\" content=\"2025-05-23T14:35:24+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2025-05-23T14:35:25+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/www.controleng.com\/wp-content\/uploads\/2025\/05\/wildfusion-terrain.gif\" \/>\n\t<meta property=\"og:image:width\" content=\"640\" \/>\n\t<meta property=\"og:image:height\" content=\"360\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/gif\" \/>\n<meta name=\"author\" content=\"Duke University\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:creator\" content=\"@controlengtips\" \/>\n<meta name=\"twitter:site\" content=\"@controlengtips\" \/>\n<meta name=\"twitter:label1\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data1\" content=\"Duke University\" \/>\n\t<meta name=\"twitter:label2\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data2\" content=\"4 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\/\/schema.org\",\"@graph\":[{\"@type\":\"Article\",\"@id\":\"https:\/\/www.controleng.com\/transformative-robot-tech-uses-touch-vision-and-sound\/#article\",\"isPartOf\":{\"@id\":\"https:\/\/www.controleng.com\/transformative-robot-tech-uses-touch-vision-and-sound\/\"},\"author\":{\"name\":\"Duke University\",\"@id\":\"https:\/\/www.controleng.com\/#\/schema\/person\/3a60ffb5fa9b8394ed3d125fa8d8d1ec\"},\"headline\":\"Transformative robot tech uses touch, vision, and sound\",\"datePublished\":\"2025-05-23T14:35:24+00:00\",\"dateModified\":\"2025-05-23T14:35:25+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\/\/www.controleng.com\/transformative-robot-tech-uses-touch-vision-and-sound\/\"},\"wordCount\":696,\"publisher\":{\"@id\":\"https:\/\/www.controleng.com\/#organization\"},\"image\":{\"@id\":\"https:\/\/www.controleng.com\/transformative-robot-tech-uses-touch-vision-and-sound\/#primaryimage\"},\"thumbnailUrl\":\"https:\/\/www.controleng.com\/wp-content\/uploads\/2025\/05\/wildfusion-terrain.gif\",\"keywords\":[\"control engineer\",\"control engineering\",\"engineering\",\"LiDAR\",\"robotics\",\"sensor\"],\"articleSection\":[\"Robotics\"],\"inLanguage\":\"en-US\"},{\"@type\":\"WebPage\",\"@id\":\"https:\/\/www.controleng.com\/transformative-robot-tech-uses-touch-vision-and-sound\/\",\"url\":\"https:\/\/www.controleng.com\/transformative-robot-tech-uses-touch-vision-and-sound\/\",\"name\":\"Transformative robot tech uses touch, vision, and sound - Control Engineering\",\"isPartOf\":{\"@id\":\"https:\/\/www.controleng.com\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\/\/www.controleng.com\/transformative-robot-tech-uses-touch-vision-and-sound\/#primaryimage\"},\"image\":{\"@id\":\"https:\/\/www.controleng.com\/transformative-robot-tech-uses-touch-vision-and-sound\/#primaryimage\"},\"thumbnailUrl\":\"https:\/\/www.controleng.com\/wp-content\/uploads\/2025\/05\/wildfusion-terrain.gif\",\"datePublished\":\"2025-05-23T14:35:24+00:00\",\"dateModified\":\"2025-05-23T14:35:25+00:00\",\"breadcrumb\":{\"@id\":\"https:\/\/www.controleng.com\/transformative-robot-tech-uses-touch-vision-and-sound\/#breadcrumb\"},\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\/\/www.controleng.com\/transformative-robot-tech-uses-touch-vision-and-sound\/\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/www.controleng.com\/transformative-robot-tech-uses-touch-vision-and-sound\/#primaryimage\",\"url\":\"https:\/\/www.controleng.com\/wp-content\/uploads\/2025\/05\/wildfusion-terrain.gif\",\"contentUrl\":\"https:\/\/www.controleng.com\/wp-content\/uploads\/2025\/05\/wildfusion-terrain.gif\",\"width\":640,\"height\":360,\"caption\":\"WildFusion uses a combination of sight, touch, sound and balance to help four-legged robots better navigate difficult terrain like dense forests. (image source Duke University)\"},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\/\/www.controleng.com\/transformative-robot-tech-uses-touch-vision-and-sound\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\/\/www.controleng.com\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"Transformative robot tech uses touch, vision, and sound\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\/\/www.controleng.com\/#website\",\"url\":\"https:\/\/www.controleng.com\/\",\"name\":\"Control Engineering\",\"description\":\"Control Engineering covers and educates about automation, control and instrumentation technologies\",\"publisher\":{\"@id\":\"https:\/\/www.controleng.com\/#organization\"},\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\/\/www.controleng.com\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"en-US\"},{\"@type\":\"Organization\",\"@id\":\"https:\/\/www.controleng.com\/#organization\",\"name\":\"Control Engineering\",\"url\":\"https:\/\/www.controleng.com\/\",\"logo\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/www.controleng.com\/#\/schema\/logo\/image\/\",\"url\":\"https:\/\/www.controleng.com\/wp-content\/uploads\/2024\/12\/ce_logo.png\",\"contentUrl\":\"https:\/\/www.controleng.com\/wp-content\/uploads\/2024\/12\/ce_logo.png\",\"width\":300,\"height\":93,\"caption\":\"Control Engineering\"},\"image\":{\"@id\":\"https:\/\/www.controleng.com\/#\/schema\/logo\/image\/\"},\"sameAs\":[\"https:\/\/www.facebook.com\/ControlEngineeringMagazine\",\"https:\/\/x.com\/controlengtips\",\"https:\/\/www.linkedin.com\/company\/control-engineering-magazine\/\"]},{\"@type\":\"Person\",\"@id\":\"https:\/\/www.controleng.com\/#\/schema\/person\/3a60ffb5fa9b8394ed3d125fa8d8d1ec\",\"name\":\"Duke University\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/www.controleng.com\/#\/schema\/person\/image\/\",\"url\":\"https:\/\/secure.gravatar.com\/avatar\/2b8cfb10aad4e0d53da9c0bceb84ef3999364eee86d1fb0fb859f31518279647?s=96&d=mm&r=g\",\"contentUrl\":\"https:\/\/secure.gravatar.com\/avatar\/2b8cfb10aad4e0d53da9c0bceb84ef3999364eee86d1fb0fb859f31518279647?s=96&d=mm&r=g\",\"caption\":\"Duke University\"},\"url\":\"https:\/\/www.controleng.com\/author\/duke-university\/\"}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"Transformative robot tech uses touch, vision, and sound - Control Engineering","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/www.controleng.com\/transformative-robot-tech-uses-touch-vision-and-sound\/","og_locale":"en_US","og_type":"article","og_title":"Transformative robot tech uses touch, vision, and sound - Control Engineering","og_description":"Duke University researchers developed WildFusion, a framework combining visual, tactile, and vibrational data for outdoor robot navigation.","og_url":"https:\/\/www.controleng.com\/transformative-robot-tech-uses-touch-vision-and-sound\/","og_site_name":"Control Engineering","article_publisher":"https:\/\/www.facebook.com\/ControlEngineeringMagazine","article_published_time":"2025-05-23T14:35:24+00:00","article_modified_time":"2025-05-23T14:35:25+00:00","og_image":[{"url":"https:\/\/www.controleng.com\/wp-content\/uploads\/2025\/05\/wildfusion-terrain.gif","width":640,"height":360,"type":"image\/gif"}],"author":"Duke University","twitter_card":"summary_large_image","twitter_creator":"@controlengtips","twitter_site":"@controlengtips","twitter_misc":{"Written by":"Duke University","Est. reading time":"4 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Article","@id":"https:\/\/www.controleng.com\/transformative-robot-tech-uses-touch-vision-and-sound\/#article","isPartOf":{"@id":"https:\/\/www.controleng.com\/transformative-robot-tech-uses-touch-vision-and-sound\/"},"author":{"name":"Duke University","@id":"https:\/\/www.controleng.com\/#\/schema\/person\/3a60ffb5fa9b8394ed3d125fa8d8d1ec"},"headline":"Transformative robot tech uses touch, vision, and sound","datePublished":"2025-05-23T14:35:24+00:00","dateModified":"2025-05-23T14:35:25+00:00","mainEntityOfPage":{"@id":"https:\/\/www.controleng.com\/transformative-robot-tech-uses-touch-vision-and-sound\/"},"wordCount":696,"publisher":{"@id":"https:\/\/www.controleng.com\/#organization"},"image":{"@id":"https:\/\/www.controleng.com\/transformative-robot-tech-uses-touch-vision-and-sound\/#primaryimage"},"thumbnailUrl":"https:\/\/www.controleng.com\/wp-content\/uploads\/2025\/05\/wildfusion-terrain.gif","keywords":["control engineer","control engineering","engineering","LiDAR","robotics","sensor"],"articleSection":["Robotics"],"inLanguage":"en-US"},{"@type":"WebPage","@id":"https:\/\/www.controleng.com\/transformative-robot-tech-uses-touch-vision-and-sound\/","url":"https:\/\/www.controleng.com\/transformative-robot-tech-uses-touch-vision-and-sound\/","name":"Transformative robot tech uses touch, vision, and sound - Control Engineering","isPartOf":{"@id":"https:\/\/www.controleng.com\/#website"},"primaryImageOfPage":{"@id":"https:\/\/www.controleng.com\/transformative-robot-tech-uses-touch-vision-and-sound\/#primaryimage"},"image":{"@id":"https:\/\/www.controleng.com\/transformative-robot-tech-uses-touch-vision-and-sound\/#primaryimage"},"thumbnailUrl":"https:\/\/www.controleng.com\/wp-content\/uploads\/2025\/05\/wildfusion-terrain.gif","datePublished":"2025-05-23T14:35:24+00:00","dateModified":"2025-05-23T14:35:25+00:00","breadcrumb":{"@id":"https:\/\/www.controleng.com\/transformative-robot-tech-uses-touch-vision-and-sound\/#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/www.controleng.com\/transformative-robot-tech-uses-touch-vision-and-sound\/"]}]},{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/www.controleng.com\/transformative-robot-tech-uses-touch-vision-and-sound\/#primaryimage","url":"https:\/\/www.controleng.com\/wp-content\/uploads\/2025\/05\/wildfusion-terrain.gif","contentUrl":"https:\/\/www.controleng.com\/wp-content\/uploads\/2025\/05\/wildfusion-terrain.gif","width":640,"height":360,"caption":"WildFusion uses a combination of sight, touch, sound and balance to help four-legged robots better navigate difficult terrain like dense forests. (image source Duke University)"},{"@type":"BreadcrumbList","@id":"https:\/\/www.controleng.com\/transformative-robot-tech-uses-touch-vision-and-sound\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/www.controleng.com\/"},{"@type":"ListItem","position":2,"name":"Transformative robot tech uses touch, vision, and sound"}]},{"@type":"WebSite","@id":"https:\/\/www.controleng.com\/#website","url":"https:\/\/www.controleng.com\/","name":"Control Engineering","description":"Control Engineering covers and educates about automation, control and instrumentation technologies","publisher":{"@id":"https:\/\/www.controleng.com\/#organization"},"potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/www.controleng.com\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"en-US"},{"@type":"Organization","@id":"https:\/\/www.controleng.com\/#organization","name":"Control Engineering","url":"https:\/\/www.controleng.com\/","logo":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/www.controleng.com\/#\/schema\/logo\/image\/","url":"https:\/\/www.controleng.com\/wp-content\/uploads\/2024\/12\/ce_logo.png","contentUrl":"https:\/\/www.controleng.com\/wp-content\/uploads\/2024\/12\/ce_logo.png","width":300,"height":93,"caption":"Control Engineering"},"image":{"@id":"https:\/\/www.controleng.com\/#\/schema\/logo\/image\/"},"sameAs":["https:\/\/www.facebook.com\/ControlEngineeringMagazine","https:\/\/x.com\/controlengtips","https:\/\/www.linkedin.com\/company\/control-engineering-magazine\/"]},{"@type":"Person","@id":"https:\/\/www.controleng.com\/#\/schema\/person\/3a60ffb5fa9b8394ed3d125fa8d8d1ec","name":"Duke University","image":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/www.controleng.com\/#\/schema\/person\/image\/","url":"https:\/\/secure.gravatar.com\/avatar\/2b8cfb10aad4e0d53da9c0bceb84ef3999364eee86d1fb0fb859f31518279647?s=96&d=mm&r=g","contentUrl":"https:\/\/secure.gravatar.com\/avatar\/2b8cfb10aad4e0d53da9c0bceb84ef3999364eee86d1fb0fb859f31518279647?s=96&d=mm&r=g","caption":"Duke University"},"url":"https:\/\/www.controleng.com\/author\/duke-university\/"}]}},"_links":{"self":[{"href":"https:\/\/www.controleng.com\/wp-json\/wp\/v2\/posts\/131012","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.controleng.com\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.controleng.com\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.controleng.com\/wp-json\/wp\/v2\/users\/679"}],"replies":[{"embeddable":true,"href":"https:\/\/www.controleng.com\/wp-json\/wp\/v2\/comments?post=131012"}],"version-history":[{"count":0,"href":"https:\/\/www.controleng.com\/wp-json\/wp\/v2\/posts\/131012\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.controleng.com\/wp-json\/wp\/v2\/media\/131013"}],"wp:attachment":[{"href":"https:\/\/www.controleng.com\/wp-json\/wp\/v2\/media?parent=131012"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.controleng.com\/wp-json\/wp\/v2\/categories?post=131012"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.controleng.com\/wp-json\/wp\/v2\/tags?post=131012"},{"taxonomy":"tracking-metric","embeddable":true,"href":"https:\/\/www.controleng.com\/wp-json\/wp\/v2\/tracking-metrics?post=131012"},{"taxonomy":"display-location","embeddable":true,"href":"https:\/\/www.controleng.com\/wp-json\/wp\/v2\/display-location?post=131012"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}