Nowadays, aerial robots have very limited cost can be equipped with a sensing suite that may contain visible light cameras, thermal imaging or even Light Detection and more. and targets of interest is explore in possibly cluttered, challenging and previously unknown environments.
At the same time, Modern AR devices are miniaturized, low–cost systems which provide live, high–quality direct views of the real– world environment fused and augmented with computer– generated sensory input such as sound, video, graphics, GPS data and more.
Market forecasts indicate that within the next few years, AR devices will become prominent among computer users and will be employed for a variety of use cases including gaming, social interaction and more.
Within this work they investigate the arising potential when automated robotic inspection is combined with augmented reality technologies and through that the path planning capabilities and cognitive understanding of the robot is shared with the human, while at the same time human intelligence empowers the robot reactive path planning through head motion–based intuitive teleoperation.
The problem of Augmented Reality–enhanced structural inspection path planning is, within the scope of this work such as a bounded environment that may contain obstacle regions.
The structural inspection planner is utilized to derive the initial inspection path, while during the operation a human is using the head–mounted AR–interface to keep track of the camera data collected by the robot as well as of the progress of the online 3D reconstruction.
Fig.5 path computed by the robot as well as the human operator commands provided based on the data of the AR–interface and the tracked head motion which is then mapped to lateral, vertical and yaw deviations of the reference trajectory of the robot.
areas of interest and other important features. The method employs a previously proposed structural inspection planner which derives the initial inspection trajectory given any prior knowledge of the environment. Live feed of the stereo camera frames combined with the real–time 3D reconstruction of the environment is provided to the human operator via the Augmented Reality interface. then head motion may be employed to adjust the path of the robot and collect the required viewpoints. The overall framework is demonstrated with the use of an aerial robot capable of GPS–denied navigation and online mapping of its environment.