While a large number of teams used MoveIt!, an integrated
motion planning and visualization framework, none of the
top three performers used such software. This may suggest—
like in the case of perception software—that prepackaged
toolkits for these complex behaviors help teams to get started
rapidly [18], but do not necessarily help them access and
improve lower-level functionality in an equally easy manner.
Generally, MoveIt! and other pre-packaged motion planning
software solutions have three problems: 1) the robot is not
allowed to exploit contact, 2) uncertainty is not taken into
account during planning, and 3) incorporation of sensor-based
feedback is not straightforward. This approach is in contrast
with the winning team’s architecture that consisted of a hybrid
automaton that connected a variety of feedback controllers
[39] with event-based state transitions. Here, sensors included
object position provided by the camera, contact via pressure
sensors, and actual torques. Since motion planning appears
important in general in geometrically more complex scenes
to navigate around obstacles, a potentially important topic
of future research is how to better integrate planning with
feedback to make up for inaccurate sensing and actuation.
There currently exist no high-level tools that combine these
approaches in a user-friendly way. The community would
greatly benefit from manipulation planning tools that better
support reasoning over contacts, sensor-based feedback and
uncertainty