Analyzing Sketched Route Maps

Being able to interact and communicate with robots in the same way we interact with people has long been a goal of AI and robotics researchers. Much of the robotics research in the past has emphasized the goal of achieving autonomous agents. In our research, we are less concerned with creating autonomous robots that can plan and reason about tasks, and instead we view them as semi-autonomous tools that can assist a human user. The robot may have some perception capabilities, reactive behaviors, and some reasoning abilities that allow it to handle an unstructured and dynamic environment. But the user supplies the high-level and difficult, strategic planning capabilities. The goal of this work, then, is to create a robot interface that allows a user (even a novice user) to guide, control, and/or program a mobile robot to perform some task.

What could be more natural than sketching a map?

As one strategy for addressing this goal, we have been investigating the use of hand-drawn route maps, in which the user sketches an approximate representation of the environment and then sketches the desired robot trajectory with respect to that environment. The objective in the sketch interface is to extract spatial information about the map and a qualitative path through the landmarks drawn on the sketch.  This information is used to build a task representation for the robot, which operates as a semi-autonomous vehicle. Note that the task representation is based on sensing and relative position, not absolute position. Spatial modeling is accomplished using the Histogram of Forces [1] [2].

An example of a route map sketched on a PDA. The robot path is shown in red.
Here is an animation which shows the spatial route information extracted from the sketch.

Possible applications include the following:

  • Military applications, where the user looks at a scene and sketches a route through landmarks or programs site-specific strategic behaviors, such as how to search or how to escape
  • Programming large construction or mining equipment
  • Guiding planetary rovers
  • Personal robots.

    For more information on the techniques used, see references [3] [4] [5] or contact Prof. Skubic at skubicm@missouri.edu.

    References
    1. P. Matsakis and L. Wendling, “A New Way to Represent the Relative Position between Areal Objects”, IEEE Trans. on Pattern Analysis and Machine Intelligence, vol. 21, no. 7, pp. 634-643, 1999.
    2. P. Matsakis, J. Keller, L. Wendling, J. Marjamaa, and O. Sjahputera, “Linguistic Description of Relative Positions in Images”, IEEE Trans. on Systems, Man and Cybernetics, part B, vol. 31, no. 4, pp. 573-588, 2001.
    3. M. Skubic, P. Matsakis, B. Forrester and G. Chronis, "Extracting Navigation States from a Hand-Drawn Map", in Proceedings of the 2001 IEEE International Conference on Robotics and Automation, Seoul, Korea, May, 2001.
    4. M. Skubic, S. Blisard, A. Carle, P. Matsakis, "Hand-Drawn Maps for Robot Navigation", AAAI 2002 Spring Symposium, Sketch Understanding, Stanford Unversity, March, 2002 (Technical Report SS-02-08).
    5. J. Keller, P. Matsakis, and M. Skubic, "Beyond 2001: The Linguistic Spatial Odyssey", to appear as a book chapter in Computational Intelligence Beyond 2001: Real and Imagined, C. Robinson, ed, Wiley, 2002. Also presented by J. Keller as a Plenary Address, World Congress on Computational Intelligence, Honolulu, Hawaii, May, 2002.

    Funded by the Naval Research Lab


    Publications
    Home
    skubicm@missouri.edu
    June, 2002