Using Spatial Language for Human-Robot Communications

In conversation, people often use spatial relationships to describe their environment, e.g., “There is a desk in front of me and a doorway behind it”, and to issue directives, e.g., “Go around the desk and through the doorway”. Recent cognitive models suggest that people use these types of relative spatial concepts to perform day-to-day navigation tasks and other spatial reasoning [1,2], which may explain the importance of spatial language and how it developed. In our research, we have been investigating the use of spatial relationships to establish a natural communication mechanism between people and robots, in particular, striving for an intuitive interface that will be easy for novice users to understand.

In previous work, we had developed two modes of communication which utilized spatial relationships. First, using sonar sensors on a mobile robot, we built a model of the environment and generated a spatial description of that environment [3]. Second, we explored the idea of sketching a map on a PDA, as a means of communicating a navigation task to a robot [4]. The sketch, which represented an approximate map, was analyzed using spatial reasoning, and the navigation task was extracted as a sequence of spatial navigation states. We also compared the results of these two modes in similar, but not exact environments, and found them to agree [5].

In the most recent work, robot spatial reasoning is combined with a multimodal robot interface developed at the Naval Research Laboratory (NRL). Spatial information is extracted from an evidence grid map in which information from multiple sensors is accumulated over time. A short-term map is built using all grid cells that show a reasonable probability of being occupied. This short-term map is then filtered, processed, and segmented into environment objects. Using linguistic spatial terms, a high-level spatial description is generated which describes the overall group of objects, and a detailed description is also generated for each object. The spatial reasoning capabilities have been implemented in the form of a server so that a client can request the spatial description of the environment at any time. In addition to objects identified in the short-term sensor map, another class of objects has also been created, in which objects are given persistent locations in the map and are assigned labels provided by a user.

An Example:

DETAILED SPATIAL DESCRIPTIONS:
The #1 object is mostly to the left of me but somewhat forward (the description is satisfactory) The object is close.

The #2 object is mostly in front of me but somewhat to the left (the description is satisfactory) The object is close.

I am surrounded from the rear (surrounded by the #3 object). The object is very close.

The #4 object is mostly to the left of me but somewhat forward (the description is satisfactory) The object is close.

HIGH LEVEL DESCRIPTION:
There are objects on my front left. I am surrounded from the rear. The pillar is mostly in front of me.

MU Collaborators: Pascal Matsakis and Jim Keller

References
1.F.H. Previc, “The Neuropsychology of 3-D Space”, Psychological Review, 1998, vol. 124, No. 2, pp. 123-164.
2. C. Schunn, T. Harrison. Personal communication. 2001.
3. M. Skubic, G. Chronis, P. Matsakis and J. Keller, “Generating Linguistic Spatial Descriptions from Sonar Readings Using the Histogram of Forces”, in Proceedings of the IEEE 2001 International Conference on Robotics and Automation, May, 2001, Seoul, Korea, pp. 485-490.
4. M. Skubic, P. Matsakis, B. Forrester and G. Chronis, “Extracting Navigation States from a Hand-Drawn Map”, in Proceedings of the IEEE 2001 International Conference on Robotics and Automation, May, 2001, Seoul, Korea, pp. 259-264.
5. M. Skubic, G. Chronis, P. Matsakis and J. Keller. “Spatial Relations for Tactical Robot Navigation”, in Proceedings of the SPIE, Unmanned Ground Vehicle Technology III, April, 2001, Orlando, FL.
6. M. Skubic, P. Matsakis, G. Chronis, and J. Keller, "Generating Multi-Level Linguistic Spatial Descriptions from Range Sensor Readings Using the Histogram of Forces", Submitted to Autonomous Robots

Funded by ONR and the Naval Research Lab


Publications
Home
skubicm@missouri.edu
October, 2001