The visual control of locomotionOur interests in the visual control of locomotion span several levels, ranging from the perception of affordances and the selection of safe and efficient routes, to the use of visual information (e.g., optic flow) to continuously guide locomotion. We focus on tasks such as avoiding collisions with stationary and moving obstacles, intercepting moving targets, and guiding foot placement when walking over rough terrain – all of which are characterized by a tight coupling of perception and action. One aim of our research is to identify the visual information and control strategies that make it possible to perform such tasks with stability and efficiency. Another aim is to understand the remarkable flexibility that is needed to control locomotion in the presence of real-world variability that affects the dynamics of the body and the environment. Our research contributes to a basic understanding of how locomotion is guided by perception and is supported by the National Institutes of Health and the National Science Foundation. The findings will help to better anticipate the complex behavioral consequences of impairments that affect locomotion, and may also inspire new ideas about how to design robots to navigate complex environments.
The role of affordance perception in selecting actions and guiding locomotionWhen people navigate through complex, dynamic environments, they select actions and guide locomotion in ways that take into account their body dimensions and movement capabilities. For example, when stepping off a curb, a pedestrian may need to decide whether to go now ahead of an approaching vehicle or wait until it passes. Similarly, a child playing a game of tag may need to decide whether to go to the left or right around a stationary obstacle to intercept another player. In such situations, the possible actions (i.e., the affordances) are partly determined by the person’s body dimensions and locomotor capabilities. If people are unable to perceive affordances, they would sometimes choose actions that are beyond their capabilities and therefore have no chance of succeeding, and other times fail to choose beneficial actions that are within their capabilities. In this project, we are studying affordance perception and its role in selecting actions and guiding locomotion.
Learning, adaptation, and the visual control of locomotionThe dynamics and dimensions of our bodies are not fixed. People grow, gain and lose weight, wear clothing that restricts movement, and carry backpacks that alter the distribution of mass. People’s movement capabilities are affected by factors such as injury, disease, aging, and neurological damage, all of which have non-trivial consequences for the control of action. Each of the factors listed above affects (oftentimes in complex ways) the energetic and mechanical costs and performance limits associated with moving. If people were unable to adapt to these changes, their ability to select appropriate actions, and to safely and efficiently guide locomotion would be significantly compromised. The aim of this project is to investigate how people adapt to changes in the dynamics and dimensions of their bodies or the systems that they control.
The rough terrain problem: Guiding foot placement over rough terrainThe ability to modulate gait to step over obstacles and land on safe footholds is what allows humans and other animals to navigate terrain that would otherwise be inaccessible (e.g., by means other than legged locomotion, such as a vehicle or wheelchair). In this project, we are exploring how visual information is used walk over rough terrain with irregularly-spaced safe footholds. We developed a novel experimental paradigm in which subjects walk over an array of randomly distributed virtual obstacles that are projected onto the floor by a LCD projector while their movements are recorded using a full-body motion capture system. This setup allows us to synchronize the appearance of obstacles with the movement of the subject. For example, in one experiment, obstacles appear only when they lie within a moving window of visibility centered on the subject. By manipulating the size of the window of visibility, we can measure how far along the future path visual information is needed to control foot placement.
The behavioral dynamics of steering, obstacle avoidance, and route selectionInformation about this project will be added soon.