/
partner with:
Neurobiology

Haptics and Sight in Action: a new way to grasp the human brain

The availability of both visual and haptic information for a target object significantly improves reach-to-grasp actions, demonstrating that the nervous system utilizes both types of information to optimize movement execution.

Credits: Pixabay - CC0
by Ivan Camponogara | Postdoctoral Research Fellow

Ivan Camponogara is Postdoctoral Research Fellow at New York University Abu Dhabi, Abu Dhabi, United Arab Emirates.

Ivan Camponogara is also an author of the original article

, Robert Volcic | Assistant Professor

Robert Volcic is Assistant Professor at New York University Abu Dhabi, Abu Dhabi, United Arab Emirates.

Robert Volcic is also an author of the original article

Edited by

Massimo Caine

Founder and Director

Profile
Views 4105
Reading time 3 min
published on Aug 28, 2019

In everyday life, actions are not only directed toward objects we see but also toward objects we already hold in one hand. For instance, even without vision, we can easily grasp the lid of a marmalade jar, open up a bottle or fill a candy box while holding these objects with the other hand. How do we do that? Every time we touch an object some receptors inside the muscles (so-called proprioceptors) become active. These inputs together with the one coming from the tactile receptors form our "haptic" sense.

Thus, haptic information can tell us where the held object is (position) and how big it is (size) to guide the other hand toward it. However, are the actions as good as movements toward objects we see? In addition, can we perform even better actions when we can simultaneously see and touch the object we are going to grasp? Unfortunately, only a handful of studies have investigated whether grasping toward haptically, visually or visuo-haptically sensed object differs between each other. Usually, these studies found that, when the object was felt with the other hand, the grasping was worse than when it was seen or both seen an felt. In contrast, when the object was both seen and felt the other hand action were as good as when the object was seen, which raises the question what the reason may be that multiple sources of sensory information are not exploited during grasping movements.

In our study, we shed some light on the role of haptic information in grasping by comparing the grasping behaviour of young healthy participants in three experimental conditions. In one, the participants were blindfolded and asked to touch an object with their left hand and then grip it with their right. In another, participants were only allowed to look at the object and then asked to grasp it with their right hand. In a third, participants were asked to look at and touch the object with their left hand and grasp it with their right. Hand and fingers movements were recorded in real-time using a motion capture system.

We found that the simultaneous availability of both vision and haptics substantially improves reach-to-grasp actions suggesting that haptics plays an important role in the constant calibration of our movements. When participants could see and feel the object with the other hand they performed faster movements and they formed smaller, more adequate grip sizes than when their actions were guided by either haptics or sight alone: a clear sign of increased movement confidence. Our findings also show that each modality contributes to a different extent in different phases of the movement, with haptics being more crucial in the initial phases and vision being more important for the final control. Thus, vision and haptics can be flexibly combined to optimize the execution of grasping movement.

These findings can be useful to develop new rehabilitation protocols to aid patients who have difficulty in using sensory inputs to guide their movements. For example, people with curable congenitally blindness who are treated only later in life know how to use touch to guide their actions, but they have to learn how to see from scratch. Similarly, Parkinson's patients and stroke survivors often struggle with performing basic reaching and grasping movements when these are guided by vision only. Thus, including haptic information in the training protocols might speed up rehabilitation.

The next aim is to further study the role that sensory integration plays in guiding our movements; a research field that can also inform the robotics community about how to build robots that can learn by interacting with the environment.

Original Article:
I. Camponogara, R. Volcic, Grasping movements toward seen and handheld objects. Sci Rep 9, 3665 (2019)

Edited by:

Massimo Caine , Founder and Director

We thought you might like