Chloe Gonseth, Anne Vilain, & Coriandre Vilain ( GIPSA-Lab, Speech & Cognition Department, Grenoble University)
Friday, May 18th
Deixis is a communicative process about showing and sharing information. Its specificity
is that it involves both the vocal and the gestural modalities, being then a perfect case of
speech/gesture interaction. The particular relationship between speech and gesture has been
demonstrated many a time, from phylogenetic [1, 2] to ontogenetic perspective [3, 4], not to
mention neurophysiological data [5, 6]. Furthermore, experimental studies reveal that these
two systems interact with a specific temporal synchronization .
The aim of this work is to experimentally study the spatial features of deictic pointing,
especially the way it encodes distance information. The distance of a designated object is
indeed a spatial feature that we believe can be encoded both through vocal and gestural
modalities. In a previous study , we investigated the articulatory and acoustic correlates of
distance in vocal pointing. Our first results showed that the distance of the target influenced
both acoustical values (first formant) and articulatory gestures (lip opening). The present
paper aims at characterizing the kinematic properties of distance encoding in manual pointing:
does manual pointing also carry distance information and thus vary with the target distance?
This hypothesis has been tested with the following experimental situation. Participants were
seated in a soundproof room, in front of three light-emitting diodes (LEDs). LEDs were
placed at 55cm (peri-personal space), 140cm (extra-personal close space), and 425cm (extrapersonal far space). The two LEDs out of reach of the hand were aligned with respect to the subject’s eyes, so that the angle of the arm can not be used to disambiguate which target was pointed at. Participants were required to name and simultaneously to point at the turned-on LED using an index finger pointing. We recorded index finger movements with the motion
capture system Optotrak-3020, which allowed us to measure the duration (s) and amplitude
(mm) of the pointing gesture, as well as its stroke speed (mm/s). A repeated-measures
analysis of variance was conducted on 18 participants, with target distance as a fixed-factor.
For each measurement, the median value was used for the statistical analysis, with a
significance level fixed at p<0.05. The results show that target distance has a significant
effect on the duration, amplitude, and stroke speed of the manual pointing. Participants
perform longer (F(2, 34)=34.9, p<0.05), wider (F(2, 34)=83.7, p<0.05), and faster (F(2,
34)=51.43, p<0.05) manual pointings to designate a distant target (see Figures 1, 2 & 3). Posthoc analyses on each measurement show significant differences for each distance in pairs.
Our results attest that, in a communicative situation, people convey spatial information
about the referent position. To designate a distant referent, they use not only a reinforced
vocal pointing (i.e. higher first formant values and larger lip opening values), but also a
specific manual pointing. Considering these findings could improve the modelling of
conversational agents able to communicate with humans in the most natural way.
(See attached PDF below for the figures).
 Arbib, M.A. (2005). From Monkey-like Action Recognition to Human Language: An
Evolutionary Framework for Neurolinguistics. Behavioral and Brain Sciences, 28: 105-167.
 Leavens, D.A., Hopkins, W.D., & Bard, K.A. (2005). Understanding the point of
chimpanzee pointing: Epigenesis and ecological validity. Current Directions in Psychological
Science, 14: 185-189.
 Volterra, V., Caselli, M.C., Capirci, O., & Pizzuto, E. (2005). Gesture and the
emergence and development of language. In M. Tomasello, D. Slobin, Beyond Nature-
Nurture. Essays in Honor of Elisabeth Bates. N.J.: Lawrence Erlbaum Associates, 3-40.
 Özçalışkan, S. & Goldin-Meadow, S. (2005). Gesture is at the cutting edge of early
language development. Cognition, 96(3): 101-113.
 Hickok, G., Bellugi, U., & Klima, E.S. (1998). The neural organization of language:
Evidence from sign language aphasia. Trends in Cognitive Sciences, 2: 129-136.
 Loevenbruck, H., Baciu, M., Segebarth, C., & Abry, C. (2005). The left inferior frontal
gyrus under focus: an fMRI study of the production of deixis via syntactic extraction and
prosodic focus. Journal of Neurolinguistics, 18: 237-258.
 Levelt, W.J.M., Richardson, Graham, & La Heij, Wido (1985). Pointing and voicing in
deictic expressions. Journal of Memory and Language, 24: 133-164.
 Gonseth, C., Vilain, A., Vilain, C. (2012). Deictic Pointing: How do Speech and
Gesture Cooperate to Encode Distance Information? Submitted.