The semantic specificity hypothesis : when gestures do not depend upon the presence of a listener
Humans gesture even when their gestures can serve no communicative function (e.g., when the listener cannot see them). This study explores the intrapersonal function of gestures, and the semantic content of the speech they accompany. Sixty-eight adults participated in pairs, communicating on an object description task. Visibility of partner was manipulated; participants completed half the task behind a screen. Participants produced iconic gestures significantly more for praxic items (i.e., items with physically manipulable properties) than non-praxic items, regardless of visibility of partner. These findings support the semantic specificity hypothesis, whereby a gesture is integrally associated with the semantic properties of the word it accompanies. Where those semantic properties include a high motor component the likelihood of a gesture being produced is increased, irrespective of communication demands.