Invited Symposia

Four Invited Symposia were submitted and accepted for ICPA12.
Each submission consists of an integrated theme and four speakers.

 

Cognitive Constraints on Coordination Dynamics

Organisers: Michael A. Riley & Kevin D. Shockley

In recent years there has been a growing interest in the relation between cognitive activity and coordination dynamics. Interactions between concurrent cognitive activity and coordination are most typically studied using dual-task methodology. This work has been motivated by the recognition that cognitive constraints are a major factor in shaping the assembly and activity of perception-action synergies. Accordingly, it is important to determine how concurrent cognitive demands affect coordination patterns and their stability. The determination of specific patterns of interference (or facilitation) between cognitive and coordination tasks will provide insights into the extent to which cognitive and coordination tasks share a common neuro-cognitive basis. Furthermore, the move to understand cognitive tasks in the language of dynamics, rather than in the more typical information processing framework, may eventually serve to enrich the set of constraints required for modeling cognitive dynamics.

In this symposium we will address interactions between various forms of cognitive activity and interlimb rhythmic coordination. The symposium draws together a progression of ideas surrounding concurrent cognitive activity and interlimb coordination. The first speaker, Jean-Jacques Temprado, will introduce the first attempt to evaluate the attentional demands of a 1:1 interlimb coordination of varying stability using the chronometric measure of reaction time (RT). The Temprado et al. (1999) findings demonstrate that the less stable, anti-phase coordination pattern results in increased concurrent RTs as compared to the more stable, in-phase coordination pattern. That is, they demonstrated that the stability of interlimb coordination has differential attentional costs via the cognitive measure, RT. Furthermore, when attentional focus was manipulated by directing attention to one of the tasks, there was a trade-off between pattern stability and RT. The Temprado et al. (1999) finding provided an important link between the more traditional measure of attentional demands (RT) and the dynamical measure of relative phase by demonstrating that the attentional costs of dynamic coordination patterns may be measured chronometrically. The second speaker, Michael Turvey, presents the research of Pellecchia and Turvey (2001) which is closely related to the results of Temprado et al. Temprado's focus was on whether interlimb coordination influenced RT, while Pellecchia and Turvey asked: How does a cognitive load influence interlimb coordination? By varying the level of difficulty of a cognitive task (e.g., simple arithmetic), Pellecchia and Turvey found that cognitive activity magnified the absolute deviation of measured relative phase away from intended phase to a degree related to the level of difficulty of the cognitive task. The attractor shift was an important finding because it provided a conceptual link between cognitive activities (typically evaluated within the framework of information processing) and coordination dynamics (necessarily evaluated within the theoretical framework of self-organization). The finding's special significance is that any general theory of performance must be able to explain why a shift in the attractor location of the coordination dynamics should result from a concurrent cognitive load. Building from the two prior studies, the appropriateness of evaluating interlimb coordination performance during concurrent cognitive tasks is made more explicit by our third and fourth speakers. Miguel Moreno considers the interplay of another traditionally cognitive task, lexical-decision-making and interlimb coordination. Once again, traditionally cognitive tasks are shown to influence coordination dynamics. The degree of similarity between nonwords (illegal, legal, and pseudohomophones) and words systematically influenced the mean relative phase and variability of relative phase. The results are interpreted and discussed in terms of strategically assembled cognitive agents based in dynamical systems theory. Finally, Kevin Shockley introduces bimanual rhythmic coordination to the study of human memory (Shockley, 2002). Memory is a classic domain for evaluating dual-task performance and the corresponding paradigms provide a useful framework for evaluating attentional trade-offs involved in memory tasks of different cognitive demand (e.g., encoding vs. retrieval). In addition to comparing Shockley's results to traditional findings, types of performance modulation are reported that are not recognized in information processing accounts. Constraints of modeling concurrent cognitive and coordination tasks from a dynamical perspective are considered.

Back to Programme

Interpersonal Perception-Action Systems

Organisers: Michael J. Richardson & Kerry L. Marsh

The proposed symposium includes a number of novel projects examining the perception and action processes of interpersonal systems. Until recently, the areas of interpersonal and social interaction have been largely ignored within the fields of ecological and perception-action psychology. However, drawing from the combined understanding that one of the most significant environmental interactions an organism has are those with conspecifics, a number of researchers from the areas of coordination dynamics, psycholinguistics, ecological, and social psychology have started to examine interpersonal systems in an attempt to broaden our understanding of how we, as social organisms, coordinate with the world.

At its essence, interpersonal and social behavior can be deconstructed into a number of elements that are essential for coordinating oneself with the social environment. These elements are: the physical presence of two or more people, visual or verbal information that is emitted from one or several individuals to another, a set of task constraints that either facilitate or impede co-action, and the presence of dyadically defined goals, whereby the actions of one organism bring about changes (both psychologically and behaviorally) in the actions of another. With this in mind, each speaker will present recent work that tackles one or several of these elements, highlighting the importance of studying perception and action at a level above that of the individual, as well as describing new methodologies that facilitate the investigation of interpersonal perception-action systems and the phenomena that surround them (e.g., coordination dynamics, social and interpersonal affordances, and communication systems).

Kevin Shockley will begin by presenting recent findings demonstrating how physical entrainment emerges between co-actors when engaged in cooperative conversation and how differing task constraints and individual goals (degree of cooperation) affect this entrainment. Such findings are intriguing because they appear to index the level of interpersonal coordination that must occur if joint activities are to be completed. Using this work as its foundation, Michael J. Richardson will then discuss a number of findings that uncover how the interpersonal synchrony of locally controlled movements emerge unintentionally by means of visual and verbal coupling. In particular, this presentation will attempt to answer the question of how visual and verbal couplings affect rhythmic movements and set the stage for interpersonal synchrony. In like fashion, Kerry L. Marsh will introduce a new line of research aimed at identifying how the possibilities for action (affordances) can be affected by the presence of others in ways that differ quantitatively as well as qualitatively from those possible in solo action. She will present research that examines the scaling relationship and intrinsic dynamics of affordances at the individual and interpersonal levels - examining whether the same perception-action coupling that constrains individual action also operates in the same way at the interpersonal level. Finally, Bruno Galantucci will focus on how communication systems emerge during social interaction. More specifically, drawing from his research into the emergence of natural languages, he will present results that suggest that this process of emergence, far from being accidental or due to a biological endowment peculiar to humans, is an unavoidable consequence of the joint increase in complexity of an animal's interpersonal and social actions.

By combining the above research projects together into one symposium, it is hoped that other researchers will begin to see how one can study perception and action at a social or interpersonal level. More importantly, the symposium seeks to demonstrate how such investigation can lead to a greater understanding of epistemic intentional perception-actions systems and organism-environment (organism-organism) mutuality.

Back to Programme

Multimodal Dynamical Gestures

Organisers: Paul Treffner & Nobuhiro Furuyama

When a person speaks, movements can be seen in various parts of the body and face. Some of the movements are seemingly connected to articulation directly and others less directly or only incidentally. The former are called articulatory gestures and the latter merely gestures (broadly construed here to include so-called manual gestures, facial expressions, head nodding, etc.). The boundary between the two has not been clear, however, both in terms of perception-action coupling and in terms of communicative function. Regarding perception-action coupling, it is not deniable, for one thing, that the articulatory system overlaps with the gesticulatory system (e.g., respiratory and postural systems). With respect to communicative function, it is well documented that speech and gestures, by meaningfully mediating each other, co-express similar or related aspects of one and the same state of affairs or events - they are spatiotemporally coordinated and constitute two poles or aspects of a single coordinative system of communication. But what is the ecological or dynamical basis of such multimodal coordination of speech and gesture? How are these two modalities coordinated with one another? To what extent and in what way are they perceived by listeners and viewers? More specifically, what are the invariants in communication? This symposium discusses recently research conducted on multimodal communicative systems from ecological and dynamical systems approaches.

The first speaker, Nobuhiro Furuyama, will describe his recent research revealing a relation between hand gestures, speech gestures, and respiration. Of interest is how this relates to the recent discoveries of how attention underlies multimodal gestures such as the speech-hand coordination task of Treffner and Peter (Hum. Mov. Sci. 21, 641-697). The second speaker, Denis Burnham, will present data showing that an understanding of the perception and production of speech is to be found when the problem is approached as a multimodal auditory-visual phenomenon. That is, speech perception by eye and ear is not only possible but also the norm. The third speaker, Miyuki Kamachi, seeks to demonstrate that there may be bimodal invariant information sufficient to specify speaker identity across the two modalities of moving faces and moving voices and that performance is more a function of the way in which people speak rather than of what they say. Thus, it is shown that the dynamic invariants with regard to speech production (i.e., articulatory gestures) also underlie speech perception. The fourth speaker, Harold Hill/Eric Vatikiotis-Bateson, investigates multimodal speech perception and production and focuses upon the existence of invariant patterns underlying facial spatio-temporal dynamics. He asks, firstly, to what extent
the physical structure of the face shapes the development and execution of a given speaker’s behavior, and secondly, to what extent perceivers can recover structural information from behavior?


Back to Programme

Functional Architecture of the Visual System

Organiser: Phil Sheridan

This symposium explores a set of related models of the primate retina and the primary visual cortex. The first speaker, Kazuhiko Takemura, will discuss perceptual implications of light input to the primate retina, and how a hexagonal lattice of light detectors improves signal detection. The second speaker, David Alexander, will discuss the mapping of the visual field at different scales of primary visual cortex, and what these reveal about invariants in the optic flow. Speaker three, Phil Sheridan, will discuss the structure of the retina and primary visual cortex in relation to the detection of pseudo-invariants in optic flow. This presentation relates the structure of the primate retina, primary visual cortex by the first two speakers to computational issues pertinent to the development of artificial vision systems. The fourth speaker, Mark Chappell, continues the theme of invariants in a manner that is independent of architecture.

Back to Programme