Effect- and Performance-Based Auditory Feedback on Interpersonal Coordination

authored by
Tong Hun Hwang, Gerd Schmitz, Kevin Klemmt, Lukas Brinkop, Shashank Ghai, Mircea Stoica, Alexander Maye, Holger Blume, Alfred O. Effenberg
Abstract

When two individuals interact in a collaborative task, such as carrying a sofa or a table, usually spatiotemporal coordination of individual motor behavior will emerge. In many cases, interpersonal coordination can arise independently of verbal communication, based on the observation of the partners' movements and/or the object's movements. In this study, we investigate how social coupling between two individuals can emerge in a collaborative task under different modes of perceptual information. A visual reference condition was compared with three different conditions with new types of additional auditory feedback provided in real time: effect-based auditory feedback, performance-based auditory feedback, and combined effect/performance-based auditory feedback. We have developed a new paradigm in which the actions of both participants continuously result in a seamlessly merged effect on an object simulated by a tablet computer application. Here, participants should temporally synchronize their movements with a 90° phase difference and precisely adjust the finger dynamics in order to keep the object (a ball) accurately rotating on a given circular trajectory on the tablet. Results demonstrate that interpersonal coordination in a joint task can be altered by different kinds of additional auditory information in various ways.

Organisation(s)
Institute of Sports Science
Institute of Microelectronic Systems
External Organisation(s)
Universität Hamburg
Type
Article
Journal
Frontiers in psychology
Volume
9
Pages
404
ISSN
1664-1078
Publication date
03.2018
Publication status
Published
Peer reviewed
Yes
ASJC Scopus subject areas
Psychology(all)
Electronic version(s)
https://doi.org/10.3389/fpsyg.2018.00404 (Access: Open)
https://doi.org/10.15488/3187 (Access: Open)