4.2 Article

Multifractal foundations of visually-guided aiming and adaptation to prismatic perturbation

Journal

HUMAN MOVEMENT SCIENCE
Volume 55, Issue -, Pages 61-72

Publisher

ELSEVIER
DOI: 10.1016/j.humov.2017.07.005

Keywords

-

Ask authors/readers for more resources

Visually-guided action of tossing to a target allows examining coordination between mechanical information for maintaining posture while throwing and visual information for aiming. Previous research indicates that relationships between visual and mechanical information persist in tossing behavior long enough for mechanical cues to prompt recall of past visual impressions. Multifractal analysis might model the long-term coordinations among movement components as visual information changes. We asked 32 adult participants (6 female, 25 male, one not conforming to gender binary; aged M = 19.77, SD = 0.88) to complete an aimed-tossing task in three blocks of ten trials each. Block 1 oriented participants to the task. Participants wore right shifting goggles in Block 2 and removed them for Block 3. Motion-capture suits collected movement data of the head, hips, and hands. According to regression modeling of tossing performance, multifractality at hand and at hips together supported use of visual information, and adaptation to wearing/removing of goggles depended on multifractality across the hips, head, and hands. Vector-autoregression modeling shows that hip multifractality promoted head multifractality but that hand fluctuations drew on head and hip multifractality. We propose that multifractality could be an information substrate whose spread across the movements systems supports the perceptual coordination for the development of dexterity.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.2
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available