BODYinTRANSIT

LAST NEWS

bodyintransit.eu

Participation in DIS 2024

Laia Turmo Vidal and José Vega-Cebrián participated in DIS2024, the ACM Conference on Designing Interactive Systems 2024 conference, which took place in ITU Copenhagen, Denmark,

Participation in ESCAN 2024

Amar D’Adamo, Marte Roel and Karunya Srinivasan participated in ESCAN 2024, the 7th bi-annual ESCAN (European Society for Cognitive and Affective Neuroscience) meeting which took

Participation in CHI 2024

Amar D’Adamo participated in the CHI Conference on Human Factors in Computing Systems (CHI ’24), in Honolulu, Hawaii (May 11-16). He presented the following paper:

Participation in TEI’24

José Vega-Cebrián participated in the Eighteenth International Conference on Tangible, Embedded, and Embodied Interaction (TEI ’24), in Cork, Ireland, (February 11-14) with a demo and

Academic Mindtrek 2023: On Futuring Body Perception Transformation Technologies: Roles, Goals and Values

We are presenting our paper On Futuring Body Perception Transformation Technologies: Roles, Goals and Values at the 26th International Academic Mindtrek conference, between the 3rd and 6th of October 2023 in the Nokia Arena, Tampere, Finland. We will be part of Session 6: Fictional, Speculative and Critical Futures, on Thursday, October 5th, from 13:45 to 15:10. See here the full program.

The paper’s abstract:

Body perception transformation technologies augment or alter our own body perception outside of our usual bodily experience. As emerging technologies, research on these technologies is limited to proofs-of-concept and lab studies. Consequently, their potential impact on the way we perceive and experience our bodies in everyday contexts is not yet well understood. Through a speculative design inquiry, our multidisciplinary team envisioned utopian and dystopian technology visions. We surfaced potential roles, goals and values that current and future body perception transformation technologies could incorporate, including non-utilitarian purposes. We contribute insights on such roles, goals and values to inspire current and future work. We also present three provocations to stimulate discussions. Finally, we contribute methodologically with insights into the value of speculative design as a fruitful approach for articulating and bridging diverse perspectives in multidisciplinary teams.

BRNet 5: Body Representation Network Conference 2023

We co-organized the Body Representation Network Conference 2023: The interactive body: Multisensory and embodied signatures of bodies interacting in the world, in Mallorca, Spain on Thursday 14th and Friday 15th September 2023.

From the BRNet 5 conference site:

We usually take the ability to identify our body as our own for granted, but empirical research in the past few decades has shown that our body representations rely on the cognitive ability to combine information about the body originating from different sensory modalities. Indeed, the development of our models of the self and the world around us relies on the contribution of both exteroceptive (e.g., visual, tactile and auditory cues) and interoceptive (e.g., physiological) bodily signals.

The present conference aims to discuss recent research lines that, albeit different, converge on the idea that the experience of our body is not a fixed phenomenon but rather an active, ever changing and dynamic process. It relies on the interplay between environmental, interoceptive, and exteroceptive bodily signals which are received during motor, social and technological interactions with the world and with others. The present conference will be of broad interest to researchers from different communities including cognitive, social, and affective neuroscientists, experimental psychologists, neurophysiologists, human-computer interaction researchers interested in the acting, sensing and feeling body in general and to researchers studying body representation, multisensory integration, interoception, virtual reality, robotics, embodied cognition, and the sensorimotor system, in particular.