Li Junting Benjamin

Assistant Professor
Wee Kim Wee, School Of Communication
NTU Singapore

Dr. Benjamin (Benjy) Li is currently an Assistant Professor in the Wee Kim Wee School of Communication and Information. He received his PhD in Communication Studies from Nanyang Technological University. He was a recipient of the inaugural Humanities, Arts and Social Sciences Postdoctoral Scholarship in 2014 and spent two years in Stanford University (USA) as a postdoctoral researcher at the Virtual Human Interaction Lab in the Department of Communication.
Dr. Li studies the effects of virtual reality (VR) on human behavior and psychology, in particular the interplay of human senses (sight, sound, smell, touch, taste) of virtual food experiences on satiation and consumption behavior. He also has a keen interest in the use of VR as an emerging tech for digital interventions, especially in the areas of weight management, physical rehabilitation and elderly wellbeing. Other research interests include associations between head movements and emotions, human-computer interaction, augmented reality and 360 degree immersive videos.
Dr. Li has collaborated with several institutions, including the Stanford Prevention Research Center (USA), the Ministry of Health (Singapore), the Health Promotion Board (Singapore), and the KK Women’s and Children’s Hospital (Singapore) in the development and testing of digital health interventions. His work has been published in journals such as Computers in Human Behavior, Games for Health Journal, PLOS One, Frontiers in Psychology, and Journal of Adolescence. His paper entitled “Impact of Visual and Social Cues on Exercise Attitudes and Behavior of Overweight Children Playing an Exergame” received a Top Paper award at the 2011 International Communication Association Conference.

Of Virtual Foods And Emotions: Two Studies That Explore The Sensory Dimensions Of Virtual Reality In Satiation And Mood Induction

Virtual reality (VR) has been proposed as a methodological tool to study the basic science of psychology and other fields. Two studies that explore the sensory dimensions of VR are presented. Olfactory research in immersive virtual environments (IVEs) have often examined the addition of scent as part of the environment or atmosphere that act as experimental stimuli. There appears to be a lack of research on the influence of virtual foods in IVEs on human satiation. Studies based on situational cues or self-perception theory provide support for the hypothesis that touching and smelling a virtual food item may lead to increased consumption as a result of modelling expected behavior. On the other hand, studies grounded in embodied cognition suggest that satiation may take place as a result of mental simulation that resembles actual consumption behavior. In the first study, we sought to explore the effects of haptic and olfactory cues through virtual food on human satiation and eating behavior. 101 participants took part in a 2 (touch: present vs absent) X 2 (scent: present vs absent) experiment where they interacted with a donut in an IVE. Findings showed that participants in the touch and scent present conditions ate significantly less donuts than those who were not exposed to these cues, and reported higher satiation as compared to their counterparts. However, findings were less clear with respect to participants who received both haptic and olfactory cues. As a whole, results provide preliminary support for satiation effects as a result of sensory simulation.
The second study examines the ability of VR to trigger strong emotions in individuals. Head postures can represent emotional states. When one is happy, he holds his head up high. Conversely, when he is sad, his head tends to hang low. Few studies have examined the relationship between head movements in VR and emotions. In this study, we first seek to establish and allow public access to a database of immersive VR video clips that can act as a potential resource for studies on emotion induction using virtual reality. Second, given the large sample size of participants needed to get reliable valence and arousal ratings for our video, we were able to explore the possible links between the head movements of the observer and the emotions he or she feels while viewing immersive VR. To accomplish our goals, we sourced for and tested 73 immersive VR clips which participants rated on valence and arousal dimensions using self-assessment manikins. We also tracked participants’ rotational head movements as they watched the clips, allowing us to correlate head movements and affect. Based on past research, we predicted relationships between the standard deviation of head yaw and valence and arousal ratings. Results showed that the stimuli varied reasonably well along the dimensions of valence and arousal. The standard deviation of yaw positively correlated with valence, while a significant positive relationship was found between head pitch and arousal. These findings may have interesting implications for filmmakers interested in producing immersive VR films.

Advertisements