Transferring perceptual psychology to a design product can be easily achieved when creating Virtual Reality environments, Aaron Schwerdtfeger and Paul Morat from AIKEN CURA discovered during the making of “In Case You Don’t Know What To Play”.
The VR application that can manipulate time perception, was awarded with the nextReality.Contest – Young Talent Award 2021. Paul Morat recaps their process.
Virtual Reality (VR) technology enables humans to experience computer-generated Immersive Virtual Environments (IVEs) using Head-Mounted Displays (HMDs). IVEs are accordingly fully artificial and designed. This also means that audiovisual, they must be minimally tied to the physical realities of the real world.
This rather trivial phrase led Aaron and me to make a new connection in our minds while developing our application „In Case You Don’t Know What to Play“. IVEs support naturalistic audiovisual stimuli – similar to those we find in the physical world. This connection is the premise that we can create perceptual conflicts using VR.
Virtual Reality is a medium that creates an intersection between the digital and the analog or real physical space. VR therefore, should not only to be seen as a „simulator“ of the analog world, but could actually influence our physical senses virtually and put both sensory worlds in conflict, we concluded and were very excited.
But let’s take a step back. We were sitting in the course „Time Machines“ of Prof. Boris Müller and we had given ourselves the task of investigating time perception in VR. How could we approach this topic as interaction designers?
The first research led us to „VIRTUAL TIMES“, a European research project that deals exactly with this topic. From there on, a universe opened up of neuroscientific, cognitive and informatic contents, which all meet in the research of human perception within VR. To be able to give form to these findings as designers, we first had to enter this universe theoretically.
We talked to some of the people responsible for the research project: Dr. Marc Wittmann from the Institute for Frontier Areas of Psychology and Mental Health Freiburg and Prof. Dr. Dr. Kai Vogeley as well as Sebastian Lammers from the Jülich Research Centre, who gave us insights into what time perception is (or can be) and which parameters they worked with within VIRTUAL TIMES that can be used to realize sensory manipulation in VR.
From this, we discovered Human-Computer Interaction research, particularly by Prof. Dr. Frank Steinicke of the University of Hamburg and Prof. Dr. Gerd Bruder of the University of Central Florida, who have collaborated to investigate time and space perception in VR.
In summary, we came to these observations at the end of our initial research phase:
Time perception is composed of external time sources, the so-called „zeitgebers“ and the resulting internal sense of time.
Zeitgebers are the position of the sun, resulting in the position of shadows, light temperature, and brightness, etc. but also artificial orientation marks such as clocks or pendulums.
The internal clock, the sense of time, initially synchronizes with these external reference points. In addition, it relates to one’s own body perception and „ego perception“. Those who are more focused on themselves, their perception of time is also shifted. This also includes the „optimal effort“, the so-called flow. The cognitive load, i.e. the utilization of the working memory, is directly related to one’s own ego perception and accordingly time perception.
Colors also have an influence on the perception of time. Green environments compress the perceived time, while red ones stretch it for example.
Finally, spatiality provides a temporal orientation aid, for example when distances are estimated in time. Furthermore, wide/narrow spaces influence the perception of time.
Thus, there are visual and motoric parameters that can influence how humans perceive time. We can manipulate time perception externally by shaping visual perception.
This becomes particularly clear in connection with spatial orientation or spatial perception: We can visually estimate how much „time away“ an object is in space, and in motion it is possible to evaluate the distance covered in temporal terms. The brain hereby refers to visual depth perception and proprioceptive body perception, among other things.
But why does visual perception in VR have to correspond to the physical reality?
Aaron and I had shortly before stumbled across the works on so-called „Redirected Walking“ by Prof. Dr. Frank Steinicke and Prof. Dr. Eike Langbehn from the University of Hamburg. Redirected Walking is a kind of virtual mechanism that allows VR developers to make a real small motion range within VR appear larger. The virtual space is distorted in this process, but without the users being able to perceive this visually. In fact, our visual perception dominates the proprioceptive, and that is how very concrete perceptual conflicts can be designed. The brain tends to prefer only the visual information.
In VR, Aaron and I now understood, space can be distorted at will, and since spatial orientation and temporal perception are cognitively related, we could use spatial distortion to build an application that would manipulate time perception.
„But now you have to decide,“ Prof. Boris Müller brought us back from conceptualization, „Do you want to work out an experimental design for a cognitive experiment, or do you want to design a VR application?“
„Well, both,“ we thought at first… But Boris was right: we should now think about our aesthetics and the effect-orientated design of the appearance of our application. Building on the observations we made in the research, we both formulated four design principles for a VR application that can manipulate users‘ perception of time via spatial distortion:
- Cues that suggest the users a sense of influence on their perception must be minimized. As an audiovisual and inter- active experience, the application must lead the user away from the underlying perceptual manipulation as much as possible. An unambiguous experimental environment must therefore be avoided. The resulting design decisions first condition the other elements of the application.
- To increase the feeling of immersion and presence, a task and an avatar should be integrated. Additionally, a movement of the user can be achieved that requires them to cover distances. Furthermore, this helps to bring the user into a state of flow. This mental state is characterized, among other things, by the loss of the sense of time and ego – which obscures the actual growth of distance (and thus the duration it takes to cover that distance) within the application.
- Zeitgebers are orientation marks for the sense of time, also within VR. These can be adjusted at will in IVEs. When the rhythm of the zeitgeber varies, the sense of time varies relatively. However, this effect is balanced when the cognitive load is utilized. Nevertheless, such distraction cannot always be guaranteed, for example, when the user pauses to look around. Accordingly, variable zeitgebers must be implemented to make it more difficult to lift the illusion. A rhythmic sound can also function as a zeitgeber.
- The cognitive connection between time and space perception must be delegitimized. Spatial orientation that simplifies depth perception, moreover, should be made as difficult as possible, avoiding depth cues and colors. In particular, the floor of the IVE must be considered as an orientation factor.
The result, and that means our proposal for the form of such an application, is „In Case You Don’t Know What To Play“. An application that appears to the users as a game but plays with their perception itself.
We decided on a completely black and white environment, whose spatiality only becomes recognizable through the user’s own movement. If you stand still, the environment appears almost two-dimensional. We therefore invalidate the perception of depth and initiate movement, and this only through the visual appearance. In addition, we have integrated the game characteristics to generate a state of flow in the users and once again steer away from the actual manipulation.
This is visually (through the head-mounted display) and consequently proprioceptively imperceptible to the users: The visually detectable, virtual distance always remains the same, while the proprioceptively detectable, real distance becomes larger. In other words, we distort the virtual space and this way manipulate the perception of the real space.
Furthermore, we have integrated a zeitgeber: A rhythmic sound pulse that also adapts to the distance covered by the user. Julian de Freitas (<3) wrote this mechanism for us, which again makes it impossible to notice any change. The sound is always triggered at the same point and in its length relatively to the covered distance, without changing auditorily.
For Aaron and me, this project was an important exploration of the medium VR and human perception, which we eventually formulated in detail in a publication for the 2021 Mensch und Computer conference. We would also like to take this opportunity to express our sincere thanks once again for the awarding with the nextReality.Contest – Young Talent Award.