vr & biosignals: increasing interpersonal understanding via a vr co-op driving game

Team: Hyun Woo Paik, Wei Gong
(October 2018 - December 2018 [Development: December - December])

independent study, ux research & design, game design, unity, C# programming



UX Researcher
Game Designer
Programmer, Audio & Visual (C#)


Rapid Prototyping
Speed Dating

Software used

Oculus Rift
Visual Studio (C#)
Fraps (for recording)


Research Question

Guided by Geoff Kaufman and under the supervision of Fannie Liu and Anna Kasunica in the Human-Computer Interaction Institute Lab, my team sought to answer the following question:

How can we increase the emotional and social interpersonal understanding between two players via Biosignals (such as heart-rate) in a VR environment?

conceptual answer in unity (v0.5)

We created the initial version of a first-person co-op timed driving game to answer the above question.

In this game, the driver and passenger must work together to navigate around obstacles and reach the finish line in the allotted time. The driver cannot see the objects in the road, while the passenger can. If the driver hits an object, the car slows down.

When the driver’s heart-rate increases, the music in the car also increases, making it more difficult to communicate. Through this, the passenger gets an idea of what the driver is currently experiencing. The volume of the music and the color of the sky will change along with the driver’s heart-rate, in order to remind the driver that they need to calm themselves down to communicate better with the passenger. This is also an additional indicator of the driver’s stress level to the passenger.


Emotion & Collaborative Illustration, created by several students in the independent study

Emotion & Collaborative Illustration, created by several students in the independent study

One stimuli-focused story. The characters experience fear, stimulus: knocking on their door

One stimuli-focused story. The characters experience fear, stimulus: knocking on their door

exploring emotions in games & social context

Our initial work involved exploring emotions in games, from a third-person and first-person perspective. The task of combining narrative and gameplay was a challenging one, especially considering the constraints of VR. Therefore, my first few storyboards focused more on narrative and stimuli: specifically, what prompted characters to behave in certain ways.

As a team, we took emotional concepts and built scenes together, with each one of us adding an element to a particular scene. This helped us come up with different ideas collaboratively.


In order to finalize a concept, we started with many storyboards (16+, a sample of which is shown above [illustrated by Hyun Paik and Wei Gong]) and narrowed it down to one concept: a driver and passenger in a car who have to sync their biosignals (in this case, heart-rate BPM)to successfully reach the finish line.

Storyboarding the final game concept

Storyboarding the final game concept


Once we narrowed down our concept, we created further storyboards for speed-dating purposes (depicted storyboards by Hyun Paik, me, and Wei Gong). Speed-dating was done to identify user needs when it comes to interpersonal understanding. We experienced a lot of difficulty in designing a game that also addresses emotional needs in a VR context. One of the biggest points

prototyping & game design

When we were designing the rules for the game and prototyping scenarios, we tried to answer the following questions:

  1. What are the prompts for the two players to communicate with each other?

  2. How would they communicate and interact with each other? Are there any obstacles for the communication?

  3. How can biosignal synchrony help with their communication and interpersonal understanding?


I was in charge of audio and displaying information to the player. I added music and sound effect files to game in Unity using C#. I set up music to start/stop with key command (if not playing at start of game) and controls to manually adjust volume (as the first version was more of a Wizard of Oz concept). Set up sound effects and mic input, with ability to change mic volume and pitch (which broke/was buggy in Unity). Set up BPM count that mimicked integration with biosignals (went up when the player hit an obstacle, and could also be manually adjusted.)


Controls (version 0.5):

  • Up arrow: Increase radio volume

  • Down arrow: Decrease radio volume

  • Spacebar: Turn radio On/Off

  • Q: Decrease BPM

  • W: Increase BPM

  • I: Driver POV

  • O: Passenger POV

  • P: High BPM skyfilter


  • Left Arrow: Decrease mic pitch

  • Right arrow: Increase mic pitch

detailed how-to-play

There are two players in the game, a passenger and a driver. They load into the same car, while the car is already moving. The two players need to reach the terminal point of the road covered with random obstacles as fast as possible to win the game together.  Each time the car hits an obstacle, the speed of the car will slow down. Therefore, the goal of the driver is to control the car to avoid the obstacles to ensure that the car is in high speed. However, the driver cannot see the obstacles whereas the passenger can see them. Thus, the goal of the passenger is to guide the driver to turn left and right to avoid the obstacles. They need to better communicate with each other in order to achieve the goals.

There is music on while they are playing. The volume of the music changes along with the heart-rate of the driver. When the heart-rate of the driver is higher, the volume of the music is higher, impeding their communication with each other. Therefore, in order to hear more clearly the guidance of the passenger and control the car better, the driver need to try to calm themselves down when the music is louder. In addition to an onscreen BPM indicator, the color of the sky will change when the driver’s heart-rate reaches >110 BPM, which is another reminder for the driver to calm down.

reflection & future direction

This was my first time working with Unity; the same went for my teammates. We learned a lot about developing in Unity within the short timespan of a month, and still have a lot to learn. The project has currently unfortunately been abandoned for the moment. I am currently taking another Unity game course and honing my skills to address these issues in the future.

Some obstacles we encountered:

  • Unity systems for Mac and Windows are not compatible. We had a lot of trouble importing from Mac to Windows when we were working in the lab, and ended up having to manually re-enter all code.

  • We spent a lot of time debugging and fixing the driver display issues before we could continue with the project, as the computer did not meet minimum spec requirements to develop for VR.

For future versions:

  • Implement multiplayer control. The different views of the two players are currently switched manually (with keypress).

  • Implement heart-rate tracking bracelet. The heart-rate is controlled manually (with keypress).

  • Clarify that the BPM on the screen indicates the heart-rate of the driver. Additionally, might need to use a better indicator than “BPM” as this may not be common knowledge.

Considering the short timespan we were given to learn how to develop, create and import assets and audio in Unity, etc., I believe we ended up with a new, fresh concept to further iterate upon.