VR development blog 3 – Making the cut

In previous development blogs I wrote about technical solutions to problems that needed to be solved during the development of my four week game project at school. This time I would like to write about the design process during the project from the perspective of one feature of the game, namely the movement of the player. How we designed it in the beginning, how it changed during the project and what finally made it in to the game.

When creating a game, a designer often needs to reassess previous design decisions. Maybe something didn’t work as intended after testing, unknown technical limitation may be discovered or the team may not have time to implement a designed feature. Any number of things may occur that forces the designers to adapt and evolve the game as the development continues. This happened to my team while we developed our VR game “Night Terrors”.


Night Terrors is a VR puzzle horror game where the player takes the role of a preschooler, afraid of the dark, that wakes up in the middle of the night from nightmares. The player’s goal is to reach the safety of the parents’ bedroom by finding light sources along the way which illuminates the path forward and drives away the monsters hiding in the dark corners of the house.

Screenshot from the final build of the game.

We knew right from the start that our game needed movement of some kind. The core game experience was to explore a child’s playroom turned scary after dark from the eyes of a pre-schooler. The core loop of the game would be to explore the environment and solving puzzles. To solve the problem of inertia and motion sickness when moving in VR we decided that the movement would be teleporting of some kind. What we would need to solve now was

  1. How to explain the teleportation to the player
  2. How it would be controlled and used by the player.

Iteration 1: Throwing the teddy

Concept art by Johannes Palmblad
Concept art by Johannes Palmblad

In our game you would play as a child scared of the dark. We thought that we could build upon this by adding a teddy bear that the player would need to use through the game. That would tie into the theme of the scared child and the nightmare scenario very well. In the first iteration of the teleportation we experimented with making the teddy bear the center of this mechanic.

The child would be too scared to walk freely, afraid of the monster. When a player selected a spot to teleport, the game would then close the players “eyes” and play a running/tapping sound to indicate to the player that the child was running to the spot with his/her eyes closed. The game would then open the player’s eyes at the new location.

We really liked this, it explained why the player “teleported” around instead of free movement and tied into the theme of the game very well. What we needed to solve now was how the player would initiate a teleportation and how to use it.

We decided to use a teddy bear. If the explanation of why the player could not freely move around was the fact that the child was scared, the teddy bear would be a good fit as an aid for teleportation. The player would throw the teddy bear to where the player wanted to go. The teddy bear then made that area safe so that he child could go there. The child threw the teddy bear and then ran to it. It seemed like a good idea…

… until a couple of questions came up. If the teddy bear made the child feel safe, why would he/she throw it away? What would happen if the player threw the teddy bear somewhere the player could not teleport? For example, high up on some unreachable furniture or into a dark space of the room where the monster may lurk? How would we communicate to the player where it was and wasn’t ok to teleport? If the player threw away the teddy to a spot where it wasn’t ok to teleport to, how would teddy return to the player? Also, the physics engine of Unreal made it very hard to throw things with any precision making it hard to hit the space where you would like to go. We needed to rework this in some way.

Nothing in this iteration was implemented. The problems that we discovered came up in discussions and meetings and we decided to iterate further before we started to implement.

Iteration 2: Using the teddy

Concept art by Johannes Palmblad
Concept art by Johannes Palmblad

We still liked the idea of using the teddy to move around so we wanted to keep that. What we decided to cut was the throwing part. The player would now need to pickup and hold the teddy to be able to teleport. When the player held the teleport and pulls the trigger on the controller a teleportation indicator would appear where the player was aiming the teddy. The player could then aim and select a spot to teleport there. The indicator would change colour and other effects to indicate if it was ok to teleport to that spot or not.

This seemed like a good compromise. We kept what we liked about using the teddy and solved most of our problems from before. No problems with physics engine, the player would get clear feedback on where and where not to teleport.  However, other problems quickly arose. We still had the problem of the player throwing away teddy to a dark spot and how to return it to the player. The player could also drop teddy and lose it or simply misplace it and forget it. We didn’t think these problems invalidated our solution and we still liked it very much, so we decided to create a support system that would solve these problems. The player would be able to press a button to get an indicator where the teddy was if the player lost it. The teddy would return if the player threw it into dark spaces of the room. We were confident we could solve any such problems that arose. What ultimately changed our minds completely was when we tested this mechanic in the game.

What we discovered when playtesting was that many players misjudge distances in VR. Especially when the proportion of all assets would be bigger than normal in our game to make the player feel small and childlike. When testers played the game, they would often misjudge distances and sizes of objects. For example, the testers would often teleport too far away from a door at first when trying to open it. The player would teleport to a spot where they thought they could reach door only to discover that it was an inch too far away. They would then need to readjust their position with a new teleportation to be able to open the door. This was particularly common in the beginning of the game with testers inexperienced with VR. As the test went on they got better and better at judging distances but every so often they would need to readjust when performing some task.

When the teleportation was tied to the holding and usage of a specific item this adjustment and repositioning became a hedge problem.  The testers would often pick up the teddy with their dominant hand and teleport. Then they would drop the teddy to try to open a door or something else, and discover that they didn’t reach. Then they would need to find where they dropped the teddy and pick it up only to have this process repeated. Movement and teleportation became something of a hassle and took too much focus and time away from the player solving puzzles and taking in the environments.

In this iteration, we implemented mechanics of the player using the teddy bear to teleport around. We didn’t implement any of the support system that we designed to solve the problems.

Iteration 3: Being close to teddy

We decided to cut that the players would need to hold the teddy to be able to teleport. Instead the player would be able to teleport freely when in close proximity of the teddy. That way we would solve the problem of repositioning but still keeping the mechanic of the safe haven around the teddy.  This iteration didn’t even survive a whole design meeting and here’s why. Many problems quickly came to mind. We now had two system that restricted the player’s movement, the proximity to the teddy and the dark and light areas of the game. This felt weird and created a huge problem of how we would communicate to the players where and why they could and couldn’t teleport. Also, the indicator for the proximity to teddy would need to always be present making the game feel more “game-y” and less immersive.

This iteration didn’t get implemented at all.

Iteration 4: Cutting the teddy

By now you may wonder why we persisted on using the teddy. Why we kept adding bandages to the feature and still kept it. We started wondering that too. The whole point of the feature was to create a bond between the player and the teddy, that the player wanted to have the teddy close at hand. What we had created was a system where the player was annoyed with the teddy. It was a thing that the player needed, but didn’t want. And the teddy had become a source of frustration rather than a source of safety.

Final teleportation

What we ended up doing was cutting the teddy all together. The player can freely teleport without holding it and is only restricted by the light and dark areas of the game. This worked very well and kept the flow of the game going. The player could focus on moving around, explore and solve the puzzles. The teddy is still in the game and the player can still pick it up and carry but it is no longer tied to teleportation or movement.

Kill your teddies, what we learned

What did we learn from this? Don’t be afraid of killing your darlings, or in this case teddies. In hindsight, we may have clung to the idea of a teddy a little bit too long and we should have cut the teddy earlier than we did but we did cut it when it felt necessary. We could also have playtested things earlier as well. Most problems were discovered during playtest and in this case we could have discovered the problems of repositioning even before we implemented the teleportation through teddy. It’s classical lectures in game design, fail early, test often and don’t be afraid of cutting features if they don’t fit.

You have heard this so often but it’s good to experience it first-hand. It also taught me the value of hard deadlines. If we didn’t have any hard deadline, the danger of us continuing to add bandages to features might be very high. The hard part is not recognizing that a feature is a problem; the hard part is when to decide when to cut them and when to solve them. Deadline forces a way of thinking of these problems more critically, and forces you to assess the values of features in a different way than otherwise.

All concept art in this post was made by Johannes Palmblad. His excellent portfolio can be foud here.

Best regards!

Simon Engqvist

VR development blog 2 – The physics of doors

I have previously written about my approach on scripting how to pick things up in VR in Unreal Engine 4 for my four week long school project. This time I will write about how I took that approach and applied it to scripting a door with a door handle.

The goal is to create a believable door in VR, with features of a real door. The player must pull down the handle and push or pull to be able to open the door.

To reiterate the previous post, I used physics handles to script so that the player can pick up objects and move them around without losing collision on the object and not getting any bad clipping problems etc. The physics handle “pinches” the object at the location the player hold his hands and then moves following the player’s hand. The handle then drags the object along with it, allowing it to simulate collision and physics along the way.

I’m using the same technique here. When the player interacts with the door a physics handle will pull down the handle and the door.


To simulate the rotation of the door in the hinges and the handle in the latch bolt, I used the Physics constraint component in Unreal.  This component is used, as it says, to constrain physics actors in different ways. The component has support for limiting an object to move both in linear and angular motion. For the door I used only the angular limitation.  I tested this out and got some nice results.

[one_half][/one_half] [one_half_last][/one_half_last]

To get the door handle to keep upright I used a feature in the physics constrain component called motion. That feature applies a force on the constraint object in a specified direction. I applied angular motion on the handle upward against the ground and it was done.


What I did next was to apply the physics handle to the blueprint the same way as before. When the player put his hands  to the door handle the physics handle “pinches” the door handle and moves the handle alongside the player’s hands with the same approach as the previous post.

Fine tuning

I now had the basics of the door down but it still needed some fine tuning to feel right in the game. The first thing is adding script so that the door can’t be opened unless the handle is pulled down, the second thing is creating some feedback to the player that this door can be opened. I solved both these problems with the same solution.

I added a little collision box at the top of the door, slightly to the side. This collision box only collides with the door and prevents it to be fully opened. When the handle is pulled down the box is deactivated and the door can be opened. By placing it slightly to the side of the door I created a little wiggle room for the door to wiggle in while it is closed. This signals to the player that the door is intractable and movable when they grab the handle.


More fine tuning

After this I needed to fine tune the door even more. Lots of parameters like the dampening of the door and the specific angular motion of the handle needed to be calibrated to get the right feeling. Most surprising to me was that the door felt much better when disabling gravity on both the handle and the door. I theorize that  when removing the extra force on the door downwards the physics simulation of the door when pulling or pushing it became less complex and therefore more responsive.

That’s all for now!

Best Regards!

Simon Engqvist

VR development blog 1 – Picking things up

At my school we are currently having a four week VR game project. My group have chosen to make an atmospheric horror escape room game where you play as a kid trying to escape from the monster in the closet. In this  post I’m going to talk about my system and implementation for picking up items and objects in the game and interacting with them. We are using Unreal Engine in this project and hence we are using the Blueprint scripting language for almost all scripting.

In my team we are three designers. I have a lot of experience with coding and programing but my teammates are less inclined to do the heavy lifting in the scripting department. Therefore it was natural that I was selected to be the lead programmer for this project.

The first thing I would like to write about is how I solved the problem of picking things up and interacting with them in VR.

The goals

I would like to have interaction with objects within our game to feel similar to the interaction in Job Simulator.

First approach

The first simple approach to implement picking things up in VR is to attach the actor that you would like to pick up to the motion controls themselves. Then when you move the motion controls you move the actors as well.

The problem with this approach is that it messes up physics quite a bit. The motion controls themselves actually teleport themselves and since the game can’t stop the player to physically move their hands, the game must allow the player to move their hands through and within objects. The attached actor may then clip through other objects within the game. This could break the immersion for the player but also introduce lots of game design problems as well. For example the player can reach into a locked vault and pull stuff out. Also we introduce physics problems as well since the player then can move items within each other which causes all sorts of problems for the physics engine in Unreal.

I needed to find another way.

Second approach

The second approach I decided to try was to fully take advantage of the physics engine in Unreal. Make the engine do all the heavy lifting for me when it comes to collision and physics. When I researched my options I stumbled onto this video  that demonstrates physics handles in Unreal.

Physics handles in Unreal are a way of moving physics objects around, it’s precisely what I need! It works like this: You specify to the physics handle the component you would like to grab and where you would like to grab it. Then the physics handle “pinches” the object and holds it in place. When you move the physics handle it moves the “pinched” object along with it. Fully simulated by the physics engine! I experimented with this for a while and ended up with the following solution for moving items:

  1. Find the location where the player hold their hand on the object
  2. “Pinch” the object with a physics handle at that location
  3. Move the physics handle every frame alongside the motion control
  4. Release the “pinch” when the player drops the object

This resulted in much better feeling interaction with objects. The objects behave really well when colliding with other physical objects and it keeps clipping to a minimum. The implementation may still need some fine tuning but will work for now!

Drawers and more…

I tried the same approach when I tried to implement other objects that the player could interact with. The first of these was a drawer. I used a physics handle to “pull” the drawer along a physical constraint track. This resulted in some smooth drawers!

I think that the combination of physics handles and physical constraints can be a powerful solution for future implementation of for example levers, buttons, strearing wheels etc.

I hope soneone will find this intresting! Next time I will probably write about the interface connecting the hands with the interactable objects.

Best regards!

Simon Engqvist


My name is Simon Engqvist and this is the first entry to my development blog. I’m a game designer and scripter in learning and I attend Future Games program for game design. Before that I worked as a programmer and IT consultant for four years.

Right now at school we are doing the second of three planned game projects. The first one was a two-week project and you can read about the result of my group’s work in my portfolio. For this second project, we just finished two-week pre-production and are going into a four-week production phase.

The theme for the second project is virtual reality, with the Oculus Rift headset and touch controllers using Unreal Engine. It’s a very exciting and challenging theme that will require a lot of everyone in the groups. My group have decided to make an atmospheric horror escape room game where you play as a scared pre-schooler trying to escape from the monster in the closet. These following weeks I will try to post some development diaries with implementation examples and thoughts about the development process of the game. The first post that I will post shortly will be about my system for picking up items and props to use them and interact with them. Please stay tuned.

Best regards!