Fire Planet: Documentation

Project Description
Fire Planet is a small narrative experience that places users in the position of a firefighter in an alternate world/planet engulfed by wildfire. In this world, civilization has been reduced to living under a protective dome, with water sprinkler units protecting its outside perimeters. Placed in the midst of a catastrophe – where one of the sprinklers has malfunctioned and wild fire approaches the city – it is the user’s role to use their powers (in the form of magical projectiles) to extinguish the flames and fix the sprinkler. This activity reflects an alternate world activity as it is an everyday action for the firefighter/protector in this world, yet which is in a setting that is quite distinct from the reality we know. 

Process and Implementation
Brainstorming
The brainstorming process for this project with Will and Steven was actually quite time-consuming yet fruitful. We started by pitching any action that occurred to us, and which we thought would be interesting to explore and use in our VR game. After much deliberation, and after considering a lot of crazy options that in retrospect would have been too ambitious to successfully complete, we realized that what we were missing was deciding on an experience that would fully capitalize on the advantages and affordances of the VR medium. This realization eventually led us to our chosen concept: that of a protector/firefighter who must protect their city from flames. The action that the user would be doing in this scenario was fanning two large fans to extinguish the flames. We believed that the action of fanning something in the air would be an interesting game mechanic to use, especially in VR. In terms of what would be everyday in this world – we decided that the idea/action of extinguishing flames (and thus suggesting that in this world fire is also dangerous) would work well to suggest the user’s objective. However, the alternate-ness would then emerge through the means of putting out the fire, along with the setting itself (which is on a vast, desert-like planet with a dome-encased city in the distance). Making the decision of how we would switch this concept to a non-VR game was easy – instead of fanning the fire out, the user would now throw an orb at it. This decision was made since we realized it would be more intuitive for users to use the keyboard to shoot something, rather than to fan.

This storyboard was the result of our brainstorming session, showing the location of the user in between the wildfire and the city.

Implementation

The first step we made once we finalized this concept was to divide the different components we had to work on to carry the game out successfully. These were the components:

1. Environment

The environment was the most straightforward part, as we all had experience with it from Project #1. Steven was in charge of bringing it to life – adding the dome and the city, using a rugged landscape hinting at the alternate-ness of the environment, adding fog and then finally adding the sprinklers and the fire. Although this environment didn’t require too many components, the careful placement of them was crucial, as they were key in framing our narrative. For instance: having a wall of fire behind the water particles suggests that these flames are somehow contained to their space, and are thus safe. Placing other fires in front of the user would suggest that these are the ones that are dangerous for the city and must be extinguished. 

Rugged terrain surrounded the user
A distant city surrounded by a dome could be seen directly opposite from the fires
Opposite the city the user could see the fire wall, stopped by the sprinklers

2. Interaction 

The interaction was further divided into more parts. These involved rendering the predicted projectile path for the magic sphere as well as shooting an object in that same trajectory, having collision detection between the user’s projectiles and the wildfire, and finally having the water sprinkler reboot triggered by the user getting close to its vicinity. Out of all these, my main role was to render the predicted projectile path and enable shooting through that same path. To do this, I followed this tutorial that demonstrated how to create a line rendering script as well as a spawning script that allowed projectiles to be shot at that same trajectory. The script was also easy enough to be able to fully customize the look, location, and angle of the line, as well as to change which object would be shot, making these components easy to combine with Will and Steven’s work.

Line render showing the predicted projectile path hitting another object

3. Character 

Since we were using a first person non-VR character, we also wanted to show our character’s hands and also show a type of shooting motion that would hint at the fact that the user itself was generating these magic spheres. When we combined the project, we made sure to sync the activation of the hand animation along with the shooting of the orbs.

Throwing animation with orb

4. Combining all of these to create our story

After getting the foundational interactions done, a lot of time was also spent on tweaking the experience and adding enough information/hints for users to understand the story they were placed in the midst of, while also being able to apply the game mechanics to fulfill their objective. Due to the linearity of our experience, this was an aspect we really struggled with in the later stages of this project’s development. At first, we weren’t sure if it was obvious for the user which fires they were meant to extinguish and which were actually contained by the sprinklers. We even reached the point where we changed the story completely to the point that we took out the sprinklers entirely and the only goal was to protect a tree that got caught in fire. In the end, after much deliberation, we decided to stick to our original idea, while making small yet effective changes in the design of the game that would make the story more clear and intuitive. For this, we ended up changing our sprinkler object to one that was more flashy (and even included an animation!) and added a large, flashy cylinder surrounding it that would always indicate the user which direction they had to go.

A closer look at the water turret with the surrounding cylinder

To be more consistent on components that were separate from the environment’s objects, we matched the look and color of the cylinder with that of the line rendering. We were also very careful with our choice of narration – we didn’t want it to just sound like instructions being read on screen, we wanted the person talking to feel like another character in the story, thus building the universe they belong in. Through a carefully made script, we tried to give enough context about what was happening in the story while also tying in a lot of the components that would have otherwise seemed a bit random and misplaced (such as the water fountains). We also edited it in Audacity to create the feel as if the voice was coming from a sort of communication device – enhanced by the white noise and static we added into it. 

Reflection/Evaluation
Overall, I feel that we did achieve an alternate version of this activity, even if it was a very specific one like putting out fires by throwing a magical spell at them. As mentioned earlier, a lot of the latter part of the development process was spent ensuring that the experience offered enough affordances for players to carry out their mission. Obvious indicators, such as the turquoise cylinder and the color-matching projectile line rendering were key in establishing a relationship between the short term objective of putting out the fires by aiming at them, and the long-term goal of reaching the broken water sprinkler unit. Placing the extinguishable flames in a loose line going towards the broken sprinkler was also an effective choice that naturally led users towards their end goal. Though at first it was hard to do the transition away from VR, I’d say that the medium ultimately didn’t majorly affect the implementation of our idea. Our story was there – we just had a slightly different way of telling it now. Finally, I feel that the end result really reflects the mental image most (if not all) of our team members had of the experience. Initially, we each had our own mental image of how the experience would look and feel, but I’d say our game combined all these conceptions we had really well, which I am really happy about. 

Agency question: 

In Fire Planet, a “meaningful” action that we designed for the user is the ability to throw magical orbs, particularly for the purpose of extinguishing fires. This action is triggered by aiming with the mouse and pressing the spacebar to shoot. The design of it incorporates various aspects outside from this pressing mechanic. The projectile path render facilitates the process of aiming at different objects, since the position of the mouse on the screen does not necessarily reflect the raycast aim of the game. The positioning and throwing animation of the hands that gets triggered when the player shoots is an additional element that aims to situate the player more into their character. We didn’t want it to seem like magic orbs are just appearing out of nowhere, which is why having this hand motion was crucial to situate users into the character of our firefighter. The meaningfulness of this throwing action comes with what it is able to allow, which is the ability of reaching the broken sprinkler and fixing it. In a way, this action is a crucial plot device for our story. Having additional outputs from throwing the orb, such as having smoke emerge from the hit location as well as emitting a sizzling sound when the fire gets extinguished are also choices meant to enhance the experience of carrying out this action.

Game demo:

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.