Documentation Project 2

Project Description

The activity that we chose as a basis for the interaction and the environment in which it is carried out is pressing a light switch. It is an action that most people become familiar with early on in their lives and they naturally develop an assumption that there should be a change in the lighting of a room when a light switch is pressed. We decided to work with this very natural and intuitive interaction that most people have grown up doing and add an element of surprise and unexpectedness. After pressing the switch, we make the surroundings and the environment suddenly alternate. Although the user begins with being located inside a regular kitchen and dining area, after exploring the room and interacting with two switches placed in the room, the user learns that the space becomes irregular after the switches have been pressed. If one of the switches is responsible for turning on or off the lights in a room, therefore indicating a more natural and common interaction, the other causes fireflies coming from plants to gradually fill up the room. This response transforms the room in an alternate world and gives more capability to the simple interaction of pressing a switch, as the user now learns that there are more possibilities than only the commonly assumed one.

*

Process and Implementation

Our brainstorming sessions started while still preparing the project for a virtual world using a VR headset and controllers. The initial idea was to illustrate a morning routine and what the effect of a morning coffee has on a person’s mind and imagination, as it comes back to life after the night’s sleep. We were first asked to come up with three interactions and all three that we chose were part of a regular morning routine that many people have. At first, the users would find themselves in a dark hallway where the only thing in front of them would be a door. The back of the hallway would be dark and access to the rest of the hallway would be restricted in the VR environment. Using visual cues such as an apparent handle, we would invite the users to interact with the door eventually opening it and finding themselves inside of a kitchen. There would be different appliances and furniture inside of the kitchen, yet some of the objects would have stronger visual cues than others, therefore guiding the users to explore them in more depth. The first object to appear more apparent would be a light switch which would be responsible for turning on the lights in the room, just as a person does when walking inside of a dark kitchen in the early morning. The next and last object that would allow for an interaction would be a coffee machine. The interaction would be to grab a coffee cup and when it happens, the environment around would suddenly change – the color ambience of the room would become more vibrant, plants in the room would go from dead to green, etc. This reaction would represent the effect of the morning coffee on a person’s mind.

*

Unfortunately, we all had to adapt our initial ideas and narrow it down to just one interaction that would not be carried out using a VR headset anymore. After discussing with my groupmates, we decided to keep the interaction of pressing a light switch as we thought it was a captivating interaction and we could see using it as a means to still carry out some of the ideas that we had previously brainstormed. Therefore, we changed the concept of the project and decided to make an alternate switch world where instead of only a regular action happening from pressing a light switch also more unprecedented and unusual events would take place, such as turning fireworks outside of the window on or off or filling the room with fireflies. We decided to carry on with the idea of the fireflies instead of the initial idea of color change or dead plants, as we liked the visual effect and feel that the fireflies added to the room. We also added fireworks that are happening outside of the window and which can be turned off by pressing the light switch which also controls the lighting, therefore suggesting an alternate action by using an everyday object. The user starts in a room which is lit up and has fireworks but can then turn off the lighting and the fireworks to move on to to the next switch to fill the now dark room with fireflies.

*

The everyday component in our world is the initial environment and the room itself, as well as the action that is carried out in this setting. The kitchen and dining area was made to look as realistic as possible with normal and everyday appliances and furniture. The light switches look like regular switches, and we also worked hard to make the visual effect of pressing the switch look natural – the switch flips up or down when the interaction happens. Up to the moment when the user realizes that there are unexpected reactions happening in this world, the user would just think that it is a regular everyday setting. However, once noticing the control over the fireflies and the fireworks, the user realizes that the world is more alternate than an everyday world.

*

It was definitely a challenge to implement our interaction with using a keyboard instead of a Vive system. When we were thinking about how the user could use the Vive controllers to press the light switch it seemed as a more intuitive interaction, as the user could literally use their hand to approach the switch in the world and then physically press a button on the controller. Because of changing the project to a keyboard, we had to come up with a different solution. In order to move around we used WASD keys, as they are very intuitive in the gaming world. In order to give some guidance to the user about how to press the switch we included some visuals and there is a text box that pops up on the screen with instructions to press the key ‘e’ on the keyboard to press the light switch.

initial moodboard before project alterations
initial storyboard before project alterations
initial game scene after project alterations
the switch asset
updated game scene with ceiling lighting on – kitchen area
dining area
ceiling lighting off – kitchen area
before entering the switch area
when entering the switch area
the fireflies starting to fill in the room

*

Reflection/Evaluation

I think that we managed to achieve an alternate world version of our chosen activity because the results from the interaction are very different and unusual from what a user would normally expect from pressing a light switch. In order to make the experience more intuitive to the player we included a small, but in my opinion necessary visual cue which was to add small lights on the switches which would draw the player’s attention to the switch area. In that way, we tried to design a more intuitive experience to the player, similarly as with adding some text when the user enters the switch area and is able to press a key to turn the switch on or off. Overall, I think that the end result was in line with how we had envisioned the revised version of the project. We utilized those keys that we thought would be the most intuitive in a keyboard world. However, I think that if we were able to use the Vive headset and controllers, the medium would allow for a more thorough and memorable experience for the user, both in terms of controlling the interactions and also the immersion within the space.

*

Agency Question

The action that we designed for the user is the interaction with the light switch. We looked thoroughly to find the most appropriate light switch asset for it to look the most intuitive and natural in order for the user to not confuse it with something else. We wanted the user to start off in the kitchen, look around and be sure right away that there are two light switches located on either side of the walls. We then added the small lights that indicate the player that there is something awaiting them in the switch area, as the lights are subtly inviting them to move forward to the switches. Once the user enters the area we incorporated the textbox that appears on the screen with instructions on what to do next. Ideally, we wanted the user to use their hand to approach it in the direction of the switch and then press a button on the controller if we were using the Vive setup. However, as this option was not possible anymore, we took a step back and looked at the simplest versions of game design. In our opinion, an effective and intuitive way to communicate something in games is by using a subtle text box with instructions, therefore we chose that to communicate to the player. Lastly, we made sure that the switch would change position after it is pressed in order for the user to feel like some action actually took place. We anticipated that he user would otherwise feel a sense of disappointment and unfinishedness if the state of the switch did not change. We think that these steps allow for the user to feel like the action was meaningful, as they have been guided towards the switches by using visual cues and they anticipate something to happen because it is their expectation based on real life precedents when seeing a light switch and pressing it.

Fire Planet | Documentation

Description: Fire Planet is an experience/game that takes place on a planet engulfed by never-ending fires. In order to ensure the security of this mysterious civilization on this planet, a dome has been erected surrounding their city as well as powerful water sprinklers to fight off the constantly encroaching fires. The user assumes the role of a protector of the planet, using their mysterious magical powers to shoot projectiles that kill the planet’s flames. In this scenario, one of the sprinklers malfunctioned, causing the fires to move towards the dome. The user is tasked with putting out the fires and reaching the broken sprinkler to fix it and ensure the security of the city.

Brainstorming: We started brainstorming by discussing what everyday actions would be interesting to replicate in VR in an alternate reality. We discussed ideas such as picking up trash, throwing balls and we eventually settled on designing an experience around the action of using a hand fan. This would entail swinging the VR controller from left to right or up and down. From this, we came to the concept of using a powerful hand fan to blow away or put out fires.

Initial scene brainstorm
Image result for temari fan
I envisioned the fan to resemble something like this (not sure if this is what my group mates envisioned)
This is a character from the anime Naruto who swings a powerful fan

Because using a hand fan powerful enough to put out large fires was an idea that is something that is not realistic, we created a concept revolving around firefighting in an alternate scenario. We decided to create a narrative of a civilization on another planet constantly threatened by approaching fires.

Due to the changes we had to make for the project, we decided to simplify the project from waving a fan, which we believed would not be as compelling of an experience with a mouse, keyboard and computer screen, to simply aiming and throwing orbs of particles that would put out the fires. Because of the disconnect between the motion of throwing or aiming and using a keyboard/mouse to do this, we found it hard to really make the action intuitive besides from most people’s engrained experiences of using a mouse and a keyboard to play video games. But in this alternate scenario, the action of pushing the hand forward mimics that of throwing something, and we found it vital to add this animation of the hands to the experience.

Process: We divided the work with me doing the scene design and character animation, Will with particle interactions and Mari with the projectiles. We worked together to bring all of our parts together to finish the project.

For the scene design we all had a desolate planet in mind with no vegetation. From this I decided to make the environment dark and the terrain rocky and dark. I designed the city in the background with a sphere with a transparent material and used some building assets along with some bright lights to have it contrast the desolate environment that the user is standing in. I also added a red spotlight from that emitted from the base of the dome to make the environment red in order to add some urgency to the actions required of the user.

Animating the hands was something that we believed had to be done so that the experience so that the user would feel like they are the firefighter. I initially started experimenting with models from Mixamo but found these difficult to control and I could not figure out how to remove certain parts of the avatar so that it would not obstruct the camera’s main focus on the hands. With Sarah’s advice, I was able to find a way to use the VR hands. I was able to figure out how to add simple animations (point, closed fist, open hand) to the VR hands. I also felt like the default skin of the hands fit nicely, with red and black gloves resembling the outfits that a sci-fi game character would use.

Lastly, I worked on creating the orb that would be shot at the fires. For this, I created a simple particle system using Unity’s Visual Effects Graph and attached this to the projectile game object that Mari made.

To tie this all together, we added a voiceover to greet the user and provide some context for the experience, as well as a closing voiceover when the user completes the task. We also played around with how we could best guide the user towards the sprinkler they needed to fix. We did this by adding a semi-transparent blue cylinder with the sprinkler inside. We also added an animation for when the user would reach it, as the sprinkler springs up and jets of water begin shooting out of it to reaffirm that the user completed the task

Screenshots

Gameplay:

Reflections:

Overall I had a great time doing this project and I am happy with how it turned out. It was difficult to really mimic motions with the VR controllers to a mouse/keyboard experience, and I am happy that we still somewhat stuck with our initial concept because it was something that I really liked. I am happy with how our game design turned out and the cues and instructions we provided the user. Our positioning of the fires and the cylinder ahead of the player allowed us to make it intuitive that the user must go forward to complete some objective.

Agency Question:

I believe that we gave the user agency that compelled them to act in a certain way by giving them the ability to react to the environment, in this case, they have the complete ability to extinguish the fires. Because fire is something that evokes an immediate response of fear and danger, we believe that a user’s immediate reaction would be wanting to put out as many fires as quickly as possible. We spawned the user right next to the fires facing the objective so that their instinct, even without any cues from the voiceover, would be to put out the fires. Furthermore, we placed the fires they must put out in, more or less, a line towards the objective, extinguishing the fires until they reached the blue cylinder. Lastly, we wanted the final interaction of the user entering the cylinder to be a rewarding one, delaying the animation of the sprinkler turning on and the ending voiceover so that it is apparent that the user accomplished their job and successfully controlled the environment’s dangerous fires.

Project 2 Documentation: Boxing with Ares

PROJECT DESCRIPTION

  • Team member: Neyva, Nhi, and Vince
  • Environment: dark and ominous, yet there is still an indication of hope
  • Boxing with Ares is an interactive experience that aims to immerse the users in a bloody reality. We focus on creating an environment similar to the one in the movie The Matrix, which is simple yet gives a sense of exploration for the users by interacting with what lies in the scene. Our scene consists of the plain ground with foggy surroundings, the red bloody sky scattered with floating small punching bags, and the main punching bag for interaction. Our basic interaction is punching action with the emphasis on the unexpectedness – doves flying out in different angles, speeds, and positions around the punching bag. While the interaction is analogous for finding hope in the darkest time, it is also open to the users’ interpretation that hope is escaping their reach, impossible to capture. In either interpretation, this still serves as an alternate reality experience that deals with different manifestations of hope, be it the shining beacon in the darkest of times or the flickering illusion that forever remains out of human reach.
Dark, ominous, and bloody environment
Movie scene in The Matrix

PROCESS & IMPLEMENTATION

  • Our initial idea for the project is completely different from our final one, though the punching action remains the same. In our first sketch, we planned to have Punching the bag (every time the user punches the bag, besides the normal oscillation, there would be white/black doves flying out magically. The white doves represent peace, and the black doves represent the concept of war), Pressing the button (this button would change the color of the doves. Every time the user presses the button, the color will change from white to black and vice versa), and Theater stage: in the back of the stage, we are considering putting words/colors/pictures to reflect the theme of our project.
Our first sketch for our project
  • After our first idea presentation and the current switch to online class, we decided to constraint our project to one interaction – punching bag & doves flying out – and replaced the theater stage with an ominous empty environment.
  • Our idea for the environment was inspired by the photo below.
Environment
  • After deciding on the final idea for our project, we divided the work as follow: Neyva worked on building the environment, Vince and I worked on developing the interaction.
  • Our final environment reflects the true identity we wanted to bring in this project. Using a grid to lay out small punching bags dotted in the dark bloody sky and using particle systems to create the fog effects in the scene, we created the environment as an implication of darkest times, creating a feeling of loneliness while the player stands and observes the scene.
Red bloody sky
Small punching bags
  • For building the interactions, we wrote 5 main scripts in our project:
  1. animationController: trigger the animation whenever a punch happens. We also put a Punch class that will return the anim.GetCurrentAnimatorStateInfo(0).normalizedTime in the get method.
  2. collisionDetector: if we detect a collision and the value anim.GetCurrentAnimatorStateInfo(0).normalizedTime < 1, we play the punching sound as if the player punches the bag.
  3. RaycastTracking: we decided to use raycast to brighten the color of the punching bag whenever the player looks at it. This aims to attract the player’s attention and serves as an invitation for interaction.
  4. ChangeColor: change the emission color. However, in the end, we decided to not use this feature because the fog we added using the particle system makes the change not obvious anymore.
  5. BirdGenerator: we decided to create a class bird and encapsulate all attributes into this class, and when we dynamically create the birds, we assign the random values to the attributes of the birds. This makes things consistent and easier for us if we want to add additional attributes in the future.
  • Camera: we used First Person Controller camera and boxing man are the children object of First Person Controller. By doing so, when we move the mouse, the boxing man would move accordingly. We also positioned the camera and limited the looking angle (x rotation: -60 to 45) so that the player can only see the hands of the boxing man and the space above
Camera
  • Animation: we have two animations in this project: boxing man animation and bird animation.
Bird prefab

Sound effect: we added background sound (https://www.youtube.com/watch?v=Qm-El3qztgw) and punching sound to make the user experience more interactive.

REFLECTION & EVALUATION

This project is an amazing learning process and great team collaboration. Though our final project looks different from what we envisioned in the beginning, in my opinion, it is a successful adjustment to feedback and improvements during the development of the project. In addition, the process of writing these scripts and debugging was really frustrating but rewarding in the end. Being able to understand and adjust the available resources we found in class and online to implement it into our project is definitely a great learning process. While we could not use VR for this project, the project still reflects what we want to deliver and even more successful than we thought it would be.

AGENCY

Agency has been described as “the satisfying power to take meaningful action and see the results of our decisions and choices” (Murray) and “the actions players desire are among those that they can take” (Wardrip-Fruin). In our project, the reason we choose to create an environment where there is only a punching bag facing the player when the player enters the scene is to invite him/her for an interaction with the punching bag. While the punching bag immediately captures the players’ attention and hints them at punching action, it also triggers the confusion and hesitation in this ominous environment. Every time the player punches the bag, there would be a punching sound effect that gives the player a more real and powerful experience. The more they punch, the larger the number of birds flying out. Yet the birds will fly away from the player out of the reach. This reflects the aforementioned interpretations in the Project Description, giving the player some thoughts and reflections on his/her own.

Documentation for Project 2: Switch World

The activity we used as the basis for the interaction was the action of turning on and off the light. To put this interaction into use, we needed an environment in which such an action would be normal. Thus, we came up with a simple room/kitchen with two light switches on the side. One light switch was completely normal: flipping it on and off would turn the light on and off. The other light switch however, would prompt fireflies to appear from the plants. This is our primary alternate world activity where turning on and off a light switch would create something unusual. The light switch also controls the fireworks that the user can see outside the window.

When brainstorming, we wanted to create an environment that incorporated our daily interactions in the real world. We started off with actions such as turning off the alarm in the morning, waking up and opening the door, turning on/off the light, and grabbing a cup of coffee. In the beginning, we assumed that we were using the Vive system, however due to outside circumstances, we had to stick with Unity only. We ultimately decided to choose turning on/off the light as the main interaction. We felt that this action could create more possibilities for us to explore an alternate reality. By turning on/off a particular light switch, we could have different events show up. In our alternate world, most things were just like normal. The interaction itself was also ordinary, but what happened from the interaction was not necessarily ordinary. Unfortunately our main mode of interaction with the world was the keyboard, the interaction was just simply pressing ‘e’ on the keyboard when the user was near the light switch.

Figure 1: Initial Idea with 3 interactions
Figure 2: First take on kitchen

The steps to implement the interaction were quite simple. We primarily used the onTriggerEnter and onTriggerExit to determine whether or not the user was near the light switch. Once the user was within the box collider of the light switch, simply pressing the letter ‘e’ would result in the light switch being turned on or off. For the fireflies, they also fell under a similar script but we used the particle system to create the effect.

I definitely feel like we did achieve our goal of creating alternate world. Turning on the light to generate fireflies or to start fireworks in the middle of nowhere is not something we see in our world. The design was supposed to create a serene experience in which the user was just there to observe. There’s no real goal for the user; the user has freedom to explore the room and check out the effects of the lights. In an attempt to guide the user, there are GUI’s asking the user to press the ‘e’ on the keyboard when they are near the light switch. Additionally, there’s a light on the light switch to prompt the user to explore. I felt like the end result coincided with what we originally envisioned. We created an everyday interaction and turned it into something unordinary. However, I felt like the medium itself took away from the implementation of the idea. The motion of turning on and off the light switch is quite distinct; it is something we’re familiar with. We couldn’t quite encapsulate the action with our medium as we could only use the mouse click or the keyboard. Perhaps if we were using the Vive, we could make the action much more realistic. Pressing ‘e’ on the keyboard and seeing a change doesn’t create the same sense of bewilderment if the user were to actually flip the switch.R

Our primary action was to turn on/off the light switch. We gave the user freedom to move around within the room and there is nothing else the user can do except from move around or turn on/off the switch. To prompt the user to carry out the action, we had GUI on the top left corner of the screen when the user approached the light switch. Additionally, there’s a light on the light switch that would curate curiosity

Figure 3: GUI when near light switch
Figure 4: Completed look of kitchen
Figure 5 : Fireworks
Figure 6: Fireflies
Figure 7: Lights turned off

Documentation: Project 2

Description

Link to Github repo

Link to executables

For this project, we decided to use the act of calculating as a basis for our interactions. The idea is that the user starts in an everyday setting — in our case, a bedroom — but upon interacting with a calculator on the desk, the user is transported to an alternate world where the calculations will occur. In this world, calculations are performed by dragging and snapping blocks together rather than pressing keys in a “2D” setting.

Process & Implementation


Conceptually, the ideas were more or less there from the start. Most of the difficulties came in the implementation of the systems and dividing up the work. Tiger and Keyin worked on the bedroom world, Yeji worked on the alternate world, and I worked on scripting/back-end. Most of the interactions were made with Rigidbody physics and Monobehaviour.OnMouseDown. The world switching was achieved by placing both worlds in the same position and toggling their enabled state when the calculator was clicked. To achieve more visually pleasing graphics, we used LWRP and the Post-Processing stack as well as several PBR shader graphs.

The development journal contains videos of the process of creating the dragging/snapping interactions.

Reflection/Evaluation

I feel that we successfully provided an alternative interpretation of the act of calculating. Although the idea itself is relatively novel, the implementation works well because the actions used (dragging, clicking, moving) are rather intuitive and easy to learn. This end product actually ended up being much more robust than we initially expected, despite the fact that we couldn’t use VR. For example, we initially never thought about including interactions in the bedroom world because we were primarily focused on interactions in the calculator world. Perhaps the lack of VR allowed for more freedom of implementation, however; after all, we no longer had to worry about VR-induced motion sickness and limitations in movement when programming the worlds.

Agency

The interactions that best enable user agency are those found in the bedroom world. Upon entering the bedroom, the user is first encouraged to walk around. The user will then likely bump into the chair, which will move in response. The user is then encouraged to interact with other objects in the scene by dragging and clicking around; by making most of the items in the scene interactive or responsive in some way, the user is able to feel that they have a great degree of influence over the environment. This helps contextualize the sense of freedom that the user is meant to feel in the alternate world too and perhaps serves as a way to let users lose themselves in this environment.

Development Journal – Project 2: 3D Calculator

For this project, I teamed up with Ben, Keyin, and Yeji. Our first discussion led to quite a few different ideas (as shown at the left bottom corner of the picture below), among which we settled on one about a “3D calculator”.

Storyboard by Ben

The idea originated from our brainstorm of everyday activities, where Ben came up with coding and programming. He suggested we could alter the action of programming in an alternate reality by making it more intuitive and graphic. Instead of typing, the programmer can drag around cubes that represent different functions or values and put them in sequences to present algorithms. I liked the idea, but thought programming wasn’t “everyday” enough, so we later switched to the idea of calculating with a calculator, which is similar to programming in a mathematical and logical way.

Basically, the core idea here is to reimagine the interface of logically creative processes within a VR context, and we are only using calculator as an example to present it. In order for the user to feel “everyday” in the alternate reality, they will first find themselves in a very normal bedroom scene, where there is a calculator in front of them. Once they touch it, the scene switches to a sci-fi-ish enviornment where there are cubes floating in the air, which tempts the user to drag them around, combine them, or separate them.

3/16 Edit: I saved this as draft but forgot to post it on the due date

Development Journal: Interaction Project

For this interaction project, we initially had several outdoors-oriented ideas to work off of, including camping, gardening, and rock climbing. Eventually, however, after some consideration, we decided to go with a 3D block/cube-based calculator. In this environment, the user would be able to drag and snap blocks together to calculate results from the block’s contents. The idea was to focus not only on interactions between the user and the the block (e.g. dragging), but also on interactions between the blocks themselves. For example, if you were to drag blocks with 1, +, and 1 together, you would get a result of 2. We also decided to have some sort of teleportation/environment-switching mechanic in which a small calculator could be clicked to move from a “normal” office environment to the block-snapping environment.

Update 1

Video

Basic block snapping is functional, but has several glaring issues with collision.

Update 2

Video

Block creation and deletion has been implemented. Some of the collision issues have been solved, but other issues with raycasting and positioning decisions are now present.

Update 3

Video

Circuit-style interaction between blocks has been implemented. Most of the issues regarding collision, raycasting, and positioning have been solved.

Update 4

Video

Figured out and finalized interactions between player and blocks. Also added mechanics for adding different types of blocks as well as an output block.

Project 2: Development Journal

For Project 2, we decide to choose the interactions that seem to be the “everyday” routine for people. Initially, we set the scenario to be a regular morning. The interactions we come up with are getting out of bed, turning off the alarm, and drinking a cup of coffee made by the character. After talking to Sarah, we realize that waking up in bed and changing posture could be challenging to implement since we don’t have a delicate model for the character. Therefore, we change the setting and try to make the interactions different in how they are triggered by the controller. Eventually, the three interactions are:

  • Opening the door to the kitchen (with the trigger on the controller)
  • Turning on the light switch and adjusting brightness (with the touching pad)
  • Making coffee and drinking it (with buttons on the controller)

Here’s the whole-scene storyboard of our kitchen area. On the left side is a window from which dim light comes into the dark room. On the right side is the light and there will be a small glowing cue on it so that the player knows it’s interactable.


Update Mar.3rd

Ganjina and I started working on the environment setup and we choose to use a low-poly kitchen asset. We think it creates a homey feeling and some props come along with the animation, such as opening the microwave. However, they could only automatically play the animation. We will work further on modifying the animations and try to activate them with our desired interaction.


Update Mar.10th

Ganjina has finished setting up the kitchen scene and we like the space that is left for the player to walk around. The dining area looks warm and cozy. Hopefully, later we will only do some minor changes to the indoor decorations.


Update Mar.12th

Luize is working on adding rigid body to the GameObjects and adjusting indoor lighting. Chris gets the script for the light switch to work.

For the prop animation, unfortunately, I couldn’t get the animator working. Therefore, I decide to get rid off them and write scripts for the fridge and microwave. The principle is rotating the door along an axis when an interaction is activated.


Update Mar.14th

Luize has finished updating the kitchen and we like the wall color and lighting she chooses for the scene.


I have been working on the firework animation. Each firework emission has a sparkle as its Shader appearance. Each emission comes with it a trail and sub-emissions of small sparkles. Each burst at the end of a trail comes along with a bunch of delayed sub-emissions so that the bursts stay longer.


Update Mar.15th

I finished the firefly visual effect and manage to get the switch mechanism right. For the fireflies, their sizes change with a curve pattern depending on their lifetime. The curves are random but all come gradually to zero till one particle disappears. Noise is added to the flying trail so that it looks more realistic.


Development Journal – Project 2

3 Mar

For the interaction project, our group discussed several possibilities among which we finally chose the one with the idea of 3D cubes. The other ideas, for example, building an alternate backyard, can be fun as well, but we prefer to play around the flexibility of simple 3D shapes in a limited space and make things creative but also simpler and clearer.

The scene would start from an ordinary room and the participant is able to move around in a room scale. When the participant pick up a calculator or turn on the computer in the room, he will be transported to another alternate world in some way with only the calculator or the computer still in sight. The background will be different from the initial room view. Instead, there will be a lot of 3D cubes floating in the dark which represent the operands and operators or programming statements. The participant can use dragging and throwing to control these cubes and get the result of calculation or run some certain code. The result might be dropped from high in front of him. By reimagining the process of using a calculator and a computer in this 3D way, we would like to create a totally different experience which can be more involved and visualized.  And here is our story board written by Ben.

10 Mar

We started form the first realistic bedroom scene. We built the room scene from scratch including picking the proper material and importing furniture models with consistent style. Here we also added the Rigidbody and Collider to the chair so that the chair could be movable and interact with the participant.

After gathering things together, we started to design the light settings to create the feeling of warm and cozy. We made the whole environment relatively dark as the sunset and the light in the room is slight but warm. To highlight the calculator on the desk, we chose a lamp to project light right on the calculator. And the lamp itself was not lit at first, but we put a bulb in it by adding a sphere with emission to make it look natural. We put the staircase in the room to extend the space and create more layers in this scene.

The window is basically an empty object with a collider because we didn’t find proper glass material. Later we also added the curtains to make it more like a window. As for the skybox, we chose a sunset scene to match our whole warm atmosphere. And we adjusted the shadow to make the whole thing more coherent.

14 Mar

For the interaction of changing scenes, Tiger and I were firstly using SceneManeger as follows to shift between two different scenes. It required to build two scenes at the same time and we added a white dot in front of the camera as the cursor. But later since Ben and Yeji used OnMouseDown to play with the visibility in the same scene, we go with their solution considering it’s more convenient to match camera setting in the realistic world and alternate world.

previous code using SceneManager

When combining our work together, we decided the light effect on the calculator and made the two scenes more consistent in terms of the objects position and the way of interaction. We also spent time fixing the problems like lost materials and textures as well as some awkward movement of our character. Also, we thought more carefully about some design details and did a little user testing within our group to make the project more complete.

Project 2 Development Journal |Fire Planet

For Project #2, Will, Steven, and I brainstormed a variety of concepts that were properly balanced between actions that were everyday, yet alternate and different from what we already experience in real life. We considered different stories, settings (both in terms of time and space), and actions (squeezing, catching, grabbing, flicking, etc.) and ended up with this mind map:

The concept: In the end, after a long discussion on concepts that took advantage of VR as a medium, we decided on a concept where the user is a firefighter in a planet where a lot of random fires are part of the natural ecosystem. In an effort to make the ecosystem livable, humans have placed sprinklers around the planet. In our VR game’s scenario, the user finds themselves in between a large wall of fire and a city. Via a radio (that will be recorded and edited audio we create), the user will be instructed to use a fan they are holding to push back the fire in front of him in order to move an asteroid that has fallen on one of the sprinklers, then successfully stopping the fire from spreading into the city.

The city

The experience can be broken down as such:

  • User hears audio instructing him to complete his mission
  • User fans fire away from the sprinkler
  • Fire gets smaller/disappears as user’s fan collides against it
  • User walks towards the sprinkler with the asteroid on top
  • User uses free hand to push/move the rock away
  • User turns on the sprinkler
  • Audio congratulates user

For now, we’ve found various fire and particle system assets. We also found an asset that allows fire to propagate from a certain location, which could be useful for us in this case. Here are some samples of potential assets we could use:

Propagating Fire
Other example of Propagating Fire asset pack
Steam could be used in other areas where the sprinklers are putting out the fire
Magic particle system pack that could be useful if we want to go for a more surreal feel

March 11

Up until now, we divided our work as such:

  • Steven: work on the character mechanics (showing hands, triggering an animation whenever user clicks), start working on the environment
  • Will: figure out collision detection of the fire particle system to detect when it should be put out
  • Mari: render a projectile path that allows the user to aim, when user clicks, a water particle system will be shot out following the set projectile path

Two of these aspects: showing a hand animation whenever the user shoots, and rendering the projectile path of the projectile are key in enhancing this non-VR experience. If this project were for the HTC Vive, we wouldn’t have to show any of these, as the controllers would naturally be shown (so no pre-set animation would be required). With a simple motion such as throwing, the user also wouldn’t need a projectile path to estimate where the object would fall. As such, even though these two things might initially seem a bit inconsequential, they are actually key in providing a more enhanced and intuitive experience on the laptop.

For my part, I’ve been able to successfully render the predicted projectile path according to where the mouse is moved, showing an additional radius on the floor of where it will be hit, and also shooting an object on mouse click.
I followed this very useful tutorial that walked me through the whole process, including the scripting of the projectile path. Essentially, I created an empty “Path” object with a script that renders the path. I can fully customize the color, width, initial velocity of this line. I attached it to the Main Character and offset it from the center, simulating how the line will come out of the player’s hand. With a script called “Spawn on Button”, I can also choose what object will be thrown when the user clicks.

The line shows the projectile path, while the sphere shows the collision poing
The path also accounts for other collide-able objects
3rd person view of how these mechanics look

March 14
As of right now, the project is almost done – the environment is mostly built, and we have been able to combine all our different parts (listed above) into one. We play-tested with Carlos without giving him any context and it went mostly well – he brought up points on how we could improve the game play and add more urgency to what the user has to do. Some of the stuff he mentioned included trying to have more cohesion between what is being shot and the fire itself, adding a bit more background to the story so the urgency of the mission is communicated, and generally guiding the user more throughout the experience.

Due to the scale of the project, we won’t be able to implement all of the aspects we could potentially add. However, this feedback was still great in helping us make more conscious decisions and on directing us better in what we would like to include in the narration that would be played at the beginning of the experience. One of the changes we did make included the fact that previously we had changed the project to have the user save a tree that was the last of its species. Instead of having to fix any sprinklers, the user just had to put out the fires surrounding the large tree. After playtesting with Carlos, however, we decided to go back to our original concept of trying to extinguish the fires in order to fix the broken sprinkler. To make this more clear, we decided to find a more obvious and flashy sprinkler that would catch the user’s attention at the end. This is the model we ended up using:

Carlos testing our project!

Based on this feedback another decision we made was to add an indicator for the users about the location of the turret. In this way, they would not lose sight of the objective as they extinguished the flames:

The large turquoise cylinder would not get lost along the business of the flames, and also matches the look of the projectile path

Some more photos of how the environment currently looks like:

Shown: user’s hands, projectile path, far-away city with dome
The user finds themselves between the city and this fire wall (with water turrets stopping the fire from getting closer). The propagating flames will come in from the area where the turret is broken.
Closer look at the water sprinklers without the fire

March 15 
Today was entirely dedicated to doing the finishing touches on the project. This included:

  • Writing, recording, editing, and adding the narration into the project: Since the beginning of this process, we knew that we wanted a narration that would provide necessary context to successfully place the user into this new situation. Since our project was so story-heavy, we wanted to do this part properly, which is why we asked Carlos Páez to be our voice actor. I wrote a script that would properly contextualize players into being in the situation of a person with powers that is given this particular mission. I then added a radio-like filter and white noise to the audio so it would sound as if the person was talking on a radio-like device.
  • Adjusting the beginning and ending sequences: This ended up not taking as much time as we thought. We synced up the narrations for both parts. We also added an animation in the ending where as soon as the player enters the cylinder surrounding the turret, the turret becomes animated and starts shooting water. Simultaneously, the voice from the radio congratulates the player on completing the mission.
  • Doing final tweaks on the location of the player, the number of flames, etc. These changes we made based on doing two other playtests with people and finding small changes we could do to the project.