Project 2: Development Journal

March 2, 2020

For the second project, my group ultimately decided on creating interactions that occur in our daily lives. The first three interactions that we discussed were turning off an alarm, grabbing a coffee cup, and turning on the light. After some discussion, we landed on the theme of actions we do in the morning.

At first, we wanted to have the user wake up in bed and then turn off an alarm. However, after discussing this with Sarah we realized that it would require a lot of work as we would need to align the height difference and have props. We scratched that idea and went for interactions that we would do while standing. These then included opening the door (after you get out of bed), turning on the light (since it’s dark outside), and pouring coffee into a cup.

The story proceeds in this manner: the user first notices the door and we visually cue the user to open the door via the handle, then the user is in a darkroom with the light switch on the side of the wall, and lastly the lights on and the user heads towards the coffee machine to grab a cup of coffee.

March 13, 2020

Due to COVID-19, our plans for the project had to change. However, our core interaction still remained the same: to turn on/off a light switch. Ganjina finished up most of the indoor decoration and I worked primarily on the script that would turn on and off the lights. At first, I couldn’t get the onTriggerEnter to work, but Sarah helped me out there. Ultimately, I got the light switch to work and I designated the light source as sun for now.

Room
The lighting when the light is on.
The light switch when the light is turned off.
The lighting when the light is turned off.

Project 2 // Development Journal (3D calculator)

02.28.2020 // First Meeting: Brainstorming Ideas 

Our team met up on Friday, each with potential ideas developed since last class. Several ideas came up, some of which were further developing the 3D drawing idea, simulation of throwing glass, cooking food in a campfire, falling from the sky, pet simulator, backyard work, etc. While collectively brainstorming and developing ideas, we ultimately came to the idea of re-imagining the calculator interface in a three dimensional space.

As the project’s emphasis is on the mix of both the usual (everyday) setting and a sense of the alternate, we decided to have two settings for the project. The first, user’s spawn location, is an everyday setting, located in a person’s bedroom. The bedroom will be furnished with objects which an average person would possess – bed, drawer, bookshelf, etc. Non-textual elements – such as lighting and color – would be used to naturally direct the user towards the calculator. 

An interaction with the calculator would teleport the user to the second, alternate, setting. This space would be where the calculator’s conventional interface would be re-interpreted in 3D space. In the meeting, we discussed portraying numbers and symbols as cubes floating through space. A calculation would occur once a user drags the cubes and combines them. The result of the calculation could represented as another cube of either a different size of color, to distinguish it to the other cubes, which drops from the sky. The background would ideally be dark, to create a clear contrast to the first setting, but the specific aesthetics of the scenes are in the process of development.

The calculator would function as a portal connecting the two different worlds. Hence, the user would have access to either worlds at any time through interacting with the object. 

Meeting Record (02.28.2020)

03.01.2020 – 03.13.2020 // Development Process

In one of the last class we had (in person), we decided to divide the responsibilities, allowing each of the members to work remotely. As we had two different scenes, it made sense for two people to be in charge of developing a scene each. (Keyin and Tiger – scene 1 (bedroom), Ben and I – scene 2 (calculator).

We needed cubes representing the different numbers (0-9) and operators. Text Mesh Pro was used to display these numbers. This also allowed access to the numbers when altering them through the script.

As the background of the alternate world was darker in color, a glow effect on the cubes was applied. This was achieved through using the Light Weight Render Pipeline which included a bloom effect.

For teleporting between two scenes, from Sarah’s suggestion, we decided to place the two worlds in a single scene and teleport the user’s position.

To provoke teleportation when the player clicked an object, I initially using Raycast, only provoked when user presses the mouse, and when the object pointed by the Ray was the desired object, the user would be teleported to an assigned location. The script was applied to the player.

From feedback, the script cold be simplified significantly by writing the script from the perspective of the calculator. There was another issue which arose due to the First Person Controller where the player was return immediately back to their default location after being teleported for a short period of time. Ben helped fix this issue through temporarily disabling the First Person Controller while the teleportation took place.

03.14.2020 // Second Meeting: Assembling all parts

The second meeting took place to put the scenes together on a single file. Teleportation and adjustments to visuals and shaders was made to finalize the project.

Project 2 Development Journal | Fire Planet

For this project, me, Mari and Will wanted to create an enjoyable, short and repeatable experience that prompted the user to be engaged in a compelling experience unique to VR. We brainstormed many ideas, including flinging orbs at the environment to bring about setting changes, furnishing a home, and experiences revolving around interacting with the environment as an omniscient, floating, god-like character. The idea that led us to this final concept was our discussions of setting the project in a home and how we could create trials and puzzles to achieve a certain outcome. We discussed how a greater urgency could be instilled in the user to not mess up and mindlessly wander around the house searching for various objects. Among one of those ideas was fighting a fire with a fan if the user was not able to perform set actions correctly. From this we attempted to elaborate on how this could be considered an alternate reality and how this action of using a fan to repel a fire can be an experience in itself. 

Ideas brainstormed

From this we developed the idea of having to protect something and yourself from the fire and how fire could be a realistic feature of the environment. After some brainstorming we developed the idea of having to protect a city on an alien planet, characterized by its continually burning fires, from the fires that are creeping up on the city. 

The concept is that the city has been forced to install sprinklers that shoot water to keep the encroaching fires from getting to the city. The user takes the role of a “firefighter,” as they are forced to put out these fires.  However, one of the sprinklers has been hit by an asteroid, causing it to be crushed and wedged into the ground. The user has a giant hand fan which they are supposed to swing to generate wind to push the fire back enough so they can get to the sprinkler, push off the asteroid, and pull the sprinkler out of the ground. This is all explained to the user through a radio voiceover that plays at the beginning of the game. 

Scene: with the user facing the fires they need to put out with their fan to turn off the sprinkler in front of them

What we want to do next is research assets and particle systems. We believe that the characteristics of the fire we are able to create will determine how the rest of the environment is styled. We will also do research on particle systems and how best to create the interactions between the wind from the fan and the fire. 

Updated: March 11

I was able to add a simple animation with hands attached to the first person controller. What this does is create a short animation of the hand going forward and opening the fist. I intend for the “spell” (orb of water) to be emitted from the hand at the peak of this animation.

I was able to use the hands from Oculus along with the SteamVR package to get these hands. I simply followed this tutorial for adding a simple close fist/open fist to the animated hand. I modified the script he wrote to work with a key to trigger the changes. I also added a simple script to animate the right hand to move out as the fist opens and closes.

This was quite challenging as I initially looked at using 3D animated models but found it hard to coordinate with the keys. I also found it difficult to get the camera position just right for these models, as the head or torso would sometimes come into view of the camera.

Here is a video of the hand animation in action:

Update: March 13

I have been working on the scene design of the fire planet. In order to create this, yesterday I used the terrain tool to create a rocky, hilly terrain of a planet. I also created a fog particle system in order to demonstrate constant smoke that is on the planet. I also created a dome with a few buildings in it that is located behind the player as they enter into the scene. This was done so that the user feels like they have to defend the city from the moving fires. The dome is shining white, while there is a red spotlight that emits from the base of the city onto the rest of the scene. I did this in order to add to the fiery environment and a sense of emergency as the user is required to put out the fires. Lastly, I created water sprinklers using this tutorial as well as fires pulled from the Unity Particle Pack.

A few things that I want to do with the scene:

  • Make the lighting around the dome and the city more realistic. I am not really sure how to do this without creating a design for a, live, dynamic city.
  • Fix the issue with the fog impacting the contours of the terrain viewed through the fire.

Today we hope to merge all of our scenes together and have the animation of the arms coordinate with the release of a water grenade. We also hope to add Will’s script to allow the detection of collisions between the water and fire particles in the scene.

Here is the video of the scene:

Update: March 15

Today we continued to work on tying all the parts together and finalizing the style and interactions. On Friday, we were able to successfully make the user put out the fires by shooting the particle orb out of the gloves of the on-screen character. On Saturday, we finally fleshed out our narrative and decided to stick with the original idea of a civilization living under a dome on a planet that is plagued by constant fires. We debated whether we would change the premise of the experience to protecting the civilization’s final tree from forest fires, but scrapped this in favor of the original idea. We discussed how we could make this narrative apparent in order to provide some context for the scene as well as motivation for the user to focus on the objectives. In order to do this we decided to record a voiceover, giving the player explicit instructions on what to do.

We also decided to style the orb/grenade similar to a magical ball, using a simple particle system made using Unity’s Visual Effects Graph. This allowed the effect to appear more magical, tying into the narrative’s description of the character as someone with “powers.”

Today we added a final script to see when the character enters a transparent cylinder that marks the final destination. Upon entering, a voiceover congratulating the user on their efforts plays and the sprinkler comes to life and shoots a stream of water away from the dome.

Development Journal Project 2

02/03/2020

For the second project Ganjina, Ellen, Christopher and myself first brainstormed different ideas for the interactions we would create. We all wanted to create interactions in a more real-life setting to make the experience more intuitive for the user. We shared the interactions that we had come up with for class and found a common thread for most of them – Ellen suggested the interaction of switching a light switch, Christopher – switching off an alarm and I suggested grabbing a coffee cup. These interactions reminded us of waking up in the morning, therefore we decided to explore this theme. At first, we imagined the user waking up in a bed, then reaching for the alarm to turn it off and then getting out of the bed. However, after discussing this initial idea with Sarah, we realized that this experience won’t be easily translated to the VR environment, as the user would also need to feel like they are lying down in real life in order to believe that they are lying down in the VR world. Therefore, we started playing out different scenarios of actions that people do in the morning and prototyping them in real life.

*

We liked the idea of being in a room and seeing a door in front. By using visual cues such as a handle on the door we would direct the user to the door, have them open the door and walk inside a dark room. After locating a light switch they could press the switch and thus turn on the light and find themselves in a kitchen in the morning with a coffee machine on the table. The room would have kitchen furniture like a countertop, different appliances, a dining table, pending lamps and lastly the coffee machine. After exploring the kitchen we would use some kind of visual cue to guide the user to the coffee machine and have them make a coffee to complete the morning routine.

*

In the upcoming week we will explore how this scenario might be feasible in a VR setting and also combining it with feeling realistic in the actual physical space that the user will be standing in with the VR headset on.

moodboard with inspiration for the kitchen layout, coffee machine, door and light switch
storyboard with what the user’s point of view

*

12/03/2020 UPDATE

Before project alterations: (before March 4)

*

To further polish our project idea, Ellen, Chris, Ganjina and myself discussed a more solid storyline that our project would be based on. We liked the idea of illustrating the effect that a morning coffee has on a person’s mind and imagination. We wanted the user to enter a kitchen in the morning by coming through a door, switch on the lights by pressing a light switch and lastly grab a coffee cup. When the coffee cup would be put near the interactor’s mouth, the surroundings of the kitchen would change – the plants around would become green again (they would be dead before) and the lighting would change to bright, happy and enhanced colors, therefore representing that a sip of coffee in the morning makes a person’s inner world happier and brighter.

*

After project alterations: (after March 4)

*

However, unfortunately due to the new circumstances, we all altered our project ideas. Given the challenges of working remotely and not being able to use the Vive headset, we all reduced the number of interactions to just one main interaction. In our group, we discussed which interaction resonated the most with us and we would like to continue working with. We all agreed that our core interaction would now become the interaction with the light switch. We decided to work on an alternate world which modifies what happens when light switches are turned on/off. Instead of having just one light switch, we will have two or three and the main character (now controlled by the keyboard of a computer instead of with the Vive headset) will be able to walk around the room and turn the light switches. The first switch that is pressed would have a realistic consequence such as simply making the lighting brighter. However, as the user advances to new switches, the actions will become more alternate and unexpected. For example, when the next switch is pressed the plants in the room would go from dead to green. If we have time, we will also add at least one more switch which would change the overall color ambience of the room to something more unusual and alternate than a regular light switch.

light switch close up

We decided to keep the room as a kitchen, as we had already found the necessary assets and had started working on its design. We also liked the available assets for a kitchen and the overall environment that could be created. Therefore, the user will start from already being located inside of the kitchen instead of walking into it through a door, as we had to remove the door interaction. The light switches will be located around the room and not all in the same place, thus the user would have to move around instead of just standing in one place.

kitchen overview
initial point of view
green plant
light switch on one side of the kitchen

*

Now the course of action is to finish up the design and layout of the kitchen, to work on the physics of the room and the character, to add the character and polish its interaction with the light switches. We will also finish up the scripts for changing the lighting of the room and reviving the plants and lastly, if we have time, we will also add another light switch and another action which would be changing the color ambience of the kitchen.

*

15/03/2020 UPDATE

During the past days, me and my group members had assigned tasks that we each needed to complete, thus we were concentrating on working on our separate tasks. Ganjina had already worked on the initial scene of the kitchen and in the past days Chris was working on the interaction of pressing the light switch and the script for turning on and off the light when the switch is pressed. I was updating the scene, the physics of the furniture and adding the light source and Ellen was working on the script for manipulating the plants when another switch is pressed as well as adding fireworks in the window when pressing the switch responsible for lighting in the room.

*

For my part I updated the outside structure of the room, as it turned out to be working better if the walls were cubes instead of planes because there were issues with some of the planes not showing up in the game view. I first tried reversing normals for the planes in order to view them inside out in the game play, however, it did not work properly and adding cubes instead of planes worked as an alternative solution. I replaced some of the furniture, as some of the previous assets were giving errors. I also added a ceiling and lastly, I worked on adding a light source for the room. I wanted to include a lamp, preferably a ceiling lamp, to replicate a more realistic kitchen lighting. I tried looking for lamp assets, however, there were no available free assets for a ceiling lamp. Therefore, I decided to make the lamp myself. I experimented with different shapes and different types of light in Unity and their combinations. After creating several lamps that I did not like and didn’t think looked much like a lamp, I finally found a combination that I was imagining in my head. It was a combination of a sphere and a point light which I attached to the ceiling. I made this light to be the light source in the script that is attached to one of the light switches in order to be able to control the lighting in the room with the switch.

updated dining area
updated kitchen area and light on
light off

*

After my updates, Ellen also finished the script for manipulating the plants. Instead of having dead plants and making them go green again after pressing a switch, we decided to have particles come out of the plants that look like fireflies and fill up the room creating a magical and alternate feeling. There were no proper assets available for different stages of plants, thus we found this solution to be an effective alternative.

Project 2 – Development Journal

Project theme: Peace & War

Project members: Neyva, Vince, and Nhi

Idea development: 

In our first project meeting, we built upon our class discussion about Lo-Tech VR Interaction Exercise, choosing boxing/punching as one of the two main interactions in the environment we want to create. I personally like this idea because it reflects a strong action and we can integrate any unexpected response out of a punch. The users may expect the punching bag to oscillate back and forth but would have no idea what effects/responses beyond the norm. Having said that, we also hope to develop our project to resonate with one of our class readings “Responsive Environments” by Myron Krueger, making the user experience and environment more interactive and responsive. For the second interaction, we first came up with the light switch, which is an everyday activity. We tried to connect the light switch with the punching action to make a cohesive story in an artificial reality. In the end, we slightly modified the idea and chose to go with a button on a pedestal. While discussing the two actions, punching a bag and pressing a button, which is almost part of our everyday life, we hope to alternate daily interactions into some dreamlike effects in our project. 

In specific, the setting would a theater stage where the punching bag and the button would be located on two opposite sides. We alternate the reality by:

  • Punching the bag: every time the user punches the bag, besides the normal oscillation, there would be white/black doves flying out magically. The white doves represent peace, and the black doves represent the concept of war. 
  • Pressing the button: this button would change the color of the doves. Every time the user presses the button, the color will change from white to black and vice versa.
  • The stage: in the back of the stage, we are considering putting words/colors/pictures to reflect the theme of our project.

The main idea behind our project development is to demonstrate how fragile the peace and the world we are currently living in and taking for granted. When punching the bag, the users would tend to punch harder and harder to see if there is any change in the effects. Of course, there would be: more doves would fly out. However, if we take a closer look, such an increase in the intensity of each punch represents the rising conflicts among individuals/nations. Moreover, just by pressing one button, we are able to enter the state of war. This button is also the symbol of war threats such as nuclear power/weapons – by just pressing a button, peace no longer exists; a war has begun (white dove-peace changes to black dove-war). 

Project development:

We first sketched the initial 360 view of our project.

Our first sketch of the project

We have found a few free assets that could be beneficial for our project, including stage, punching bag, gloves.

March 9, 2020

After our team discussion, we decided to minimize our interaction to only punching bag – birds flying out. The environment would be similar to the environment below

March 10, 2020

Vince and I worked on interactions of the punching bag.

The first problem we encountered was to control the boxing man when we moved the mouse. I wrote a separate script for the camera controller so that the camera position = box_man position + (proper) offset, and smoothened the interactions by using Vector3.Lerp(). However, after a few trials, we decided to use FPSController which already has the script and makes our life easier. We used the first person controller (FPSController) prefab (renamed it with “player”) and put the box_man prefab as a child of FPSController. By doing so, when we move the mouse, the boxing would move accordingly. We also positioned the camera and limited the looking angle so that we only see the hands of the boxing man.

The other part we worked on is collision detection. We decided to put the script of collision detection on the punching bag – every time there is a punch, the collision will be detected, and we will trigger the animation. In order to do that, besides detecting the collision, we also need to detect whether it is a proper punching action or not (since the collision detection can also happen when the boxing man just walks towards the punching bag and touch the bag ). We have two scripts for this: animationController and collisionDetector. In animationController, besides triggering the animation, we also put the Punch class that will return the anim.GetCurrentAnimatorStateInfo(0).normalizedTime in the get method. In collisionDetector, if we detect a collision and (call Punch and the time) < 1, we play the punch sound as if the person punches the bag.

March 11, 2020

After finishing the collision detection for the punching bag, we decided to use raycast to lighten the color of the punching bag whenever the player looks at it (to attract attention).

First, we reused the code we learned in class. However, the color did not change even though the debugging message indicated that the player was looking at the punching bag. We tried to switch the tag and the layer of different components related to the punching bag to see where the problem was. After around 1 hour, we found the problem.

First, hit.transform.gameObject did not work since the punching bag is the child component, and we need to use hit.collider.transform.gameObject instead.

Second, we can change the color of the prefab; however, the changes were not obvious to the player. Hence, we decided to change the emission color so the bag would brighten whenever the player looks at it.

March 12, 2020

We started working on generating birds every time the player punches the bag. We wrote the script birdGenerator.cs to handle this behavior.

In this script, we initialize an arraylist to store the birds. If we receive hit signal from collision detector from collisionDetector, we will instantiate a clone of the bird with different speed and position (using Random.Range) and add it to the arraylist. Later we just loop through this arraylist and set the moving direction and speed for the bird.

March 13, 2020

We worked on changing the angle of the bird animation and setting a flying path for the bird. Since we wanted the birds to fly out at a random angle, speed, and position around the punching bag, we decided to create a class bird and encapsulate all attributes into this class, and when we dynamically create the birds, we assign the random values to the attributes of the birds. This makes things consistent and easier for us if we want to add additional attributes in the future.

March 14, 2020

We worked on changing the color for the birds. The bird prefab originally had the color of red. Vince and I decided to experiment with the color when we received the environment from Neyva.

First, we tried giving each bird a random color (by giving random values for R,G,B). However, this variety of colors made the scene look extremely low poly and does not fit with what we had in mind for our final scene and our final concept of the project. Hence, we decided to go back to our original idea of doves and consistent white color for the bird.

We also decided to add background music to this project. Since we want to convey of dark & ominous environment

March 15, 2020

We worked on final touches for our project. After Neyva added the particle systems to create the fog effects, we decided not to use raycast anymore. The original idea of the raycast is to brighten up the punching bag a little bit to invite the player for a potential interaction. However, the fog makes the “brightening up” of the punching bag not obvious, so we in the end decided not to keep it. We also worked on other small fixes and on our presentation:

  • Fix the position, density, and scale of the particle systems
  • Fix the color of the clouds and the skybox
  • Work on presentation and divide the part among team members

Project 2 Dev Journal | Fire Planet

For this project, I’m working with Mari and Steven. After a few in-class discussions, we met again for a brainstorming session. We started with coming up with different environments, then moved to thinking of different everyday actions. Throughout the process, we continuously returned to the question, “Why VR?” We considered what VR could do, and what everyday means. As we navigated the balance between compelling and feasible, we realized our ideas revolved around the themes of playing with known roles, perception of scale, and customization.

brainstorm mind map

As usual, guiding our thinking with even this messy mind map helped us find our way to a good idea. In the end, we landed on what it would be like to be a firefighter on a fiery planet. Compelled by a nice balance between feasibility and compelling, we developed a few versions of the narrative and started to storyboard.

storyboard 1
storyboard 2

In this experience, the user finds themselves on a fiery planet. In front of them, a tall fire burns. Sprinklers in the near foreground keep them at bay, but a small asteroid has landed on one of the sprinklers, pushing it into the ground. Because of this, the fire is slowly creeping through the gap towards the user. Behind them, a dome extends to the left and the right horizons and soars into the sky. Beyond the barrier, a city rises up. A voice tells the user, who holds a fan in one hand, to try and reach the broken sprinklers and repair them. The user has to wave the fan to push the flames back, and, when they reach the sprinkler, reach down, remove the rock, and pull the sprinkler out.

So far, the idea is fairly well developed. However, we still have to work out the specifics of the interactions and the cues which direct the user’s attention.

March 14 | Update

After the assignment changed from VR to computer-based, we decided to focus on the interaction of the user putting out the fire with a tool in their hand. Because of the medium change, we changed this tool from a fan to a sort of water spell. From there, we split up the components of the interaction and started to work. Steven put together the character and the first-person animation of the arm throwing the balloons. Mari developed a raycast system that would move the spell from the users hand to where they were aiming in the distance. I developed the water-fire interaction.

For my tasks, I ended up writing two simple scripts. We wanted the fire to burn continuously but to disappear when the water touched it. We also wanted it to reignite after a time so the user would constantly have to be putting the fires out. One script allows a particle system to disable the fire particle system upon collision. The other script re-enables the disabled fire after a delay. This latter script manages all the fires in the scene.


Project 2 | Document Journal

<Scroll down for the oldest post>

[Updated] March 15, 2020

We began to integrate our interactions (the camera, the punching bag, the boxing man, the bird) into the environment that Neyva created. By all means, there were some difficulties trying to put together everything and we had to manually place every object that we created altogether. There were some compile errors and some glitches from Unity when it came to the particle for the fog, but we overcame them after all. I found a sound piece that fits nicely with the ambient environment of the experience here. We also played with the color palettes of the environment and its associated objects and ended up to color the bird white-gray-ish, the cloud red, the punching bags red, and the punching glove red.

[Updated] March 13, 2020

We wrote a script attached to an empty game object that clones new bird prefab every time the player punches the bag. We created a class Bird that hosts the initial conditions of the bird (position, speed, angle, animation…) and the GameObject Bird itself. Initially, we used Character Controller and its Move function to move the birds; however, because the Character Controller component comes with a Collider by default and no matter how hard I tried to disabled it, it still collided with the punching bag, causing unwanted bag swing that doesn’t look good.
Therefore, I did some research and came up with the idea of using RigidBody and AddForce to move the birds instead (without adding a collider to avoid collision with the punching bag), which works like a charm. For some reasons, the volumetric spot light above the fog did work in the Unity sketch but was nowhere to be seen in the exported app, which we could not fix and have to say goodbye to :'(

[Updated] March 11, 2020

We added a raycast script to the camera so that if the player is looking at the punching bag, it would become brighter, signalling to the player an invitation for possible interaction. Along the way, we encountered some challenges that wasn’t brought up in the original example of raycast.
Firstly, because the punching bag is a child of some other components, we had to use hit.collider.transform.gameObject instead of just hit.transform.gameObject (which returns the parent of the punching bag instead).
Secondly, because the material used for the punching is not a simple color, thus we had to use emission color to alter the brightness of the punching bag.

[Updated] March 10, 2020

Neyva is working on the environment while Nhi and I work on the player and the interaction with the punching bag.

Below is a video of the working punching bag (the environment is a placeholder, not the environment we are creating for this project):

After some experimenting with the character controller, we decided to settle on the first person controller from the Standard Asset and make the boxer a child of the controller so that he and the camera will follow the movement of the controller.
The Unity package comes with a boxer and his associated boxing animation. As this is a first person perspective experience, we intended to keep only his two hands in gloves in front of the camera as an affordance for the interaction with the punching bag. However, we couldn’t dissect those from his entire body rig, which is read-only; therefore, we put the camera just outside of the body and restrict the camera in such a way that the player can only see the hand movements (not the entire body).
Having some experience in Unity animation, I did some digging and found a way to trigger the boxing animation on click. I also slowed down the animation and configure it so that it can only be triggered again once it finishes (to avoid animation reload when the player clicks too fast). We also added a collision detector script on the punching bag and passed around some variables from different scripts to determine if the collision is from the punching action, not from accidental touches (in case the player gets too close to the punching bag and touches it with the hands). We also added sound effect upon collision.

[Updated] March 09, 2020

After having some more meetings with our team, taking into considerations feedback we received in class and the constraints we are now facing due to *cough* COVID-19, here are some revised aspects:

  • We will ditch the button; rather, we will focus on the main interaction between the player and the punching bag. The dove will fly out of the bag upon being punched as before.
  • We will ditch the theater. The environment will be simplified down to its barest elements. We took inspirations from the scene below:
  • The main cylindrical punching bag will be in the middle of the scene, while being surrounded by smaller punching bags held in place in the sky. Fog will be used to create a mysterious atmosphere. The environment will remain mostly dark, unlit; there will a volumetric light illustrating the main punching bag.

March 4, 2020

For the second project, we (Neyva, Nhi, and me) adopt the the struggle between war and peace as the central theme of our experience.

Storyboard

The experience is set up in a theater stage where the user is placed next to a punching bag and a red button on a pedestal, corresponding to two everyday activities of punching/boxing and pressing (buttons). What makes the environment alternate, or surreal, is the way those two objects and the environment responds to the user’s interaction:

  • To punch the bag, the user will need to press the trigger to form a fist with their hand and accelerate their fist towards. Upon being punched, white doves will magically appear from the punching bag and fly around the stage.
  • To press the button, the user will need to press the trigger while aiming at the button. Upon being pressed, the button will magically turns all the doves black.

The theater stage thus presents the user with an alternate space to perform his/her relationship with war and peace. The mere action of violence of punching is counterbalanced by white doves which have long been symbolizing peace and aspiration for peace. On a similar note, the symbolic action of pressing the red button, which is often quoted as the threat of global war, is materialized in the transformation of white doves (peace) to black doves (war).

Below are some Unity assets and some reference images we’ve found so far:

A cinema theater asset in Unity store. We will try to convert it into a theater theater.
A punching bag asset in Unity Store.
Boxing gloves royalty-free 3d model - Preview no. 1
A pair of boxing glove found on Free3D. We will try to replace the Vive’s default controller with these.
Image result for punching bag

Project 2 – Development Journal

Group Members: Ellen, Christopher, Luize and Ganjina

Project Theme:  Student’s everyday life

Storyline: 

For this project, we decided to create some simple interactions that take place in our everyday lives. At the beginning the interactions we had in mind were turning off an alarm, turning on the light and preparing a coffee. We wanted the user to get out of bed and then turn off an alarm, however, after talking to Sarah we realized that at this stage it would be hard to implement waking up in bed and changing posture in VR at this stage, therefore we decided to change some interactions.

Interactions:

  • opening the door
  • turning on the lights
  • making a coffee

Setting:

Room 

  • Door
  • Kitchen area: plants, coffee machine, glasses, kettle, microwave, fridge,
  • room decor
  • light switch

We also decided to use some sound effects such as:

  • door opening
  • light switch
  • pouring coffee into cup

Storyboard: 

Update 1:

Unfortunately due to Coronavirus, we did not have access to VR headsets so we had to make some changes to our project by reducing the number of interactions, with using WASD keys instead of a VR headset. We finally opted for light switch as our main interaction. We decided to add multiple light switches, where the character walks around the room and turn on/off light switches. While turning on/off light switches, we have decided to add various actions that would alternate the place. Our initial idea was that the character walks through the door to the kitchen, but we ended up the character being already in the kitchen -> walks around and explores the space, while playing with light switches, where different actions occur when turning on/off the light switch to make the place more alternate.


Interactions (light switch):

  • turning on/off the light
  • fireflies emit over time, and fill the room until turning off the switch
  • fireworks – outside the window (to make the place more alternate)

Update 2:

Tasks that each needs to complete:

Ganjina – works on designing the scene of the kitchen

Chris – works on turning on/off the light (when pressing the light switch)

Luize – physics of the furniture, adding ceiling and a light source for the room

Ellen – works on adding effects: fireflies, and fireworks

Update 3:

Final Result:

  • the character stands inside of a room (kitchen)
  • multiple light switches across the room
  • when approaching the lights switch -> instructions appear on the screen to turn it on/off
  • different actions occur when light switches are on/off – to make the place alternate


Response as a Medium; Reading 2

In Krueger’s Responsive Environments he describes his development of creating responsive environments and installations in order to procure a certain response from his audience. He gives some guidelines on what he learned after his project GLOWFLOW, and I want to break these down. (423)

He sets these as a precedent/series of goals in which he hopes to establish an interactive or “responsive” environment. While I believe that some of these still hold true to the test of time, and can be applicable to people within the interactive media/multimedia field, I think our advancements in technology have allowed for an expansion within certain rules. I believe that number 3 no longer holds true, as we can see example with large groups of people participating within interactive art. (Ex: Teamlab borderless comes to mind. Link https://borderless.teamlab.art/) I also believe that number 6 belongs to preference, as these aspects are important within creating an entire environment. I find it hard to cultivate an environment of interaction when key elements of the art piece are ignored.

Response as a medium should take into account the constant input of it’s users and generating new output within an environment that lends itself to the constant cycle.

Reading Response on Interaction

In “Responsive Environments,” Myron Krueger introduces several interactive media projects that collectively present part of how his exploration with responsive environment has shaped his understanding of response as a medium. So how does response act as a medium? Without a doubt, response carries messages and/or information and passes them from one end to another, just like books, radio, or any other media do. At the end of the day, any medium is a way of human communication, whose job is to connect its artist and its audience. In this case, the artist makes the rules for an responsive environment to obey, so that the audience’s behavior will generate particular actions by the environment, which will hopefully allow the audience to understand and/or get interested in their communication with the environment (essentially, with the artist). The rules are algorithms programmed by the artist. This way, the artist does not necessarily know what action of the environment will be generated by a given behavior of the audience; instead, the artist takes care of the algorithms to ensure whatever actions they generate bring pleasure to the audience. It is interesting to notice that “pleasure” is a loosely defined term that can be achieved through many more specific emotions, such as curiosity, the desire to generate some certain action of the environment, or even empathy from successful immersion.