Invisible Cities Response

A city that especially stood out to me was Esmeralda because of several reasons. Firstly, the name of the city reminds me of a childhood game that I used to play with my family. It was a card game where each player had to make predictions about how many card combinations they will be able to play out based on the available cards in their hand. The combinations were set by the game’s rules and each person’s turn was affected by the decisions and actions of other players, therefore it was not easy to predict the right amount of combinations you would be able to play out. At the end of each round points were calculated based on how many predictions each player got right. In addition to counting points among the players, my godfather had come up with an additional set of rules which included Esmeralda – an invisible player who also scored points after each round based on the performance of others. The ultimate goal of the game was to beat Esmeralda and score more points than her. I remember being very bewildered about who Esmeralda was when I first started to play the game when I was young. I always associated her with a mystical creature and was almost afraid of her “presence” in the game. Only when I grew up I learned that Esmeralda was purely an invention to make the game more dynamic, interesting and challenging, yet my strong perception of her as a mystical creature never went away.

*

Secondly, out of all cities described in the book Esmeralda spoke to me the most because it immediately reminded me of New York – a city that I lived in for four months during my Junior Year Spring semester. “And so Esmeralda’s inhabitants are spared the boredom of following the same streets every day…each inhabitant can enjoy every day the pleasure of a new itinerary to reach the same places” – this quote resonated a lot with my personal experience in New York which was to always try to take a new route to get to my destination, as there are countless parallel streets in the city. Although they may seem similar from the outside, each street is different from the previous – I always discovered either a new coffee shop, a new park or dog day care on each street that I walked on. I loved exploring New York on foot, as the city seemed so walkable despite the occasional bike rider nearly crashing in me and cars honking everywhere. Whenever I had to go somewhere further, just walking was often not an option, especially if I was in a hurry. Then I had to explore other transportation methods that New York offers. “…the network of routes is not arranged on one level, but follows instead an up-and-down course of steps, landings, cambered bridges, hanging streets” – this quote reminds me of the type of public transport that I used the most in New York – the iconic subway. The subway system in the city seemed so vast and almost always I could count on finding a station nearby and teleporting one level down to continue my journey in the underground. The subway system was also always full of rats which is also mentioned about Esmeralda: “below, the rats run in the darkness of the sewers, one behind the other’s tail, along with conspirators and smugglers”. The second part of the quote also reminds me of New York, as it was a city where I sometimes experienced some dangerous and frightening moments because of people’s weird and unpredictable behavior.

*

Lastly, another quote about Esmeralda that reminded me strongly of New York was: “a map of Esmeralda should include, marked in different colored inks, all these routes, solid and liquid, evident and hidden”. The part about different colored inks reminded me of the New York subway map which is filled with seemingly tangled colored lines, each color representing a metro line. The map was especially hard to navigate in the beginning when I moved to New York. In addition, New York’s map is filled with tangible and intangible routes. For example, the dotted ferry lines on the Hudson river seemed far less tangible than solid streets represented in the map and took more time to learn to navigate. However, once mastered, they opened up new and unexpected ways to discover the never-ending city of New York.

New York City subway map

Project 2 – Documentation

For the second project I worked with Ellen, Chris and Luize. The inspiration for this project was loosely on interactions that take place in our daily lives, as we wanted to create interactions in a more real-life environment to make the user’s experience more natural. Then we started brainstorming on different scenarios of actions people do in their everyday lives, and ended up making a kitchen room because everyone is familiar with the place. We thought about various interactions that might occur while the character is in the kitchen and decided to make the switch as our main interaction. The user understands that the place is very unusual after playing with switches, since the first switch responsible for turning on/off the light and the other one is responsible for making the place alternate by causing fireflies coming from plants that emit overtime. This gives the user the idea that switches can do something more than just turning the light on or off.

Image 1 – the switch asset

Our initial idea was to create a bedroom where the user wakes up by turning the alarm off, then turns on the light and goes to the kitchen to make the coffee, but we decided to use the light switch as the only interaction because of some limitation of available assets and some other circumstances.

I worked on setting up the room and adding decorations and furniture. Adding walls, furniture and some other staff was very time consuming, however, I really enjoyed the process of working on it. When I finished with setting up the room, Luize updated the scene by adding the physics and the ceiling with the indoor light. Afterwards Chris worked on implementing the interaction with the switch, turning on/off the light, and Ellen worked on adding effects to make the place alternate such as fireworks and fireflies.

We added small lights on the switches to draw user’s attention to the switch area (Image 9). That way we tried to make things more clear and understandable, we also added some text such as “press E to turn off the switch” so when the user enters the switch area the text appears to help the user understand how to navigate and how to turn on/off the switch.

Image 2 – environment setup
Image 3 – fireworks
Image 4 – fireflies
Image 5 – top view

Image 6 – dining area
Image 7 – kitchen area – light on
Image 8 – kitchen area – light off
Image 9 – fireflies with light off
Image 10 – fireworks

The only action that the user does is walking around the room and turning on and off the switch. We wanted the user to start with just looking around the room and then realizing that there are two light switches. The reason why we wanted the user to start from being next to first switch next to the window is in order for him to be able to see one of the effects that we added, fireworks (Image 10). After seeing the fireworks, user can turn on/off the first switch which is right next to the window. When turning off the switch, user realizes that not only did the lights go off but the fireworks are also gone. When the user moves around after being alone in the dark room, he sees another switch located on the other side of the room with a little light on it, by inviting the user to go the next switch area. When the user enters the second switch area, the the text-box appears on the screen with instructions on what to do next. After being in the second switch area, when the user tries to turn on the lights, the fireflies start coming from the plants and fulfilling the room (Image 9). We also tried to download the proper switch assets, in which the user would see how the switch changes while turning it on or off, to satisfy users expectations. (Image 1).

Initially we wanted the user to use VR controllers to play with switches, however, since we didn’t have access to VR headset, we had to use the keyboard. The reason we have chosen to add text-box with instructions is because it will make the user feel comfortable navigating and make the next steps easier to understand.

Although there have been some challenges that have arisen throughout the project, such as not finding proper assets, lighting difficulties and putting up the furniture together, I am extremely pleased with the overall work we have done in such a challenging situation. The end product came very close to what we foresaw for our project first, if feels natural, but unusual and surprising at the same time. And I think we have managed to achieve the goal of an alternate world, because the results of the interactions are very different and extraordinary from our normal everyday life.

Project 2 Documentation | Boxing with Ares

[Updated March 28 2020] Added to Documentation Category

Project Description

Boxing with Ares is an immersive experience in Unity that invites players to a dark and eerie world where they will have to navigate through their internal conflict of peace and war, of hope and sorrow. An inviting big red punching bag placed in the center of a gloom, obscure, and desolate ground that is actively contrast by a sky filled with grids of smaller punching bags seemingly blending with bloody clouds streaks: what could go wrong, what other ominous thing that could happen here?

Unknowingly to the players, dozens of doves fly out from the punching bag whenever it is punched. That is simply not how punching a bag works in real life. The act of punching something is supposed to be a violent act: how could this make sense with such a symbol of peace, how could such two antipodes co-exist in the same world, let alone in the same interaction. Taken back by the unexpected interactions, the players then face their internal struggle of interpreting such encounters: whether to keep punching or to stop the violent act, whether to spread peace by setting the doves free or to let hope die out by chasing the doves away…

Process and Implementation

The very first step of brainstorming for this project was to come up with an everyday activity through which we would modify in accordance to the alternate world. Someone yelled let’s do boxing, I did not remember whom, but the idea was so captivating that we went full force with it. The word “magic” somehow popped up in the conversation and somehow I said “What if birds fly out of the punching bag like when they magically fly out of a magician’s hat?”. Instantaneously, something clicked: we realized that if the birds were to be doves which have long been symbolized peace, they would unexpectedly counterbalance the suggestively violent act of boxing. That they would open up so many questions revolving around peace, war, and the agency of the player, his/her internal struggle between the good and the bad. There would be a button that could be pushed to change the color of the doves flying out (this, however, quickly proved to be an idea made hastily and did not blend well with the rest of the experience).

Initially envisioned for the Vive system, we intended the interaction to be organic, analogous to that required to punch a bag in real life: the player would have to pull the trigger while holding tight to the controller (which resembles the act of clenching a fist) and accelerate their controller/hand forward towards the punching box.

The act of punching a bag. This is also the asset we found on Unity for the experience.

Also, we thought that we could envision a theater environment in which the player is given a platform to perform their internal struggle between peace and war.

This is our first sketch of the experience

However, after receiving feedback from Professor Sarah and our classmates as well as the breaking news of the coronavirus that would have a big impact on how we designed the experience, we revised and narrowed down our initial idea, specifically:

  • The interaction would only involve the action of punching the bag
  • The theater environment would be changed to a less context-based and more provocative space. We took inspiration from this scene from the Matrix, in which the environment does not guarantee any concrete clues as to where the player is currently situated, a place that is not defined by the conventionals.
  • We also took some more inspirations from this set-up. We would want some fogs in the environment, as well as smaller punching bags randomly hung from the sky, without really making any sense as to why they are there in the first place, opening up possibilities for self-interpretation and self-reflection from the player.

With the developed idea in mind, we started to work on the project. Neyva took charge of the environment, while Nhi and I worked together on the camera, the character, the interactables, and the interactions.

We decided to put the main camera on the character in such a way that the player can see his/her hands. As we could not use the Vive anymore and thus its associated in-screen controllers, being able to look down and see his/her hands provide a visual cue that interactions through the hands are possible. We however limited the angle which the players can rotate down, as we did not want the player to be able to look through the boxing man’s body. Lastly, we made the camera and the boxing man children of the First Person Controller so that they can move in tandem with the inputs from the player. This is about as far as we could implement the experience we intended to be without Vive.
We then added a script to detect the collision between the boxing man’s hands and the bag. We could not rely on the default Collider alone because we had to check if the collision is indeed caused by the punching action, not by accidental touches caused by proximity to the bag. After detecting the collision, we added a script containing a class Bird to generate birds flying out of the bag. The birds are generated with random positions, random angles, random velocities (using Rigidbody and AddForce function) and animation speed correspondingly (the faster the velocity, the faster the animation – the flapping wings animation).
The bird prefab upclose. We played around with its color and ended up to choose a white-grey-ish tone that suited well with the monotonous tone of environment while not overpowering the experience. The addition of the moving birds somehow provide the scene with a lot of contrast, which is predominantly made up of stationary or slow-moving components.
The punching bags in the sky placed by Neyva.
The clouds in the sky placed by Neyva. Originally, they were white; however, after toying with the skybox a bit, we decided to set them to red and made the environment even more mysterious, hell-ish, dark, and cruel.
The ground fog effect (particle system) created by Neyva

Project Reflection

Overally, I am satisfied with what we were able to achieve in such a changing and challenging situation.

First of all, I can feel a sense of an alternate world being presented in the experience. From the very prominent cue of a dark, ominous sky without any light coming from the sun dotted with bloody streaks of clouds, to the less-so-stand-out desolate layout of the immediate environment with only a lone punching bag on the ground (or even a lack thereof of a definitive ground, only a featureless plane that extends and seamlessly melts into the horizon), to the omni-present ambient sound that suggests a rather unsettling tone of the environment: everything works together to transform the player into an alternate world that one might imagine but is too scared to face it himself/herself.

Moreover, having the ability to see his/her hands (or rather hands with a pair of red gloves on them) right from the very beginning of the experience immediately reminds the player of the possibilities of utilizing them for potential interactions. Apart from the smaller punching bags that are placed way too high for any conceivable interactions, the one and only thing that are in the reach of the player is the inviting big red punching box that is a few feet away. It is obvious that something, unexpected or not, will happen upon interactions between the hands and the punching bag.


One small thing to note, however, was a lack of the volumetric spotlight that shines above the punching bag. While it was functional in the Unity project, when we exported it into an executable app, the volumetric is nowhere to be seen. While it originally served to further emphasize the importance of and as an invitation to the punching bag, the lack of it in the final app did not really have a big impact on the experience as a whole: the aforementioned features are enough to act as affordances for the experience.

The end product came quite close to what we envisioned for our project. In some way, it exceeds my expectation, it feels both more real and more alien than I could have imagined before. While the immersiveness of the medium lends itself nicely to the experience, giving the player the freedom to explore and interact, it also presented us with challenge to put things where they should be. For example, we decided to offset the punching bag quite a bit from the initial position of the player so that the player can have a grasp of the environment as a whole before delving into the interaction. This was met with positive comments from our classmates, citing it gave them a pause to think about their actions, to whether or not incite more violence by punching the bag on the already violent environment.

Agency Question

The very first thing that ensured the player with an agency in this experience is the ability to see his/her hands right from the beginning. What’s more, the hands are barely in their naked forms: rather, they are inside a pair of red boxing gloves, which imbued the player with an elevated kind of agency, the kind that comes with capabilities specific with boxing gloves. The sight of a matching red punching bag afar immediate after that inevitably welcomes the player to come closer to materialize the thoughts of actions that were triggered earlier upon seeing the boxing gloves for the first time. The satisfying sensation is derived from the ability to punch the bag (either through a mouse click which is implemented here, or with an actual forward movement of the controller while clenching a fist initially imagined for the Vive ) and see the bag responds to the action through its change of position and speed in space and time as well as auditory cues (impact sound). Moreover, more than an expected displacement of the punching bag, the player is surprised with doves flying out of the bag every single time it is punched. It is at this moment that the player realize they can not only physically influence, but they can also extend their bestowed agency on innocent doves somehow “trapped” in the bag, to decide either to set them free with a view to spreading hope outwards or to keep them inside, trying to hold on to the last bit of hope in this dark environment.

Documentation – Project 2: 3D Calculator

Project Description

Demo of final version

With the idea to make a “3D Calculator,” Ben, Keyin, Yeji and I worked together on this project. Its purpose is to imagine how the everyday activity of using a calculator could happen in a very different way with the help of VR.

Upon entering the scene, the user will find themselves in a bedroom (scene 1). In the center of the room, there are a bed and a desk, on which is a calculator, lit up by the lamp beside. Once the user clicks on the calculator, they will be taken into an alternative world (scene 2) where the calculation takes place.

Here, the user can form a formula by creating and dragging around cubes that represent operators and operands. The cubes that make up the formula remain red if not connected to the green equal sign cube; once connected, they turn blue and the result will pop out automatically.

It feels very alternate because instead of pushing buttons on a traditional calculator, the user places numbers and operators in whichever way they want to on a 3D canvas that can contain infinitely long formulas. They can actually place the cubes in specific ways to embody the logic behind the calculation, while digits on a small screen won’t be able to do so. Moreover, the technology of VR allows us to turn a electronic device into an entire environment.

Process and Implementation

Before making up our minds on 3D calculator, our team discussed a few ideas. From pet keeping, programming to barbecuing, we kind of traversed the spectrum of interaction: human-animal interaction, human-computer interaction, and human activities involving non-responsive objects.

Draft of ideas and storyboards

We finally settled on 3D calculator, as all of us agreed that the idea of being able to create and snap together virtual cubes sounded cool and futuristic. Personally, I was interested in how it could embody both the everyday-ness and the alternate-ness. Calcutor is a common device that people are very familiar with, and the action of using it takes place in a lot of scenarios in our daily life; but by reimagining calculator within a VR context, we introduce to the user a brand new interface, where they get to experience a different set of operations. These new operations might open up the door to a lot of new possibilities of how everyday activities could be done in easier ways. Specifically, with this project, we want to demonstrate that activities requiring a strong sense of logic, such as mathematical calculation, and even programming / coding, could take on a more graphic and more intuitive approach. Once users get familiar with this kind of operations, tasks may be done more efficiently.

We divided the work in developing this project into two parts: building the everyday environment, and building the core interaction with cubes. Keyin and I took on the former, so I will first elaborate on the design of the bedroom scene (scene 1) with respect to the following aspects, and then introduce briefly the calculator scene (scene 2) Ben and Yeji built.

  • Purpose
  • Identities
  • Design: furniture, room structure and lighting
  • Interactions

Purpose The question to answer before doing anything is, what is the purpose of scene 1? I expected our scene to act as a friendly starting point for the user, where they would instantly feel familiar.

Identities This required us to create a homely and realistic atmosphere for the scene to be as immersive as possible. Therefore, we decided that building a bedroom would be an easy way to achieve the purpose, as a strong feel of warmth and coziness is often attached to bedrooms, and the identities above wouldn’t be too hard to realize with appropriate design.

Furniture We started with the bed at the center of the room, which has a very realistic texture that gives it a cozy feel. A nightstand was placed right next to the bed. Then we found a desk asset with incredible details, and put it in front of the bed as the platform where the calculator would locate. Alongside the calculator, we placed some other stationery as well as a lamp on the desk in an untidy manner to reflect the casual bedroom identity. The chair came into the scene for a similar purpose. Other decorations include the curtains, a plant at the corner, and three pictures on the wall.

Room structure A significant feature of the room is that it has a staircase on one side. The idea began with me suggesting that there certainly needed to be a door in the room, but Keyin added a staircase instead, which I later found to be a better idea. If we had a door there, it would most possibly not be functional; the staircase, despite not being functional (walkable) either, opens up the space and extends the user’s perception of the house. Since there is a second floor in the house, the user wouldn’t feel isolated in an enclosed space. The large French window provides a view of sunset and serves for that purpose as well.

Lighting We chose the time of dusk as the background because it brings out a serene feel and leaves much room for us to create our own lighting. The bedroom contains two spot lights casting light from above, each having a gentle intensity and a warm yellow-ish hue. The light on the second floor is in fact brighter, so that it may leave the user imagining a brighter space in other parts of the house even though they cannot go upstairs. Not to forget that the desk lamp produces a spot light focusing on the calculator, highlighting the most important object in the room and encouraging the user to interact with it.

Interactions There aren’t many interactions in this scene, the only ones being that the user can click on the calculator or push around the chair. Later after we combined our scene with Ben and Yeji’s, Ben added the interaction of picking things up and throwing them around. We initially enabled the chair to be moved around in order to enrich the gaming aspect of the project; after other objects became movable as well, I found the entire scene a lot more interactive, realistic and engaging.

Demo of interactions in first version of scene 1

The calculator clicking interaction was made using RayCast, which basically means that as the user looks around, the center of their view remains a cursor that can be used to click on things. When the cursor is pointed at the calculator, the calculator turns blue, suggesting that it is clickable. Once it is clicked on, the user will be teleported to scene 2. From the desk lamp to the color changing, we tried to make it as intuitive as possible for the user to nevigate themselves and access all the interactions.

Scene 2 When the user clicks on the calculator, everything in the room disappears except the calculator, which remains at the same place, while a sci-fi-ish scene appears and replaces the bedroom. The calculator apparently acts as a portal from the real world to the alternate reality, and the reason that it is still present in the latter is that by clicking on it, the user can return to the bedroom scene, which is suggested by its unchanged position.

Since we compromised the project for PC platform instead of Vive, the experience of this core interaction feels a lot different from how we intended it, but the scene per se very much lives up to my expectation. (Thanks to Ben and Yeji’s hard work!) The glow-y cubes work well with the blue/black grid floor and the darkened background, creating a futuristic feel that contrasts greatly with scene 1 and presents the alternate-ness as expected.

Demo of an early version of scene 2

The interaction is not as intuitive as it would be with a VR device. It involves several keys on the keyboard and all three mouse buttons, due to the amount of operations needed in this scene. Operations including choosing an operand / operator, creating / moving / deleting a cube, as well as walking around.

Reflection/Evaluation

One thing that’s been bugging me is the amount of intuitiveness that our project provides for the user. In other words, is this way of doing calculation simpler than the regular way? How is it more appealing? The core problem is, how many old conventions are we taking into consideration, and what new conventions are we trying to build with such an interface? In this sense, I think this project stayed far away from a lot of conventions of using a calculator. Our interface does not include a digit keyboard, which would be very familiar to most people. We don’t even have all the digits and operators laid out for the user to choose from; they have to scroll their mouse wheel to find what they want instead. Moreover, placing a cube at a desired position within a 3D space appears difficult through a 2D display, because the sense of depth is very much missing.

But in the meantime, I can’t help but wonder what the project would be like in VR. Obviously, the user would have more dexterity in VR with both hands available for more instinctive movements. Probably it will come in handy and be faster than the traditional calculator once the user gets used to the operations. Like I said, the ultimate goal is to explore the possibilities for 3D interfaces to become a common operating language, which could serve for more purposes, more than just doing mathematical calculation.

Agency Question

Like Murray said, the pleasure of agency could come from double clicking on a file icon and seeing it open, or obtaining a desired output from a piece of code. In our project, creating cubes to form an equation provides the same type of pleasure, which is equivalent to pushing down buttons on a calculator and getting the correct result. I think the summit of pleasure comes when the user places the equal sign cube at the end of their formula. Whether they have a desired result or not, it is meaningful to see that result coming out and fulfilling their expectation, because it is generated from something they build. The colors of the cubes (red when incorrect or not connected, blue otherwise) as well as the grid on the floor help inspire the user to create things in their way and anticipate the result.

Boxing with Ares: Project 2 Documentation

Boxing With Ares Final

Project Description

Group Members: Nhi, Vince, Neyva

“Boxing with Ares” gets its name from the Greek god of war Ares, who represents violence. Ares was chosen over the Roman god Mars, due to the fact that Mars represents valor in war instead of the violent aspect we wanted shown in our project.

Within our project, we really wanted to use the motion of “punching” in terms of boxing. While originally this was going to be done with VR controllers which gave us feedback, we needed to scale it back to a mouse click within Unity. There was a loss in translation, but we were able to achieve the boxing animation that we wanted.

The boxing motion takes place in a dark, and bleak environment. Red clouds fill a back sky, and the character stands in a light fog. Small punching bags just beyond reach are spread throughout the sky, filling the space above the user’s head. An ominous soundtrack (which can be found here: https://www.youtube.com/watch?v=Qm-El3qztgw ) plays on loop in the background. The hellish landscape contains nothing the user can interact with except for the one large punching bag situated in the middle of an open area.

The most important part of the project is the fact that once the user punches, white doves appear out of the bag. Since the project had to have an unexpected action occur from an everyday motion, we found the idea of doves emerging from a punching bag to be one of the least expected actions to occur. The dove is meant to represent the fleeting beacon of hope within the dark and hostile environment. Furthermore, the action of hope is juxtaposed by the violent action that caused the doves to spawn from the punching bag.

The thought was to create an alternate universe in which to situate this punching bag. We did not want to create a conventional area, such as a gym or forest in which this would take place, but created our own imaginary scene.

Process and Implementation

When we first began with the project, we thought to put the punching bag in a theater space.

Large punching bag situated on a theater stage, with an audience of chairs.

Our thought process was that any kind of prop can be placed on a theater stage and it will look natural, and so we wanted to plop the large punching bag on the stage instead of using something like a gym environment. In the case of the gym environment, the emergence of the doves would then seem more nonsensical than not.

After talking to Sarah, we reached the idea of creating our own unique environment. An influence from this was the matrix scene (shown below) in which there is nothing to interact with in the space besides the necessary objects.

Matrix scene after Neo first takes the red pill to see the truth

Building from this idea, we then began to look into imaginary dreamscapes that did not fit into a conventional kind of environment.

These were some of our earlier ideas, but finally we found one image which was a huge influence in our project.

We then took elements from this image, such as the lightbulbs being replaced by the smaller punching bags, and the fog. There is also heavy black and gray which we followed as well in our landscape.

Luckily, we were able to find a punching bag/boxing asset in the Unity store for free. It came with a large and small punching bag, plus a character wearing boxing gloves. This was the perfect fit for our project.

Here you can see the 200 punching bags we placed in the sky in order to recreate the lightbulb feeling from the image above.

We then added multiple clouds that were dark red and gray in the sky.

Then within the floor, we created 2 particle systems to create the fog effect around the user. (This tutorial was the one followed: https://www.youtube.com/watch?v=DvKRGwCImJ4)

We used a sheet of textures in order to create the smooth transition within the fog, shown below.

The second particle system was a lot simpler, as it was just floating dust motes. This was done simply using the Unity Particle pack.

Lastly, as part of the main interaction, we took the character from the boxing package and mounted the camera on him. This would allow the user to feel as if it was first person. The boxing package came with the scripts and sound effects to be punched, and so from there we built a script to spawn the white doves whenever there was a detected collision between the user punching and the punching bag.

Reflection

I believe we were able to achieve an alternate reality, due to the fact that the setting was very ominous and was meant to seem like it came from a hellish dream. That, coupled with the music playing on loop and the only interaction being to punch a punching bag definitely allows the user to be in a completely new environment. I believe it was a good choice to follow the matrix scene example in which the only object the user can interact with are necessary items. The large punching bag being the only touchable object and placed all alone is a very clear indicator for the user to come and punch it, and also the hands of the player being boxing gloves is another clear affordance to go and punch the bag and see what happens.

Overall, we were able to create the project that we had envisioned originally, except for the feedback controller aspect. We thought it would have been very fun to punch in VR, and we felt the mouse click from our computer did not match this motion we had originally wanted. BUT, other than that, we were able to create the scene we had originally envisioned.

On Agency…

Originally, through the use of the punching motion through the VIVE controllers, the user was meant to feel satisfaction as seeing the punching bag move and react to their touches, along with satisfaction at seeing birds spawn from their actions. This could have been improved in our project with the mouse click being held in order to “charge a punch” and for that to have a kind of effect when punching. I also spoke above of the clear affordances in our project, as the punching bag was the only thing to interact with within the environment.

Project 2: Documentation

Project Description

The everyday activity we choose to reimagine is pressing the light switch and it happens in a homey environment which consists of the kitchen and the dining area. We limit the room size so that it could be carried out with the Vive, but due to some special circumstances we have to set it up as a regular app instead. We have envisioned what the project will be like if carried out in a VR setting and I will discuss the difference in the reflection section. To recreate the interactions triggered by pressing a light switch, we have to break the convention that the light switch is only used to control the lights. To realize the alternateness of this activity, we design it to reflect the inner world of the audience, or more precisely the character in our first-person story.

In our scene, the audience would find themselves in a room and they could only explore the space within the room. The main interactions that could be triggered are with the two switches on the wall. To make the transition to an alternate aspect of reality go smoothly, the first switch that the audience see could actually control the light, but also with some changes to the environment that comes along with the light. The second switch couldn’t be used to turn on the lights and special responses are activated only when the lights in the room are off.


With the demo below, we could be more clear about the story. After entering the game scene, the audience would find themselves in the room. The door couldn’t be opened if one tries to. Apparently the character is locked in the room for some reason (maybe doing self-quarantine). When the audience turns around and sees the window, they will see fireworks outside. But when turning off the lights in the room with the switch next to the window, they will notice the fireworks are also gone. So here we are, alone in this dark room and the cold moonlight.


However, when the audience turns around, they will see another switch with a small light glowing. Turning this switch on, the audience will find themselves surrounded by the fireflies that fly out from the plants. Now even though we couldn’t get out of the room, we could still have our own light within this tiny space.



Process and Implementation

We went through a lot of brainstorming process to get to the final idea that is presented in our project. From the beginning, our goal has been to recreate the regular event where the audience makes interactions with the household properties, and this main idea remains unchanged all along till the end. Initially, our character was set to be a sleepy person in the early morning. After having a sip of coffee that is made with a coffee machine, the lights in the room are automatically turned on, meaning that the coffee light he/she up. However, due to the limitation of available assets, we couldn’t get a satisfying model for the coffee machine and have to give up this idea.

Later on, since we couldn’t use the VR device and focus on one core interaction, we decide to choose the interaction with the switch and implement some unexpected responses triggered by pressing a switch. Finally, we decide to use two switches, one for controlling the lights, and another for making changes to the plants in the room.


  • Environment and Aesthetics

To begin with, Ganjina set up the room with furniture and decorations and I helped with the arrangement of things in the room. In the initial environment, some space is left for the audience to walk around in this room. One switch is on the wall next to the starting position of the audience so that one could easily see. Another switch is on the opposite wall, with plants nearby.

Initial environment setup

Afterwards, Luize updated the scene by choosing a suitable color tone for walls and floor. She also added the ceiling with the indoor light and more decorations in the dining area. The environment becomes warmer and cozier with yellowish lighting.


Updated room scene with ceiling and indoor lighting
  • Scripts and Visual Effects

The scripts for this project are mainly about changing the states of objects involved when pressing a switch. The switch itself consists of two parts and one part is rotated when turned on/off. Chris implemented the interaction with switch and the light changes accordingly.

I’m mainly responsible for the scripts and visual effects after the switches are triggered. Initially, we wanted to turn the plants from dead to alive to reflect how the character is feeling. But again, the available plant assets are not satisfying enough. Therefore, we changed our idea and instead have the story and fireflies mentioned in the project description section. In fact, I’m inspired by the extraordinary situation where many people in the world have to lock themselves in their house. I know how depressing and lonely it could be to spend days and nights in the limited space, not to mention with concerns about the unprecedented situation. However, I somehow have the feeling that eventually everything will come to an end. It’s not even going to a miracle, but the fate of us all. So I use the image of fireflies as the metaphor of a subtle but certain faith. The fireworks outside are to enhance the contrast before and after the light is off. The fireworks only go off when the light is on while the firefly-effect could only be triggered when the light is off.


Fireflies with the room light off
Fireworks with the room light on

For the visual effects of fireflies and fireworks, I basically play around with the ParticleSystem in Unity. The fireflies have relatively long lifetime so that they could fill the room before disappearing. To simulate the motion of fireflies, noise is added to the particle emission trail. As is shown by the bounding boxes of particles below, the size of fireflies changes varying their speed and lifetime. Their color (or glow) dims over lifetime as well.

For the fireworks, they each consist of emission trails, small sparkles (as sub emitters) along the trail, and the bursts at the end of trails. The trails and sparkles have gradient colors so that the process from birth to death of emissions becomes natural.


In addition, the audience could also open the fridge and the microwave and the mechanism is similar to flipping a switch. The only difference is that the doors rotate around an axis instead of a point for the switch.

Reflection

From my perspective, we have realized the alternateness in our project. Conventionally, people don’t expect a switch in their room to be able to set or stop the fireworks outside. Furthermore, usually we don’t see fireflies indoor so it’s also an unexpected event.

For the medium, initially we envisioned the project to be implemented with VR setup. We think it would quite different with hand gestures when pressing the switch or interacting with the microwave and fridge. I add colliders to the fireflies so that I think it would be a better experience with VR devices when fireflies fly into the character (even better if sound effect is added).

Besides, the long-lasting brainstorming process of this project is especially inspiring to me. Before the actual implementation, we went through several times of idea sharing and feedback collecting with the whole class. At the very beginning, we even didn’t have any alternate elements in our design. The three interactions that we choose are also random and irrelevant. We got the suggestion that for the entire activity to make sense, we need to have a story that links everything together. That’s when I realize the meaning and importance of coherence, and I’ve always kept it in mind throughout the project.


Agency

There are two things we have designed regarding agency. First of all, the first version of fireworks are far from the room and one has to walk close to the window to see them. We think they are not noticeable compared to the first switch from the starting position. Thus, we adjust the position of fireworks so that the player could see them at first glance. Moreover, we move the switch closer to the window so that the player could easily realize the functions of the switch.

The second detail is the small glowing light on the second switch. It only lights up in the dark so that it catches the attention of the player to press it when the light is off.


Fire Planet: Documentation

Project Description
Fire Planet is a small narrative experience that places users in the position of a firefighter in an alternate world/planet engulfed by wildfire. In this world, civilization has been reduced to living under a protective dome, with water sprinkler units protecting its outside perimeters. Placed in the midst of a catastrophe – where one of the sprinklers has malfunctioned and wild fire approaches the city – it is the user’s role to use their powers (in the form of magical projectiles) to extinguish the flames and fix the sprinkler. This activity reflects an alternate world activity as it is an everyday action for the firefighter/protector in this world, yet which is in a setting that is quite distinct from the reality we know. 

Process and Implementation
Brainstorming
The brainstorming process for this project with Will and Steven was actually quite time-consuming yet fruitful. We started by pitching any action that occurred to us, and which we thought would be interesting to explore and use in our VR game. After much deliberation, and after considering a lot of crazy options that in retrospect would have been too ambitious to successfully complete, we realized that what we were missing was deciding on an experience that would fully capitalize on the advantages and affordances of the VR medium. This realization eventually led us to our chosen concept: that of a protector/firefighter who must protect their city from flames. The action that the user would be doing in this scenario was fanning two large fans to extinguish the flames. We believed that the action of fanning something in the air would be an interesting game mechanic to use, especially in VR. In terms of what would be everyday in this world – we decided that the idea/action of extinguishing flames (and thus suggesting that in this world fire is also dangerous) would work well to suggest the user’s objective. However, the alternate-ness would then emerge through the means of putting out the fire, along with the setting itself (which is on a vast, desert-like planet with a dome-encased city in the distance). Making the decision of how we would switch this concept to a non-VR game was easy – instead of fanning the fire out, the user would now throw an orb at it. This decision was made since we realized it would be more intuitive for users to use the keyboard to shoot something, rather than to fan.

This storyboard was the result of our brainstorming session, showing the location of the user in between the wildfire and the city.

Implementation

The first step we made once we finalized this concept was to divide the different components we had to work on to carry the game out successfully. These were the components:

1. Environment

The environment was the most straightforward part, as we all had experience with it from Project #1. Steven was in charge of bringing it to life – adding the dome and the city, using a rugged landscape hinting at the alternate-ness of the environment, adding fog and then finally adding the sprinklers and the fire. Although this environment didn’t require too many components, the careful placement of them was crucial, as they were key in framing our narrative. For instance: having a wall of fire behind the water particles suggests that these flames are somehow contained to their space, and are thus safe. Placing other fires in front of the user would suggest that these are the ones that are dangerous for the city and must be extinguished. 

Rugged terrain surrounded the user
A distant city surrounded by a dome could be seen directly opposite from the fires
Opposite the city the user could see the fire wall, stopped by the sprinklers

2. Interaction 

The interaction was further divided into more parts. These involved rendering the predicted projectile path for the magic sphere as well as shooting an object in that same trajectory, having collision detection between the user’s projectiles and the wildfire, and finally having the water sprinkler reboot triggered by the user getting close to its vicinity. Out of all these, my main role was to render the predicted projectile path and enable shooting through that same path. To do this, I followed this tutorial that demonstrated how to create a line rendering script as well as a spawning script that allowed projectiles to be shot at that same trajectory. The script was also easy enough to be able to fully customize the look, location, and angle of the line, as well as to change which object would be shot, making these components easy to combine with Will and Steven’s work.

Line render showing the predicted projectile path hitting another object

3. Character 

Since we were using a first person non-VR character, we also wanted to show our character’s hands and also show a type of shooting motion that would hint at the fact that the user itself was generating these magic spheres. When we combined the project, we made sure to sync the activation of the hand animation along with the shooting of the orbs.

Throwing animation with orb

4. Combining all of these to create our story

After getting the foundational interactions done, a lot of time was also spent on tweaking the experience and adding enough information/hints for users to understand the story they were placed in the midst of, while also being able to apply the game mechanics to fulfill their objective. Due to the linearity of our experience, this was an aspect we really struggled with in the later stages of this project’s development. At first, we weren’t sure if it was obvious for the user which fires they were meant to extinguish and which were actually contained by the sprinklers. We even reached the point where we changed the story completely to the point that we took out the sprinklers entirely and the only goal was to protect a tree that got caught in fire. In the end, after much deliberation, we decided to stick to our original idea, while making small yet effective changes in the design of the game that would make the story more clear and intuitive. For this, we ended up changing our sprinkler object to one that was more flashy (and even included an animation!) and added a large, flashy cylinder surrounding it that would always indicate the user which direction they had to go.

A closer look at the water turret with the surrounding cylinder

To be more consistent on components that were separate from the environment’s objects, we matched the look and color of the cylinder with that of the line rendering. We were also very careful with our choice of narration – we didn’t want it to just sound like instructions being read on screen, we wanted the person talking to feel like another character in the story, thus building the universe they belong in. Through a carefully made script, we tried to give enough context about what was happening in the story while also tying in a lot of the components that would have otherwise seemed a bit random and misplaced (such as the water fountains). We also edited it in Audacity to create the feel as if the voice was coming from a sort of communication device – enhanced by the white noise and static we added into it. 

Reflection/Evaluation
Overall, I feel that we did achieve an alternate version of this activity, even if it was a very specific one like putting out fires by throwing a magical spell at them. As mentioned earlier, a lot of the latter part of the development process was spent ensuring that the experience offered enough affordances for players to carry out their mission. Obvious indicators, such as the turquoise cylinder and the color-matching projectile line rendering were key in establishing a relationship between the short term objective of putting out the fires by aiming at them, and the long-term goal of reaching the broken water sprinkler unit. Placing the extinguishable flames in a loose line going towards the broken sprinkler was also an effective choice that naturally led users towards their end goal. Though at first it was hard to do the transition away from VR, I’d say that the medium ultimately didn’t majorly affect the implementation of our idea. Our story was there – we just had a slightly different way of telling it now. Finally, I feel that the end result really reflects the mental image most (if not all) of our team members had of the experience. Initially, we each had our own mental image of how the experience would look and feel, but I’d say our game combined all these conceptions we had really well, which I am really happy about. 

Agency question: 

In Fire Planet, a “meaningful” action that we designed for the user is the ability to throw magical orbs, particularly for the purpose of extinguishing fires. This action is triggered by aiming with the mouse and pressing the spacebar to shoot. The design of it incorporates various aspects outside from this pressing mechanic. The projectile path render facilitates the process of aiming at different objects, since the position of the mouse on the screen does not necessarily reflect the raycast aim of the game. The positioning and throwing animation of the hands that gets triggered when the player shoots is an additional element that aims to situate the player more into their character. We didn’t want it to seem like magic orbs are just appearing out of nowhere, which is why having this hand motion was crucial to situate users into the character of our firefighter. The meaningfulness of this throwing action comes with what it is able to allow, which is the ability of reaching the broken sprinkler and fixing it. In a way, this action is a crucial plot device for our story. Having additional outputs from throwing the orb, such as having smoke emerge from the hit location as well as emitting a sizzling sound when the fire gets extinguished are also choices meant to enhance the experience of carrying out this action.

Game demo:

Project 2 Documentation

3D Calculator

Documented by: Keyin

Group Member: Tiger, Ben, Yeji

Project Description

This project is meant to reimagine using a calculator in a three-dimension space. Instead of pressing buttons at fixed positions to get a result at a fixed screen, doing calculation within our project can have more freedom and the visualized process in 3D can break the regular mode but create a functional and portal experience with a calculator. We built the calculator world as an alternate world which is composed of blue grids and glowing cubes. The participant can move, turn around and even float in the sky while manipulating the cubes to do the calculation. Besides the calculator world, we also made a realistic bedroom scene where the alternate calculator world can be triggered by the normal calculator on the desk. Rules in the two worlds are quite different and we tried to utilize the realistic bedroom world to exaggerate the alternate part in the calculator world.

Process and Implementation

Initially after we did the brainstorming, we had quite a few ideas that were possible to extend. We finally chose the one with the idea of 3D cubes. The other ideas, for example, building an alternate backyard, can be fun as well, but we prefer to play around the flexibility of simple 3D shapes in a limited space and make things creative but also simpler and clearer. We thought about the application of 3D cubes including doing the calculation and doing the programming, because most of our group have CS background and these are indeed everyday activities for us. In this project, we implemented the 3D calculator.

The scene would start from an ordinary bedroom and the participant is able to move around in a room scale. When the participant picks up a calculator on the desk, he will be transported to another alternate world with only the calculator and glowing cubes in sight. Originally in VR environment, we thought about making the cubes throwable and the result would drop down from somewhere in the sky so that it could be very immersive and alternate. But later since the project was redirected to be built on PC, in order to adapt to the PC experience where we mostly interact with the mouse, keyboard and the screen, we simplified the effects and dragging objects became our main way of interaction in both scenes.  

We were firstly divided into two to do the two scenes separately and then combined our work together to make it a complete project. When we worked on our own scene, we also thought about the consistency like how to change the scenes and the way of interaction. Tiger and I were working on the first scene and we initially used SceneManager to shift between two different scenes. Ben and Yeji played with visibility in the same scene instead. But it seems both ways can work and here we go with the latter one, because in this case, it’s more convenient to match the camera setting in the two scenes. We met some problems when we did combination, mostly in losing some material and texture. But we fixed them one by one and also improved the effect on the calculator when doing the transition. 

The calculator scene is utilizing snapping to connect cubes and do calculation. By applying the different movement speed and rule, the participant can gain more freedom in this scene. The cubes that involves calculation will turn blue while others remain red, and the result will pop up on top of the “=” cube. If there’s anything wrong with the arrangement of the cubes, the result will show “Error”. As for mouse interaction, it includes left clicking, right clicking, middle clicking and scrolling. It may take time to get used to using this 3D calculator, but it indeed provides a total different experience for this everyday activity.

More detailed process can be found in Development Journal. And here are demos of our two scenes.

First Realistic Scene
Second Alternate Scene

Evaluation

The alternate part of our project lies in the calculator world which can be obviously sensible due to the different environment setting from the real world and the unlimited movement direction for the participant. To guide the participant to interact with the calculator, we made the glowing effect on the calculator whenever the center of the screen is directed to it. Accordingly in the alternate world, we have a big glowing cube functioning as a desk and the calculator is put in the center with the same glowing effect whenever directed. It gives the indication that the calculator is a switch to shift between two scenes. However, compared to the natural interaction of throwing and moving objects in the first realistic scene, the way of interaction in the second alternate scene may not be that intuitive because we are using different clicks on the mouse. But we think adding some instruction can be useful. In general, our project has successfully achieved what we want to create. Even though we deviated form the very original thoughts which was set in a VR world, we still watched the core interaction of our project and made the experience as enriched as possible on PC. The different choices made in the premise of different environment settings is also part of our learning experience.

Agency

The main interaction of our project is no doubt playing with the 3D calculator where the participant can clicking, dragging and moving with keyboard. The action is meaningful because it will finally show the result a person using a calculator is expecting. As I mentioned in the evaluation, we could make the way of interaction more obvious by adding some instruction. But thinking from another angle, the participant may have a chance to explore the alternate world in his/her own understanding by figuring out how to make it work, considering the WASD keyboard and mouse interaction are very intuitive on PC. With very simple setting at the beginning where there’s only a cube with “=“ and a cube functioning as a controller at hand side, It should be easy to find out how to create new cubes and connect them. Besides, we also have designed an action of shifting between two different things. We made the position of calculator fixed and added the highlight effect when it’s hovered in order to indicate it is a medium connecting the two scenes. It also makes sense in a way the participant will desire to see the working space of a 3D calculator when he chooses to click the glowing calculator on the desk. Moreover, we picked the starting point right in front of the staircase in the first scene because we want to encourage the participant to move around and explore by him/herself. And that’s how we think about the idea of agency along the process.

Project 2 Documentation | Fire Planet

Description

The user stands on a barren planet, fires burning all around, with mechanical sprinklers keeping most of the flames at bay. Behind the user towers a city in the distance, surrounded by a dome. A voiceover calls upon the user to use their powers to fight the fires and repair a sprinkler that has broken. Using their abilities as the designated protector, the user can fire projectiles from their hands to extinguish the fires and clear a path to the sprinkler. Reaching it, the sprinkler reactivates, and the voiceover congratulates them. In short, the user firefights on a different planet.

Process

We started thinking about interesting, observable, everyday actions. Building upon our ideas from the in-class activity and considering the controllers of the HTC Vive, we landed on the action of waving a fan. Simultaneously and iteratively, we considered what situations waving a fan do occur and could occur inside of an alternate reality. This led us to think about firefighting with a fan. With this action and character context, we built a narrative of a civilization under constant threat from the fires of the planet they live on. 

We imagined what the life of a firefighter in such a civilization would be like. A civilization under constant threat of fires would establish more permanent and secure defenses against the ever-present threat of fire. Hence, the city of the civilization is covered by a dome, and the dome is surrounded by large mechanical sprinklers. The role of a firefighter as a first-responder shifts slightly in this reality, then, as they are tasked with repairing and maintaining the defenses rather than fighting the fires themselves. This translated into the user repairing a broken sprinkler, using a fan to fight the fires which had encroached inside the boundary in the meantime.

storyboard

Due to the change from a VR system to a computer system, we had to reconsider the interaction of the fan. While an HTC Vive controller mimics the handle of a fan and can be waved independently of the user turning their head, waving a mouse around in first-person would force the user’s head and aim to move erratically or require an unconventional restricting of movement. Because of these affordance problems, we changed to an earlier idea: throwing water balloons. The FPS-reminiscent setup allows a clearer connection to what user’s already expect from a mouse-and-keyboard game. Upon in-class feedback, we abstracted the water balloons into spells.

With the world, character, and interaction worked out, we moved to divide up the work and start building. Steven worked on the scene design and hand animation. Mari worked on the projectiles. I worked on the particle system interaction between the extinguishing projectile and the fires.

To create the effect of an endless fire, the fires had to burn continuously and reignite if extinguished. The reignition had to be delayed in order for a path to be cleared by the user. After exploring the asset store for different fire systems, I started to use the Fire Propagation System. I followed the use instructions, learned exactly how all of the parts worked, and realized it would not work. The fire system was too realistic. The fires in this system burned based upon available fuel and different material and climate properties. After the calculated “HP” of a material is reached by the ignition system, a fire starts, and then burns until the materials fuel value is reached. Unfortunately, we did not want a fire that would always die.

I read through the sections of the Unity reference on particle systems and collision and then found tutorials on how to script delays of actions and to disable and reenable in game objects. With this research, I was able to write two scripts. One script detects collisions between the extinguishing particle system and the fire particle system and disables the fire system upon collision. The other script detects when fires have been disabled and then enables each particle system after a given delay. With this much simpler method, the affordances of the extinguishing spells and the response of the environment is much clearer.

The scripts were combined with Mari and Steven’s work. As final touches, a voiceover was added to clarify the narrative and contextualize the user’s role in the experience, and a cylindrical indicator was also used to guide the user to the disabled sprinkler.

Scripts

Script Demo Images

Reflection

In the end, this experience successfully establishes an everyday activity in an alternate world. It approaches this challenge by considering a specific role, firefighter, and restructures that role in an alternate reality, the fire planet. The voiceover, hand animation, particle path, and fire system all clearly demonstrate or reinforce the different affordances of the experience. While the world and the character matched what we envisioned early on, the interaction itself lacks much of the more seamless interaction afforded by a VR system and its controllers. While the materiality and contextualization of the interaction is unique, we did not have the opportunity to explore the potentialities of a new medium, the way creating a fanning interaction in VR would have.

Agency

The main action designed for the user in Fire Planet is extinguishing fires. Its meaningful quality relies upon the instinctive perception of uncontained fire as a threat, the intuitive perception of a city as something to be defended, and the provided (via voiceover) impetus of the user’s role as a protector. The planet’s features and orientation of the user help establish these perceptions. The uncontrolled fires contrast against the fires controlled by the functioning sprinklers. The user starts close to and facing the uncontrolled fires, with the city behind them. Placing them between danger and the endangered, and having them face the danger rather than face away from it, suggests the role of a user as a guardian about to defend rather than a victim about to flee. The voiceover then makes all of this explicit through narrative, establishing a motive and a specific task, repairing the sprinkler, that will allow them to fulfill their role completely. All of these elements combine to drive the user to extinguish the fires.

Project 2 Documentation

Project Description

The main activity used as the basis of the project was using a calculator. The goal was to re-imagine the calculator interface in a VR setting. The calculator most people are familiar with is either in the form of the physical calculator or as apps on our phones or computers. Both versions display the results on a two dimensional screen. Also, the interaction to input numbers is often restricted to the act of pressing buttons – whether they are physical  buttons or those on a screen. With this mind, we wanted to make use of the VR platform to design a functioning interactive calculator in a 3 dimensional space. 

As the project’s emphasis is on the mix of both the usual (everyday) setting and a sense of the alternate, we decided to have two settings for the project. The first, user’s spawn location, is an everyday setting, located in a person’s bedroom. The bedroom is furnished with objects which an average person would possess – bed, drawer, bookshelf, etc. Non-textual elements – such as lighting and color – is used to naturally direct the user towards the calculator. The interactions in this room follow the conventions of the real world, gravity for example, to provide a clear distinction between this world and the alternate. The alternate world is accessed through clicking the calculator. The interactions that take place in this virtual calculator interface would have qualities which do not follow the everyday scene – flying, endless black background, cubes that float in space and snap to each other – creating a clear sense of an alternate reality. 

Brainstorming

Project Two’s brainstorming process was different due to its collaborative nature. The team met up in person, each with potential ideas in mind. Several ideas came up, some of which further developed the 3D drawing idea, simulation of throwing glass, cooking food in a campfire, falling from the sky, pet simulator, backyard work, etc. Ultimately, Ben mentioned re-imagining the programming interface in VR. Discussing the potentials of this idea ultimately led to re-imagining the calculator interface in three dimensional space.

Implementing two different scenes made the division between the everyday and alternate very clear. The everyday surroundings of the bedroom was achieved by placing familiar furniture – desk, bed, plants, etc – under a warm lighting which created a comforting ambience. Furthermore, Ben allowed the user to interact with the objects in the space, making it more interactive. The alternate aspect of the project comes in when the user clicks the calculator. While clicking on other objects, such as the pencil, plant, etc, allows the user to grab and throw, clicking the calculator teleports the user inside the 3D calculator interface. The alternate environment is distinct, evident through the endless black background, glowing objects, ability to fly, grid-like floor. The user also has the freedom to fly. The nature in which objects, cubes in this case, interact is also different. The cubes are not influenced by gravity and also have snapping qualities to each other. 

Process / Implementation

Due to the cancellation of physical classes and partly for convenience, our team divided the work, allowing each of the members to work remotely. As we had two different scenes, it made sense for two people to take charge of one scene. (Keyin and Tiger – scene 1 (bedroom), Ben and I – scene 2 (calculator). 

I was responsible for designing the necessary assets in the alternate world – cubes being the main one. We needed cubes which represented single digits (0-9) and operators (+,-,*,etc). Text Mesh Pro was used to display these numbers. TMP allowed access to the numbers through script which was convenient for Ben when altering the numbers and operators. As the background of the alternate world was darker, a glow effect on the cubes was applied. This was achieved through using the LightWeight Render Pipeline which included a bloom effect.

After finishing the design for the cubes, I worked on writing the script for the teleportation to familiarize myself with the c# language. From Sarah’s suggestion, we decided to place the two worlds in a single scene and teleport the user’s position instead of switching scenes. I initially used Raycast to identify where the cursor was pointing. The raycast was only initiated when pressing the mouse. When the object pointed by the ray was the desired object, the user would be teleported to an assigned location. The script was applied to the player.

From feedback, the script was simplified significantly applying the script to the calculator. Raycast was no longer needed as teleportation was initiated when the calculator was clicked. There was another issue which arose due to the First Person Controller where the player was returned immediately back to their default location after being teleported for a short period of time. Ben fixed this issue through temporarily disabling the First Person Controller while the teleportation took place.

After each person completed their part, the project was assembled in person. Teleportation was implemented to connect two scenes. Most of the time was spent adjusting the LightWeightRender Pipeline to the first scene.  

Reflection / Evaluation

The project managed to create an alternate version of both the interaction with the calculator and its internal interface. The alternate version of the interaction itself was achieved through prompting a different response when clicking the calculator as opposed to clicking the other objects in the room. When clicking the other objects, the user can grab and throw them whereas clicking the calculator teleports the user. This contrast clearly distinguishes the calculator from the rest of the objects. The alternate version of the interface is achieved through the visual and physical elements that clearly differ from the everyday scene. The relative flexibility in the alternate scene allows interactions – such as flying, floating cubes – to occur. 

The primary visual cue which prompts the user to interact with the calculator is the desk light. The calculator is placed in the center of a large desktop under a bright spotlight. When the user hovers the mouse over the calculator, the object luminates to indicate a special interaction and encourages the user to click.

In the alternate world, colors are used to represent the different status of the cubes and expression. Moving around and placing objects follow the conventions of many computer games (wasd) and mouse-click, allowing the user to quickly adapt to the controls. The “=” cube which exists in default, along with the block on the right corner, signals the user to place and block and add on to the expression. 

Different colored cubes

The end result turned out similar to the original designs. Alterations that were made were primarily due to the change in platform from the vive to computer. The keyboard allowed for more freedom in user movement, such as flying. It allowed more buttons for input which were necessary in the alternate world to support the operations. As most interactions in the bedroom was limited to holding and releasing the mouse, they could easily be adapted to the vive. 

Agency Question

Calculators allow for the computation of infinite expressions. This function itself, creates a high-agency experience for the user. The project simply provided another visual interpretation of this very function. Hence, the interactive aspect (pull and drag, snapping, flying), in addition to the freedom of the user to compute any expression and see results for all cases, provide a high-agency environment for the user to interact in. This allows the user to focus more of their attention in formulating expressions and waiting for the right answer and feedback, and less on the physical interactions themselves.