Galaxyze: the final documentation

For our group project, Shenuka and I worked on Galaxyze, an open-ended VR environment that allows the following interaction/engagement:

  • user controls the stars in the night sky by rearranging pebbles on a beach.
  • user is able to impact on a cosmic level with a trivial action.

The goal for this project is to let the user become a creator in a fictional world that is also quite relatable, in the sense that it’s a deserted island surrounded by the sea (much like Saadiyat bubble). The user can draw patterns in the sky with trails and constellations, feeling powerful but also isolated by their limited mobility in the world.

This idea first arose from a chapter from Italo Calvino’s Invisible Cities and we wanted to explore the “stuff of story” – about makes a story, the lack of a narrative and he open-endedness of it.

The environment consists of the following, based on the idea that the user must feel small in the vastness of the cosmos above and the ocean below.

  • Island: Terrain tool
  • Pebbles/ Water: Unity Asset Packs
  • Stars: Spheres with glowing material
  • Sky: Night sky skybox
  • Trails: Particle systems set to fade with time
the environment

The scripts we wrote are the following:

  • Mapping the position of the stones (x and z values) to the positions of the stars.
  • Detecting when all the stones have been thrown in order to play the ending animation.
  • Switching from the ambient music to the ending music.
Galaxyze in its final stage

Looking back, our project came out differently from how we imagined it in the first place. We were able to complete this project thanks to multiple player testing, experiments, and feedback from peers and Sarah.

Here’s a footage of Craig testing it out at the IM Show exhibition (mute sound, a little noisy):

Galaxyze in IM showcase

Some challenges we faced were mainly in writing the scripts because it was the first time for both of us to use C#. However, after ample research, late-night experiments, crazy amount of logs, tests, and help from friends, we were able to solve them all!

If we had more time, I think we could work more on the environment and making it feel more alive. This could involve adding:

  • splashing noises and wave sounds
  • dolphins jumping out of the water from afar
  • more stars in the sky
  • shooting stars here and there
  • different types of assets to pick up – not just pebbles, but also shells and other broken/abandoned remnants that we find on the beach these days. I think this could add more to the backstory and give more contextual evidence as to where they are

Galaxyze: Development Blog 02

Following my previous post, we progressed more to make the vr environment more immersive and aesthetically pleasing.

The vr environment now looks more realistic and complete with the addition of these assets:

  • rocks of different size and shape
  • island terrain
  • water surrounding island

<picture>

When we were play-testing our prototype last time, we noticed how people were throwing away the rocks into the water (of course they’d do this…). Several solutions were proposed, such as making ripple effects or making the rocks bounce back at the player if thrown too far. We decided to add trails to the stars as they move so players can feel free to “draw and paint” on a blank canvas in the sky.

We achieved this by adding a particle system to each star. We then tested it out on game view, trying out different features to decide on the best effect. Here’s a video of what it first looked like:

the star trail…

As you can see in the footage, the trail at first looked like little blobs multiplying over time. To make it look more appealing, we made the following changes:

  • color: turn from yellow to white
  • opacity: fade out slowly over time
  • speed: create a total of x number of particles at a rate of y each time.

We also edited the code of the rock-star movement so that the z value of the star is being multiplied by a number, making it move not directly above the rock (on the same axis), but a little bit to the side. This resulted in the star moving

more realistic star trail! (ft. our midnight happy squeals)

We then proceeded to add music in the background. This contributed a lot to make the experience more mysterious and adventurous. Here’s a video of Max play-testing it:

Max having the time of his life with not one but TWO controllers!
Junior fascinated by the experience!

There were also a few bugs here and there which we fixed, such as how some of the stars weren’t corresponding correctly to the rocks’ movements (being stuck within one area) and some stars not having the right trails.

More thing to work on:

  • adding texture to the moon
  • the final scene when all rocks/pebbles have been used up
  • sound

Galaxyze: Development Blog 01

Shenuka and I decided to name our project Galaxyze. We thought it was a fitting title considering how our vr environment features stars, the universe, and connecting with the world “out and above.” We also thought using “-yze” made it look like an action verb, which suggests that the players have to do something in the game – to draw and create constellations, patterns, and trails in the sky with little rocks on the ground.

We started out by creating a small plane for the ground and adding a star-filled sky for the skybox, as well as adding little cubes that we’ll later transform into rocks and stars:

Galaxyze in its initial stage

We also worked on a prototype of the first interaction between the player and the world. We imagined the player to be able to pick up rocks on the ground and then be able to see a corresponding star copying and reflecting its movement. We achieved this by essentially creating two cubes – one on the ground and one elevated in the sky, and sending the y position of the cube below to the one above. You can see Junior playing with our prototype in the video below:

Junior testing out our interaction prototype

I was also delighted to find that the players who tested out our protytpe began sitting down to observe the object in the sky better (see pictures below), as this looks more like the experience we were going for – sitting on an island and looking up into the sky to watch the impact one was having and creating. It was great to see people do this because we didn’t instruct them to do this in advance – the environment and interaction had apparently made it intuitive and natural for them to play it in this way.

Junior sitting down to enjoy stargazing and galaxyzing to the fullest! 🙂

After presenting this to the class, we received some feedback that we’ll be working on next:

  • how to signal players to look up at the stars? – sound? visuals? shadow on the ground?
  • create more rocks and stars
  • more environment visuals
  • how to make interaction more interesting – more than just the star copying the rock’s movement – creating trails?

VR Park review

My trip to the VR Park was a very interesting one to say the least. When we first arrived, I was super excited and amazed by the number of different rides available. Along with a couple of other people, my initial thought was “why aren’t there more people here?!” Everything looked so fun and I imagined it to be bustling with people. But funnily enough, halfway through my time there, I slowly realized why this was. There are definitely drawbacks to this new technology and

My favorite experience was the Burj drop. It was the first ride that I went on and at this point I was filled with excitement and expecting the best to come. I put on my headset and immediately saw what you would see if you were up in Burj Khalifa. I was apparently with another man and by the look of his helmet and outfit, we were doing some construction work up in the air. The ride began and we were going up and down at varying speed. I felt very immersed in it because the movement felt very real. What I was seeing in the headset corresponded to how my body felt, which added to the adrenaline. For me, the light and the sound did play a role in making the experience feel fuller, but neither of it was extraordinary.

However, I realized this wasn’t the case for some other rides and even though my vision was aided by movement in the ride, I wasn’t as immersed and was even thrown out of it. I think the key was to find the right balance between the two – if there’s too much of any one aspect, it immediately takes me out of the experience. Most of the other rides, for example, had way too much movement. My body was being thrown from side to side, and it was very hard to focus on the game as I began feeling dizzy and sick. the headset also felt very heavy so when there were sharp turns in some of the rides, I felt a lot of pain and pressure on my neck. Even some of the props that we had to hold, like guns, were quite heavy and holding them for prolonged periods of time would tire me out midday.

Overall, I had fun going on different rides and I had never experienced anything like it. However, I doubt I’ll be back anytime soon, given the headaches, neck strains, and dizziness the rides caused!

Final project: Lauren & Shenuka

For our final project, we’ve decided to create an environment where the player can move pebbles on the ground of a deserted island surrounded by sea, and having the stars up in the sky to reflect the movement of these pebbles as they are being moved. After a certain time has passed, the sun rises and sets, thus “renewing” the sky and giving it a new blank canvas for the player to create another constellation on.

This idea came from reading the chapter from Invisible Cities, where things happening on the ground affects what’s above it in a similar manner.

Here’s what we imagine the environment to look like:

courtesy of Shenuka Corea

*note: the sea surrounding the little deserted island gives necessary boundaries as to where the player is constrained in terms of space.

And a storyboard of how the world would work:

courtesy of Shenuka Corea

Our project aims to use the space, the objects within it, the relationship between them, and the sense of time in relation to cause and effect to convey “the stuff of story.” The interaction between the player and the world lends itself to discoveries and experiments.

Assets we’ll need:

  • stars in the sky
  • island terrain and water for the ground
  • night sky as skybox
  • animation of sun rising and setting to restart the sky

Interactions to design (+code):

  • moving of pebbles – objects with gravity, and responsive to where the player is moving them
  • similar reflection in the moving of stars
  • extra time element (delay) added to stars so they leave a trail behind as they’re moving into places corresponding to pebbles, creating the effect of shooting stars!

Sound/light:

  • ambient during nighttime – see first photo referenced above
  • calming sounds of ocean waves in the background
  • stars above sparkling a little

Midterm project: final documentation

Recycling @NYUAD is a project that strives to bring awareness to the environment and waste issue at NYU Abu Dhabi. It seems that many of our students lack the knowledge of recycling properly and frequently, so we want to use this project to address that problem.

The look & feel:

the environment that I made!
look from another angle

One important decision we made in terms of the campus was to focus on one portion of it – not too much space, but enough to move around, be able to see trash littered around, but at the same time exude a sense of a closed space so players don’t wander around and stay within that designated area. Our set area was the space right outside D2, where there are grass patches and A6 and The Arts Center are on the side.

My job was to create the actual space. I took references from taking photos in the actual space and also looking at Google Map to see what the player would actually be seeing if they were standing in that space.

Initially I tried looking for prefabs that could be used for this project but because our campus is very unique in design, it was difficult to find anything similar. So I started building them from scratch on Unity using 3D shapes. The key was to layer them together to mimic the building structures and add elements for details.

On my part, I’m pretty satisfied with how the environment turned out. It was my first time building assets from scratch and it took a lot of trials and errors, but I enjoyed the process and liked how it turned out. I also spent a while experimenting with different skyboxes and eventually settled on a bright, cloudy sky look, which fit the environment quite well. The main things I learned in the process of building the 3D space were 1) using the right color, 2) getting the relative size of buildings correct, 3) adding small but important details that can make that space look more realistic and accurate.

After I completed all buildings and the environment was finished, I passed it onto Ju Hee, who incorporated prefabs of objects that populate the space, such as chairs, tables, and trash.

For the interaction, Simran and Yufei worked on how the player would pick up the trash. The pieces of trash glow yellow when the player is nearby, indicating that they can pick them up, which they can then dump them in the recycling bin. A sound plays if it is recycled properly and another sound plays if it’s not.

In reflection, if we had more time I think we could have worked more on making the interaction more sophisticated – for instance, making the trash come to live and react angrily if the player chooses to ignore it and not pick it up to recycle it. It could shake and make a roaring sound until the player actually picks it up. I think this would have made the experience more engaging and interesting. Making the trash come more alive would also be taking advantage of VR as a medium as it’s not bound by how things work in the real world.

We also had issues re-styling the environment for the interaction as the space itself was pretty big. Looking back, I think we could have spent more time trying to adjust the size and scale more.

I would also work more on the space, decorate the buildings a little more, and maybe even add animations of people sitting around and chatting to each other near the Dining Hall. All of these contributions would add to the experience when the player is in the space, making it engaging and immersive.

After user-testing & presentation in class:

I was very delighted to find that a lot of my classmates found our project very fun to play. To our surprise, people started throwing the trash around to see if they can throw it into the trash can from afar. It was interesting to see how our supposed weakness of having a huge space contributed to the fun element. Moving on, we could make use of this feature – if the player throws the trash from afar and fails to throw it into the trash can, it comes flying back and doubles in number! To add to the educational element, we could also have words pop up onscreen, giving numbers and facts about our waste control at NYU Abu Dhabi and how different kinds of trash should be properly recycled.

I was also pleased that people found the environment very familiar. I spent a lot of time trying to build The Arts Center, The Dining Hall, A6 as well as the grass patches from scratch, so it was very rewarding to hear my friends telling me that they could immediately recognize the space.

Midterm project: development 02

Moving on, I built the space around The Arts Center more. It took a lot of trials and errors to get the right look and see what details will add to it. I realized having the right overall shape of the buildings and the color correct really makes them look realistic. Adding windows is also another fine element that made the buildings come more alive.

Screenshots from work:

building more grass patches around the area

I also looked to see what other structures I could add to make the space more recognizable and realized that there were long, rectangular direction platforms around the grass patches. I went ahead and recreated this in the space:

After that I started building the Dining Hall as well as A6 on the other side.

I identified A6 as being the tallest building among the three, and made sure that was clearly visible in the space.

the initial stage of A6
A6 rising up from the ground!

And finally, here we have the final look of the actual environment!

Midterm project: development 01

For our group, we decided to make a project on bringing awareness to the environment and waste issue at NYU Abu Dhabi. It seems that many of our students lack the knowledge of recycling properly and frequently, so we want to use this project to address that problem.

The interaction we want to recreate in this VR project is the act of picking up trash on the ground and recycling it into the right bin. We decided to use our own campus as the actual environment in the project. One important decision we made in terms of the campus was to focus on one portion of it – not too much space, but enough to move around, be able to see trash littered around, but at the same time exude a sense of a closed space so players don’t wander around and stay within that designated area. Our set area was the space right outside D2, where there are grass patches and A6 and The Arts Center are on the side.

So far, the groups have been set to take on the following duties:

Building the environment – Lauren (making buildings and environment) & Ju Hee (adding objects, trash and cats to the space)

Making the interaction – Yufei & Simran (picking up of the trash, recycling into the bin)

For the environment, I decided to make the actual campus space (buildings, composition, staging and lighting). Initially, I was going to decide on a color palette and build a space using that based on the campus. We thought maybe we could build an “angular” version of our campus using simple shapes. But after trying that out with simple cubes, I decided the key is for players to be able to identify it as our campus and it had to look more realistic. So I went to take some reference photos of that area as well as looking at Google Map to get an accurate sense of space.

referencing Google Map for accurate details

I started working on The Arts Center first because it’s big and easy to recognize. Once I have it finished, I can build the space around it. I also first searched for a lot of textures that I could use for the buildings as well as the ground/environment.

The Arts Center in the making!

The Arts Center with more details

a close up of The Arts Center

I’m happy with the progress I made for today. More coming!

Forms of representation in VR

I think one of the biggest strengths of VR is the fact that the player feels immersed in an alternative environment and feel an intimate connection to that world and objects within it.

Given this feature, I think this medium is fitting to represent building of relationships between people – a simulation of how people interact, connect, and bond.

Image result for vr friend
multinational friends in VR talking to the player

I first thought about this idea because of the environment we’re in at NYU Abu Dhabi, where we have students and faculty coming from literally everywhere around the world. I was just serving as a Peer Ambassador in the last Candidate Weekend where I talked to prospective students and realized how it is sometimes overwhelming for people who have never travelled abroad before. I think simulating human interactions through VR could help people like these students to know how to situate their thoughts and conversations in a global context.

In this sense, VR could be used as an educational tool for people wanting to learn about people hailing from all over the world. For this specific representation, it could incorporate intricate details specific to different cultures and societies by simulating the different reactions and responses from characters within the VR world coming from different countries. The player can meet and converse with these characters and be informed about the nuances of different cultures.

Such representation in VR provides an interface for interaction where people learn to approach people of different backgrounds, ask informed questions, and know what factors to keep in mind when conversing.

But this leads to questions such as

  • Why not just interact with real people?

I think such representation of multi-cultural and multi-ethnic in VR is definitely in no may suggesting that this is the only and most accurate manifestation of what these interactions would look like, but rather is a way for people to get started thinking about how to act, approach and behave when they are in such situations. There are benefits to meeting people like these characters in person, but for those who fear making mistake and want to avoid being overwhelmed, this is a good option to try out.

  • What does VR bring to this specific form of representation?

I think having realistic characters to guide you through human interactions can prove to be very important for people not just wanting to “practice” and get to know other cultures before meeting people in person, but also for those who actually struggle to socialize and are introverted – VR can strive to provide an experience as similar to reality as possible.

Google Cardboard VR Experience Review

I chose to watch a 360 video featuring dinosaurs in a jungle in hopes of experiencing what it would be like to be in Jurassic Park.

One thing I noticed about the composition of the environment is the use of positive and negative space. I think it’s important that the view is not saturated with so many assets that the player/viewer does not know how to go about that place. There needs to be as much empty space for them to figure out how to navigate through the environment. But on this note, I think the design needs to be clear so that the player/viewer can regonize navigation points like roads. In other words, it should be obvious that there’s a road ahead so the player/viewer knows that they have to go forward (see picture #1).

Some of these “navigation points” are clear in the game’s use of coins that are placed on the road before them. The coins appear out of nowhere, bright and glowing, prompting the player to approach it (see picture #2). When you go close enough, it disappears, suggesting that you’ve successfully acquired it.

picture #2: a glowing coin/diamond!

This specific game seems to also use arrows to direct the player (see picture #3). When the player is lost and looks around, arrows appear on the screen to guide the player. These are all different ways a VR environment tries to communicate information to the player.