Development Journal: Project 3

Storyboard
Moodboard
Proposed environment layout

For this project, Keyin, Chris, and I decided to go with the Escape Room idea. We had several specific ideas in mind, but we eventually settled on one where the user is wheelchair-bound. The reasoning is that by forcing the player to stay seated while playing/viewing in VR, we can not only easily create immersion, but also counteract any motion sickness felt while moving within the VR world.

In terms of scenario, we eventually settled on using a hospital setting, mainly because we felt it would create a compelling narrative around the wheelchair as well as create a suitable environment for a wheelchair-based user to navigate around in. We also decided to go for more of a “fake-happy”/”light-hearted” low-poly theme rather than an outright horror theme to give our game more of a psychological horror ambience.

My portion of development

https://streamable.com/gs96wb
Extremely barebones first prototype showcasing movement + interaction mechanics using only one button.
https://streamable.com/1ae7el
Collision-based pushing and physics interactions between objects within the world.
Wheelchair and player model.
POV perspective of player model.
https://streamable.com/tiypks
Wheelchair pushing animation.

Invisible Cities Response: Maurilia

Maurilia strongly reminds me of a lot of the European cities/towns that I used to visit. In Maurilia, travelers are encouraged to glorify what Maurilia used to be: a quaint rural town with no particular distinctions. This old Maurilia is preserved and portrayed through postcards indicating where things used to be — for example, a hen in place of a bus stop. The idea is that the modernity of current Maurilia contrasting with the rural feel of old Maurilia is meant to evoke a strong sense of nostalgia. However, the two versions of Maurilia are arguably too different to be considered the “same” Maurilia; rather, it would be more suitable to consider them as two cities coincidentally with the same name.

In terms of a real-life equivalent to Maurilia, the city of Graz, Austria comes to mind. Graz is now the second-largest city in Austria behind Vienna; it is often characterized as an odd combination of future and past. One notable area is located around Kunsthaus Graz, a strangely-shaped art museum that runs on solar power. The museum itself is a stark contrast to the more conventional/traditional buildings around it and serves as a distinct example of the aforementioned “future meets past.” The tourism markets itself similarly; guides often point out what landmarks “are” as opposed to what they “used to be.” It seems that in the perspective of Graz’s inhabitants, there exists a clear divide between Graz now and Graz before, even though the two intermingle within the same space.

Graz - Cities of Design Network
Landscape of Graz
Kunsthaus Graz - Wikipedia
Kunsthaus Graz

Note: Reposted due to odd error/oversight.

Documentation: Project 2

Description

Link to Github repo

Link to executables

For this project, we decided to use the act of calculating as a basis for our interactions. The idea is that the user starts in an everyday setting — in our case, a bedroom — but upon interacting with a calculator on the desk, the user is transported to an alternate world where the calculations will occur. In this world, calculations are performed by dragging and snapping blocks together rather than pressing keys in a “2D” setting.

Process & Implementation


Conceptually, the ideas were more or less there from the start. Most of the difficulties came in the implementation of the systems and dividing up the work. Tiger and Keyin worked on the bedroom world, Yeji worked on the alternate world, and I worked on scripting/back-end. Most of the interactions were made with Rigidbody physics and Monobehaviour.OnMouseDown. The world switching was achieved by placing both worlds in the same position and toggling their enabled state when the calculator was clicked. To achieve more visually pleasing graphics, we used LWRP and the Post-Processing stack as well as several PBR shader graphs.

The development journal contains videos of the process of creating the dragging/snapping interactions.

Reflection/Evaluation

I feel that we successfully provided an alternative interpretation of the act of calculating. Although the idea itself is relatively novel, the implementation works well because the actions used (dragging, clicking, moving) are rather intuitive and easy to learn. This end product actually ended up being much more robust than we initially expected, despite the fact that we couldn’t use VR. For example, we initially never thought about including interactions in the bedroom world because we were primarily focused on interactions in the calculator world. Perhaps the lack of VR allowed for more freedom of implementation, however; after all, we no longer had to worry about VR-induced motion sickness and limitations in movement when programming the worlds.

Agency

The interactions that best enable user agency are those found in the bedroom world. Upon entering the bedroom, the user is first encouraged to walk around. The user will then likely bump into the chair, which will move in response. The user is then encouraged to interact with other objects in the scene by dragging and clicking around; by making most of the items in the scene interactive or responsive in some way, the user is able to feel that they have a great degree of influence over the environment. This helps contextualize the sense of freedom that the user is meant to feel in the alternate world too and perhaps serves as a way to let users lose themselves in this environment.

Development Journal: Interaction Project

For this interaction project, we initially had several outdoors-oriented ideas to work off of, including camping, gardening, and rock climbing. Eventually, however, after some consideration, we decided to go with a 3D block/cube-based calculator. In this environment, the user would be able to drag and snap blocks together to calculate results from the block’s contents. The idea was to focus not only on interactions between the user and the the block (e.g. dragging), but also on interactions between the blocks themselves. For example, if you were to drag blocks with 1, +, and 1 together, you would get a result of 2. We also decided to have some sort of teleportation/environment-switching mechanic in which a small calculator could be clicked to move from a “normal” office environment to the block-snapping environment.

Update 1

Video

Basic block snapping is functional, but has several glaring issues with collision.

Update 2

Video

Block creation and deletion has been implemented. Some of the collision issues have been solved, but other issues with raycasting and positioning decisions are now present.

Update 3

Video

Circuit-style interaction between blocks has been implemented. Most of the issues regarding collision, raycasting, and positioning have been solved.

Update 4

Video

Figured out and finalized interactions between player and blocks. Also added mechanics for adding different types of blocks as well as an output block.

Intelligence in a Responsive Environment

In the fields of accessibility and childcare, responsive environments refer to settings that allow the child or disabled person to gain a sense of influence over their surroundings. The idea behind these settings is that they are meant to motivate the person to learn and — more importantly — interact with the environment. Bringing this definition over to general user experience design, a responsive environment also serves a similar purpose of motivating the end-user to interact with the environment by giving that user a sense of influence. Through intuitive cues placed by the environment, the user would ideally be able to navigate the environment and slowly learn its rules and quirks over the course of this navigation.

Given that machine learning and AI has progressed so much within the past few years, the logical next step would be to implement some form of learning into responsive environments. As a small-scale example, a responsive drawing program might learn which tools the user picks the most and rearrange the toolbar accordingly or suggest tools to use at certain times. From my perspective, giving responsive environments the ability to learn makes those environments more responsive in turn. After all, if a responsive environment is meant to motivate a user to interact with that environment through intuitive cues, then a learning responsive environment would (hopefully) enhance the intuitive nature of those cues. This would then further motivate user interactions; the idea of a learning responsive environment would open many opportunities for designers to enhance their created environments for users.

Documentation for Project 1: Synthwave

Source can be found here.

APK can be found here.

For Project 1, I decided to create a retrofuturistic environment inspired by the music/art genres of synthwave and vaporwave. By synthesizing and expanding on ideas, elements, and emotions expressed in existing synthwave media, I sought to create a minimalist, nostalgic space where the viewer could escape from “real” reality. Ideally, this space would be constructed in a way that steered the viewer away from questioning elements of the surroundings and towards simply accepting the “random” nature of the surroundings for what they were. To achieve this, I needed perfection in both design and implementation of the synthwave space.

Throughout the design/implementation process, I experimented a variety of things. I initially started with a static environment with only sun, sky, mountains, and a grid, but decided that this environment was too boring. To make the experience a bit more dynamic, I made a script that would scroll the grid texture to create the illusion of forward movement. This also had an unintended but welcome effect of making the terrain seem much more expanded than it really was. Since music is a defining characteristic of vaporwave and synthwave, I added a statue behind the camera that served as both an audio source situated in the 3D space as well as a point of interest for the viewer to look behind. In complement with the music, I added planets orbiting around the sun, then made the sun and planets vibrate along with the high and low frequencies of the music. Finally, I added shaders, lighting, and post-processing (bloom and chromatic aberration) to truly bring out neon and glossy elements.

Overall, I am quite proud of the end result. Despite having little experience in Unity, I managed to learn Unity’s C# framework, basic shader programming, as well as some useful quirks and intricacies that I hope to exploit in my next projects. The visual simplicity of the environment made it quite suitable for mobile devices, and I experienced little to no issue play-testing my environment. Most importantly, I feel that I successfully brought to life my unique interpretation of a synthwave world that I could get lost in.

Primary view.
Closer look at the statue.
Development angle which shows how the scene is constructed.

Development Journal for Project 1: Synthwave


Inspired by synthwave/vaporwave art and music styles, I decided to bring the pictured retro-futuristic landscapes from 2D into 3D. The common themes among most of these synthwave landscapes include forward-moving perspective, vibrant sun on the horizon, wireframed terrain, and neon gradients. The compositions tend to evoke a retro/nostalgic feeling often associated with synthwave; some might describe the feeling as akin to “riding into the sunset in a DeLorean.”

However, these landscapes limited to 2D representations; we do not get a 360-degree view of the entire environment. Thus, I hope to create an interpretation of a synthwave environment in 3D. Key goals I wish to accomplish include making the environment behind the viewer — facing away from the sun — interesting in some way as to promote more active head-turning, as well as creating a more dynamic environment with subtle movements (i.e.: swaying palm trees). In addition, I hope to incorporate synthwave/vaporwave music, likely as ambient sound. Lastly, I hope to add some personal touches to the environment that expand on the ideas and feelings surrounding the composition while staying true to the retro-futuristic vibe of the environment.

UPDATE 2/15/2020 6:13AM

Working draft of the basic synthwave environment is more or less finished.

Features:

  • Neon grid
  • Subtle (mobile-friendly) bloom
  • Distant mountains
  • Massive flat sun
  • Dope color scheme

Findings:

  • Chromatic aberration seems to be already-present in the Cardboard (visual artifact or intentional? Either way, it works in my favor)

UPDATE 2/16/2020 3:13AM

Managed to obtain an Android to live-test the environment.

New features:

  • Grid now moves
  • Added statue as audio source + fitting music

Fixes:

  • Tweaked colors to compensate for mobile screens
  • Adjusted skybox rotation for better aesthetics

Response: Hamlet on the Holodeck, Ch. 3

I found it interesting to consider VR as a conceptual descendant of sorts to ELIZA and other archaic technologies that were not as visually intensive. Although one might not immediately see the nontrivial similarities between a chatbot and an audiovisual framework, both technologies are designed to fool the end user’s perception of reality. In ELIZA’s case, this meant fooling the user into believing ELIZA was human; in the case of VR, this means fooling the user into believing that the virtual world that they are seeing and hearing is real. What made ELIZA so impressive was the fact that despite its eery humanity, its implementation was quite small and simple — even current graphing calculators are able to run the full implementation flawlessly. In a way, this was yet another proof that the human senses are easily fooled.

Likewise, VR headsets are portable and compact; although they are still rather computationally intensive, they can run reasonably well on most consumer-grade computers. Without prior exposure to VR headsets, one might not expect these glorified goggles to already simulate reality so well — and yet they do. In fact, there exist many videos online of people who, while watching a VR roller-coaster ride, have toppled over due to the disorienting incongruence between their “real” and “virtual” spatial perception. Thus, it might not be so far-fetched to call VR an ELIZA for the modern age.