This past winter break, I visited Marpi’s New Nature exhibition at ARTECHOUSE DC, an art gallery that focuses on immersive interactive art pieces.
The part of this particular exhibit that I really liked was a room full of screens, each screen containing a “creature” you could interact with by waving your hand over a motion sensor. The sensor would track your hand, which you could see displayed on the screen with a small hand symbol, the movement of which would get different reactions out of the creature. These also all implemented AI, so the creatures adapted how they reacted to the interactions. Essentially, the way it was explained, the reactions we were getting from the creatures were not the same as say, the reactions the very first exhibition visitors received from the creatures. This video I took during my visit shows just one of the many creatures you could interact with:
I really liked this interaction for many reasons. First, I think the music, lighting, and graphics all complemented each other nicely. The music was very calming yet futuristic in a way, and kind of inspired curiosity. Second, the interaction was very easy to understand. The user’s hand makes a digital hand appear on the screen, which gets an immediate reaction from the creature – it is all very intuitive. Third, it was fun to play with all the creatures in different ways. They were each designed differently – this one just happened to have many balls that would bounce in different ways. I also really liked the use of AI because it made the creatures more “real,” in a sense. They learned, just like real living things do. There was one creature, however, whose sensor didn’t seem to be working properly so the digital hand wouldn’t move the way you wanted it to. However, looking back on it, this could have been on purpose.