Play and Body
For the theme 'Play and Body', I wanted to explore the possibilities of bridging the gap between physical body movement and interactive animation. I wanted to work on a piece of interactive artwork that allows the user to play with it without touching the digital screen at all.
My inspiration actually comes from 'Super Nintendo World', an interactive game project the Universal Studios in Japan planned to implement this year. It creates a connection between the digital and physical space by using a watch to detect the player's location and a real-life brick button for the player to activate/trigger the game events.
I researched the technology and realised that I couldn't do a face or finger detection without Vuforia in AR (an external plugin that is not free). I brainstormed for a little while, tried out other free plugins, and eventually decided on using image tracking as a creative solution for the problem.
After reflecting on the workflow I adopted last week, I decided to start this week's tinkering exercise by doing the AR aspect of the interactive artwork first. It is to make sure that my plan will be able to go through at an earlier stage. If not, I could always put a stop to the development and start something fresh or brainstorm ideas for solutions.
Doing AR image tracking first allowed me to understand a few things about the project setup. In order for the reference image library to comply properly, the unity project has to be set on the desktop. The image file has to be a .jpg with no space in the name. The name of the image has to be the same as the name of the prefab. This is to ensure that the object will be spawned when the image is detected. It is also case sensitive, so I have to be really careful when naming them. There is one problem that I haven't figured it out yet. Some images, although in the same sizes, cause problems in the reference image library compilation. That's why I settled with the image that could go through and couldn't be bothered to create something else.
At first, I wanted to do a multiple image tracking for the project so that it could add more to the interaction level of the artwork. However, after realising the difficulty of setting up an image and the lagging image tracking ability that Unity offers, I could only focus on developing the main feature, which is to poke the creature on the screen and interacts with it.
For the poking to work, I played with the logic behind the coding for a bit. At first, I set it to be the poking boolean will be true if both 'isNear' and 'isFar' boolean are true. However, it would be registered as a poke even though the creature is far away from the finger as when the user finishes the poke, it is so near that the 'isNear' boolean would be turned true in a second. Therefore, I have to set it so that the 'isFar' has to be true before 'isNear' can be true. This way, I could ensure that the poke will definitely happen when the finger is poking the creature. Also, to stop the animation from going on and on, I used an 'Invoke' function to turn the poking boolean to false after 3 seconds of animation.
I am happy with how it eventually turns out to be. The image tracking is working even though I stuck the picture to my finger with tape. It then allows my poking gesture to work properly. This is the best I could do without equipment or software. To be honest, this project surprises me with another way to interact with it. The user could use the computer screen as their canvas, click and drag the picture around to interact with the creature on the phone screen too. By doing it this way, the connection between the two digital spaces is created.
I have recorded a demo as usual. The response is uploaded in a zip file. It includes the game .apk file and the image tracking print. Please don't use the print for any other stuff. It's kind of the logo for my website.
To explore this week's theme of 'Play and Body' theme, I wanted to explore play as an action to bring pleasure and joy. I used the pleasure framework as listed in 'A Study in Play Pleasure and Interaction Design' for the beginning concept planning and the mid-stage artist's reflection. If we were doing this in the studio, I would like to do a formal user evaluation for this project too. For your information, the pleasure framework is a framework that concludes ideas from various theorists and how each idea relates to the thirteen pleasure categories.
Among all the categories of pleasure, I decided to have exploration, discovery, sensation, and simulation as my key pleasures. I wanted the user to have fun in exploring and triggering the animation using their finger movement to simulate the action of petting/poking an animal.
'A participant who experiences displeasures is liable to become distracted and to stop exploring an artwork. (Costello & Edmonds 2007)' This is something that I noticed in designing an interactive experience too. I wrote down points that make me feel discomfort when I play with the interactive artwork I created in the last two weeks. I noticed that the response is one of the things I'm not quite happy about. I was constantly using spheres and cubes in Unity to build the prototype. They work, but they don't really provide a huge amount of pleasure for me when I played with them. That's why I decided I wanted to work on a proper creature animation for this interactive artwork. It can be simple, as long as it's not a simple object from Unity anymore. I also noticed that the instant response of previous interactive artworks is something that brings pleasure to me. Hence, I tried my best to trigger the animation of the creature as soon as the poke happens. This way, the user won't be bored by the process of infinitely poking.
When considering the shape of the creature, I decided on using an oval shape as the base. It is a shape that is soft, smooth, and non-aggressive in visual language. I thought that it would decrease the level of discomfort for the user too. The user-driven shape, however, is a cube. It is designed to have edges just to excite the user.
I wish that I had more time to explore the project as I would add more to the interaction level for the user. I would like to add multiple different poking reactions for the creature. I would also like to create a semi-gameplay which the user needs to get the right 'poking ritual' to get the correct animation triggered for the creature. It would be building on top of the exploration and discovery and, hopefully, brings more pleasure to the user. An interesting concept would be changing the interactive animation based on the size of the spawned prefab too. This would be changed by the distance between the finger and the camera. It would then add more to the sensation aspect of the interactive artwork.
Overall, I am happy with how this project turns out to be. From the technical aspect of the project, although I am not able to do proper finger detection and tracking without equipment, I still figure out a way that could substitute finger tracking. As of the creation of interactive artwork, I am glad that I am introduced to the pleasure framework. It is something that I could always use to examine my workflow and my project, making sure it could be used to create a pleasurable experience for the user.
Costello, B., Edmonds, E. (2007). A Study in Play Pleasure and Interaction Design. In 'Proceedings of the 2007 conference on Designing pleasurable products and interfaces (DPPI '07)'.