Play and Forces
'The Playable City concept has captured the imagination of cities across the globe, offering a new way of connecting people, and thinking about the city.' (Playable City n.d.) As we are currently stuck at home, I was thinking more about the possibility of viewing the room we're in from a different perspective. I want to explore the probability of a Playable Room, which then leads to one question. Is there any way to engage with the room or the places we're in besides our normal activities?
Technology has undoubtedly become our source of input from the outside world during this period of time. Therefore, I was wondering if technology could be the bridge between the room and the user. I would like to work on an interactive artwork that allows the user to play with the environment somehow. As I really like the idea of a data-driven animation (inspired by Margaret Lapanski's animation in Matt's presentation), I would like the interaction between the user and the environment becomes the input that changes the animation on the screen.
I came up with a few ideas, but then they are large-scale projects. (e.g. AR detecting shapes in the real-world and triggers animation by tapping on the shapes) Then, I thought about making a prototype of those projects. It will be a small-scale project with the main mechanism of the interactive animation working.
I decided to go for the idea of getting colour information from the real world as data input that changes the rotation of the object in the middle of the screen to perform a simple animation. This is the basic mechanism and this would be the prototype I am going to work on. My original idea was to have multiple pole objects across the screen and each pole will rotate when they detect the colour information based on different tap count of the user. There will then be a creature walking out from either side of the pole with the detected colour. It will be a soothing animation with the user constantly tapping on the screen and exploring different colours their room could offer.
This was the first time in two years that I tried to code in Unity. This was also my first time coding for AR. Although I ended up not using many features of the AR (e.g. plane detection, 3D environment, image tracking), it is still a learning process for me.
I want to work on a phone-based tapping interactive animation that would allow the user to affect the rotation of the pole, and therefore, affect the direction of the rolling sphere on top of the pole. Also, I want the user to be able to interact with the real world to gain the information that affects the rotation value, which is where the phone tapping mechanism and AR come in. Hence, each user will then be able to gain there own set of pole rotation and sphere rolling animation with a unique background.
I started with connecting my Android phone to my laptop in order to get a real-time build to check whether the interactive artwork would work. However, I think there was a problem with my USB cable that stopped me from doing that. Then, I tried to use my iPad to do the checking, which again failed because I did not have an Apple Provisioning License for Apps distribution and testing. In the end, I went the long way of properly building my game, sending it to my phone, downloading it, and installing it. I was then able to test out the AR artwork on my phone.
The basic mechanism of the game is that, when the user taps on the screen, the artwork will gain the colour information of the point of the tapping. The colour information will then be transformed into an HSV value. The pole rotation value is determined by the V (brightness) of the HSV value. The colour information will also be passed onto the sphere object and change the colour of the sphere.
To do so, I have to build a functional rotation game that allows the user to gain colour information with a pure mouse click before transferring the game to the phone. I then changed the code so that the mouse clicking is substituted by the tapping. This saved me a lot of time from the process of sending the build to my phone and check whether the function works.
There were a few problems I encountered along the way of coding. One, an AR interactive artwork means that the point of tapping is essentially a point in the world, which is consisted of the x, y, z value. I have to transfer the point from a 3D dimension to a point on the 2D dimension. It is called a world to screen function in Unity. Two, to gain the colour information of the tapping point on the time of tapping accurately, I have to do a screenshot function to save the image of the screen when the user taps as the visual are updated every single second. I tried a few ways and in the end, found the screen capture function in Unity is the only one working. Three, as it is still an AR interactive artwork, when I rotate my phone, the pole rotation will be affected as well. I want the pole to always stay in the middle of my phone without affected by the rotation value of the x and y-axis. Therefore, I parented the pole to the AR camera and changed the coding of the rotation from the global rotation to the local rotation of the pole. That fixed the problem.
There were still a few minor problems in the coding but the basic prototype is completed.
There are three main things that happen in the back end of this AR interactive animation:
- Tap to detect the Colour.
- Change the Colour of the Sphere.
- Rotate the Pole and dictate the direction of the Sphere.
To play the interactive animation, the user just has to tap on the screen to explore their room.
I uploaded a version for user testing. Unfortunately, it works only on Android phones. So, HAPPY exploring your room!
As Alex Pang mentioned in his article about Tinkering, ' tinkering is customizing software and stuff; making new combinations of things that work better than their parts; and discovering new capabilities in or uses for existing products.' (Pang 2009) I tried to tinker with the software Unity and the connection between a room and the technology. When we play AR games, the room or the place the AR game locates in is a mere canvas for the player to build something on. It often does not provide any data input to the game. That is why I want to reconstruct the connection between the environment and the AR technology. I want the environment to be more than just an environment. I want it to interact with the user's action. I want the data input to be my take on Play and Forces.
That's why I came up with the idea of an AR interactive animation that gathers information from the environment by tapping and consequently changes what happens in the middle of the screen.
The prototype took me one day and a half to reach the state it is in right now. Coding an interactive artwork consists of a lot of trial and error and debugging. They took a long time, especially when my phone wouldn't be able to perform a real-time build and run. Although I was using AR Foundation to build this project, I did not get to play with that much AR feature in this project. I wonder if I could re-establish the connection between the environment and the AR as data input or a data output while maintaining the environment status as a canvas. I wish to explore more AR features in my next project.
Moreover, I think the Playable Room idea I mentioned above could be explored more in my next project too. I want to explore the possibility to make a simple room fun so that we wouldn't feel so stressed out while stuck at home. Reconstructing the experience we have with the room might help with providing a mental escape and relief. Also, I want to try to make it as accessible as possible so that people could play it in their room without having much setup.
Playable City (n.d.). Cities [Online]. Available from: https://www.playablecity.com/cities/ [Accessed 12 September 2020].
Pang, A. (2009). Tinkering and the Future [Online]. Available from: https://www.iftf.org/future-now/article-detail/tinkering-and-the-future/ [Accessed 10 August 2020].