Today I worked with an intro suggestion for the mobile version, a text indication that hints out what action the user must take to adjust the filter. While it works exactly how conventional filters do, I feel I'm trying to go completely opposite. Upon discussion with Evan, it feels like I will be removing that.
I then attempted to make the multiple user scrip work, the face index seems to break and the file crashes. May be setting the file for a bigger screen space would fix this. It also pushes me to start taking this project into the next phase of consideration, the exhibition part. The endgame with this project is to put it out in a publoc space for people to interact with. I have a fair bit of working prototypes ready and they are mostly functional. It is a good point to look at the experience part of this of this project.
Next step for me is to export and certify the filters with LensStudio, create an offline back up as well. I will also start working on expanded screen sizes to compensate for the lack of space for multiple users.
I also tried my hand on a bit of coding and got some basic functions working, but that's the extent of it.