(To view, please adjust vimeo viewing settings to 4k)
Play and Sound
I will tinker in a 3D program, to explore audio reactive motion to create a form of visual music through the use of motion graphics.
“Music is not limited to the world of sound. There exists a music of the visual world”— Oskar Fischinger,1951
Throughout this weeks lecture, we discovered the world of sound and how this can play a part in design through various ways. Oskar Fischinger’s famous quote around music not being limited to sound, but alive through a visual world stood out to me a lot and got me thinking a lot about motion graphics, abstract visuals, the history of motion graphics, past influences towards my practice in the past such as Saul Bass and Norman McLaren and how the idea of abstract motion and “visual music” has developed over the years through different design trends and technology and is now present throughout a wide variety of platforms and disciplines such as animation, UI/UX design, digital design throughout broadcasts, events, film, games and interactive works.
Keeping Oskar Fischinger in mind, I read an article that Matt shared to the group called ‘Composers as Gardeners’ by Brian Eno, in which Brian describes a gardener as ‘ someone who plants seeds and waits to see exactly what will come up’ and how this can be a useful concept to apply in rethinking our own positions as ‘creator’ and the need to feel or be in control of every aspect and element. Comparing the garderner to the architect, who has a clear detailed concept and final result in their head, in which everything on the outside is controlled.
In past explorations of motion graphics through both 2D and 3D, everything I have done has been carefully planned through every angle, key frame and final output. I discovered a desire to play and explore new methods in creating abstract motion to create visual representation of music, through Solitary play and tinkering in some of the latest technology. Exploring a more freefall approach by aiming more to plant some seeds and not really having full control over the final outcome, but rather testing ideas and experimenting with motion.
My initial response to this week was a plan to advance my skills and understanding of what I already know in Cinema 4D, by pushing the latest updates and learning different methods of creating motion to music, which is something I haven’t done before. I unfortunately wasn’t able to access a trial version of the software (oops), however this forced me to think past my comfort zone. Given the upcoming growth and need for real time rendering throughout events, VFX and games, I decided to have a crack at some motion graphics and play with the Niagara particle system that is built within that. Playing with particles is something I haven’t had the chance to do in the past due to limitations of hardware. So I decided to just tinker and play in Unreal Engine 4 this week, to explore the combination of abstract visuals and music without any sort of plan. There is to some degree a small level of planning which I like to think is the initial ‘planting of the seed’ which still needs to occur even if approaching the task as a gardener rather than an architect, however the way in which the particles release each time (although beating to the music) is unique each and every time. This required a little bit of a quick tutorial to understand how to program the particles to react to the music, but then it was just playing around with particle settings and trialling different things in order to observe outcomes. I explored different lighting options – although this wasn’t really working out as planned and with more time and setup, could be more interesting with more of an environment setup, adding fog, multiplying the particle system around the space and changing the particle objects to see how busy it ended up visually.
The piece of music I chose to play with was a piece with deliberately a lot of beats in order to get the full effect of the audio reaction – however this could be easily toned down to adapt to different needs. The crazy thing that came out of this, which was super unexpected – was the fact it was accidently setup in a VR blueprint – rather than a plain one. My initial plan was to create a rendered movie with different depths of field and camera angles, however when I went to try export, it appeared to be VR. I had no idea how I did it initially, however upon further review, it turns out I setup the wrong UE4 blueprint to start with! This was kind of a happy accident though as it gives a lot more depth to the concept of abstract visuals to reflect the music – allowing the ‘player’ to walk around within the space of the particles and either stand within it with the particles all beating around their head, or standing far away and playing more of an observer role within the environment. Although a simple test and outcome and something that is quite frequently achieved these days around live events and motion design, this was a new method and approach for me in regards to the field of motion graphics and interactivity. I never would have thought 5yrs ago, I would be using a game engine to create this sort of thing, let alone learning to program and setting up a VR experience – which I essentially discovered and learnt how to do all in a days session of playing and tinkering. It provides a really interesting foundation that has the potential to be expanded upon visually and interactively quite greatly.
Real-time rendering is amazing to work in and speeds up the workflow quite dramatically with having to avoid waiting hours for a render that didn’t produce the outcome you were hoping for. There were so many potential approaches to sound this week, however I wanted to try something a little different to the way in which I have used sound already over the past 6 weeks.
https://www.edge.org/conversation/composers-as-gardeners
By Amber Stacey
Email Amber Stacey
Published On: 21/04/2021