Categories
Blog posts

Examples of Animation principles

Exaggeration, Slow in & Slow out, Arcs, Anticipation;

Here we can see in two clips at the end of the video, the exert of Thumper from Bamby, where exaggeration and arcs are used in Thumpers ears and facial expression to amplify the emotions as he experiences them talking to his mother.

All 12 principles of animation;

This is an excellent example of all 12 principles in specifically crafted animations to demonstrate each of their uses.

Categories
Blog posts

My VRMV

My VRMV Plan;

Song choice

For my VRMV I had a few songs in mind. I listened to them all and this was my final list;

Sarah Cothran – As The World Caves In

John K – ilym

Kodaline – All I Want

Alec Benjamin – Beautiful Pain

Jordan Suaste – Body

Corpse – Agoraphobic

Tommyinnit – CG5

In the end, I decided to go with the song Sarah Cothran – As The World Caves In (cover)

I chose this over the others because its lyrics tell a strong narrative that will be both moving as an experience and also make it easier to plan the events of the MV.

Sarah Cothran – As The World Caves In – (cover)

The song tells the story of two lovers finding comfort and peace in one another, as the world ends.

I will most likely will use literal interpretation of the lyrics for the narrative. I will follow the scenes of the song, focusing on the couple, in their home, moving through the rooms of the house.

I think the final scene should show the world ending and a blinding light coming from outside as it all fades to white.

It would be powerful to include an interactive element at line “I weep and say goodnight, love – While my organs pack it in”. Here the song would pause before the final chorus and a button would appear. Here the user would have to press it, sending the bombs mentioned in the song, that end the world.

Locations in my scene

-The home of the couple in the song, and the many rooms inside.

-The street that the house is on, for panning shots / ending or intro.

Items / details in the scene (using lyrics for imagery)

-Bottles / glasses on a table

-Doomsday newspapers

-TV

-Fancy outfits for the characters

-Nail polish for painting nails

-Nuking animation and button to press

Why is VR useful in storytelling and in this type of MV?

When this sort of content is watched in VR, the scene is viewed in 180 or 360 degrees, which allows the experience to unfold around you as the song goes on.

A picture from when me and my Girlfriend went to VR world that immersed you in a music video.

This is one of the most immersive forms of communication, the story is not just communicated as a window into the universe as it would be on a screen, but the user is actually in the scene, with the characters, in their environment.

In the case of this music video, it will create a greater emotional connection to the story and the characters, and greater impact from the dark songs message. I can use audio prompts, physical interactions and events in the scene around the viewer to further the immersion and enhance the detail that would otherwise not be possible.

Filming the characters animations

To create the music video’s characters animations, I will be using Mocap Fusion. This is a free motion capture software currently in Beta testing that allows you to use consumer VR hardware to create animation files, but also has a lot of other features.

It allows you to record many different VR events, but importantly, it can use VR headsets, controllers and trackers to record full humanoid pose data with proper inverse kinematics ready to be imported into projects and applied to models.

I will be using my own VR hardware to create the animations for the characters in the music video. The characters will feature full body presence and can feature details such as finger tracking. This will bring lifelike realism to the models, and will reduce production load, from not having to hand animate the models.

Character models

The models for the characters in the music video are the Avatars that me and my Girlfriend use as our Avatars online, and I thought it would be fitting to use them in a love story song.

They are already rigged and ready for animation files from Mocap Fusion.

Scene arrangement

For the scene, I think I might have an apocalyptic home, where the environment is in disuse. For the theme of the song, the couple are at the end of the world, so the home will be a visualisation of their break down and grief.

I want the scene to have a dark and morbid tone, so I want to give the effect of lots of dust with low lighting.

Story plan

This is my plan for the scenes in the video. I broke the song down into narrative sections and then planned what we would see in each of these sections. Based on what I planned in each area of the song, I used these plans to create the character animations for the character models I have.

When I had finished the animation Mocap, I also added the name of which animation file goes with each section of the song, the duration of each animation, and the duration of each length of music, next to each set of lyrics. This way when I go to arrange the animations in my Unity project, I can reference each section of the song, and know where my models need to be, and what animation they are doing, and at what time.

Creating the animations

Some of my VR hardware

As the narrative is largely about the two main characters, it is important that the models that represent them have animations to give them life. They are the main story telling element, so I wanted to give them hand made animations to tell the narrative and covey their emotional significance and life.

I used a software called Mocap Fusion, which allows you to use consumer VR hardware to record your own acting and create animation files. I used my own HTC Vive for head tracking, Valve Index controllers for hand and arm position using inverse kinematics, and to also provide finger tracking. I also used six Vive trackers for feet, knee, waist and chest position. I recorded the animations for each scene of the song as I had planned previously, and exported them ready for applying to my models in Unity.

The desktop view of Mocap Fusion

Here you can see the imported animations in my Unity project, and a freeze frame of the different events that I will use as applied to the models I am using in my video.

Creating the environment

I wanted to create an end of the world atmosphere, but just inside an everyday couples home. I knew I wanted the home to be busy and full of belongings to show that it is lived in and make it feel more realistic. I knew my 3D modelling skills were not fast enough to create the entire realistic interior of a home, so I decided I would use assets from online to build my scene. I started with a house that had a reasonable interior and then started to flesh it out with objects and belongings you might find in a home. I went for a more vintage feel with the aesthetic of the house.

Here are some images of the inside of the house after adding lots of items and furniture, such as beds, chairs, a kitchen, sofas, and more.

I also included a variety of pictures that me and my Girlfriend have taken together using our Avatars, the models of which I am using as the characters in the video. I imported them from our pictures and scaled some picture frame assets to make it seem as though the characters have hung up pictures of themselves in their home.

I continued to add assets including those to be focussed on that parallel with the lyrics of the song, such as some nail polish, newspapers and others. To finish the scene, I added lighting using a 3D model of a light and an area light to add natural lighting to the inside of the home. I also added clouds to the sky so that looking out the window makes the world outside seem more real and less like a void.

To finish the environment I just needed to create a moody tone, so I decided to change the directional light, serving as the sun, to a warmer tone and turn it away so it looks darker in the scene.

Sequencing the character animations

The longest amount of work I spent on this project was the character animation sequencing. I had many animations for two characters, I needed to have them play at specific times in the song and in specific locations in the house. I also needed the VR view to move the user to where the models would go when the location changed in time too.

To manage all of these changes and timings, I used scripting. It took me a long time to script all of the events as there were many many animations and locations and 6 unique values for each one, with a lot of trial and error to get things to line up correctly. I got some coding help from Antoine and it took me a while to write a script that would work, but eventually I had all of the animations, locations and camera arrangements properly lined up and in time.

Final touches

Also using scripting, I activated a light change at the end of the song, running a pre-recorded animation of an area light getting brighter. I used this to represent the bombs going off outside, and it fittingly shows the outside world slowly get brighter and brighter as the song ends and the couple look on.

I also had to add the song. I imported and added it to an empty game object, selected play on start, and I had the music playing as the events went on.

The last element was the VR camera. I wanted to use the XR plugin to add headset compatibility and more user friendly and open use, but I couldn’t get the XR rig to recognise my headset and nothing would happen when I hit start. So, in effort to have my MV actually running in VR, I used the Oculus OpenXR rig instead. Because of the size of the project and the complexity and concentration of assets, I decided to leave this project running on PC as apposed to building for Android and putting it on a Quest. This would allow me to retain the complex scene I had created and still get high frame rates as the processing power of rendering on PC is far more capable than a Quest. I decided against animating my camera as I was cautious of putting the user on tracks and making it less accessible to those who easily experience motion sickness, so I decided to just use snap cuts to the different areas of the music video.

Things I didn’t quite get to do

I wanted to have some interactive elements of the MV, perhaps have the user launch the missiles at the end of the song, that end the world, by having the song pause and getting them to press a big red button.

There were also a few interesting issues with the animations on the characters. Because of the way I used mocap, I had to create the animations for both characters at different times, and in some scenes there is a little noticeable separation and misalignment of the two characters in relation to one another. In the future it would be interesting to see if I could use multiplayer mocap to record person to person interaction with another person at the same time.

I also wanted to have better jump cuts between the events in the MV, I wanted to have the camera view briefly fade to black between jumps, to make it all feel less sudden. However this was a little too complex and I was already low on time.

I wanted to have more animations and have the characters interact with the environment a little more, for example have one chopping food in the kitchen, or another put the HIFI on when they go to dance together. However due to time constrains and the complexity of recording all the mocap in time, I ended up having to use less animations over all.

I also wanted to have the street outside their home modelled so that you could see more than just an endless expanse outside the windows. However as I was low on time, and I couldn’t find any good free models of housed streets that fit the aesthetic, I just left the outside to be a large dark plane, which does somewhat add to the creepy and dark tone.

Project overview

Overall, I am really proud of how the music video turned out. It took a lot of work and a few sleepless nights, but I really like the final outcome. I think I really hit my goals for the scope and concept of the project. I really enjoyed learning a little more about Unity, coding for Unity and using VR in Unity. It’s taught me a lot of new skills and definitely pushed me out of my comfort zone all for the better.