How I Created a Music Video in a Game Engine (Unreal Engine Cinematic)
This article is a follow-up from: How To Write and Produce a Song From Home (During Lockdown) which details the song writing process and production of a song called ‘We Feel It Too’ by YagmanX (me) which was written during the pandemic while exploring my initial feelings surrounding it. Please do read that first if you have any interest in music creation because in this one we’re going to get techy!
When I have produced a song that I feel is worthy to release as a Single I often like to create a music video, like the one above, to release it with. “But, why do you need to release a song with a music video?” Great question. Simple answer: You don’t. However, I firmly believe in delivering the best narrative experience possible through my creations. If I’m able to deliver both a visual and auditory experience then I will, because it is enhances my overall vision and is entirely possible. Let me share with you how…
Filming Within a Game Engine
I choose to create videos within a game engine, in this case Unreal Engine 4, because it gives me more freedom and opportunity to experiment with new camera angles, scenes and effects than I would be able to if I were to film this physically. I don’t have the money for new cameras, lenses and lighting equipment, and most certainly can’t take enough time out of my day job to scope out filming locations. But with a game engine you can create what you see within your imagination and apply it digitally without as many restrictions.
You want to change the time of day? No problem, let me change the sky sphere. You want dust in your environment? Set up some particle systems and exponential fog and you’ll have a nice textured scene. You want a different type of lens? Change the Filmback dropdown in the cine camera actors details panel. Want to change the colour grading of the shot live? Try out different post processes and see what looks best.
Within this article I’ll focus on the steps that I took to create my music video for the song ‘We Feel It Too’, including:
- Creative Direction
- Preparing Art for a Digital Environment
- Developing the Video within Unreal Engine 4
- Final Touches in Premiere Pro
Step 1: Creative Direction
Music Video Narrative
A video can build upon the narrative of the song, providing visuals that can enhance the meaning and feeling of it. When creating a music video I always ask myself, how can my song benefit from this video and how best can I portray its narrative?
This song’s theme was isolation so I wanted to keep it very stripped back to help give the audience the feeling that they were listening to something very intimate. I imagined it as though the listener would be right there, in my room with me.
During the production process of the song I had already envisioned the theme and narrative for it so I already knew the location that it needed to be shot in: a bedroom. The music video needed to fulfil the feeling of being within a bedroom while listening to the song, mimicking what it’s been like for many of us during lockdown. The song is calming and has chill vibes so I knew I wanted the artstyle to reflect that and the first thing that came to mind was an isometric room with a warm pastel colour palette.
Know Your Limitations
In my opinion having and knowing your limitations really helps with creativity. We cannot be good at everything and that’s perfectly okay, once you make peace with this then you realise that the sky is NOT the limit and therefore you won’t get as overwhelmed with creative possibilities. Instead you can focus on what is realistic for you, allowing you to then figure out how to make the most out of the skills that you have, instead of getting frustrated by unatainable goals that will only lead to frustration and disappointment. My limitations and workarounds when making this music video were:
- Limitation: I could not 3D model
Workaround: I worked with Concept Artist Braden May and used some assets from the UE marketplace.
- Limitation: The Concept Artist had limited time to work on the project and 3D modelling was not their primary stength.
Workaround: We worked together to limit the complexity and amount of 3D models needed for the project to make sure the work was manageable and attainable for them.
- Limitation: I have no 3D animation ability
Workaround: I had to find an alternative option for adding movement into the 3D scene through my other skills which are tech design (more on that later)
- Limitation: I have a tendency to strive for ‘perfection’
Workaround: I gave myself a soft deadline of 2 months to get this music video completed, so that I had a goal to reach to hold me accountable.
When I knew my limitations and the consquent workarounds for them I could then put together a plan of what was possible for the music video. I decided to have the video focus on a simple bedroom, with the camera slowly panning around the room, but to also include elements of my real bedroom within it to make it feel more personal.
With a more solid plan in mind I could then make sense of what I needed for the music video:
- A low poly 3D square bedroom and individual assets from my Concept Artist which have one material per model to keep it simple for them.
- Walls and individual items must be separate models so that I could hide / show them independently during the video to allow free movement of the camera without worry of it clipping through objects.
- Low poly 3D outdoor assets from the Unreal Engine marketplace that I could use to create a city view outside of the bedroom window.
- Sprite animations for including movement into the video.
- Include elements of my real bedroom as textures within the 3D room (e.g. journal entries, pictures, etc)
Document a Clear Vision (Especially if working with others)
I know, I know, documentation is the last thing you want to do at this stage when you just want to get the ball rolling but, from experience, I believe it’s vital to making a cohesive creation. A solid piece of documentation will help keep you accountable for sticking to the original vision and to help avoid feature creep. This is beneficial for both the cohesivity of your project and your own mental wellbeing. If you are working with others it also helps to clarify your vision so that it can be shared amongst your team to avoid any confusion or time wastage. My documentation detailed:
- The music video narrative and intent
- The colour palette that I would reference when making materials
- The reference images for the 3D models that had inspired my vision (mainly cute isometric bedroom concept art and lofi art work)
- A detailed list of 3D assets that I needed from the Concept Artist
- Links for tutorials that would be useful for me during the development process (e.g. lighting tutorials, material tutorials, blueprint tutorials)
- A to-do list for the development process that I would tick off as I worked on it
Step 2: Preparing Art for a Digital Environment
These assets had a very low poly count and most had no texture map. This made the artist’s job much easier and faster, however it did mean that I could not bake lights onto these models because the texture information was wrong which would lead to odd shadows. This was fine for what I wanted though because I was going to be working with dynamic lighting as I didn’t have to worry about performance with this being such a small, low poly scene.
Dynamic Lighting is when everything is lit in realtime whereas Baked Lighting is when your lights are turned into textures, hence why the latter is better for performance. More on lighting during my explanation of the development phase…
Once I put these into my level within Unreal Engine I could then reposition each individual asset to where I liked within the room, as well as create a new material for each one so that I could change the colours to match the colour palette from my documentation.
Including the Musician
A common trope of a music video will be to include the musician themselves, either playing the music or being a part of the song’s narrative. I knew I wanted to insert myself within the video to add more personality to it and make it feel more personal, however I was unsure of the method to do so. I do not have the technology or skill to model myself or 3D animate (yet) so I ended up recycling a technique that I had used within a short film I created within UE4 in the past: Sprite animation.
My idea was to include sprite animations into the 3D bedroom to make it appear as if ghostly versions of myself were lingering within the room, acting out mundane movements such as resting on my bed, drinking coffee, sitting at my desk, etc. This technique helped to strengthen the overall narrative as it hints to how I spent a lot of time within my room and shows that the music video spans through a large length of time.
To create this effect I set up the cameras in the 3D space where I wanted each animation to be so that I had a reference for the camera’s angle and zoom and then recreated these as best I could in real life. By taking videos of myself doing these actions in a real life version of the mocked up digital scene I could then use these as references to trace over in Photoshop.
By importing the video to Photoshop as a video layer I then scrubbed throug the Timeline to find moments of the video that I would like to trace over and then created a New Layer to trace onto. By repeating this process of tracing onto new layers at different points of the video I eventually saw my animation come to life in the Timeline! This process, called Rotoscope Animation, takes quite some time and patience but the outcome is worth it. Once I was happy with the animation I then selected the layers I had traced onto and exported them as a sprite sheet, using a Photoshop Script (developed by JesseFreeman), to set up as a Sprite Animation within my music video.
To see more in detail about how to set this up within UE4 please check out ‘7. Using Sprite Animation’ within the article: What I Learned from Creating an Unreal Engine Cinematic.
Making it Personal
A music video, or any creative piece in my opinion, is stale without injecting a bit of personality into it that others can relate to.
For this project I was heavily influenced by the game Life is Strange, its ‘moments of calm’ in particular. These were moments in the game where the player could sit and reflect on the current state of the character within the games story while the camera panned around their current surroundings, highlighting items of interest which might inform the player more about the character or situation that they had found themselves in throughout its narrative.
I attempted to recreate this by adding aspects of my own bedroom to this 3D space. By inserting polaroid's and photographs of me and my friends as well as real journal entries from my life, I was able to make the space feel more personal and lived in which would hopefully create some intrigue from the viewer. To do this I took well-lit photographs of these elements of my room from a top down view and removed the background in photoshop, making them pngs that could then be used as a texture on some 3D models such as an open book, or inserted directly onto a plane 3D object (which is a flat square) in Unreal Engine to be used as a hanging polaroid picture.
Make Use Of The Unreal Marketplace
As I explained earlier, we can’t be good at everything and that’s okay. Especially when you’re working as a solo creator or even within a small team, there’s so much work that goes into creating a digital video, from modelling and texturing to post process, animation, setting up cameras, and so much more. With this in mind I would implore you to look through the Unreal Marketplace and see if there’s anything that could help make your workflow easier. You never know when something you thought might take you a long time to create or learn from scratch is already readily available to you, which will save you time to focus on the parts of creation that lay within your expertise.
For this project I used Stylized Nature Pack, Modular Building Set and Interactive Stylized Lowpoly Grass to create the outside environment of my scene. Because the focus would be on the inside of the bedroom, with the outside being distorted by frosted glass I knew that I could get away with using models from different sets without having to worry about them seeming out of place as the viewers wouldn’t see it in detail, so it was a quick and easy way to create the outdoors. I also used Good Sky to create a more stylised sky as well as Chameleon Post Process to handle the colour grading and visual effects of the music video. This post processing package allowed me to manipulate the visual style of the environment throughout the video sequence by offering many visual effects. I used two seperate post process effects, one for the whole environment and one which only effected the window of the bedroom, by placing its collision box around only that part, to create the rain effect at the beginning of the music video when the camera was inside this collision box. I could then turn off this effect off during the video sequence so that when the video ended back at the window it was no longer raining.
Step 3: Developing the Video in Unreal Engine 4
The Game Dev Part
This is where it starts to get more techy. Unreal Engine 4 offers a whole treasure chest of goodies to make your cinematics but it does have quite a learning curve to learn how to make the best from it. To get a more detailed explanation of how to create cinematics within Unreal Engine, as well as an oversight on the benefits of doing so, then please check out my previous article: What I Learned from Creating an Unreal Engine Cinematic.
A quick recap on some terminology I’ll be using:
- Actor: An actor is any object within the UE4 editor.
- Material: The texture of an Actor.
- Mesh: The 3D modelled object.
- Blueprint: A visual scripted asset that allows for bespoke gameplay functionality.
- Variable: Properties that hold a value or reference an Object or Actor in the world and are used within Blueprints.
- Sequencer: The cinematic editor within UE4 that gives the ability to create cinematics with a multi-track editor, allowing the user to use keyframes to determine the content for the scene, controlling things like transformations (moving things around), media, audio and more.
With every cinematic created I learn something new and this project was no differenct, so I’ll share what I learnt throughout this production:
This video was set in a very small space so lighting the environment correctly was important. To do this I used a few new features within UE4 that would elevate the lighting to allow me to have more control over it:
1.Real-Time Ray Tracing: Now that I have a beasty PC I could enable raytracing without worrying about whether my graphics card could handle it. If you are looking for an easy way to get realistic lighting results with soft shadows then you’ll want to turn this setting on within the Rendering Tab in the editor’s Project Settings. However I would recommend saving or duplicating your project first if you’re unsure whether your PC can handle it.
“Ray tracing effects look more natural, producing soft shadowing for lights, accurate ambient occlusion (AO), interactive global illumination, reflections and more.” — Unreal Engine Documentation
2. Light IES Textures: My scene had lots of small light sources, including fairy lights and a candle so I used IES Textures to give a more realistic lighting effect to display upon walls or other nearby objects. IES stands for ‘Illuminating Engineering Society profiles’ which are for are a lighting industry standard method of demonstrating the brightness and falloff of light as it exits a particular real world light fixture. I used IES Light Profile Pack which granted me an array of textures to choose from that would match the shape of the object that my light source was within so I could mimic the light falloff of a real object. Because the light is just a texture it renders very fast and is a more performant option for creating realistic lighting.
3. Light Functions: The IES Light Profile Pack also came with Light Functions which are materials that can be applied to a lights intensity and simulate situations like windows, search lights or laser grids. I used this to create moving sections of light to create movement in my mostly static scene. For example I gave the laptop a digital glow and the bedroom wall by my window light streaks that moved from bottom to top, as if a car was driving by.
However, I did not like how uniform the timing of the light movement was and wanted to create the illusion of cars driving by at random intervals, so I manipulated the material to have the texture pan based on an artifical time which would be fed into the material via a blueprint at runtime.
In the blueprint’s Construction Script, which is used to set up an actor on creation of the blueprint object within the editor, I made a Dynamic Instance of the Light Function so that I could store it as a variable to use at runtime:
I used this reference within the Event Graph (which contains events and function calls). To create the Artificial Time value I used a Timeline which would output an everchanging Speed float value (a float is a decimal formatted number). The float of Speed was controlled by the Timelines curve which would play on an Autoloop. Speed was then then multiplied by Delta Time from the On Tick function (the time difference between the previous frame that was drawn and the current frame). The output of this multiplication would update current value of Artifical Time for that frame.
I then fed this Artificial Time value into the Light Function material so that it moved less uniformally. I’ll leave my Timeline and Blueprint setup below if you’d like to copy for yourself…
4. Finally, I also learnt how to manually control the falloff of my individual lights so that I could control how much light came from which source. This added to the realism within the bedroom, especially because it was such a small scene lit up with multiple sources of weak light. By default Unreal Engine’s lights are set up within their Details panel to Use Inverse Squared Falloff, by unticking this option you can use Unitless Intensity Units and then manipulate the Light Falloff Exponent to control exactly where the light should fall off in distance around the light source.
Inverse Squared Falloff is physically based distance falloff where the attenuation radius is only clamping the lights contribution supported by all three light unit types. When disabled, only Unitless is available. Lights with Inverse Squared Falloff disabled are useful when placing low-intensity fill lights where you don’t want a bright spot near the light. — Unreal Engine Documentation
Widgets and Animations
Widgets are used to create 3D UI elements to use within your environment, during this production I created a YouTube page as a widget which was output from a laptop, and a text message display which was output through a phone.
The YouTube page was relatively simple with me incorporating different sections of a real webpage such as the tab and side bars, with a video of the acoustic version of the song (read on for details on how I imported the video later). The side bar incorporated images from my discord server and names from my Patrons who, without their support, I would be unable to create such things. I then created a blueprint which stored the laptop Static Mesh as well as the Widget Component layered upon the screen.
The phone widget was a little more complicated as I had to create a Blueprint that would update the most recent text message with a String (a series of characters) that I sent from the sequencer. When this happened I also set up Widget Animations to create a ‘pop up’ effect to mimic that of a text message. I also did the same for the three dots which represent a person typing a message.
I created these Animations using the Widget Blueprint’s Timeline by pressing +Animation within the Animation frame and then keyframing which elements I wanted to change and where along the alocated animation timeframe. These animations then get stored as Variables within the Widget Blueprint’s Graph so that I could set up functions to trigger the correct animation at the right time so that they could get called via the Sequencer, using Sequencer Events (more on that later).
Using a Video Image Sequence for the In-Engine Video
I wanted to have a video being played as a part of my laptop to represent my original upload of the song, We Feel It Too. I recorded it acoustically when I first wrote it and uploaded it to YouTube so I thought it would be a nice personal touch and an easter egg for those who have listened to both versions.
In the past I have struggled with rendering videos playing through Unreal Engine as a Video Material because the video playback frame rate does not stay consistent with the frame rate of the overall video render, this makes it impossible to know what video frame will appear when throughout the overall render and displays it at a much slower speed than it should. This time I decided to render a Video Image Sequence instead, controlled via the Sequencer, which helped this problem immensely! I can’t imagine it would be very performant if it were run as a game however, but it works a treat for cinematics!
To do this I created a .jpeg sequence from my video via VLC Media Player, click here for details on how to do this. These images will be stored within a new folder. within Unreal Engine I then created an Img Media Source and made this folder of images the Sequence Path, overriding the frame rate of its output to be 24fps to match my cinematic render.
This Img Media Source could then be used as a Media track within the Sequencer so I could control when it was playing and what frames would play at which point.
I set the track to output within the Laptop Widget, setting the Image to be a Video Material that got reference from a Media Texture which output whatever was set to play within a Media Player, it was here that I set the Img Media Source to play. Yes, that was confusing, setting up videos in Unreal always is. Just remember:
- The Media Player dictates what should play ( which in my case was the Img Media Source).
- The Media Player is referenced by the Media Texture.
- The Media Texture is referenced by the Media Material.
- The Media Material is referenced by whatever object is playing it within the scene.
Now I had full control of which parts of the image sequence would be played within the laptop widget and when via the Sequencer which made this much easier than to control in my previous attempts to use videos in Unreal Engine.
Sequencer Events allow you to put triggers within the Sequencer to fire off scripted functionality within other blueprints, for example playing the widget animations that I spoke about previously. They aren’t needed if you’re only wanting to change things which can be done within the editor such as movement or hiding / showing of actors, but it does really help when you want more bespoke events such as animations, spawning an actor, or effecting actors in some way.
How to set these up have been covered extensively in my original post about creating a cinematic in UE4, so I won’t repeat myself here, however there is one important part I need to explain. To test in-game functionality set off by the Sequencer Events you’ll need to review the sequencer while the game is running. To do this I go into the Level Blueprint and set up a reference to my Sequencer to then play on BeginPlay.
However the game will run when you Render your Sequence so you DO NOT need to have this hooked up when you are ready to make your final render. Trust me, this will save you a world of time and frustration due to crashes as technically you’ll be forcing the render to happen twice simultaneously.
Rendering the Final Video
Upon rendering the final sequence I had to take many different renders. This is because every outcome will be different, especially when using animations or particle effects. To speed up the export I left the Audio Output Format at its default ‘No Audio’. This makes sense to me as the audio would be exported as a seperate file to the video anyway, so either way you’d have to merge these together afterwards in a different software.
To export your sequencer you can do test renders first by ticking ‘Use Compression’ to speed up the process. Adjust the compression quality to your needs, with 1 being the most compressed and 100 being the best quality. You can also account for warm up frame count and a delay before that warm up to start running particle effects to make sure your scene looks it’s best by the time the sequence starts to render.
Step 4: Final Touches in Premiere Pro
Using Premiere Pro, or another video editing software, will nicely wrap up the video and include the audio track. This allowed me to choose the best .avi files (which is the output format of the UE4 renders) for the song as well as cleaning up any transitions so that it cut nicely to the beat of the music.
When creating the Premiere Pro project I had to ensure that it was set up to match the 24 fps output of my .avi files, which you can do either in the project settings or by right clicking on an already imported video file and choosing Modify > Interpret Footage and change ‘Assume this Frame Rate’ to 24fps.
Then export and BOOM! MUSIC VIDEO READY FOR RELEASE! 🎉
There you have it! A music video created entirely from home within a Game Engine. I know I have bombarded you with a lot of information so if you’ve gotten this far then I commend you for sticking with me. Ultimately, the more you use any software or game engine, the more you will learn how best to use it and what works for you. All I can do is try my best to share with you my own findings, which are constantly expanding every time I create a new cinematic within Unreal Engine.
I hope this was somewhat insightful and hopefully educational for you and I wish you the very best upon creating your own music videos ❤
If you’d like to check out more of my work or what I’m up to then you can follow me on YouTube, Twitch, Twitter or Instagram @YagmanX . I also have a free horror game ‘Perfection’ on itch.io and a music EP ‘Sweater’ available on all major music streaming platforms :)