Fixing Hair For Metahumans in VR



https://docs.metahuman.unrealengine.com/en-US/MetahumansUnrealEngine/MetaHumansLODs/

https://forums.unrealengine.com/t/can-someone-please-fix-the-grooms-issue-in-4-26-1-2/225127

You can enable cards by using r.hairstrands.strands 0
That way you get the head hair. Using the sample metahuman files seems to then allow for the eyebrows to be reenabled but I’m stuck trying to get the eyelashes to work in a passable way…

Adjusted M_EyeLash_HigherLODs_Inst to fix eyelash opacity for eyelash cards made on construction script.

image.png
better eyelashes

Tutorial – Retargeting UE4 mannequin animations onto a non-standard bipedal skeleton.

Often you may find yourself having a bipedal character such as monster characters and such and there might not be enough animation assets for it. Adding to that its skeleton does not coincide with the default UE4 mannequin’s one and so you can’t just share the animation assets between them. There is a solution tough, you can attempt at retarget the animations that you have for the default mannequin onto your character.

This is a walk through, step by step, showing how you can retarget animations that are rigged for the UE4 default mannequin onto a different bipedal skeleton.

1 – Preparing the skeleton assets for retargeting

First thing you need to do is to check if both your skeleton assets are ready for retargeting.
For that you need to:

  • Open both skeleton assets.
  • Go to retarget manager.
  • Set up their rigs: choose humanoid.
  • On the non-default skeleton you may need to set up the root-target correspondence: choose the appropriate bone (if one exists) for each of the targets provided.
  • Important: make sure that both of them have matching poses be it A-Pose or T-Pose. You can change them at “Manage Retarget Base Pose” below “Set up Rig”.

The rig that was just setup is meant to make the bridge between the source skeleton bones and the target skeleton one’s.

Figure 1 shows how the end result should look like with both skeletons doing an A-Pose.

Figure 1 – Setting up the rig.

2 – Retargeting the animation asset

After this you may go to an animation asset of your choice, can be anything be it aim-offset, blend space, blueprint, montage, or sequence. You click with the right mouse button, select “Retarget Anim Assets” and then “Duplicate Anim Assets and Retarget”. You’ll be presented with a similar window as the on in Figure 2

On this window you can then select your target skeleton and after selecting it you may notice that both meshes should be doing the same pose. There’s a red warning to remind you of it. You can also add extra task to do such as adding prefixes, suffixes, replace parts of the asset name for another name and even select the folder on which you want the new asset to be saved in.

Figure 2 – Animation retarget window, note that both meshes are doing the same pose.

After the process is done you can then open the new asset that was just created and check the result. It is likely that the result might look like something on the below video where the mesh seems to be horribly distorted. What gives?

On the above video we see the superposition of the non-retargeted skeleton animations (in yellow) while in white is the retargeted animation.

3 – Retargeting options

There is another step that was omitted previously to show what can go wrong during retargeting and on how to try and fix it. That step is setting up the retargeting options. If the rig is meant to bridge one skeleton bones to the other the retargeting options are meant to tell on how to deal with the animation information that was just retargeted.

On the skeleton asset click “Options” and then “Show Retargeting Options”, see Figure 3.

There are multiple options available but following Epic Games’ guidelines as a rule of thumb we do the following setup:

  • The root bone (the first bone in the chain), weapon and cloth bones (if available), IK bones and other markers are meant to use the “Animation” option.
  • The pelvis bone (or another that has the same purpose) will be set as “Animation Scaled”.
  • The remaining bones use the “Skeleton” option.

For a fast setup do it like this:

  • Right click the root bone and select “Recursively Set Translation Retargeting Skeleton”.
  • Then find the pelvis bone or equivalent and set it to “Animation Scaled”.
  • Any other bones such as the root bone, weapon or cloth bones, IK bones and markers set them as “Animation”.
Figure 3 – Retargeting options

Now having all set up you may check again the animation asset that was retargeted and see how those changes affected the animation.

As you can compare with the previous example, on this video the mesh is much less distorted and there is a striking difference between the non-retargeted animation with the retargeted one when they are superimposed.

4 – Remaining issues

There are still two details that on this example require fixing:

  • The mesh is being set up below the ground level: this might be due to the root bone on this specific skeleton has an offset being at the same level as the pelvis. When getting retargeted the target root bone gets set up at the source root bone position that is the ground.
  • The mouth and tongue still present a bit of an issue: this might be due to a lack of information on how to translate those bones on the source animation.

Both of these issues can be tackled on the animation blueprint.

For root bone height issue one of the fix may be as follows: every time you may want to play the retargeted animation you can apply an offset to the root bone (see Figure 4):

  • Add a “Transform (modify) Bone” node to the animation blueprint.
  • On the translation pin apply a vector with an offset on Z-axis. To precise on a value you may need to do some trial and error. On this example 100 unreal units worked well.
  • On this case we are switching the offset depending on if the retargeted animation is playing or not.
Figure 4 – Root bone height offset fix.

To solve the tongue and mouth issue, on this example there was some animation assets for walking already done for this specific mesh which had the character walk with its mouth closed.

On the animation blueprint we saved this animation as a cached pose (from the animation asset pin write “New saved cache pose” and select it. You can also then rename this new node. Having  this saved pose was then just a matter of blending it in onto the other pose your character might be playing (see Figure 5).

Figure 5 – Mouth and tongue bones fix.

As you can check from the above picture there is a “Layered blend per bone” node where two cached poses are being fed in. The one below, second pin, is being layered on top of the one above, first pin, and this one is the saved cache pose with the mouth shut. On the details panel you can check the values that were set in. On the branch filter the “root” bone of the tongue bone chain is being set with a value of -1. What does it mean? It means that the animation data that is being layered on top gets discarded on that bone chain. Meaning that on that chain, which is the one that translates the mouth and tongue, is only being driven by the pose that has the mouth shut.

5 – Conclusion and documentation

Thus this concludes the tutorial. Albeit not covering all of the issues that may arise it gives the basis for you to tackle them and see on how to try different solutions to solve them as soon as they appear.

            For further references see the following links to Epic Games’ documentation on the subject:

https://docs.unrealengine.com/4.26/en-US/AnimatingObjects/SkeletalMeshAnimation/AnimationRetargeting/

https://docs.unrealengine.com/4.26/en-US/AnimatingObjects/SkeletalMeshAnimation/AnimHowTo/Retargeting/

Creating challenges for the pallet stacking project

The challenges are setup up so that the player is given a 10 minute time frame to complete as many challenges as they can. During this period they can select easy, medium or hard challenges which score different amounts accordingly. They start the challenge by pulling the lever after pressing the difficulty button of the challenge they would like to do. They will then get a random challenge of that difficulty.

How to start a challenge.

During the challenge they then have to stack as many boxes on the pallet as they can and then pull the lever to deliver the pallet. They will incur penalties for any unstacked boxes or any boxes that have been crushed. Crushed boxes also incur a time penalty.

Challenge Design

The challenges are split into 3 difficulties, the easy challenges are challenges that only use the basic types of block: heavy, standard and light, plus the variations in size of those blocks. They also make minimal use of the waves system only delivering a few new blocks a most. The medium challenges introduce the fragile heavy blocks and start to make much more use of the wave system. Finally the hard challenges introduce blocks starting on the pallet. Each step up in difficulty is designed to make the player think more about the placement of the blocks by constraining the number of solutions possible. It is hard to quantify the exact number of solutions for any given challenge as there are a large number of configurations of the blocks on the pallet however by introducing these constraints we remove options reducing the available solution space.

Each of the challenges are intended to take approximately 2-3 minutes to complete allowing a player to complete 3-5 challenges in the allotted 10 minutes. The player is allowed to choose the difficulty of the challenges as they play them so that they can adjust their play for maximizing points or take on easier challenges to learn and improve.

Challenge Technical Setup

When setting up a challenge you can select which boxes you want to spawn by selecting them from the drop down (you can select any of the variations in size and weight that have a blueprint created for them) and you can set them up to spawn in waves. A new wave is spawned when the number of boxes stacked on the pallet reaches the number specified in the wave.

You can also choose to have boxes spawn already on the pallet by adding them to the array ‘ObjectsStaringOnPallet’ and setting a starting offset corresponding to the position on the pallet where you would like them to spawn, these boxes can’t be moved by the player. The final thing that can be specified is the size of the pallet by inputting 2×2 or 3×3 in the PalletType field. To add different pallets a new pallet blueprint child would need to be created.

The challenges are all set up in a data table so it’s straight forward to add and adjust challenges.

Example challenge setup in the data table.

The code for starting and stopping the challenges as well as tracking the score etc can be found in the PalletStackerGameMode.

When a challenge is started I un-pause the timer and then get a random challenge of the selected difficulty from the data table. I then create the pallet and spawn in the boxes defined by the challenge. When stopping a challenge I do the opposite; clear the pallet and cash in the boxes updating the player score, the code here is very straight forward and just steps through the process as you would expect.

Starting and stopping the challenges.

When spawning the boxes I retrieve the soft class references from the data table and the compare them to a list of box types stored in the gamemode to then spawn the appropriate box from that class.

Select class to spawn box from.

Setting Up AWS Account For Unreal Servers Using Blueprint Plugin

Introduction

As a start to this process, I’ve been following tutorials and understanding some key principles.

  1. The UE4 client has access to amazon IAM credentials for logging in users via cognito. On Amazon, this IAM credential must be restricted to only cognito logins to not allow someone who has access to this information
  2. For higher security of data, the unreal server must be the only one to send updates and this should be sent through graphQL/app sync and this must be filtered to only whitelisted IP addresses.

Video

The video I am following for this setup is this video from Lion of of the aws plugin team.


First Step: Setting Up An AWS IAM User Configured to Access Cognito Only.

The first step is setting up a new cognitoOnly amazon AWS user. This is required to allow that user to log in but not access other aws services.

On the policy for this user, we are creating a new policy that restricts access to only certain functions within the CognitoUserPools Service.

This user is restricted to:

SignUp, ConfirmSignup, InitiateAuth, ForgotPassword,ConfirmForgotPassword, ChangePassword, List Users, and Get User

Before finalizing this user, we must create a user pool.

User Pool Creation

The video takes through some standard steps for creating a user pool related to specifications on the user password and email authorization.

The last step gets into some specifics of which app the user pool connects to and options for auth flows configuration.

In the video, it is advised that you can generate a client secret here but not to forget to create a secret hash in aws cognito idp’s blueprint nodes.

That likely refers to this node here on the cognito only example, we will have to check back on this later.

Adding User Pool Specifics to AWS IAM User Configuration For Cognito Only

The next step involved finishing the IAM user policy to restrict access to only this specific user pool we created. The video shows clicking “Add ARN to Restrict Access” and then shows to copy the region and the specific user pool ID from the user pool you have created.

Configuring Federated Identity Pool

There are some steps to follow to create a federated identity pool. This takes in the User Pool ID and App Client ID.

Adding Cognito Identity Service To Cognito Only AWS IAM User

Add to the policy for the IAM cognito only user access to the Cognito Identity service.

Add only access to getID and GetCredentialsForIdentity.

Now we have enough to finish creating the policy, reviewing the access and naming it cognito_user.

Create an IAM user with this policy

The next step is creating a new IAM user, attaching only this policy. These credentials can now be used inside the ue4 build.

Creating an Admin IAM User.

The video shows steps to create an IAM admin user for use with aws cli.

The video then shows configuring the aws cli to use these credentials, but this part is not completely clear.

Next it looks like it is referencing a role on AWS IAM of Cognito_awstutorialAuthRole. I do not remember seeing this setup earlier in the video.

Reviewing what was set up previously, I believe I set this up to be tied to this particular role Cognito_IDPoolFranckStagingAuth_Role, that must have been set up earlier when setting up the user, the policy, and linking the user to the userPool.

I can see when looking at the trust relationships tab it fits with what I had configured.

Adding Additional Access Permissions to the Authenticated Role

This next step shows adding additional permissions to the authenticated player.

First there is Gamelift & Allow All Gamelift Actions

Then there is Lambda and all Lambda Actions/Resources (this seems potentially dangerous, we may likely have to revisit it).

Then the video briefly showed AppSync but did not add it here.

Setting Up Matchmaking Logic

The tutorial video shows assigning a matchmaking rule set as part of the gamelift setup. We currently do not have that configured.

I also found that the eu-west2 region does not support matchmaking, so similar to the example in this video, we will need to assign a different region for matchmaking vs our fleet’s region.

On the description of this video, it mentions:

prerequisites: I have setup the whole matchmaking system in my previous videos, if you want to know how to set up matchmaking systems, please watch my previous videos.

So it seems like I will need to learn more about matchmaking before proceeding further on this tutorial.

This video seems to cover matchmaking from the plugin perspective.

And this starts to cover matchmaking from the official gamelift side.

https://docs.aws.amazon.com/gamelift/latest/flexmatchguide/match-intro.html

Shifting to a new Tutorial to do the original tutorial!

Before diving into another 50 minute long silent tutorial, I felt like it might be helpful to learn from some official aws documentation first.

https://explore.skillbuilder.aws/learn/course/421/play/1224/amazon-gamelift-primer

The primer first describes matchmaking as a pre-requisite, then gets into game session placement.

After a match is found, the game session needs to be started somewhere. Finding the best location available for the game session is called game session placement. The game hosting service should be able to examine the infrastructure resources and select a location that has the lowest average latency for all the players. Game session placement is another game hosting requirement.

The four layers of Gamelift:

Matchmaking relates players that want to play a game with a game session where it can be played.

Game Session Placement figures out where to host game sessions.

Session Management starts games and facilitates players to join games.

And Infrastructure management provides elastic control over game servers.

Notes from team conversation :

Many different types of authenticators/authentications, can be applied to each specific lambda functions.

Can have several entry points that can trigger the same function.

Fetching user profile can be available for web front end or available for a different policy

Settings are very flexible

yml function can regenerate entire aws setup

Different authorizers can be set up per lambda function

get token from launcher, attach token to every api request

Requirements For Setting Up Gamelift Local And Testing A Server Build

  1. Download Gamelift Managed Servers SDK

https://aws.amazon.com/gamelift/getting-started/

2. Download Java Development Kit 8 (scroll down to get it)

https://www.oracle.com/java/technologies/downloads/#java8-windows

or use this direct link

jdk-8u301-windows-x64.exe

3. Check to see if java is installed on your path by opening a a command prompt (type cmd after hitting windows button) and typing java -version.

4. Unzip that Gamelift SDK you downloaded on step 1 and copy the unzipped folder to a different drive than where you have unreal installed (so if you installed unreal on C, copy gamelift sdk to D.

5. Navigate to that folder in windows, select the navigation path, then type cmd to replace the directory and press enter. This will open a command prompt direct to that directory.

6. Type java -jar GameLiftLocal.jar in the command prompt. If it works you should see text like this.

7. Next step is to add the plugins for the project to your engine folder.

Tutorial – Using the Live Link Face app for iPhone 10+ to record facial animation assets for MetaHumans in Unreal Engine

This tutorial will walk you through the steps of bringing your MetaHuman to life with facial mocap straight from your iPhone. You will be guided through the process of setting up a new project readyu for animation, importing your MetaHuman and connecting it to Live Link, before finally recording your animation and saving it as a separate asset that you can reuse on any other MetaHuman later.

The Live Link Face app only works for the iPhone X or above, so you will need access to one of these before beginning this tutorial. The screenshots in this tutorial are from UE4.26.2 but the same steps will also work in UE5, although the UI will look different. The main difference is that in UE5, Quixel Bridge is incorporated directly into the engine, whereas if you are on Unreal Engine 4, you will need to download Quixel Bridge from here: www.quixel.com/bridge

1 – Starting a new project

  • Open Unreal Engine and start a new blank project

2- Downloading and importing the MetaHuman into Unreal Engine

  • Open Quixel Bridge and sign in with your Epic Games account
  • Navigate to the MetaHumans panel and download one of the premade MetaHumans or create your own custom MetaHuman at www.metahuman.unrealengine.com. For this example, we will use Danielle.
  • Once the download is complete, press “export” while your unreal engine project is open in another window. Your MetaHuman will be imported into your UE4 Project.

3 – Enabling necessary plugins

  • After your MetaHuman is imported, you will see several warning messages asking you to enable missing plugins and project settings. Press “Enable Missing…” for all of these
Enable Missing for all Plugins and Project Settings Warnings
  • You now need to make sure the Live Link, Apple ARKit and Apple ARKit Face Support Plugins are enabled. To do this, go to “settings > plugins” and then search for each of these in the search bar. Press ‘enable’ on each. You will need to restart Unreal Engine for this to take effect
Enable Live Link, Apple ARKit and Apple ARKit Face Support Plugins

4 – Placing your MetaHuman into the scene

  • With the necessary plugins enabled, you may now drag your MetaHuman into your scene. You can find the MetaHuman in your content browser at “Content > MetaHumans > Danielle > BP_Danielle “. Drag the BP_Danielle file into your viewport.
Drag the MetaHuman Blueprint into your scene

5 – Connect your MetaHuman to the Live Link Face App on your Iphone X (or above)

  • First Download the Live Link Face App off the App store
  • Open the app and go to the settings icon in the top left corner
  • In Settings go to Live Link and then “Add Target”
  • Here you will need to input the IP address of your computer. You can find this on your computer by pressing the windows key and typing “cmd” to open the Command Prompt. Here type in “ipconfig”.
  • Look for the line reading “IPv4 Address…”. The number at the end of this is what you need to enter into the Live Link Face App on your phone.
  • Now back in Unreal, go to the viewport and select your MetaHuman. In the Details Panel, under “Default” there is an option called “LLink Face Subj” with a dropdown menu. Click this and choose your iPhone from the list.
  • You are connected! If you want to additionally stream head rotation, you must enable the option below “LLink Face Subj” called “LLink Face Head” and “enable stream head rotation” in the settings of the Live Link App on your phone.
  • To test the connection hit the play button at the top and move the camera so you can see your MetaHuman’s face. If you’ve followed the above steps correctly, the MetaHuman should be copying your facial movements.

6 – Recording a performance with the take recorder

  • First open the take recorder by going to “Window > Cinematics > Take Recorder”
  • You need to add your iPhone as a source. Click the green “+ source” button and then “From LiveLink > iPhone”
  • Select the iPhone source to bring up more details and uncheck “use source timecode”
  • If you want to then also record sound (for example, you are recording a speaking part, then add “microphone audio” as another source
  • Before recording your take, click the play button at the top of the screen (to keep the camera from jumping to the centre of the world, change the spawn location from “Default Player Start” to “Current Camera Location”) This lets you see your MetaHuman moving during your recording
  • To record your take, press the circular red record button. There will be a three second count down before your take goes live. Press the square stop button when you are done.
  • Your take will be saved as a level sequence and can be found through the content browser at “Content > Cinematics > Takes > ‘current-date’ “

7 – Saving the take as an animation for later use in sequencer

  • To open your take in Sequencer, navigate to “Content > Cinematics > Takes > ‘current-date’ “ and find your take as a level sequence. Open it and then double click on the track “iPhone_Scene_…”. If you scroll through the keyframes and cannot see the animation play, this is likely because you have exited ‘Play’ mode.
  • You won’t be able to edit anything on this yet, as by default, takes are saved as locked. To edit it, we will need to click the lock icon in the top right corner of the sequence.
  • We are trying to save the take as an animation asset that we can use later. To do this, we need to add BP_Danielle to our timeline and bake the animation sequence. Drag BP_Danielle from the world outliner to the space in Sequencer underneath our iPhone Data.
  • We don’t need the Body, MetaHuman_ControlRig or Face_ControlBoard_CtrlRig tracks so delete those, leaving only the face track.
  • Next, right-click on the face track and click “bake animation sequence”. Name and choose where to save your animation and press “ok” and then “export to animation sequence”
  • You can then navigate to wherever you saved your animation asset (“Content > MetaHumans > Common > Face”) and open it to see the animation on its own.

8 – Re-using your animation asset in sequencer

  • So now we have our MetaHuman facial animation asset we can then apply this to any MetaHuman in a separate sequence, with other animations (for the body for example) also applied separately.
  • Let’s create a new level sequence and add a camera and our MetaHuman. Do this by clicking “Cinematics > New Level Sequence”. Press the camera icon to create a new camera and then drag in BP_Danielle (or whichever MetaHuman you’d like to animate) as before.
  • Click the tiny camera icon next to “Cine Camera Actor” to lock your viewport to the camera. You can now move around and position your camera as you’d like. You can adjust the camera settings in the details panel
  • Delete the “Face_ControlBoard_CtrlRig” track and instead press the little plus sign next to “Face”. From here you should select your previously saved animation from the Animation dropdown menu.
  • Now your animation has been added to your MetaHuman in Sequence. Combine these and other animations / sequenced events to make short films and videos!

Building a VR pallet stacking system with procedural meshes in UE4

The goal here is to make a pallet stacking system that can gives the users a feeling of weight and makes them think about how boxes should be stacked.

The snap system

The snap system (Modular Snap System Runtime) is straightforward to set up. First you need to go to the static mesh that you would like the snap system to work for and add some sockets to the mesh. Optionally if you only want snapping between specific objects then you can name the sockets so they match with what you want to snap to for example you could have two sockets ‘box_01’ and ‘cylinder_01’ then specify in the snap settings to pay attention to the socket names and these two would no longer snap together.

Sockets placed on each face to allow snapping on all sides.

Next you need to set it up in your VR pawn so that the snaps are triggered when you drop an object that you would like to snap. To do this with the Vive_PawnCharacter we follow off from the TryDropSingle_Client nodes and take the gripped actor and use the snap node.

Snap actor node (things to note: ‘seach only’ is ticked and the snap settings can be seen on the right)

This is then followed by the smooth move node which does as it says and takes the snap info and smoothly transitions the object to it’s new location.

Smooth Move

In with the snapping code on the Vive_PawnCharacter is also some functions that are called on the dropped object and it’s new stack parents. The functions update the procedural meshes with the new weights they have above them on the stack which in turn updates the deformation and also updates them to save a reference to their parent. This is also done when the objects are grabbed after the TrytoGrabObject nodes.

There is one problem with the snapping system for this use case and that is a procedural mesh can’t have sockets so I have solved this problem by using a underlying static mesh which deals with the snapping instead of the procedural mesh. Another problem is you can’t update sockets dynamically, which was a problem because I didn’t want the objects to stick to the side of each other. To solve this I have a function called ‘AdjustSnapPoints’ which swaps the underlying mesh as it rotates to make sure that there are only ever sockets on the top and bottom.

The Procedural mesh

To get the procedural mesh set up you need to first take a static mesh you want to work as the base for the procedural mesh and get the section from the static mesh and use the outputted values to create a new mesh section for the procedural mesh. Then some collision needs to be added using the ‘Add collision convex mesh’ node.

The procedural mesh set up in the construction script.

To deform the procedural mesh you need to do 3 things calculate the adjustment for each vertex, move the vertices and then update the procedural mesh. To calculate the vertex adjustments first you need to rotate the vertex to account for any changes in world rotation then get the distance of the vertex from the bottom of the mesh and finally multiply that by a factor of the current mass load on the object. This gives us the base deformation, we then do another step to add the buckling effect.

On the left rotate the vertex and on the right the fuction below followed by the multiplication.
Small function to work out the distance to the bottom of the mesh.

First I do a check to see if the vertex is on the top or bottom face as we only want this effect to apply to the middle vertices when the deformation is less. If it is we do a simple adjustment based on what we have already calculated and if not we also do the buckling calculations.

Uses the component bounds to work out the z size.

To work out the buckling get the distance of the vertex from the centre point of the mesh using ‘component bounds’ and then put that into a range between 0 and the ‘OutwardsBuckleFactor’ which can be adjusted on each type of object to create different crushing profiles. Then multiply that by the direction of the vertex from the centre of the mesh to get the vertex change.

Move the vertex and save this in a new array. (below this is the non buckling version which is the same but without the buckle comment bit)

Finally update the procedural mesh, in the picture below you can see that I have used a timeline to smooth out this process by lerping between the old vertex positions and the new ones. To do the updating use the ‘update mesh section’ node as well as clearing the old collision and rebuilding it. When this is done check to see if the mesh has sbeen crushed and if it has highlight it.

Update the mesh smoothly so it looks like it is being crushed.

The result of this is that the mesh will deform after a certain amount of weight has been put on top of it as you can see in the video below. There are still some improvements that need to be made to this such as updating the sockets so that they more closely fit the procedural mesh (to stop the meshes from floating).

Video showing off all of the above.

Tutorial For Installing Amazon Multiplayer Plugin

  1. Download Unreal Engine 4.25+ from github.
  2. Follow instructions from Epic Games github (download pre-requisites, setup.bat, generateprojectfiles.bat).
  3. Build Development Editor by opening the sln file in visual studio. Should take about 1-2 hours.
  4. Unzip marketplace.rar file and move marketplace folder into new unreal engine install location

5. Build development editor again (should just take 5 mins this time).

New Project Steps

  1. Make a new project
  2. Create a server target by duplicating PnameEditorTarget.cs in project source and changing it to PnameServerTarget.cs and changing the c++ contents also to be named PnameServerTarget.cs

There are two starter tutorials to consider:

Implementing Gamelift Into A New Project

The first of these two tutorials gives an overview on implementing gamelift into a new project completely from scratch.

Consider that in project settings, the server default map needs to be set and this is the map the server will enter into. The client enters into the map called EntryMap, a different entry point than the server.

I suggest watching this tutorial to understand the different parts of what is going on, but starting from the gamelift only starter project instead of recreating it yourself.

Starting From The Gamelift Only Starter Project:

https://mega.nz/file/2OhGnIzL#qCJ93Rt3102cAYBkJ_4kK4mVwAxCA6TnWUn150UHpBM

Also important to make sure maps are included in packaged project–Gamelift Only Sample Project shows these as included already by default also.

After building, dedicated servers appear in the saved>>staged builds directory

The next step involves uploading a build to aws gamelift which requires the aws CLI tool.

For our purposes we also want to create builds in linux, which requires installing some tools to create ue4 dedicated server builds in linux.

For this, we will investigate this tutorial: https://medium.com/swlh/building-and-hosting-an-unreal-engine-dedicated-server-with-aws-and-docker-75317780c567

This tutorial mentions first installing a clang pre-requisite, for our version we use unreal 4.25, so this is available to download and install here.

https://docs.unrealengine.com/4.26/en-US/SharingAndReleasing/Linux/GettingStarted/

With the clang pre-requisite installed, now is the time to try and see if we can build to a specific running linux instance from within the project launcher window.

By default it does not appear here.

For doing this and getting a linux platform to appear here, I was advised to run a linux shell by installing debian from the windows store. First opening it returns this error–checking into it.

Here is the site they are referring to:

https://docs.microsoft.com/en-us/windows/wsl/install-win10

I decided to try a simplified install.

It appears to have worked :).

After restarting, it appears an error popped up in Ubuntu–this will require investigation.

Perhaps this installation failed because I didn’t join the windows insider program and install a test windows build. For now I will put this on the backlog and see if I can progress making dedicated server linux builds for uploading to AWS but package windows builds for testing with my local gamelift.

After speaking with Atanas–we think it’s likely an easier fix to enable this in my bios settings and I won’t need to join the windows insider program.

If I need to return to creating a linux environment for building the project and doing local gamelift testing, I can return to this video tutorial here https://www.youtube.com/watch?v=ZRl_P8Q5_fI

Here on the next step, I have I am packaging and building the gamelift only sample project for windows from the project launcher.

Now after packaging, it is time to upload this build with the amazon command line tools.

First step after downloading the command line tools is to configure my aws environment with access key, secret key, and default region (eu-west-2) and default format (json).

Next is doing the actual upload in the command line.

aws gamelift upload-build –name –build-version –build-root –operating-system WINDOWS_2012 –region eu-west-2

We modify this to be:

aws gamelift upload-build –name “GLTestBuild” –build-version 1.0 –build-root “E:\UnrealProjects\GameliftOnly\GameliftOnly\Saved\StagedBuilds\WindowsServer” –operating-system WINDOWS_2012 –region eu-west-2

Now after some patience, it is uploaded. Next step is to create a fleet from this build!

For specifying the build path, from a full string of this where GameLiftTutorial is the project name

E:\UnrealProjects\GameliftOnly\GameliftOnly\Saved\StagedBuilds\WindowsServer\GameLiftTutorial\Binaries\Win64

You specify just

GameLiftTutorial\Binaries\Win64\GameLiftTutorialServer.exe and set concurrent processes to 20 for testing (which means 20 players).

Configure the ec2 port settings as per this image.

Now it’s time to go back to that sample project and update our configurable variables for connecting to Gamelift.

Opening the GameEntry map and opening the level blueprint, we can see two well organized sections for aws local and aws online respectively.

Here we edit the access key, secret key, and region.

The next step is to edit the fleet alias name variable which connects into the Create Game Session event with our fleet name from the aws gamelift console.

Now I attempt to play on this new fleet by hitting play in the editor, but we can see from the print string a friendly error message showing that the fleet is not yet active but is still activating.

This ended up having an error that prevented starting the fleet. After testing locally it did not seem to connect new players either, but the gamelift SDK running locally did appear to detect the server exe being opened and said it had healthy processes. It’s possible I missed something there.

For running windows builds, the developers of the gamelift blueprint plugin did mention it was necessary to include specific windows files.

https://github.com/multiplayscape/Multiplayer-With-TDM/tree/main/Linux%26WindowsFiles/windowsrequiredfiles

But for now, I decided to march on with Linux instead as recommended by Atanas.

I found this very useful article for enabling virtualization on my machine.

How to Enable Virtualization on Windows 10

After doing this, I successfully got debian to install.

Now my next steps are packaging a linux build and running aws locally in linux to test it.

I was recommended to follow this tutorial by the developers of the plugin:

Following this silent tutorial I reached a point of no return where installing openssh-server did not seem to proceed.

This led me to find this other tutorial which I will now be considering.

Atanas sent me some commands to try out for installing ssh server, so I will test those out first before going through this tutorial.

  • sudo apt update
  • sudo apt upgrade
  • sudo apt-get install openssh-server
  • sudo service ssh start (to run the ssh server)
  • sudo service ssh stop
  • sudo service ssh status

Using these commands, I was able to start an ssh server. The How to Build Linux Server with WSL tutorial then shows I need to run a command ip a to find the ip address of this server.

127.0.0.1/8

The next step in the tutorial shows I need to add an unlisted linux platform as an option from the device manager, but I am not seeing this available on the drop down.

Although I have installed the linux clang, this potentially is not appearing because we are missing some additional steps of configuring environment variables.

It seems like the environment variable was set automatically, but watching this tutorial video showed me I need to build the engine again after installing the linux clang.

After rebuilding the engine, now I see the options for adding linux from the device browser.

I have entered the ip address of my linux ssh server and for the username and password I put the username and password I had entered for debian and the ssh server.

Now it’s time to try building the project from Project Launcher, making sure to select by the book for the data build option.

The build failed on this step, so from the windows terminal I connect now to the ssh that is running in debian.

I ran into errors building the project when setting the ip for 127.0.0.1/8 and also for 127.0.0.1.

This error seems more promising to fix.

This post seems to suggest I need to re-run setup.bat

https://answers.unrealengine.com/questions/542062/linux-cross-compile-visual-studio-error.html

Further searching on the error I found that some suggestions on the discord for the plugin were to update the version of putty that is included on unreal engine.

I have attempted to update the putty files here and will try re-running the linux build process.

putty files

This appears to have made some progress..building a linux server build now!

The process appears to fail at the end, but these error messages can be ignored.

The next step is to run on windows command line the gamelift Local server with this command java -jar GameLiftLocal.jar. This opens Gamelift Local to start looking for servers.

Then we need to go this directory for finding our new linux server build:

E:\UnrealProjects\GameliftOnly\GameliftOnly\Saved\StagedBuilds\LinuxServer\GameLiftTutorial\Binaries\Linux

Along the way, in the image below I learned some things about navigating to this path in linux. The linux server can access the windows directories via its mnt directory from its file root.

Unfortunately here you can see I have an error running the command ./GameLiftTutorialServer that it is not opening the server and running into an error.

After searching, the fix for this error suggested:

“make sure you have put your project and unreal engine source in different partition and the plugin in unreal source engine. because of a bug of windows cpp linker, windows will just link a relative path to the binary in unreal engine to executable if they are in the same partition” which I have interpreted to put my project files on D drive while keeping my engine source on E drive.

After this, it did not fix the bug so I am attempting to rebuild the project from the d drive.

Re-building the project from the d-drive worked, but then launching the server gave this error.

Discussing with Lion from the plugin team (very helpful), he suggested that this might be blocked by windows firewall. Disabling windows defender is not enough, but I will need to modify regedit.

“you need to modify the regedit table on windows to terminate windows firewall. turning off windows defender will just allow all income from network. not included dbus”

He also gave some helpful advice that it can be useful to have a separate linux PC to do this kind of testing or can be also useful to develop on mac for avoiding the need to run a virtual machine for doing so. Food for thought!

He also gave some advice that it is possible to retrieve logs from the gamelift game session consoles, but only for terminated game sessions.

Next Steps From Here

So I have almost arrived at the end of my journey of running on a local linux virtual machine an ue4 linux server build which I can monitor on aws gamelift local.

The next steps for me will be disabling the firewall as suggested with regedit and then attempting this again.

As a recap for myself now that I almost have a built thing ready to go, after restarting my computer I will need to:

A. Open debian and run these commands

sudo service ssh start (to run the ssh server)

sudo service ssh stop

sudo service ssh status

B. Connect to debian from windows command prompt with ssh brian@127.0.0.1

C. In the debian command prompt, navigate to the path where you installed the gamelift local sdk with cd /mnt/e/Gamelift/GameLiftLocal-1.0.5 and start Gamelift Local with this command: java -jar GameLiftLocal.jar

D. In a new Debian window, navigate to the Linux Server Build Path

cd/mnt/d/GameliftOnly/GameliftOnly/Saved/StagedBuilds/LinuxServer/GameLiftTutorial/Binaries/Linux/

E. open the linux server with the command .\GameLiftTutorialServer

F. Monitor to see if this time it successfully connects to GameLiftLocal.

A successful connection should look like this:

Resuming Progress!

It turns out we did not need to edit firewall settings, our error turned out to be something a bit simpler.

I was opening the gamelift local jar file in a windows command prompt, so the linux server running in the debian environment was not seeing the gamelift local running.

Now the next step is to see if we can connect to the locally running server from either a client build or when playing from the editor.

First step now is to make sure I have changed the game entry map for the client to point its begin play event to the aws local side of things.

There was a small compile error here I had to fix, I’ve renamed the fleet variable name to fleet arm 2.

Running this I ran into these errors below when trying to connect, it seems I will need to modify the blueprint code of the gamelift only project and I’ll be reviewing again the tutorials to see if there are any things I missed on this.

Create Player Session Error

When connecting the client again, it says there is no available process. Perhaps it needs to be set to allow more simultaneous processes on the aws local server configuration.

Wednesday

To make sure there are no problems with my environment, I am now testing packaging for linux server a build we know connects.

Thursday/Friday/Sunday

Our results packaging a build we knew 100% worked on my colleague’s computer found us trying lots of different ip addresses from our ipconfig, but still having an issue for the windows client playing from the editor to connect to the gamelift server running in debian.

The next steps now are to:

  1. Try a tutorial to set up a virtual box environment instead of debian for linux, as there may be better control over network parameters
  2. If this fails, try running the linux build on another computer on the network and connect to it that way.

I am going through this tutorial now on setting up the Virtual Box.

Lion from the discord support team for the AWS plugin also recommended watching this video, which I’ll watch soon and summarize the learnings from there.

First, I watched Lion’s tutorial video. It was a great overview of the overall Gamelift architecture. When I re-edit this into a more organized tutorial, I think this is the kind of video to definitely put at the beginning.

I learned some great points about configuring autoscaling for your aws fleet and making sure the server is responsible for sending critical data to the aws backend and this is vetted by making sure the IP of the sender is one of the IP’s in the aws fleet of unreal dedicated server processes, not just any client.

Second, I did succeed in getting a virtual box environment set up, but when I go to open the linux server I am running into an issue.

I will investigate this error some more first before abandoning the virtual box set up in favor of trying another pc for hosting the aws local server.

Investigating on discord did not give many leads, so I am going to explore the option of building a dedicated server on windows for my local aws testing, and then later building a linux build to upload to aws.

A while back, Lion messaged me to ask if I had installed the required files for building a windows server build.

https://github.com/multiplayscape/Multiplayer-With-TDM/tree/main/Linux%26WindowsFiles/windowsrequiredfiles

I did get the server and gamelift jar working inside the virtual box environment, but in the end ran into the same error we were having with Debian where my windows client was not connecting to the gamelift server.

Monday

I’ve decided to try packaging a windows server to see if I can connect to that.

I ran into some errors trying to package the gamelift SDK, so I have started from part 1 of this tutorial series and installed visual studio 2019.

As part of part 2 of this tutorial series, I have installed cmake and set the system path variables for CMAKE and MSBuild and then also set a new system variable for the visual studio 2019 common tools.

running this command in command prompt to build the gamelift sdk.

E:\Gamelift\GameLift-SDK-Release-4.0.2\GameLift-Cpp-ServerSDK-3.4.2\out>cmake -G “Visual Studio 16 2019” -A x64 -DBUILD_FOR_UNREAL=1 ..

Next step:

msbuild ALL_BUILD.vcxproj /p:Configuration=Release

These steps above were for building the SDK, but I realized with the gamelift plugin, we already have a built gamelift SDK in our engine folder, so I fast forwarded past this part and stopped this process.

Beyond this, I resolved an error I had with gameliftlocal.jar not opening anymore on windows by reinstalling java jdk.

I then got the windows server build to open now that I had populated it with the pre requisite installer and the install.bat.

To open the windows server and enable logging, I open it from a cmd prompt with these parameters added. Convrs_Quest2Server.exe -log -port=7779

This makes me wonder if I could add a port as a parameter to the linux server to resolve the problems we were getting.

Building a responsive cinematic animation system in Unreal Engine 4

Singularity University is building a diversity training exercise which requires animating full scenes with multiple human characters, including the potential for these human characters to interact with the player or the environment.

To support this development, we are using the following pieces of unreal engine.

  • Animation Blueprint
  • Level Sequences and Template Sequences
  • Blueprint Character
  • Animation Slots within Animation Blueprint
  • Fabrik IK
  • Power IK Plugin

Videos Of Implementation

Level Sequence, Template Sequence, and Blueprint Triggering

A template sequence can be bound to a particular kind of actor, in this case BP_NPCCharacter.

This particular template can be re-used in different level sequences, so an animator can plug in a sequence of static animations.

Here is an example of the process of inserting a template sequence for the track of a particular chosen actor of the level sequence (in this case BP_NPCCharacter3)

These sequences can be triggered in blueprint, such as this trigger for the specified sequence to play in BP_SequenceManagerActor.

If we were strictly to rely upon animation sequences without an animation blueprint though, we might find ourselves lacking some of the programmatic interaction we would like to add to achieve a believable scene.

Let’s say for example we wanted to create a board room scene with 4 characters seated, looking at a speaking character.

To achieve this purely using sequencer, an animator would need to create custom sequences for each individual character, each with their own different rotations on the head for looking at the speaking character.

Enter the Animation Blueprint & IK Animation

The goal with setting up the animation blueprint for the NPC character is to create an animation blueprint with the flexibility to play triggered animations from the sequence manager but also respond intuitively to the position and actions of the player and actions of the environment.

To achieve this goal, the setup needs to:

1) Track the player head position in VR and animate the eyes and head of the NPC character to follow the player at times when it is appropriate to do so.

2) Allow the NPC to track other objects or NPCs when appropriate to do so (ie, a cue to look at a presentation screen in the environment or another speaking character)

3) Allow the NPC to interact with specified objects when they come within a certain proximity of the NPC. These objects will have an interface defined which returns an Interaction Grip Type, to specify what kind of interaction the NPC should have with the object (should they reach out to grab it with an open palm, grasp it, grab it like a door handle?)

4) Allow the NPC to transition between moving, opening a door, and sitting at specified locations.

5) Allow a slot in the animation blueprint to play specified animations from level/template sequences.

Anim BP Part 1: Locomotion State Machine

The first part of this setup in the animation blueprint is a core locomotion state machine.

This allows for flexible behavior on the character’s base animations, changing between base poses for moving, sitting, idle, and interacting with objects.

The result of the locomotion state machine is saved into a cached pose to be used later in the stream

Sitting in particular has a unique animation set up. Allowing the player to sit but still play other animations with their upper body uses an animation node called Layered Blend Per Bone. Setting the base pose to an animation where the character is sitting, and then layering on top of it from spine_01 up allows for adding additional animations which affect only the upper body.

Anim BP Part 2: Enabling Template Sequences to Override The Base Animation

https://docs.unrealengine.com/4.26/en-US/AnimatingObjects/Sequencer/HowTo/BlendingAnimBPs/

Following this tutorial link above shows the process for connecting the character blueprint, animation blueprint, and template sequences to allow combining triggering animations in sequencer with the additional functionality of animation blueprints.

In our case, this node is available here and saves its cached pose for use with the later stages in the animation blueprints. This node will need to be updated to apply the same principles for separating sitting and upper body animations and pass in a variable if the animator wishes to set the character as sitting during the animation they configure in the sequence.

Anim BP Part 3: Adding Head & Neck Rotation

This node uses Fabrik IK and passes in a rotation calculated in the character blueprint. The Fabrik IK node in this case is set to copy target rotation, the tip bone is set to the head and the root bone is set to the neck.

Fabrik IK node
This will need to be updated to set limits on maximum head rotation

The next step here is a node called Blend Poses by Bool. This allows for turning on and off an animation feature such as this one, so if the programmer or animator turn off in the character blueprint the variable “Follow Head Position”, the pose from the cached animation of the previous step is used instead, without any modification to the head/neck turn.

Anim BP Part 4: Adding Object Interaction

This part is still in development, its setup is similar to the neck turning set up but relies instead on the position of an object to match the hand, arm, and shoulder position to this object. It can be turned on and off with a boolean and the character can be set up to only trigger this boolean if an object with a matching interface begins an overlap with the character.

The next step for this is investigating the Power IK plugin to see if we can get better results.

Power IK

Investigation into Power IK seemed a little bit underwhelming because I feel it’s a bad trade off in terms of high complexity but low payoff. Their full body IK tends to twist more parts on the human mannequin than we would really want and in my opinion did not offer as much easy to use flexibility as the fabrik IK nodes in terms of specifying how high up the bone hierarchy you want the IK to influence.

Exploring this however introduced me to learning control rig, which I think has high potential for helping our animation systems.

Control Rig

Building Additive IK with Control Rig Nodes

One of the really attractive things about working with control rig is the ability to pass in parameters and consolidate multiple animation adjustments in one cleaner node.

modifying an animation by applying IK with control rig

For example, this older approach of using multiple IK nodes and then adjusting wrist rotation would need all these nodes to be copied in other parts of the animation blueprint if we wanted to re-use the same effect (say we wanted to make an anim bp sequence for shaking hands and grabbing an object). With this approach we could build one control rig for a general purpose (moving the arm towards an object or a hand to shake) but then apply more specific nodes afterwards to the hands and fingers for the difference between hand shaking and object grasping, trimming down the complexity of the overall animation blueprint as these nodes are shared.

previous example, multiple IK’s chained together for opening a door

Control Rig Controls, Forward Solve, Backward Solve

After learning about PowerIK and its control rig, I later found that Epic Games has a much more flushed out humanoid control rig for both male and female mannequins available as a downloadable project.

The control rig example provided by Epic Games

The control rig graph is separated into two sections, forward and backward solve.

green forward solve, blue backwards solve

The forward solve section means if you place instructions in that section to modify things (such as moving the hand control to a specified location and then specifying that the hand bone in the rig hierarchy moves to the location of that hand control with an IK node), then the control rig when applied in animation blueprint adds those instructions on top of whatever pose is being put into it.

How the control rig interacts with poses being put into it (whether from the animation blueprint or directly specifying an animation) depends upon the backwards solver.

The backwards solver in action, taking third person run and applying it to the control rig.

Control Rig And Interaction With Sequencer

Control Rig Forwards Solve vs Backwards Solve

Control Rig Component (for use with Sequencer)

Control Rig vs Re-targeting

Logic for Forts

Forts are set up as small dense challenge areas that force the player to fight a set of enemies in order to either unlock progression on the map (opening the ice door on this map) or to unlock a fixed reward (coins or a permanent extra heart when encountered on the open world).

They are setup as the object GeneralFortBoundingBox and include the following parameters:

  • number of enemies–this spawns a fixed number of enemies at a random location inside the bounding box, right at the start of the fort encounter.
  • Spawn Portal Points–these are an array of point objects that specify where spawn portals are meant to spawn. If empty, no spawn portals will spawn as part of the fort’s challenge.
  • Fort Number–this is a parameter used to track which fort is activating when relevant for keeping track of progression on the map.
  • Crossbow Rat Spawn Points–these are the array of points where crossbow rats should spawn.
  • Siege Bows Array–these are the array of the actual bows, already placed on the map. The user can operate these bows after they kill the rat, so they serve as a kind of turret.
  • Is IceFort (boolean value to decide whether progression logic should apply).
  • Fort Finish Point–single point actor indicating where to spawn the sphere that indicates a player has conquered the fort.

When the player enters the area of a fort, fort-specific combat music starts, the fixed number of enemies spawn, the spawn portals are created, and the siege bow rats are spawned.

Any open world enemies still alive are deleted and the door of the fort closes behind the player.

If the player teleports out of the fort, it destroys all enemies and ceases the fort specific combat music. Some logic should take place at this point to spawn enemies back out in the open world again.

If the player dies in the process of battling inside a fort, they should respawn at a point directly outside of the fort.

To win the fort, the player must close all the spawn portals and all the crossbow rats–as soon as they meet this criteria, all enemies are destroyed and some victory music should play.

In the future, some forts will feature specific mini-bosses and additional parameters will be added to specify different sets of enemies.