Tutorial – Integrating CC3 character to Advanced Locomotion System v4.

           This will be a quick and dirty way to do this integration, so to keep everything clean from the main project:

           Go to Epic Marketplace and add “Advanced Locomotion System v4” to your library. Then create a project with it.

           After the initial setup migrate or import your CC3 char onto the project. The editor will likely generate a skeleton to the CC3 character.

1 –    Retargeting

           Prepare both ALS and CC3 skeletons for retargeting, go to retarget manager, set both to Humanoid Rig.

           On CC3 character: root bone retarget option is “Animation”, pelvis is “Animation Scaled”, all IK bones will also be set to “Animation”.

Figure 1 – Retargeting options

           Go to the ALS mannequin skeleton and right click, then select “Retarget to another skeleton”.

Figure 2 – Retargeting the skeleton

           This will retarget ALL the animation assets that are referencing that skeleton. Things to note:

  • It will keep the file structure as it was since is is NOT duplicating the assets and then retarget.
  • A major CON is that it won’t rename the assets so that needs to be done afterwards.                  
  • A major PLUS would be that it will then retain any reference to the current animation assets.

2 – Adding Virtual Bones to the skeleton.

           Back to the CC3 skeleton we need to add the Virtual Bones that ALS uses for IK.
To add them you need to go to the skeleton asset, right click on a bone (this will be the SOURCE) and choose another bones (this is the TARGET). Afterwards you can rename the bone. NOTE: the VB prefix can’t be removed.

Figure 3 – Add virtual bones.

           The following table gives all the needed virtual bones together with the correct sources and targets.


           Also copy the head sockets that are placed on the ALS skeleton and paste them on the CC3 skeleton. They are attached to the “head” bone.

3 – Setting up the Ragdoll/Physical Asset

           The auto generated physical asset from the editor might not be well done for the character. So a quick way to have another is to take advantage of the ALS one. We can always adjust or change it again at a later date. Rename the physical asset.

           Copy the ALS physical asset and move the copy to the CC3 folder. Open the CC3 character skeletal mesh and on the Physics and Lighting entries place the copied Physical asset.

           If you don’t want to copy the physical asset but use another you already have you need to add one extra physical body (if not existing) for it to work with the ALS ragdoll system.

           On the physical asset go to the skeleton tree, click “Options”, and select “Show All Bones”. Right click on the root bone and select “Add shape”.

Figure 4 – Adding root bone body

           After that just make the created body to be kinematic and disable its collision. So “Physics Type” set to “Kinematic” and “Collision Response” set to “Disabled”.

Figure 5 – Body setup

4 – Animation Blueprint adjustments

         If you attempt to play you may notice that the face appears to be melted when the heads does sweep motions. The feet might also be floating a bit. For this we need to apply some adjustments on the animation blueprint. First, the feet.

Go to the “Foot IK” animation layer. On the bottom there’s a box named “Apply IK to Feet”. You need to adjust the “Effector Location”. Both adjustments should be on the X value. The left side is likely a negative value offset while the right side should be a positive value. For this character in particular +3 and -3 worked best.

Figure 6 – Feet IK adjustment

            Then it’s just a matter of “fixing” the jawbone. My take on the issue is that CC3 places weights on the facial bones (to enable animation on them) but since our animations do not use them some animation curves might be applying some influence on them. The jaw problem is due to the “cc_base_facialbone” lagging behind the rest of the head when rotating. One fix could be achieved using a Control Rig but there’s another solution which is less complex. The drawback of it? It disables one bit of the animation blueprint.

           Search for “AimOffsetBehaviors” open the nested state machine and then just disable the “Look Towards Input” state. You can just duplicate the “Look Towards Camera” state and connect the entry node to this new state.  

Figure 7 – Facial bone “fix”.

           Now should just be a matter of renaming the animation assets to keep everything tidy. Renaming the assets in bulk while connected to P4V is likely a lengthy process but it assures that no reference gets broken.

           If you rename the assets, in bulk, without being connected to P4V I recommend doing it in small batches. Close the project and reopen, see if nothing failed to load and continue like that until finished.

           If anything fails to load: rename the culprit asset (or assets) to the previous name, save. Close and reopen. The loading error should be cleared. Rename the assets again and test again just to be sure. And then continue.

5 – ALS and DCS merge

          The initial merge following the tutorials leaves the blueprint with a lot of bugs that have been squashed on the meantime.

          On my opinion the best course of action to apply a new character to it would be: add the new skeletal mesh. Likely, the engine will assign a new skeleton to it, or even, pick a mannequin skeleton and add the new CC3 bones to it.

          I would advise to restore the Mannequin skeleton if the editor alters it, just for precaution.
Then, pick the skeleton that the already existing characters have and assign that skeleton to the new skeletal mesh.

          Adjust the retargeting options (rule of thumb: root as Animation, pelvis as animation scaled, the rest as skeleton and IK bones, weapon bones and such as animation also).

          The engine will then provide runtime retargeting of the current pose to any skeletal mesh that shares the skeleton.
Issues: if the skeletal meshes have different proportions it will be required to correct it. Likely IK will be necessary .

          NOTE: if there are any folders that have the same naming scheme on both projects you may end up overwriting any asset that has the same name.

Figure 8 – Same file name on different projects.

          As per the image above, if I migrate the assets on the left project to the right side project I will be inevitably overwriting the existing files.

Tutorial – Creating Control Rigs for spine and hands orientation

           This tutorial will walk you through on how to create control rig assets that enable you to adjust your character spine using mouse input, working then as an aim offset. And on how to adjust your character hand positioning to create sweep and stabbing motions using spears or staffs. It also provides you the tools to repurpose and recreate for other types of weaponry and motions.

           First it will show on how to create a Control Rig asset, it will provide a step by step to create the spine influencing asset and compounding from it the hands influencing Control Rig will be shown. This tutorial follows the usual UE4 skeleton naming scheme and hierarchy.


  • Control Rig plugin.
  • Full Body IK plugin.
  • Engine version 4.26 or above.

           After this tutorial you will have two control rigs that enable you to apply runtime adjustments to your character spine or hand positioning as shown on the next two videos.

Video 1 – Sweep and stab motion.

Video 2 – Spine Aim Offset.

1 – Creating the Control Rig asset

           To create your control rig asset you just need to right click on the content browser then go to “Animation” and then select “Control Rig”, see Figure 1. The editor will then prompt you to select the parent rig. There should only be a single option named “ControlRig”. Select it.

Figure 1 – Creating the Control Rig.

            Afterwards, opening your newly created asset you will see a window like in Figure 2. To start building your Control Rig you first need to import a skeleton hierarchy. In order to do it, click on the green button named “Import Hierarchy” at the middle of the left side of the window. And then select the desired skeletal mesh.       

           Note: The Control Rig is neither importing the skeleton nor the skeletal mesh itself. It’s creating a copy taking into account the skeleton hierarchy. So it takes into account the existing bones’ names and the corresponding transforms. The control rig asset won’t be referencing the skeleton at all, as it can be verified using the reference viewer.

Figure 2 – Blank Control Rig asset.

           After that you will have your skeleton hierarchy imported and the selected skeletal mesh will be on the viewport as in Figure 3.

Figure 3 – Imported skeleton hierarchy

2 – Adding controls and spaces to Control Rig

           In order to better understand the work that will be done it is best to provide a brief explanation on what Controls and Spaces are:

  • Controls: they are points on the rig global space that you can use to aid or directly adjust bones’ transforms.
  • Spaces: they work as being a secondary frame of reference to any child control (or space) that it parents. Meaning: on control rig the global space origin (0,0,0) is located at the root bone. When you create a space, you are creating a new reference frame for any child of this new space. So their global transform is measured from the rig global whilst the local transform is measured from the parent space.       

           So to create either a space or a control one can right click on a bone and select “New” and then “New Control” or “New Space”. It can also be done on a blank area at the bottom of the “Rig Hierarchy”, so after the root bone chain. Let’s create a control like that.

           A red sphere should appear at the root bone position. Rename this new control to “root_ctrl”. The naming scheme is following the UE4 mannequin scheme, if your skeleton has a different naming convection just adjust the names and follow them by the following suffixes: “_ctrl” if it is meant as a control or “_space” for spaces.          

           The red sphere is a gizmo, it’s just a visual representation of the control that was created. While having it selected, on the right side, on Gizmo section, you can adjust its transform, colour, and type. Change it to a hexagon (see Figure 4 for reference).

Note: spaces do not have gizmos.        

Figure 4 – Gizmo setup.

           Following that, click on the “root_ctrl” and create two more controls named “foot_r_ctrl” and “foot_l_ctrl”. Change the gizmos to boxes and colour the left one to blue (left sided controls will be coloured blue while right side is coloured red). Similarly create a space named “spine_02_space” and parented to this space create a control named “spine_02_ctrl”. Change the “spine_02_ctrl” gizmo to a yellow circle.

           All gizmos should be appearing on top of each other. To reposition them on the “Rig Graph” right click and write “Setup Event” and select it. This is the event that runs before the other events. Think of it as a construction script.    

           Then, on the graph you can place a “Children” node, just right click and search the name. This node can help you to recursively obtain the entire chain of the type of items you want (including the parent). If set to search for controls and there’s a space on the chain, it will jump those. So in our case it will retrieve the root and feet controls as a single collection. A collection is just a container for bones, controls, and spaces.

           We want to set up the controls initial positions as being the respective bones initial positions. For that we can iterate through the created collection using a “For each item” loop node. Expanding the “Item” node you can see that there is a “Name” pin. From it create a “Chop” node, this node is meant to remove substrings with the length specified. Since using our naming convection we can obtain the bones’ names by chopping the item name by a length of 5 (so removing the “_ctrl” suffix). From the remainder we create a “Get Transform” node, set it up to bone type and to retrieve the initial transform from the global space. From this transform we set up a “Set transform” node, the conditions are: “Initial” and “Propagate to children” as True. Also, link the “Item” pin from the loop node to this node. This finishes the positioning of the control in the collection. It should look like it is in Figure 5.

Figure 5 – Collection loop setup.

           Then we just need to setup the “spine_02_space” and his child control. Since we want to have both the space and control initial location to coincide with the “spine_02” bone we just require to setup the space, as the child control will be at (0,0,0) in local space which is where the “spine_02_space” will be located. For that we just need to retrieve the “spine_02” bone initial location. Place a “Get transform” node, set it to retrieve the initial bone transform and then expand the pin and get just the “Translation” pin. From it place a “Transform from SRT” node. It will create a transform from the fed in values. So the initial location will be in, and it will also make the rotation to be (0, 0, 0). Since later we will be adjusting the rotation at runtime it is best to work with it starting at 0. From this transform place a “Set Transform” node and set it up to set the initial transform of “spine_02_space”. As per Figure 6.

Figure 6 – Space setup.

            Having finished the setup event you can check that all the controls should be placed on the correct bones as in Figure 7.

Figure 7 – Finished control setup look.

3 – Setting up the Spine influence.

            Now that the rig is properly done let us move to the setup to influence the spine. For it to work you’ll need to place a “Forwards Solve” node. This is a node that is meant to be read as: you set up the controls’ positions and then move the bones along adjusting if necessary.

            One important note to take is the following: the sequence at which you apply or adjust the controls positioning matters. For example in a parent/child relationship of a space and control: if you first setup the control transform but then change the parent space transform you will be inevitably altering the control positioning. So be aware of that.

            Since there will be quite a few steps to do let’s first place a “Sequence” node. From it the first two execution pins will just mimic the “Setup Event”, we just won’t be setting the initial transforms. So on the first execution pin just paste the setup for the space transform, same as before barring setting “Initial” to false. Then on the second pin paste the loop we had done before. But now we also want to setup the “spine_02_ctrl” to follow the “spine_02” bone. So for that, on “Rig Hierarchy” click on the control and drag it to “Rig Graph”, when prompt select the last option “Create Collection”. Now you will have two collections, one with the “root_ctrl” chain and then one with just “spine_02_ctrl”. From one of them, drag the “Collection” pin and select “Union” this allows you to merge two collections into one. Then drag the resulting pin to the “For Each Item” loop node. Remember to toggle all “Initial” options to false.

            This two parts are just there to assure that your rig controls will be following the bones when they get animated.

            Now we can inject the logic behind the spine influencing. The ideia behind it is simple, we will adjust the “spine_02_space” transform and the “spine_02_ctrl” will follow it by being a child of it.

            From the “Sequence” node third execution pin you place a “Set Transform” node, setup it up to be applied on “spine_02_space”, set “Space” as “Local Space” and “Initial” to false. Now you will want to pass along a “Rotation” value. It will be this value that will come from mouse input. At the right side of “Rig Hierarchy” you will notice that there is a “My Blueprint” tab, on it you can create variables as you can on every blueprint. Create two float variables that are meant to store the yaw and pitch input values. And when creating them click on the eye icon next to them, it will allow you to then use the values as inputs when setting up the Control Rig in an animation blueprint.
Now drag those variables to the “Rig Graph”.

            Dragging the pins from those variables create “Remap” nodes. These are just so you can map the input values to the values you might want to rotate the space by. You will have to adjust these values by trail and error. From the “Remap” nodes create one “From Rotator” node, place the values on the Z and X values and then the resulting rotator gets fed in the “Set Transform” node. The location is taken from the “Get Transform” from the “spine_02” bone. You will end up with something similar to what is shown on Figure 8 (the “Multiply” node shown there is due to how the Yaw value is being calculated on another blueprint).

Figure 8 – “spine_02_space” adjustment logic.

            And for last there is just the need of a “Full Body IK” node, this node will take the controls’ transforms as constraints and then will solve the skeleton positioning adjusting the skeleton pose as needed. For a more thorough explanation it might be best to search for “Inverse Kinematics”.

            So, placing a “Full Body IK” node as stated, there will be a “Root” pin, on it just setup up the root bone of the required chain, in this case let’s use the “pelvis” bone. There will also be an “Effectors” pin with an add symbol besides it, click that to add effector entries. Create three of them and set them up to be the “spine_02”, “foot_r” and “foot_l” bones. Now we need to get all the needed controls’ transforms. On the “Rig Hierarchy” and while holding the control key, click on “spine_02_ctrl” and both feet controls, drag to the “Rig Graph” and select “Get Transforms”. Now expand the transform pins and just drag the “Translation” and “Rotation” to the correct pin entries on the IK node. When you drag one pin the valid entries will light up. Correspond the control transforms to the correct bones. The result should look like Figure 9.

            If the “spine_02_ctrl” gizmo is rotated, probably upwards, adjust its rotation on the corresponding gizmo menu, something like Y = 90 should suffice.

Figure 9 – Full Body IK node.

4 – Setting up the Hands influence

           The setup for this Control Rig is quite similar to the previous one. The only differences are that we will be creating a space to control the hands that will require to be always between both hands, creating controls for both hands and there won’t be a need for spine controls.

            So for that, create a similar “root_ctrl”, “foot_r_ctrl” and “foot_l_ctrl” as before. Also same gizmo setup. Then create a space called “hand_center_space” and create two controls named “hand_r_space” and “hand_l_space”. For these controls’ gizmos use “Circle_Thick” and remember to colour the left one as blue.

            The “Setup Event” will also be similar, same loop logic for the root control chain just need to merge with a collection of both hands’ controls. 
Now the “hands_center_space” will have the requirement of being at the middle point of both hands. For that, get the transform nodes of both bones and from the “Translation” pin add an “Interpolate” node with a “T” value of 0.5. 
This will ensure that the resulting vector will be at the middle point of both hands. Feed this resulting vector to a “Transform from SRT” as the previous rig. The result should look similar to Figure 10.

Figure 10 – Hand control rig setup event.

           Now, for the “Forwards Solve”, same as before. Following the first two execution pins of a “Sequence” node place both parts of code from the “Setup Event” but now with the “Initial” boolean turned to false.         
For the third execution pin we will be setting up our hand influence. Just as before we will be adjusting the space transform which in turn will adjust the child controls. On this setup we will be creating a sideways sweep and a stabbing motion. The sweep motion is just a matter of adjusting the rotation value of the space by the Yaw value that we pass to the Control Rig. While the stabbing motion will take as an input the Pitch value. 

          As before, create two float variables for these values. Then from the Yaw value place a multiply node before the “Remap” node. This multiply node will aid in adjusting the speed of the motion. Pass the remapped value to a “From Rotator” node and the result to the “Rotation” pin of the “Set Transform” node.       

          For the “Translation” value we will still be taking into account the middle point between both hands but then we will be adding a value to the Y positions. So this would offset the position forwards and backwards on our setup. So from the same interpolate setup as mentioned before, pass the split the “Translation” pin from the “Transform from SRT” node and then pass the X and Z values directly to the “Set Transform node”. The Y value will go to an “Add” node.         

          The second “Add” pin will be fed with our Pitch value. Pass the Pitch variable through a “Multiply” node (again to work as a speed multiplier, so adjust the value to your taste) then trough a “Remap” node and from it to the “Add” node and this result to the Y input of the “Translation” on the “Set Transform” node. The result should be similar to Figure 11.    

Figure 11 – Hand Space adjustment setup.

            Now it’s just a matter of adding the “Full Body IK” node, still having the root bone being the pelvis but now the effectors are both the feet and hands bones. Get the corresponding control transforms and set them up accordingly. It should look like as in Figure 11.

Figure 12 – Hand Control Rig Full Body IK setup.

5 – Placing your Control Rig on an animation blueprint.

            To use your Control Rig on an animation blueprint just place a “Control Rig” node on the blueprint, then on “Control Rig Class” choose your newly created rig and to enable your variables just toggle the “Use pin” checkboxes. See Figure 13.

            For mouse input you can pass the usual variables that are also used to drive standard aim offsets or blendspaces.      

Figure 13 – Placing a Control Rig on an animation blueprint.

6 – Final considerations

           Thus this concludes the tutorial. It should provide some basic notions on how to create your own control rigs for a multitude of end results or mechanics you so desire.

           One thing that was not previously explained is the why do you need to also set up feet controls. The reason behind it is that otherwise, when applying the IK node you would be adjusting the feet positioning. By using controls that follow the feet as constraints you assure that the final pose will still follow the feet animation and thus not creating extreme rotations on the rest of the body that could be hard to blend or layer out on the animation blueprint.

          The other point is that, in order to enable your character to drive the Control Rig as in the sample videos shown at the beginning you have to do two things: on the spring arm component set “Use Pawn Control Rotation” to False and on your movement component set the “Rotation Rate” to (0, 0, 0). It likely has only a value on the Z axis, so just set that one to 0.

            For further references see the following links to Epic Games’ documentation and video on the subject:

Control Rig Epic’s documentation.

Fixing Hair For Metahumans in VR



You can enable cards by using r.hairstrands.strands 0
That way you get the head hair. Using the sample metahuman files seems to then allow for the eyebrows to be reenabled but I’m stuck trying to get the eyelashes to work in a passable way…

Adjusted M_EyeLash_HigherLODs_Inst to fix eyelash opacity for eyelash cards made on construction script.

better eyelashes

Tutorial – Retargeting UE4 mannequin animations onto a non-standard bipedal skeleton.

Often you may find yourself having a bipedal character such as monster characters and such and there might not be enough animation assets for it. Adding to that its skeleton does not coincide with the default UE4 mannequin’s one and so you can’t just share the animation assets between them. There is a solution tough, you can attempt at retarget the animations that you have for the default mannequin onto your character.

This is a walk through, step by step, showing how you can retarget animations that are rigged for the UE4 default mannequin onto a different bipedal skeleton.

1 – Preparing the skeleton assets for retargeting

First thing you need to do is to check if both your skeleton assets are ready for retargeting.
For that you need to:

  • Open both skeleton assets.
  • Go to retarget manager.
  • Set up their rigs: choose humanoid.
  • On the non-default skeleton you may need to set up the root-target correspondence: choose the appropriate bone (if one exists) for each of the targets provided.
  • Important: make sure that both of them have matching poses be it A-Pose or T-Pose. You can change them at “Manage Retarget Base Pose” below “Set up Rig”.

The rig that was just setup is meant to make the bridge between the source skeleton bones and the target skeleton one’s.

Figure 1 shows how the end result should look like with both skeletons doing an A-Pose.

Figure 1 – Setting up the rig.

2 – Retargeting the animation asset

After this you may go to an animation asset of your choice, can be anything be it aim-offset, blend space, blueprint, montage, or sequence. You click with the right mouse button, select “Retarget Anim Assets” and then “Duplicate Anim Assets and Retarget”. You’ll be presented with a similar window as the on in Figure 2

On this window you can then select your target skeleton and after selecting it you may notice that both meshes should be doing the same pose. There’s a red warning to remind you of it. You can also add extra task to do such as adding prefixes, suffixes, replace parts of the asset name for another name and even select the folder on which you want the new asset to be saved in.

Figure 2 – Animation retarget window, note that both meshes are doing the same pose.

After the process is done you can then open the new asset that was just created and check the result. It is likely that the result might look like something on the below video where the mesh seems to be horribly distorted. What gives?

On the above video we see the superposition of the non-retargeted skeleton animations (in yellow) while in white is the retargeted animation.

3 – Retargeting options

There is another step that was omitted previously to show what can go wrong during retargeting and on how to try and fix it. That step is setting up the retargeting options. If the rig is meant to bridge one skeleton bones to the other the retargeting options are meant to tell on how to deal with the animation information that was just retargeted.

On the skeleton asset click “Options” and then “Show Retargeting Options”, see Figure 3.

There are multiple options available but following Epic Games’ guidelines as a rule of thumb we do the following setup:

  • The root bone (the first bone in the chain), weapon and cloth bones (if available), IK bones and other markers are meant to use the “Animation” option.
  • The pelvis bone (or another that has the same purpose) will be set as “Animation Scaled”.
  • The remaining bones use the “Skeleton” option.

For a fast setup do it like this:

  • Right click the root bone and select “Recursively Set Translation Retargeting Skeleton”.
  • Then find the pelvis bone or equivalent and set it to “Animation Scaled”.
  • Any other bones such as the root bone, weapon or cloth bones, IK bones and markers set them as “Animation”.
Figure 3 – Retargeting options

Now having all set up you may check again the animation asset that was retargeted and see how those changes affected the animation.

As you can compare with the previous example, on this video the mesh is much less distorted and there is a striking difference between the non-retargeted animation with the retargeted one when they are superimposed.

4 – Remaining issues

There are still two details that on this example require fixing:

  • The mesh is being set up below the ground level: this might be due to the root bone on this specific skeleton has an offset being at the same level as the pelvis. When getting retargeted the target root bone gets set up at the source root bone position that is the ground.
  • The mouth and tongue still present a bit of an issue: this might be due to a lack of information on how to translate those bones on the source animation.

Both of these issues can be tackled on the animation blueprint.

For root bone height issue one of the fix may be as follows: every time you may want to play the retargeted animation you can apply an offset to the root bone (see Figure 4):

  • Add a “Transform (modify) Bone” node to the animation blueprint.
  • On the translation pin apply a vector with an offset on Z-axis. To precise on a value you may need to do some trial and error. On this example 100 unreal units worked well.
  • On this case we are switching the offset depending on if the retargeted animation is playing or not.
Figure 4 – Root bone height offset fix.

To solve the tongue and mouth issue, on this example there was some animation assets for walking already done for this specific mesh which had the character walk with its mouth closed.

On the animation blueprint we saved this animation as a cached pose (from the animation asset pin write “New saved cache pose” and select it. You can also then rename this new node. Having  this saved pose was then just a matter of blending it in onto the other pose your character might be playing (see Figure 5).

Figure 5 – Mouth and tongue bones fix.

As you can check from the above picture there is a “Layered blend per bone” node where two cached poses are being fed in. The one below, second pin, is being layered on top of the one above, first pin, and this one is the saved cache pose with the mouth shut. On the details panel you can check the values that were set in. On the branch filter the “root” bone of the tongue bone chain is being set with a value of -1. What does it mean? It means that the animation data that is being layered on top gets discarded on that bone chain. Meaning that on that chain, which is the one that translates the mouth and tongue, is only being driven by the pose that has the mouth shut.

5 – Conclusion and documentation

Thus this concludes the tutorial. Albeit not covering all of the issues that may arise it gives the basis for you to tackle them and see on how to try different solutions to solve them as soon as they appear.

            For further references see the following links to Epic Games’ documentation on the subject:



Creating challenges for the pallet stacking project

The challenges are setup up so that the player is given a 10 minute time frame to complete as many challenges as they can. During this period they can select easy, medium or hard challenges which score different amounts accordingly. They start the challenge by pulling the lever after pressing the difficulty button of the challenge they would like to do. They will then get a random challenge of that difficulty.

How to start a challenge.

During the challenge they then have to stack as many boxes on the pallet as they can and then pull the lever to deliver the pallet. They will incur penalties for any unstacked boxes or any boxes that have been crushed. Crushed boxes also incur a time penalty.

Challenge Design

The challenges are split into 3 difficulties, the easy challenges are challenges that only use the basic types of block: heavy, standard and light, plus the variations in size of those blocks. They also make minimal use of the waves system only delivering a few new blocks a most. The medium challenges introduce the fragile heavy blocks and start to make much more use of the wave system. Finally the hard challenges introduce blocks starting on the pallet. Each step up in difficulty is designed to make the player think more about the placement of the blocks by constraining the number of solutions possible. It is hard to quantify the exact number of solutions for any given challenge as there are a large number of configurations of the blocks on the pallet however by introducing these constraints we remove options reducing the available solution space.

Each of the challenges are intended to take approximately 2-3 minutes to complete allowing a player to complete 3-5 challenges in the allotted 10 minutes. The player is allowed to choose the difficulty of the challenges as they play them so that they can adjust their play for maximizing points or take on easier challenges to learn and improve.

Challenge Technical Setup

When setting up a challenge you can select which boxes you want to spawn by selecting them from the drop down (you can select any of the variations in size and weight that have a blueprint created for them) and you can set them up to spawn in waves. A new wave is spawned when the number of boxes stacked on the pallet reaches the number specified in the wave.

You can also choose to have boxes spawn already on the pallet by adding them to the array ‘ObjectsStaringOnPallet’ and setting a starting offset corresponding to the position on the pallet where you would like them to spawn, these boxes can’t be moved by the player. The final thing that can be specified is the size of the pallet by inputting 2×2 or 3×3 in the PalletType field. To add different pallets a new pallet blueprint child would need to be created.

The challenges are all set up in a data table so it’s straight forward to add and adjust challenges.

Example challenge setup in the data table.

The code for starting and stopping the challenges as well as tracking the score etc can be found in the PalletStackerGameMode.

When a challenge is started I un-pause the timer and then get a random challenge of the selected difficulty from the data table. I then create the pallet and spawn in the boxes defined by the challenge. When stopping a challenge I do the opposite; clear the pallet and cash in the boxes updating the player score, the code here is very straight forward and just steps through the process as you would expect.

Starting and stopping the challenges.

When spawning the boxes I retrieve the soft class references from the data table and the compare them to a list of box types stored in the gamemode to then spawn the appropriate box from that class.

Select class to spawn box from.

Setting Up AWS Account For Unreal Servers Using Blueprint Plugin


As a start to this process, I’ve been following tutorials and understanding some key principles.

  1. The UE4 client has access to amazon IAM credentials for logging in users via cognito. On Amazon, this IAM credential must be restricted to only cognito logins to not allow someone who has access to this information
  2. For higher security of data, the unreal server must be the only one to send updates and this should be sent through graphQL/app sync and this must be filtered to only whitelisted IP addresses.


The video I am following for this setup is this video from Lion of of the aws plugin team.

First Step: Setting Up An AWS IAM User Configured to Access Cognito Only.

The first step is setting up a new cognitoOnly amazon AWS user. This is required to allow that user to log in but not access other aws services.

On the policy for this user, we are creating a new policy that restricts access to only certain functions within the CognitoUserPools Service.

This user is restricted to:

SignUp, ConfirmSignup, InitiateAuth, ForgotPassword,ConfirmForgotPassword, ChangePassword, List Users, and Get User

Before finalizing this user, we must create a user pool.

User Pool Creation

The video takes through some standard steps for creating a user pool related to specifications on the user password and email authorization.

The last step gets into some specifics of which app the user pool connects to and options for auth flows configuration.

In the video, it is advised that you can generate a client secret here but not to forget to create a secret hash in aws cognito idp’s blueprint nodes.

That likely refers to this node here on the cognito only example, we will have to check back on this later.

Adding User Pool Specifics to AWS IAM User Configuration For Cognito Only

The next step involved finishing the IAM user policy to restrict access to only this specific user pool we created. The video shows clicking “Add ARN to Restrict Access” and then shows to copy the region and the specific user pool ID from the user pool you have created.

Configuring Federated Identity Pool

There are some steps to follow to create a federated identity pool. This takes in the User Pool ID and App Client ID.

Adding Cognito Identity Service To Cognito Only AWS IAM User

Add to the policy for the IAM cognito only user access to the Cognito Identity service.

Add only access to getID and GetCredentialsForIdentity.

Now we have enough to finish creating the policy, reviewing the access and naming it cognito_user.

Create an IAM user with this policy

The next step is creating a new IAM user, attaching only this policy. These credentials can now be used inside the ue4 build.

Creating an Admin IAM User.

The video shows steps to create an IAM admin user for use with aws cli.

The video then shows configuring the aws cli to use these credentials, but this part is not completely clear.

Next it looks like it is referencing a role on AWS IAM of Cognito_awstutorialAuthRole. I do not remember seeing this setup earlier in the video.

Reviewing what was set up previously, I believe I set this up to be tied to this particular role Cognito_IDPoolFranckStagingAuth_Role, that must have been set up earlier when setting up the user, the policy, and linking the user to the userPool.

I can see when looking at the trust relationships tab it fits with what I had configured.

Adding Additional Access Permissions to the Authenticated Role

This next step shows adding additional permissions to the authenticated player.

First there is Gamelift & Allow All Gamelift Actions

Then there is Lambda and all Lambda Actions/Resources (this seems potentially dangerous, we may likely have to revisit it).

Then the video briefly showed AppSync but did not add it here.

Setting Up Matchmaking Logic

The tutorial video shows assigning a matchmaking rule set as part of the gamelift setup. We currently do not have that configured.

I also found that the eu-west2 region does not support matchmaking, so similar to the example in this video, we will need to assign a different region for matchmaking vs our fleet’s region.

On the description of this video, it mentions:

prerequisites: I have setup the whole matchmaking system in my previous videos, if you want to know how to set up matchmaking systems, please watch my previous videos.

So it seems like I will need to learn more about matchmaking before proceeding further on this tutorial.

This video seems to cover matchmaking from the plugin perspective.

And this starts to cover matchmaking from the official gamelift side.


Shifting to a new Tutorial to do the original tutorial!

Before diving into another 50 minute long silent tutorial, I felt like it might be helpful to learn from some official aws documentation first.


The primer first describes matchmaking as a pre-requisite, then gets into game session placement.

After a match is found, the game session needs to be started somewhere. Finding the best location available for the game session is called game session placement. The game hosting service should be able to examine the infrastructure resources and select a location that has the lowest average latency for all the players. Game session placement is another game hosting requirement.

The four layers of Gamelift:

Matchmaking relates players that want to play a game with a game session where it can be played.

Game Session Placement figures out where to host game sessions.

Session Management starts games and facilitates players to join games.

And Infrastructure management provides elastic control over game servers.

Notes from team conversation :

Many different types of authenticators/authentications, can be applied to each specific lambda functions.

Can have several entry points that can trigger the same function.

Fetching user profile can be available for web front end or available for a different policy

Settings are very flexible

yml function can regenerate entire aws setup

Different authorizers can be set up per lambda function

get token from launcher, attach token to every api request

Requirements For Setting Up Gamelift Local And Testing A Server Build

  1. Download Gamelift Managed Servers SDK


2. Download Java Development Kit 8 (scroll down to get it)


or use this direct link


3. Check to see if java is installed on your path by opening a a command prompt (type cmd after hitting windows button) and typing java -version.

4. Unzip that Gamelift SDK you downloaded on step 1 and copy the unzipped folder to a different drive than where you have unreal installed (so if you installed unreal on C, copy gamelift sdk to D.

5. Navigate to that folder in windows, select the navigation path, then type cmd to replace the directory and press enter. This will open a command prompt direct to that directory.

6. Type java -jar GameLiftLocal.jar in the command prompt. If it works you should see text like this.

7. Next step is to add the plugins for the project to your engine folder.

Tutorial – Using the Live Link Face app for iPhone 10+ to record facial animation assets for MetaHumans in Unreal Engine

This tutorial will walk you through the steps of bringing your MetaHuman to life with facial mocap straight from your iPhone. You will be guided through the process of setting up a new project readyu for animation, importing your MetaHuman and connecting it to Live Link, before finally recording your animation and saving it as a separate asset that you can reuse on any other MetaHuman later.

The Live Link Face app only works for the iPhone X or above, so you will need access to one of these before beginning this tutorial. The screenshots in this tutorial are from UE4.26.2 but the same steps will also work in UE5, although the UI will look different. The main difference is that in UE5, Quixel Bridge is incorporated directly into the engine, whereas if you are on Unreal Engine 4, you will need to download Quixel Bridge from here: www.quixel.com/bridge

1 – Starting a new project

  • Open Unreal Engine and start a new blank project

2- Downloading and importing the MetaHuman into Unreal Engine

  • Open Quixel Bridge and sign in with your Epic Games account
  • Navigate to the MetaHumans panel and download one of the premade MetaHumans or create your own custom MetaHuman at www.metahuman.unrealengine.com. For this example, we will use Danielle.
  • Once the download is complete, press “export” while your unreal engine project is open in another window. Your MetaHuman will be imported into your UE4 Project.

3 – Enabling necessary plugins

  • After your MetaHuman is imported, you will see several warning messages asking you to enable missing plugins and project settings. Press “Enable Missing…” for all of these
Enable Missing for all Plugins and Project Settings Warnings
  • You now need to make sure the Live Link, Apple ARKit and Apple ARKit Face Support Plugins are enabled. To do this, go to “settings > plugins” and then search for each of these in the search bar. Press ‘enable’ on each. You will need to restart Unreal Engine for this to take effect
Enable Live Link, Apple ARKit and Apple ARKit Face Support Plugins

4 – Placing your MetaHuman into the scene

  • With the necessary plugins enabled, you may now drag your MetaHuman into your scene. You can find the MetaHuman in your content browser at “Content > MetaHumans > Danielle > BP_Danielle “. Drag the BP_Danielle file into your viewport.
Drag the MetaHuman Blueprint into your scene

5 – Connect your MetaHuman to the Live Link Face App on your Iphone X (or above)

  • First Download the Live Link Face App off the App store
  • Open the app and go to the settings icon in the top left corner
  • In Settings go to Live Link and then “Add Target”
  • Here you will need to input the IP address of your computer. You can find this on your computer by pressing the windows key and typing “cmd” to open the Command Prompt. Here type in “ipconfig”.
  • Look for the line reading “IPv4 Address…”. The number at the end of this is what you need to enter into the Live Link Face App on your phone.
  • Now back in Unreal, go to the viewport and select your MetaHuman. In the Details Panel, under “Default” there is an option called “LLink Face Subj” with a dropdown menu. Click this and choose your iPhone from the list.
  • You are connected! If you want to additionally stream head rotation, you must enable the option below “LLink Face Subj” called “LLink Face Head” and “enable stream head rotation” in the settings of the Live Link App on your phone.
  • To test the connection hit the play button at the top and move the camera so you can see your MetaHuman’s face. If you’ve followed the above steps correctly, the MetaHuman should be copying your facial movements.

6 – Recording a performance with the take recorder

  • First open the take recorder by going to “Window > Cinematics > Take Recorder”
  • You need to add your iPhone as a source. Click the green “+ source” button and then “From LiveLink > iPhone”
  • Select the iPhone source to bring up more details and uncheck “use source timecode”
  • If you want to then also record sound (for example, you are recording a speaking part, then add “microphone audio” as another source
  • Before recording your take, click the play button at the top of the screen (to keep the camera from jumping to the centre of the world, change the spawn location from “Default Player Start” to “Current Camera Location”) This lets you see your MetaHuman moving during your recording
  • To record your take, press the circular red record button. There will be a three second count down before your take goes live. Press the square stop button when you are done.
  • Your take will be saved as a level sequence and can be found through the content browser at “Content > Cinematics > Takes > ‘current-date’ “

7 – Saving the take as an animation for later use in sequencer

  • To open your take in Sequencer, navigate to “Content > Cinematics > Takes > ‘current-date’ “ and find your take as a level sequence. Open it and then double click on the track “iPhone_Scene_…”. If you scroll through the keyframes and cannot see the animation play, this is likely because you have exited ‘Play’ mode.
  • You won’t be able to edit anything on this yet, as by default, takes are saved as locked. To edit it, we will need to click the lock icon in the top right corner of the sequence.
  • We are trying to save the take as an animation asset that we can use later. To do this, we need to add BP_Danielle to our timeline and bake the animation sequence. Drag BP_Danielle from the world outliner to the space in Sequencer underneath our iPhone Data.
  • We don’t need the Body, MetaHuman_ControlRig or Face_ControlBoard_CtrlRig tracks so delete those, leaving only the face track.
  • Next, right-click on the face track and click “bake animation sequence”. Name and choose where to save your animation and press “ok” and then “export to animation sequence”
  • You can then navigate to wherever you saved your animation asset (“Content > MetaHumans > Common > Face”) and open it to see the animation on its own.

8 – Re-using your animation asset in sequencer

  • So now we have our MetaHuman facial animation asset we can then apply this to any MetaHuman in a separate sequence, with other animations (for the body for example) also applied separately.
  • Let’s create a new level sequence and add a camera and our MetaHuman. Do this by clicking “Cinematics > New Level Sequence”. Press the camera icon to create a new camera and then drag in BP_Danielle (or whichever MetaHuman you’d like to animate) as before.
  • Click the tiny camera icon next to “Cine Camera Actor” to lock your viewport to the camera. You can now move around and position your camera as you’d like. You can adjust the camera settings in the details panel
  • Delete the “Face_ControlBoard_CtrlRig” track and instead press the little plus sign next to “Face”. From here you should select your previously saved animation from the Animation dropdown menu.
  • Now your animation has been added to your MetaHuman in Sequence. Combine these and other animations / sequenced events to make short films and videos!

Building a VR pallet stacking system with procedural meshes in UE4

The goal here is to make a pallet stacking system that can gives the users a feeling of weight and makes them think about how boxes should be stacked.

The snap system

The snap system (Modular Snap System Runtime) is straightforward to set up. First you need to go to the static mesh that you would like the snap system to work for and add some sockets to the mesh. Optionally if you only want snapping between specific objects then you can name the sockets so they match with what you want to snap to for example you could have two sockets ‘box_01’ and ‘cylinder_01’ then specify in the snap settings to pay attention to the socket names and these two would no longer snap together.

Sockets placed on each face to allow snapping on all sides.

Next you need to set it up in your VR pawn so that the snaps are triggered when you drop an object that you would like to snap. To do this with the Vive_PawnCharacter we follow off from the TryDropSingle_Client nodes and take the gripped actor and use the snap node.

Snap actor node (things to note: ‘seach only’ is ticked and the snap settings can be seen on the right)

This is then followed by the smooth move node which does as it says and takes the snap info and smoothly transitions the object to it’s new location.

Smooth Move

In with the snapping code on the Vive_PawnCharacter is also some functions that are called on the dropped object and it’s new stack parents. The functions update the procedural meshes with the new weights they have above them on the stack which in turn updates the deformation and also updates them to save a reference to their parent. This is also done when the objects are grabbed after the TrytoGrabObject nodes.

There is one problem with the snapping system for this use case and that is a procedural mesh can’t have sockets so I have solved this problem by using a underlying static mesh which deals with the snapping instead of the procedural mesh. Another problem is you can’t update sockets dynamically, which was a problem because I didn’t want the objects to stick to the side of each other. To solve this I have a function called ‘AdjustSnapPoints’ which swaps the underlying mesh as it rotates to make sure that there are only ever sockets on the top and bottom.

The Procedural mesh

To get the procedural mesh set up you need to first take a static mesh you want to work as the base for the procedural mesh and get the section from the static mesh and use the outputted values to create a new mesh section for the procedural mesh. Then some collision needs to be added using the ‘Add collision convex mesh’ node.

The procedural mesh set up in the construction script.

To deform the procedural mesh you need to do 3 things calculate the adjustment for each vertex, move the vertices and then update the procedural mesh. To calculate the vertex adjustments first you need to rotate the vertex to account for any changes in world rotation then get the distance of the vertex from the bottom of the mesh and finally multiply that by a factor of the current mass load on the object. This gives us the base deformation, we then do another step to add the buckling effect.

On the left rotate the vertex and on the right the fuction below followed by the multiplication.
Small function to work out the distance to the bottom of the mesh.

First I do a check to see if the vertex is on the top or bottom face as we only want this effect to apply to the middle vertices when the deformation is less. If it is we do a simple adjustment based on what we have already calculated and if not we also do the buckling calculations.

Uses the component bounds to work out the z size.

To work out the buckling get the distance of the vertex from the centre point of the mesh using ‘component bounds’ and then put that into a range between 0 and the ‘OutwardsBuckleFactor’ which can be adjusted on each type of object to create different crushing profiles. Then multiply that by the direction of the vertex from the centre of the mesh to get the vertex change.

Move the vertex and save this in a new array. (below this is the non buckling version which is the same but without the buckle comment bit)

Finally update the procedural mesh, in the picture below you can see that I have used a timeline to smooth out this process by lerping between the old vertex positions and the new ones. To do the updating use the ‘update mesh section’ node as well as clearing the old collision and rebuilding it. When this is done check to see if the mesh has sbeen crushed and if it has highlight it.

Update the mesh smoothly so it looks like it is being crushed.

The result of this is that the mesh will deform after a certain amount of weight has been put on top of it as you can see in the video below. There are still some improvements that need to be made to this such as updating the sockets so that they more closely fit the procedural mesh (to stop the meshes from floating).

Video showing off all of the above.

Tutorial For Installing Amazon Multiplayer Plugin

  1. Download Unreal Engine 4.25+ from github.
  2. Follow instructions from Epic Games github (download pre-requisites, setup.bat, generateprojectfiles.bat).
  3. Build Development Editor by opening the sln file in visual studio. Should take about 1-2 hours.
  4. Unzip marketplace.rar file and move marketplace folder into new unreal engine install location

5. Build development editor again (should just take 5 mins this time).

New Project Steps

  1. Make a new project
  2. Create a server target by duplicating PnameEditorTarget.cs in project source and changing it to PnameServerTarget.cs and changing the c++ contents also to be named PnameServerTarget.cs

There are two starter tutorials to consider:

Implementing Gamelift Into A New Project

The first of these two tutorials gives an overview on implementing gamelift into a new project completely from scratch.

Consider that in project settings, the server default map needs to be set and this is the map the server will enter into. The client enters into the map called EntryMap, a different entry point than the server.

I suggest watching this tutorial to understand the different parts of what is going on, but starting from the gamelift only starter project instead of recreating it yourself.

Starting From The Gamelift Only Starter Project:


Also important to make sure maps are included in packaged project–Gamelift Only Sample Project shows these as included already by default also.

After building, dedicated servers appear in the saved>>staged builds directory

The next step involves uploading a build to aws gamelift which requires the aws CLI tool.

For our purposes we also want to create builds in linux, which requires installing some tools to create ue4 dedicated server builds in linux.

For this, we will investigate this tutorial: https://medium.com/swlh/building-and-hosting-an-unreal-engine-dedicated-server-with-aws-and-docker-75317780c567

This tutorial mentions first installing a clang pre-requisite, for our version we use unreal 4.25, so this is available to download and install here.


With the clang pre-requisite installed, now is the time to try and see if we can build to a specific running linux instance from within the project launcher window.

By default it does not appear here.

For doing this and getting a linux platform to appear here, I was advised to run a linux shell by installing debian from the windows store. First opening it returns this error–checking into it.

Here is the site they are referring to:


I decided to try a simplified install.

It appears to have worked :).

After restarting, it appears an error popped up in Ubuntu–this will require investigation.

Perhaps this installation failed because I didn’t join the windows insider program and install a test windows build. For now I will put this on the backlog and see if I can progress making dedicated server linux builds for uploading to AWS but package windows builds for testing with my local gamelift.

After speaking with Atanas–we think it’s likely an easier fix to enable this in my bios settings and I won’t need to join the windows insider program.

If I need to return to creating a linux environment for building the project and doing local gamelift testing, I can return to this video tutorial here https://www.youtube.com/watch?v=ZRl_P8Q5_fI

Here on the next step, I have I am packaging and building the gamelift only sample project for windows from the project launcher.

Now after packaging, it is time to upload this build with the amazon command line tools.

First step after downloading the command line tools is to configure my aws environment with access key, secret key, and default region (eu-west-2) and default format (json).

Next is doing the actual upload in the command line.

aws gamelift upload-build –name –build-version –build-root –operating-system WINDOWS_2012 –region eu-west-2

We modify this to be:

aws gamelift upload-build –name “GLTestBuild” –build-version 1.0 –build-root “E:\UnrealProjects\GameliftOnly\GameliftOnly\Saved\StagedBuilds\WindowsServer” –operating-system WINDOWS_2012 –region eu-west-2

Now after some patience, it is uploaded. Next step is to create a fleet from this build!

For specifying the build path, from a full string of this where GameLiftTutorial is the project name


You specify just

GameLiftTutorial\Binaries\Win64\GameLiftTutorialServer.exe and set concurrent processes to 20 for testing (which means 20 players).

Configure the ec2 port settings as per this image.

Now it’s time to go back to that sample project and update our configurable variables for connecting to Gamelift.

Opening the GameEntry map and opening the level blueprint, we can see two well organized sections for aws local and aws online respectively.

Here we edit the access key, secret key, and region.

The next step is to edit the fleet alias name variable which connects into the Create Game Session event with our fleet name from the aws gamelift console.

Now I attempt to play on this new fleet by hitting play in the editor, but we can see from the print string a friendly error message showing that the fleet is not yet active but is still activating.

This ended up having an error that prevented starting the fleet. After testing locally it did not seem to connect new players either, but the gamelift SDK running locally did appear to detect the server exe being opened and said it had healthy processes. It’s possible I missed something there.

For running windows builds, the developers of the gamelift blueprint plugin did mention it was necessary to include specific windows files.


But for now, I decided to march on with Linux instead as recommended by Atanas.

I found this very useful article for enabling virtualization on my machine.

How to Enable Virtualization on Windows 10

After doing this, I successfully got debian to install.

Now my next steps are packaging a linux build and running aws locally in linux to test it.

I was recommended to follow this tutorial by the developers of the plugin:

Following this silent tutorial I reached a point of no return where installing openssh-server did not seem to proceed.

This led me to find this other tutorial which I will now be considering.

Atanas sent me some commands to try out for installing ssh server, so I will test those out first before going through this tutorial.

  • sudo apt update
  • sudo apt upgrade
  • sudo apt-get install openssh-server
  • sudo service ssh start (to run the ssh server)
  • sudo service ssh stop
  • sudo service ssh status

Using these commands, I was able to start an ssh server. The How to Build Linux Server with WSL tutorial then shows I need to run a command ip a to find the ip address of this server.

The next step in the tutorial shows I need to add an unlisted linux platform as an option from the device manager, but I am not seeing this available on the drop down.

Although I have installed the linux clang, this potentially is not appearing because we are missing some additional steps of configuring environment variables.

It seems like the environment variable was set automatically, but watching this tutorial video showed me I need to build the engine again after installing the linux clang.

After rebuilding the engine, now I see the options for adding linux from the device browser.

I have entered the ip address of my linux ssh server and for the username and password I put the username and password I had entered for debian and the ssh server.

Now it’s time to try building the project from Project Launcher, making sure to select by the book for the data build option.

The build failed on this step, so from the windows terminal I connect now to the ssh that is running in debian.

I ran into errors building the project when setting the ip for and also for

This error seems more promising to fix.

This post seems to suggest I need to re-run setup.bat


Further searching on the error I found that some suggestions on the discord for the plugin were to update the version of putty that is included on unreal engine.

I have attempted to update the putty files here and will try re-running the linux build process.

putty files

This appears to have made some progress..building a linux server build now!

The process appears to fail at the end, but these error messages can be ignored.

The next step is to run on windows command line the gamelift Local server with this command java -jar GameLiftLocal.jar. This opens Gamelift Local to start looking for servers.

Then we need to go this directory for finding our new linux server build:


Along the way, in the image below I learned some things about navigating to this path in linux. The linux server can access the windows directories via its mnt directory from its file root.

Unfortunately here you can see I have an error running the command ./GameLiftTutorialServer that it is not opening the server and running into an error.

After searching, the fix for this error suggested:

“make sure you have put your project and unreal engine source in different partition and the plugin in unreal source engine. because of a bug of windows cpp linker, windows will just link a relative path to the binary in unreal engine to executable if they are in the same partition” which I have interpreted to put my project files on D drive while keeping my engine source on E drive.

After this, it did not fix the bug so I am attempting to rebuild the project from the d drive.

Re-building the project from the d-drive worked, but then launching the server gave this error.

Discussing with Lion from the plugin team (very helpful), he suggested that this might be blocked by windows firewall. Disabling windows defender is not enough, but I will need to modify regedit.

“you need to modify the regedit table on windows to terminate windows firewall. turning off windows defender will just allow all income from network. not included dbus”

He also gave some helpful advice that it can be useful to have a separate linux PC to do this kind of testing or can be also useful to develop on mac for avoiding the need to run a virtual machine for doing so. Food for thought!

He also gave some advice that it is possible to retrieve logs from the gamelift game session consoles, but only for terminated game sessions.

Next Steps From Here

So I have almost arrived at the end of my journey of running on a local linux virtual machine an ue4 linux server build which I can monitor on aws gamelift local.

The next steps for me will be disabling the firewall as suggested with regedit and then attempting this again.

As a recap for myself now that I almost have a built thing ready to go, after restarting my computer I will need to:

A. Open debian and run these commands

sudo service ssh start (to run the ssh server)

sudo service ssh stop

sudo service ssh status

B. Connect to debian from windows command prompt with ssh brian@

C. In the debian command prompt, navigate to the path where you installed the gamelift local sdk with cd /mnt/e/Gamelift/GameLiftLocal-1.0.5 and start Gamelift Local with this command: java -jar GameLiftLocal.jar

D. In a new Debian window, navigate to the Linux Server Build Path


E. open the linux server with the command .\GameLiftTutorialServer

F. Monitor to see if this time it successfully connects to GameLiftLocal.

A successful connection should look like this:

Resuming Progress!

It turns out we did not need to edit firewall settings, our error turned out to be something a bit simpler.

I was opening the gamelift local jar file in a windows command prompt, so the linux server running in the debian environment was not seeing the gamelift local running.

Now the next step is to see if we can connect to the locally running server from either a client build or when playing from the editor.

First step now is to make sure I have changed the game entry map for the client to point its begin play event to the aws local side of things.

There was a small compile error here I had to fix, I’ve renamed the fleet variable name to fleet arm 2.

Running this I ran into these errors below when trying to connect, it seems I will need to modify the blueprint code of the gamelift only project and I’ll be reviewing again the tutorials to see if there are any things I missed on this.

Create Player Session Error

When connecting the client again, it says there is no available process. Perhaps it needs to be set to allow more simultaneous processes on the aws local server configuration.


To make sure there are no problems with my environment, I am now testing packaging for linux server a build we know connects.


Our results packaging a build we knew 100% worked on my colleague’s computer found us trying lots of different ip addresses from our ipconfig, but still having an issue for the windows client playing from the editor to connect to the gamelift server running in debian.

The next steps now are to:

  1. Try a tutorial to set up a virtual box environment instead of debian for linux, as there may be better control over network parameters
  2. If this fails, try running the linux build on another computer on the network and connect to it that way.

I am going through this tutorial now on setting up the Virtual Box.

Lion from the discord support team for the AWS plugin also recommended watching this video, which I’ll watch soon and summarize the learnings from there.

First, I watched Lion’s tutorial video. It was a great overview of the overall Gamelift architecture. When I re-edit this into a more organized tutorial, I think this is the kind of video to definitely put at the beginning.

I learned some great points about configuring autoscaling for your aws fleet and making sure the server is responsible for sending critical data to the aws backend and this is vetted by making sure the IP of the sender is one of the IP’s in the aws fleet of unreal dedicated server processes, not just any client.

Second, I did succeed in getting a virtual box environment set up, but when I go to open the linux server I am running into an issue.

I will investigate this error some more first before abandoning the virtual box set up in favor of trying another pc for hosting the aws local server.

Investigating on discord did not give many leads, so I am going to explore the option of building a dedicated server on windows for my local aws testing, and then later building a linux build to upload to aws.

A while back, Lion messaged me to ask if I had installed the required files for building a windows server build.


I did get the server and gamelift jar working inside the virtual box environment, but in the end ran into the same error we were having with Debian where my windows client was not connecting to the gamelift server.


I’ve decided to try packaging a windows server to see if I can connect to that.

I ran into some errors trying to package the gamelift SDK, so I have started from part 1 of this tutorial series and installed visual studio 2019.

As part of part 2 of this tutorial series, I have installed cmake and set the system path variables for CMAKE and MSBuild and then also set a new system variable for the visual studio 2019 common tools.

running this command in command prompt to build the gamelift sdk.

E:\Gamelift\GameLift-SDK-Release-4.0.2\GameLift-Cpp-ServerSDK-3.4.2\out>cmake -G “Visual Studio 16 2019” -A x64 -DBUILD_FOR_UNREAL=1 ..

Next step:

msbuild ALL_BUILD.vcxproj /p:Configuration=Release

These steps above were for building the SDK, but I realized with the gamelift plugin, we already have a built gamelift SDK in our engine folder, so I fast forwarded past this part and stopped this process.

Beyond this, I resolved an error I had with gameliftlocal.jar not opening anymore on windows by reinstalling java jdk.

I then got the windows server build to open now that I had populated it with the pre requisite installer and the install.bat.

To open the windows server and enable logging, I open it from a cmd prompt with these parameters added. Convrs_Quest2Server.exe -log -port=7779

This makes me wonder if I could add a port as a parameter to the linux server to resolve the problems we were getting.