The weekly summary
At the end of every week I try to summarize what I've been working on for the past couple of days.
Both to show my progress and to share and showcase cool things I've learned and created that might not fall in to a specific category when it comes to either game, level or technical design.
I hope to both inspire and teach a thing or two. And I gladly take any feedback you have to improve the project.
Thank you for reading!
Week one
Testing out procedural animations
and prototype for movement functionality
The majority of the animations for the player movement are procedural. Variables controlling different alphas are being set it the Player blueprint which control Bone Modify Transforms in the animation blueprint.
Each default movement functionality is in placed and a first pass of tuning has been done. I've set up a Metrics Gym to test out the movement and tweak it thereafter.
Setting up character inheritance
and adding the Gameplay Ability System
I created a Character C++ class and included the Gameplay Ability System framework to the class so all child characters could inherent the component.
The Player Character derives from the project C++ Character and inherents GAS from it. I added all relevenat Inputs, Mapping Context and movement functionality to the BP_Player.
Camera design and object specific FOV
To prevent the weapons and/or FP arms to clip into geometry I created a material function adjusting the World Position Offset by rendering the object on top of existing geometry. This basically scales the objects vertices towards the camera without the player noticing from her point of view and always renders the same no matter the choosen FOV.
This is a very well working solution to the weapon clipping issue. One of the most used solution would be to have a seperate camera rendering the arms and weapons on top of the viewport, though this is quite complicated to achieve in Unreal Engine with a decent result.
Week two
Creating a Health Component and testing different shield mechanics
After trying to piece together using GAS to fire weapons and dealing damage I decided against it for the non ability mechanics. I felt that it became slightly too complex for just a few weapons.
I made a Health Component in C++ instead and added a regenerating shield through blueprints. After some testing I decided that if the shield would fully be depleted, it would not start regenerating again. This is subject to change after further playtesting, but works great for the time being.
Establishing the upgrade loop
One of the main goals for this project is to limit the amount of 2D menus. I want to keep the game as diagetic as possible.
Upgrading the suit and is done by adding aquired modules to it by hand. For now, while working on the procedural animations, I've split the player logic into two characters. One used in non combat zones and the other one (while using the suit) in combat/missions.
Prototyping the upgrade modules and suit workbench
It took a few days of research and testing to see if I could utilize GAS by having actors holding the ability classes. When attached to the upgrade bench, they'll grant the correct ability to the combat player character, while storing the given abilities in the GameInstance to be accessed across all levels. When a module is removed, so is the ability. I think this could be a really cool and modular way of customizing your suit to cater to each players playstyle.
Week three
Cleaning up the movement
When setting up a couple of more traversal examples in the metrics gym I realized that the function handling the capsule collisions half height didn't translate as well as I first thought from crouching to sliding.
The issue in hand became quite complicated to pin point.
I could track the float in runtime but there is no out-of-the-box feature in Unreal to change viewport in runtime while still maintain control over your character. So I made a debug camera tool which allowed me to change between viewports in runtime and automatically execute console commands to show the necessary collisions. With the help of the tool it became much easier to time the capsule half height changes.
Documenting design guidelines
I had to take a little step back from prototyping and focus a bit on the GDD this week to remind myself of a few useful guidelines to not become too overwhelmed with the long list of tasks.
With the help of notes from my educations, a bunch of GDC talks and other useful sources of information I wrote down important things for me to keep in mind during the production. I focused on analyzing design, vision, evoking emotions, approaches to fixing issues and control mastery.
Feel free to have a read, either by clicking the selected pages to the right or by going through the GDD above.
Collision filtering
At the end of this week I focused on setting up a prototype for interactions. I planned an easily scalable systems that could be applied to all actors, including enemies (which exluded utilizing a parent blueprint for interactables).
I created a component used for tracing the interactables and a custom shape actor with customizable volumes which can be attached to any object och character. To prevent inconsistencies by using custom collisions I added and defined a new collision preset, dedicated for interactables, only responding to the trace channel used by the InteractionComponent. This preset overlaps all other traces used for example physics objects and weapons.
Week four
Dynamic widgets for interactables
I'm trying to keep the upgrade system as diegetic as possible, but a little bit of information about each upgrade (module) is nedeed to present to the player the usage of each one. When the player is in her ship, a tooltip widget is drawn when looking at interactables, such as the modules.
A Name and a String variable can be set in each instance of the module which are being passed with the Interaction interface to the tooltip widget. The widget can then be recycled and we don't have to use a unique one for each single item.
Saving transforms between levels
Modules and other decorations within the ship needs to have their individual transform saved to maintain their location when leaving and returning to the ship after missions.
To manage this I set up an overacrching manager that handles the Cargo Hold when aquiring new objects and tracks their transform through various arrays and maps. The necessary information is saved and stored with the GameInstance and SaveDataObject through solid blueprint communcation using interfaces. By having a ShipManager blueprint I can keep the Cargo Hold modular and easily changeable both when it comes to customazation and future iterations.
Loading attached modules
Every object in the ship is handled by the ShipManager except the attached modules. Since they have to spawn attached to the correct location I chose to have the functionality in the respective sockets. This way I don't have to change a ton of blueprint when adding additional sockets to the upgrade station. By using a Map variable I can find the exact value (Module) to each key (Vector per socket) and respawn each module with the correct physics settings.
(At the moment they are being spawned back in using a Debug Key. This will be changed to calling a dispatcher when the pod returns from missions.)
Week five
Designing the player ship
After having MVP's of all the core gameplay feature in place, it's time to start designing levels. The player ship is a natural place to start since it will be a very central part of the project, connecting many elements together and bring structure to the game. It's onboard the ship that systems like the armory, upgrade station and mission select/level loading will take place.
Week six
Armory
The MVP for the Armory is now in place. It works as a conditional for the boarding pod. You can't venture out without adding two weapons to your loadout, in whichever order the player prefers.
Upon returning to the ship, the weapons are returned to the Armory. Planned addition for this is customization and upgrades for the weapons.
The individual assets are the InteractiveCollision to trigger the widget.
Loot crates
An important part of the upgrade loop. The crates inherents the interactive volume to differentiate it from the actors using the InteractionCollision. I looked a lot at Destiny and how they handled their world loot. There's an hold down interaction that adds a little bit of risk/reward for when to open each crate.
I'm quite proud of the entire loot system. It doesn't depend on any hard references and provides a solid RNG-system to build upon.
Feel free to read more about it in the link below:
Navigation table for level selects
The second conditional in order to head out to missions with the boarding pod. Uses both the InteractiveAssetVolume as parent and the InteractiveCollision (to maintain the immersive feeling) for each body to provide information about each individual destination. When a destination is choosen, the level information is passed to the boarding pod.
I'm using a data table for the destinations. Holding attributes like name, description, mesh, conditions and can easily be scaled up with additional attributes such as environment and amount of hostiles.
Week seven
Rough lighting pass and hallway blockouts
I have an early blockout and 2D layout in place, but it's getting hindered by the current lighting, lack of space elements and corridors that doesn't quite convey the space fantasy. So I create a dynamic space skybox prototype using Spacescape and worked on getting a rough lighting pass in the gym in.
A lot of time this week also went to gather references on hallway metrics and blocking out different versions to help the feeling of being on a space station in place.
Week eight
Rigging pipeline
I want to start replacing the Unreal stock meshes with my own low poly styled weapons and arms to start getting a bit of the soul in to the game. I've read up a lot lately on the subject and practice a ton to be able to set up a good non-destructive worklflow. I've imported and started to implement the procedural animations to my new viewmodels. But there is a few tweaks left to finish before switching over to new meshes and skeleton.
Testing different approaches for lighting
I've never been sold on Lumen so I wanted to see how I could illuminate the maps without it. Seeing as the game is set in varius space stations and ships, having it enabled seem like an uneccessary alternative. There is also something with it feeling almost "too perfect" for being a part of a game this. So I tried and recreate a more stylized way of lighting the levels using cubemaps. I did a few tests to see if I could tint the shadows for a more "arcadey" feel.
Additional weapon functionality
I improved the hitscan functionality for the parent weapon class by firing a trace from the player camera. The end of the trace also marks the end of the trace coming from each weapons muzzle to always make sure the shots go towards the center of the screen/reticle. I believe this is the best approach rather than either firing the traces only from the players head or the weapon. It eliminates the majority of issues the two other options has, for example when firing from behind covers.
To add a little sense of randomness I use a Random Unit Vector in Cone in Degrees. This also comes in handy for the shotgun providing functionality for pellet spread.
Week nine
Loot sorting and UI trackers
I added ammo bricks with the same automatic pickup functionality as the currency. By using a macro seperating the loot by tags I can sort the pickups to their respecitve funtctions in the player blueprint upon overlap.
If the pickup ammo value is greater than the max capacity, the ammo bricks won't get picked up by the player and will stay where they are until the player needs them.
For the accumulated Glarium widget I use a dispatcher on pickup to communicate with the widget bluerpint. The counter gets reset after not picking up any currency during a set duration. If no additional currency is overlapped during that span the widget gets hidden.
Base Enemy behavior and using SubTrees
I modelled and rigged a placeholder enemy drone with an animated attack for anticipating it's attack. I set up a base Behavior Tree with subtrees for healing, hiding after taking damage, strafing and attacking a target. Hiding, strafing and attacking all uses EQS to find the optimal location for each uses. I addition to this each enemy can be assigned a Patrol Region in which each can investigate and hold during searching and combat.
I've found using subtrees really flexible for a modular approach when creating different enemies so I can reuse behavior when needed.
Different approaches for Hit Reactions
I prototyped two different versions of applying Hit Reactions to the enemies. First I tried to use physics animations but since I have the root bone centered in the hovering enemy the impulse caused the skeletal mesh to detach from the capsule component while the Behavior Tree kept running causing ghost like behaviors.
Combining these two felt a bit too finicky so I decided to use montages instead. I seperated the capsule component into a grid. By using inverse transform I converted the absolute hit location to the equivalent row/column on the grid. Depending on which area the enemy gets hit, a different montage will play. If the relative hit is negative the montages gets flipped to match the velocity of the impact.
Week ten
Debug hud for posing
Since I'm using a material function to achieve an FOV of 75 on the viewmodel i had some issues with finding a good workflow between posing and later implementing the animations.
I resolved this by setting up a HUD option displaying the combat corridor and reticle framing together with drawed lines to better and more quickly get a good sense of where the viewmodel should be positioned on the screen. This helped immensly with posing and to quickly get a good sense if I'm on the right track or not with having consistency across all weapons individual animations.
Prototyping the dash ability
I explored three different versions of making a dash ability. I continued with the version that hade the best collision detection while providing the more arcady doom-like feeling since it behaved the most consistent.
The collision detection is being controlled with a trace activated on input in direction of the Last Movement Input Vector to get a valid target location.
It came with a challenge of being reliant on using a timeline. Something that can't be done inside the gameplay ability system. The solution for this is to pass the activation of the ability using an interface to the player blueprint and run the timeline there and then pass back the alpha with the same interface.
Replacing placeholder meshes
I've for the past weeks tried to improve my modelling in Blender to try and replace some of the placeholder meshes in the prototype. I'm trying to go for an art direction where I myself can contribute to some extent.
This SMG is based on the mood and near futuristic technology references selected for the game. Slowly but steadily you'll see assets replaced in the game to better reflect the goal for the project.