Unity Space Game

About This Section

This section is a development log of my second attempt at creating my own 3D game using Unity Engine and C#. This game brings elements of my first project Unity project as a foundation, amplified by new/revised features and a deeper understanding of Unity's capabilities.

- - - - -

Unit Behaviour & Docking Paths

Controlling units with docking routines and targeting hostiles.

26/03/2025

Unit behaviour
The base unit behaviour has been carried over from my previous Unity project. The player will be able to add enemies to unit attack queues, and units will pursue any enemies in range by default.

Enemy behaviour
I've also added enemies to the scene. These enemies follow the same development principles as the controllable units in terms of unit data. The only component thus far that varies from the existing scripts is the movements of hostile units (since their behaviour is distinctly separate in terms of player interaction and targe detection).

Projectiles
Test projectiles have been added to the game, making the unit behaviour script components flexible and configurable based on how many projectile origins there are and whether the projectiles are guided. I’ve made it so that there are currently primary weapons for both playable and hostile units, with the options to expand to secondary functionalities.

Docking path routines
I’ve refined the functionality for a unit to depart from and be recalled back to its parent dock. One of the aspects which I’ve considered is the unit’s position and rotations ‘lerping’ from one navigation node to another. Especially for the recall action, the unit will rotate towards the first return route node of the parent dock. Once within range, it will orient itself in the direction of the parent dock. From there, it will lerp through the remaining docking nodes until it rests at the final node, where the return process is completed. I’ve also had to configure how the unit’s other movements like enemy detection and target caching will be disabled, suspended or cleared once the recall has been put into action.

I’ll need to think of a way to abort the return process if the unit hasn’t yet reached the first return node. This is because the player might react to nearby events that could pull the priority of the returning unit. The plan is to intercept the logic in an existing ‘if’ condition that defines the logic mentioned previously to navigate to the first return node. This means that if the unit has not yet started the docking procedure, the player can divert its course.

Unit selection
Selection outlines on buildings and units have been added. When toggling the outline component, I had to add an additional check to ensure that the dissolve effect had completed before allowing the outline render toggle, as triggering the outline while the dissolve effect was taking place would cause the render to remain even when the entity is no longer selected. This is because the outline would still reference the initial dissolve shader materials.

When adding visual effects such as engine propulsion trails to the units, the outline component was throwing warnings to the console due to the VFX renderer component not being compatible with the logic. As a result, I modified the outline component to sanitise the initial renderer array by converting to a list, filtering out renderers that contained VFX renderers, and replacing the array with the filtered list. This stopped the warnings from showing, and gives more confidence in applying VFX to more selectable objects in the future free of warnings.

Unit allocation
There is a 1:1 relationship between a production building and a unit. The buildings now reference the allocated unit, in a similar way to how units currently reference their parent buildings for their recall functionality. This means that the player is no longer able to build another unit in a given production building if its allocation slot is currently occupied.

Sources Referenced:

- - - - -

Blueprints & Building Anchors

Placements of blueprints.

19/03/2025

Blueprints
I created blueprints which would be bound to the movement of the player's cursor. To do this, a build layer was used to project the cursor’s ray-casting and track the blueprint’s position in relation to the player’s cursor. Colliders were set to the main building components to evaluate the validation of building placement with previously placed buildings on the same physics layer. Given that the blueprint is anchored to a viable location, the user is able to click and instantiate the proper buildings in its place. The functionality of the blueprints are governed by a single script component used across all blueprints. The script has been set up to accommodate all types of buildings, sub-meshes, and composite buildings for building placement.

Placement validation
Blueprints have three states of validity: default, invalid, and anchored. In its default state, the blueprint is rendered in a blue colour. The blueprint will follow the cursor's movements within the scene, as long as the projection is within the bounds of the build layer. In its invalid state, the collider of the main building blueprint will be in contact with an existing object of the phyiscs layer e.g. another building. The blueprint will be rendered red and the user is unable to place the object, even if it's snapped to an existing building anchor. The anchored state occurs when the blueprint is snapped to a viable position, and the bleuprint will be rendered white. This is the state in which the user is able to place the intended object defined by the blueprint.

Placement anchors
Anchors are their own prefab components which the blueprinting script component detects when in a viable snapping range. Placement indicators are shown to the user if the player is in 'Build mode'.

Sources Referenced:

- - - - -

Camera Controls & UI Toggle

Camera controls including rotation, field of view, movement.

15/03/2025

Camera controls
I've implemented a tutorial (referenced below) which allows a user to move the camera through the scene. They can either (1) move by dragging the right mouse button across the screen, or (2) keep the cursor to the side of the screen to edge scroll.

For the mouse dragging functionality, the existing implementation only allowed the camera to move across the distance determined by the mouse drag. This means that if the user kept the mouse held down, the camera would stop moving after a short period of time (in which the original dragging distance was traversed). The original functionality also followed a 'grab and move' apprpoach i.e. the drag direction was the opposite of the direction of camera movement. Instead, I wanted the camera movement to be sustained in the direction of the mouse drag, as long as the right mouse button was held down. In prder to match the intended functionality, I inverted the vector direction of mouse drag camera movement to better reflect the sustained direction of movement. This was done by simply subtracting the direction vector frrom the position of the camera's focal point. To make the movement persist as long as the player is holding down the right mouse button, I had to (1) store and update the position vector of the start of the drag in relation to the focal point of the camera. (2) The end point of the mouse drag will then compute the direction vector of the camera movement against the relative position computed in step (1). As the map is traversed, step (1) will be updated for step (2) to use this as a reference when recalculating the movement, keeping the magnitude of the direction (and in turn velocity) constant.

I've revised how camera controls operate compared to my previous project, to accommodate user experience. The camera rotation is now limited to pivoting around the Y axis only. I've also made it so that the parent camera focal point is the object that is pivoting, rather than camera's own rotation and position relative to the focal point that is changing.

I've also added some hotkeys to allow the user to reset their camera position and rotation. If the user keys [Ctrl] + [Tab] once, the rotation will move back to its original rotational angle and focal length. If the user keys [Ctrl] + [Tab] again, the camera will move towards the origin of the scene. Both of these transitions have a fixed time duration of one second. During this implementation, I learned that the use of interpolation known as 'lerping' using Time.deltaTime multiplied by a scale factor (usually characterised as transition speed) does not actually arrive at its final target point. Instead, the value of interpolation arrives asymptotically to its final value. After reading up on some articles, I found an alternative approach to linearly 'lerp' to a target state in a fixed time duration.

HUD fading transitions
I laid the foundations of transitioning the visibility of UI displays using the canvas group alpha channels upon certain triggers. This will become useful once certain UI elements become dependent on play modes and triggered events.

Sources Referenced:

- - - - -

Importing Assets

uv mapping of spaceship model

UV mapping of spaceship model.

01/03/2025

Exporting From Blender
Before exporting the models from Blender, I had to make sure the model properties were standardised and compatible for the purposes of development. This included:

  • The object-level scale should be to-scale. This means the underlying local-level mesh must be scaled correctly relative to the object. This makes it easier to manipulate the mesh and reference its properties once imported into Unity.
  • To prevent the loss of mesh data, I converted all faces into triangles to avoid intersecting places, and ensured all plane normals were outward-facing.
  • For the shaders to display correctly on the model meshes, the UV map must be recalculated to accommodate graphical transitions over adjacent faces. This was done by using the 'Smart UV project' option.

Once the above was complete, the export process to FBX format is as follows:

  • Set transform scaling to FBX Units Scale.
  • Set transform directions to Z forward & Y up.
  • Check Apply Transform.

To import into Unity:

  • For model import settings, check 'bake axis conversion' & 'read/write enabled'.
  • For animation import settings, remove duplicated animations and check 'loop time'.
  • For material import settings, overwrite with the materials created in Unity.

Dissolve shader on game objects.

Dissolve shader
To begin the main part of development, I created a dissolve shader to animate the creation of in-game objects. I followed an existing tutorial for the base shader graph, and controlled the dissolve rate with a script component. To replace the dissolve shaders with the main static materials of game objects, I created a function that would take a list of static materials from the inspector, pattern match the substrings with the assigned shader materials of the mesh renderer, and replace once the dissolve coroutine was completed. As a result, we now have a script that can receive any number of replacement static materials, as long as the currently used mesh materials has a corresponding mesh material. Multiple current mesh materials can be replaced by the same replacement static material. The flexibility of this script means that this component can be applied to any mesh that requires this dissolve effect.

Sources Referenced:

- - - - -

A Fresh New Start

27/02/2025

Setting up a new Unity project
After a period of more than 2 years, I've decided to get back into the world of Unity by starting a brand new project. The aim of this project is to develop an enhanced version of one of my previous projects, by leveraging the knowledge I've gained these past two years and paying more attention to the performance of the runtime logic. I will be using a fair share of game assets and scripts from my previous project to accelerate development, as well as new models and improved scripting. Much of the code and editor configurations will be refactored and optimised to improve maintainability, scalability, performance, and readability.

Setting the scene
To first set up the development scene, I gave myself a refresher on how to establish the Universal Rendering Pipeline. The main purpose of this is to support the shaders and visual effects that will be featured heavily throughout the game's development. To do this, I set up a URP Renderer Asset in project files, mapped the reference to the project configurations, and added a global volume object to the game's primary scene where the development will take place.

* * * * *