You are on page 1of 27

2019

Alchem
AN EXPLORATION IN GAME DESIGN USING MAYA AND UNITY
EDUARDO NODARSE
Table of Contents
Introduction 2

Modeling, Rigging, and Animating in Maya 5


Modeling 5
Rigging 9
Animating 12

Development in Unity 13
Scripting 15
Basics 15
Finite State Machines 19
Some Game Design Patterns 22

1
Introduction

This document serves to provide a comprehensive overview on making an RPG, specifically

making the game Alchem. Through a lot, and I mean A LOT of trial and error I learned so much from:

 Music Theory and Creation in FL Studio

 Modeling, Rigging, and Animating in Maya

 Game Development in Unity

Setting on this independent study is definitely daunting, looking at everything that is required to create

a single level. But if one likes to face adversity and overcome it, then creating a game is a perfect

project.

Developing this game forced me to look for unique solutions to different problems. One thing to

always remember is stick with your idea. What you want to get done is possible, you just need to come

up with interesting solutions and work arounds sometimes. Sure, it may be a bit janky sometimes, but as

a beginning game developer, the most important thing to do is to keep going. A game is not going to

develop itself, and if you don’t do it, someone else will.

To begin with, a game needs to start from somewhere. It was important to me to create

something that I was excited to play. I love the show Avatar: The Last Airbender and have admired that

the creators used different styles of Kung Fu to allow their characters to manipulate (or “Bend”) the

elements. I wanted to create something that involved the same idea of bending elements. I also wanted

a choice to be made by the player to choose which element they would pick and create different

playstyles for each of the elements. The fact that the player was going to be able to make choices and

the game would have different outcomes based on their decisions, leads one to create a Role-Playing

Game (RPG).

2
Next I created some concept art that would help steer the art direction and allow to build the

world of the game. I began by sketching out a couple of ideas for some character and felt the need to

create a whole new world. The art style was supposed to help modeling, considering the models are

humanoids. Their nearly featureless faces would allow me to save time in worrying about topology and

likeness to human features.

While creating the

concept art, I kept in mind

one core moment that

would be the defining

moment of the game. At

some point, the main

character is going to have to

face off against enemies in

the town square, which will

allow the player to use the


Figure 1 – Concept art for the Main Character, weapons, and
elemental sages. Eduardo Nodarse. Drawing. 2018 abilities they’ve unlocked. At

that point I knew what I would need.

I decided to use Maya for modeling and Animating; however, for the Rigging of the humanoid

character I uploaded a FBX of the modeled character to Mixamo. Mixamo is a website that will rig a

humanoid character with a basic skeleton. Mixamo also has a comprehensive list of animations that they

will bake into the skeleton’s joints, and then your character would be ready for importing to a game

engine. I then grabbed the rigged character and chucked him back into Maya so I could animate him

myself.

3
Don’t get me wrong,

the animations at Mixamo are

amazing, they even have fully

modeled characters ready to

be dropped into a game

engine; however, I felt like that

would be cheating, so I

animated my character myself.

On top of that, animating it

myself allowed me to give my


Figure 2- Core game moment where player faces enemies.
animation a bit more
Eduardo Nodarse. Drawing. 2018
character.

Now, the character only has a skeleton, and has no IK handles or controls, so I had to go into

Maya and make them myself. I was guided by Mr. Eduardo Vieira who has amazing tutorials on YouTube

that will go through each process of character modeling, texturing and rigging for any purpose. He

explains every step of the way and is really easy to follow along with.

Once the animating was done, importing the model and the animations to Unity was the next

hurdle. Luckily for me, there is another person on YouTube who has a great tutorial on different

methods of getting the animations into Unity. Although he has not posted his name on his channel, Mr.

AnimatorGameDev’s tutorial definitely enlightened me on how to import the animations.

Finally, development in Unity seemed to be one of the hardest things to deal with, due to the

unfamiliar environment and the plethora of material on the game engine. It was overwhelming, but

once I was knee deep in the thick of it, I was okay. Each stage of the process has had many issues, but in

4
the end, the issues get solved and all to the thanks of Google. Most of this process is getting something

done, testing it by itself, finding bugs, trying to fix it, googling the problems, screaming, and then peace

when you’ve found the solution.

Modeling, Rigging, and Animating in Maya


Modeling and Animating in Maya tends to be a both glorious and incredibly frustrating thing.

Glorious when it works the way you believe it should, and frustrating when things begin to happen that

you never intended to. Glorious, because your hard work paid off, frustrating because your hours of

hard work just crashed and you didn’t save it. Glorious because your character has life and is moving

around like it is alive, frustrating because when his right arm moves, his left heel moves with it. There

are many strange things that happen in Maya, but the internet has answers to all questions that are

posed by Maya.

Modeling
The modeling process begins with the concept image of the character. My character is very

simple in his shape, so I did not use image planes to model against. And since this character is

symmetrical, I was able to model one side, and then just mirror the geometry. The first step to creating

the model is to insert a cube into the scene. I then split the cube in half and deleted half of the face. I

then proceeded to select the cube and use the Duplicate Special tool to be able to work on both sides of

the model simultaneously. I made sure that the tool was set to “Instance” and the X Scale was set to -1. I

then hit the number 3 on the top of the keyboard. This shows what the shape will look like once it is

smoothed.

Once I saw that the box had become somewhat of a sphere, I put edge loops around the edges.

By placing the edge loops around the edges, the box begins to retain its shape. This also gives a bit more

flexibility when animating later. I proceeded to extrude faces and add edge loops to create the torso.

Since the top of the torso flares, I double clicked the edge loop to select the entire loop and scaled it out

5
until I was happy with the shape. I then extruded some faces off the side of the torso to create the arms.

About midway down to the hand, I extruded again, and that edge would serve as the location of the

elbow.

Next, an edge loop was

added near the center of the

body that allowed for a

separation between the

middle of the body and the

legs. Next, the faces that

excluded the center of the

Figure 3 – 3D Character model version 1 body were extracted to

create the top leg. Then

around where I believe it to be appropriate, I extruded the faces again. The edge loop then signified the

knee. I proceeded to pull the leg down to a size that I was happy with that would be the leg.

Next was to create the clothes, which was easy enough. In Maya you can select faces and

duplicate them by holding “Shift” and then holding down the Right Mouse Button (RMB) to open an

alternate menu. Towards the bottom, “Duplicate Faces” is an option, and this allows one to copy the

topology of the faces selected and scale them away from the body. Figure 3 demonstrates this with the

shirt and the shorts. After the clothes were created, I changed the shirt and the shorts to Wrapper

Deformers, which would allow the mesh underneath (the body) to affect the movement of the clothes.

Figure 3’s skeleton rig was done using the Auto Rig system Maya offers. The Auto Rig generates

a humanoid mesh based off the mesh that you select, and it only works with humanoid characters. It

also provides controls and IK handles that can be used for animation. I proceeded to animate using

6
these controls, unaware of the future problems importing the character was going to bring. Proceeding,

I found there were faces that would break through the shirt and shorts, and so I began to use the “Paint

Skin Weights” tool.

To use this tool, I had to select the mesh and then double click on the tool to open its

properties. In the properties you will see the skeleton hierarchy and how each joint interacts with each

of the vertices. In Figure 4, the option to see the color ramp is selected. The color ramp is used to

determine how much influence a joint has over a vertex. The color black indicates that the joint has no

Figure 4 – Demonstrates the use of the Paint Skin Weights tool

influence over the region. Blue would indicate slight influence, and the warmer the color, the more

influence it has. If the color of the region is white, then the movements of that bone will affect that

region. Although the skin weighting process will occur during the skinning, it can still deform improperly,

which is why skin weighting is an important process to modeling.

Once I was happy with the way the character’s polygons were deforming, I proceeded to

animate. After animation I encountered my first problem. Upon exporting the character, I would receive

a warning that the FBX could not resolve the Deformer meshes. And after trying many different things to

7
attempt and remedy this, nothing I could do would change the fact that there was a deformer mesh,

and the FBX converter really did not like that. Therefore, I had to redo the clothing.

I just deleted the clothes and reduplicated the faces. The next problem came with the Auto

Rigged skeleton. Although this creates a good representation of a humanoid skeleton, I found it very

difficult to import the animations, based off of Mr.AnimatorGameDev’s tutorial. In his tutorial, he states

that the only things in the scene should be the skeleton (which should be at the top of the hierarchy)

and the mesh. With the Auto Rig, there were a lot of references to items in the skeleton, and I had no

idea what belonged to what. Furthermore, if I had just followed the tutorial, the result would be my

character with his arms and legs not affecting the clothing [4].

It was also about this point where I realized my topology could be improved. I began to delete

unnecessary edge loops to reduce polygon count (poly count) and move them around to get more

consistent shapes. The topology of the character refers to the mesh of the character. Good topology will

have very consistently sized polygons. One can achieve good topology by moving edge loops around,

scaling them to be even, and moving vertices around as well to gain a better overall look for the

character. The topology will allow the

character mesh’s deformations to become

better, cleaner, and reduce the number of

artifacts during animation.

Upon attempting to understand how

to work weapons into the game, I discovered

Mixamo [1] [3]. Mixamo is a website that

allows you to upload an FBX of a humanoid

mesh and it will then prompt you to place Figure 5 – Character model (v.2) retopologized and
ready for rigging

8
locators on parts of mesh. After this, Mixamo will rig a skeleton and bind the skin to the skeleton. I

found this to be a great asset and it saves so much time with skeleton creation. Mixamo also has an

extensive library of animations that may be downloaded with your FBX. The animation will be baked into

the bones and is editable when the FBX is placed back into Maya [1]. I went ahead and deleted the

animations to work on my own.

One thing to note about Maximo: when making your model, the legs of your model should be

spread out a little, like the image in Figure 5. This will ensure that the polygons in the legs are skinned

properly and will not have unwanted deformations.

Rigging
After the model is reopened in Maya, and the animations are cleared out, it was time to add IK

Handles and Controls to the character to aid in the animating process. Once the IK Handles were placed

on the arms and legs of the character, I added controls to them.

To create a control, begin by creating a NURB circle. Activate Snap To Point at the top of the

Maya screen, and place the circle at the point that is the base of the IK Handle. Next, Freeze the

transformations: this allows for a reset position. Further, it is possible to alter the appearance of the

NURB circle by RMB hold and selecting “Control Vertex”. This will display vertices around the circle that

will alter its shape. Then, once you’ve shaped the circle, select the shape, then hold CTRL and select the

IK handle. In the rigging menu, select Constrain and select Point. The controller will now move the leg/

arm. The controller still does not have the power to rotate the leg, so to do that, select the controller

again, and then select the joint that is at the end of the IK handle (i.e. ankle bone/ wrist bone) and then

Select Constrain and the head to Orient and open the tool menu to ensure that Maintain Offset is ticked.

If not, there joint will be mangled [2].

9
To deal with spinal movements, the model will need an IK spline for the spine. Select Skeleton

and click on Create IK Spline. From here, select the hip joint and then select the joint below the neck.

Once the spline is created, Maya also creates a NURB curve. By holding down D, move the pivot of the

new curve to the hip joint position [2].

Next, create a controller for the Hip by placing a NURB circle at the point, freezing the

transformations, and creating a point constraint like previously done. And similarly, begin to do the

orient constraint; however, only constrain on the X axis and Z axis (done by entering the tool settings

again and checking the boxes that have X and Z). Next, go to Windows->General Editors-> Connection

Editor. A window should pop up that has two sides. Select the Hip Controller from the Outliner and then

Reload Left in the Connection Editor. Next, select the IK Spline on the spine and then Reload Right in the

Connection Editor. On the left side, scroll down to find “rotate” and select “rotate Y”. On the right side,

find “roll” and select it. By following this process, this will allow for tilting and bending at the hips [2].

The chest controller is a bit different from the other controllers. Place the chest controller in the

correct area and freeze the transformations. Open the connection editor again and load the chest

controller on the left side, and the IK spline on the spine again on the right. Again, find “rotate Y” on the

left, and then find “twist” on the right [2].

To control the neck/head, create a control around the head like the all the other controllers,

freeze the transformations and do an Orient Constraint on the neck joint: but make sure that all the axes

are being constrained [2]. If you want the head to move around, you can do a point orient, but this will

be up to you as an individual.

10
Figure 6 – Character with full control rig

Next, create the controllers for the knees and elbows. These controls will also help with proper

rotation of the arms and legs. For the following example, I will be describing the process for the knee;

however, this process applies to both the knees and the elbows. Place the NURB circle at the knee,

move it forward, and freeze its transformations. Then select the leg IK Handle and the knee controller

and go to the Constrain menu and find Pole Vector. Once that has been applied, the knee will follow

that point, enabling a better rotation for the leg [2].

Finally, the Master Controller is the last control that is needed.

Simply place a NURB circle where you want to main controller to be,

freeze its transformations, and then parent all the controllers to the

main controller. This controller will control overall movement of the

character as seen in figure 6 [2]. Since the controllers are

independent of themselves, it would mean that all the controllers


Figure 7 – Example of how
the Controller Hierarchy need to be moved independently, and they do not work with each
should look

11
other. The best thing to do at this point is to do some further parenting to ensure a more natural

movement of the character for animation. The Hierarchy should look something like figure 7 [2].

Animating
Something to always consider when animating are the 12 principles of animation, which were

described by Disney Animators Frank Thomas and Ollie Johnston. These men worked on all the Disney

animated movies starting from Snow White and the Seven Dwarves until Fox and the Hound. These 12

principles set a standard when animating to make the animations feel more lifelike. Mr. Alan Becker has

a video on YouTube that describes each of the principles in great detail [5].

Animating the character for the game was a lot easier than the rigging process. The first thing I

do is save the T pose as the first pose, that way I always have a blank slate that I can copy and paste

later down the line for any poses that need a total reset. Next I created a relaxed, standing pose and

began to animate an idle animation, which would play when the player is just standing around. I began

thinking about what he should do, and I had no idea what I wanted him to do.

To help get ideas, I stood up, and paid attention to myself as I just stood in one spot. It’s

interesting to note that we, as humans, move a lot even when we stand still. We will shift our weight

from one leg to the other, look around, scratch ourselves, and a myriad of other things. I then figured

out what I wanted my character to do when idling and set to animating it.

One thing I would do is start the animation with a one second interval between the poses and

see how that felt. After watching it, I would adjust the timing by selecting the animated keys, and sliding

them either forward or backwards depending on if I wanted the motion to be quicker or longer

(respectively).

If I already knew how long I wanted the animation to be, I would: set up the first frame, pose the

character and save it’s pose, jump to the last frame and pose the character, and then animate the in-

12
between poses at equal time intervals. And similarly, if the animation felt too long or short, I would grab

the keyframes and move them around. Sometimes, the animation would feel too choppy, so one thing I

would do is delete keyframes in between poses. This allowed for more fluid animations, as well as more

interesting animations.

If during the animations I notice that there is a problem with the transition between one pose to

the next (i.e. the rotation for the arm goes haywire), I would open the graph editor and check out the

rotations and translations of the controller. If there was too much rotation, I would reduce it by

eliminating a key frame. If there were excessive movements caused by the way the curves are

connecting to the keys, then I would select the affected keys, and flatten the curve out. If neither of

these things would help, then I would begin to add keyframes in the in-betweens to force the animation

to behave the way I wanted it to. I would then see if I could eliminate some key frames to get a more

fluid motion.

Although adding some key frames may help with the animation, there is the possibility that the

animation will begin to look choppy, at which point I would try another technique. I would zero out the

rotations on the handle at the end of that animation, and then re-rotate the handle until I get the

behavior I want and then save the key frame. Although this initially deforms the character more, it

allows for a smoother interpolation between the points.

Another thing that should be done is to exaggerate each of the animations. If one tries to keep it

too realistic, the animation falls flat. By adding some extra bending, twisting, and making animations a

bit more dynamic, the animation then truly seems to become alive.

When it comes to exporting the animations to Unity, Mr. AnimatorGameDev’s tutorial really is

the one to watch. In the tutorial, he explains two different methods of exporting the animations to

Unity. One of the methods requires the following steps. I had to export everything to an FBX, except for

13
the animation. This FBX then gets opened in a new scene in which all the IK handle controls, get deleted,

leaving only the mesh and the skeleton. The skeleton must also be at the top of the hierarchy. Next,

export all again and overwrite the FBX file. For this example, I will refer to this FBX as FileName.fbx.

Next, I had to go back to the scene that had the animations and select all the joints. Once all the

joints were selected, I had to Bake the simulation into the joints [4]. Baking the Simulation is as simple as

clicking Edit -> Keys -> Bake Simulation. This then goes through every frame and saves all the selected

bone’s position as a key frame. These animations are saved into the bone, and this will render the IK

Handles and their controls useless, therefore it is very important to not save the project in this state.

Next, select all of the joints and export selected. This time export the animation with the skeleton [4].

To make things a little simpler in the transition to

Unity, it is helpful to save the names of the animations as

such “FileName@Animation.fbx”, where Animation is the

name of the animation that is going to be saved. I created a

copy of this FBX for each of the animations and renamed


Figure 8 – An example of how to
them according to their proper animations. For clarity, refer
name the animations [4].
to Figure 8. By following this example, when the models and

animations are imported to Unity, the Unity project will map the animations to the appropriate Game

Object [4].

Development in Unity
Unity is a game engine, which as defined by Unity, “is the software that provides game creators

with the necessary set of features to build games…”[6]. Unity provides a plethora of features that will

allow one to create any game with any accessibility. The imagination of the creator is literally the only

14
limit to this game engine. At first it is intimidating due to all the bells and whistles that are introduced to

the creator; however, it becomes easier after exploring the game engine for a while. Unity allows you to

create your own game assets, import them, add scripts, and save the game object as a “prefab”.

Unity also has a flow of execution within their code that is derived from its parent class

“Monobehavior”. This script allows the components to be used as they are in the game engine. This

script ensures a flow of execution for all the scripts in Unity. The flow of execution is dictated through

the functions that auto populate the C# script. Also, it is a good thing to note that every object in unity is

accessible via script.

Not only can animations be imported to Unity, but animations can also be created in Unity. With

creating animations in Unity, one can even tie special events to animations and game moments that

require specific timing can be handled with what is known as an “animation event”. Unity will allow for

the creator to animate the player, as well as any game object if that object has an animation controller.

On top of creating animations, Unity also has a particle system in the game that allows one to

create special effects such as fog, fire, water, and so much more. The particle system can even be used

to emit text and images to convey information to the players much more efficiently than any other form

of storytelling. It may add flourish to your character or allow for things to disappear easier in a cloud of

dust. The possibilities are endless.

Scripting
Basics
Anytime you create a script in Unity, it is derived from the Monobehavior class, which gives the

components the ability to be drag and dropped onto game objects. With this capability, it is even

possible to access different components on the same game object, or even in its children with simple

functions. Each function in the base Monobehaviour class gets called at specific times during execution.

Figure 9 depicts the flow using a flow chart.

15
The main functions I used in

this project were: Awake(), Start(),

Update(), FixedUpdate(), and

OnTriggerEnter(). There was also the

use of the OnDisable function for

delegtates, which will be covered

later [7]. Each of these function

serve as a purpose. For instance,

Awake() Is the very first function

called on a game object [7].

I used Awake() quite a lot, and

that was to do some preliminary

initializations of certain game object.

More specifically, I would tend to

use awake to follow a create a

singleton of a game object.

A singleton is a game object

that can only have one instance

running at a time [8]. For example, I

have a prefab store that holds all the

prefabs I need to instantiate a

certain game object. The game

objects are stored in class called


Figure 9 – Flow of execution in the Unity Game Engine [7].
PrefabStore as public game objects.

16
This allows me to grab the prefab directly from my project and drag and drop it into position in the

inspector. To ensure that this Prefab Store only has one instance running at a time, a public static

PrefabStore variable is declared and instantiated to null.

In the Awake function, I check to see if the instance is set to null: and if it is, then the game

object’s instance is set to the current instance of the prefab store. If the game object is not null, then

there are two instances currently running. When this happens, the script will delete the latest instance

of the Prefab store. By doing this in Awake(), it is guaranteed to be among the first things done at

runtime.

Start() is then called after Awake(). Start() is only called once after Awake() per component [7].

As another initializing step, it is used to initialize objects that need to be initialized after Awake() has

been called. For instance, in this game, the main character has two colliders at the end of his hands with

a Melee.cs component. This component needs a reference to the main character’s stats, which is held in

a script called PlayerStats.cs and attached to the root game object. In order to ensure that this works,

Unit has a couple of different ways to ensure the instantiation of an object; but considering the flow of

script execution order, it would be simplest to instantiate everything in Awake() for PlayerStats.cs, then

link the instance reference to Melee.cs in the Awake() function.

Although the link between PlayerStats.cs and Melee.cs is simply solved by rearranging the code

into different functions, there are instances where it is necessary to ensure that one script runs before

another. Unity allows us to control the order of execution of each script as well. In order to order the

scripts, click on Edit -> Project Settings -> Script Execution Order, and click on the small “+” on the

bottom right hand side of the inspector tab. Upon doing so, a window will pop up with a comprehensive

list of the scripts in the project. You then select the script you wish to add to the Execution order. You

can also rearrange the scripts execution by dragging the scripts to the respective locations.

17
After Start(), FixedUpdate() is called. This function takes care of any of the in game physics. It is

in these frames that player movement should take place, since this will allow for proper collision and

trigger detection when objects collide with one another. This function can also be called multiple times

in one frame, depending on the step increment dictated in the project settings for physics calculations

[7]. I used this function for the movement of the player. The player would be able to move, only if there

is nothing preventing him to move (indicated by a “canMove” Boolean). In this manner, the player will

Figure 10 – Snippet of code from PlayerControler.cs


be able to walk wherever, and the in game physics engine can keep track of the collisions. Similar to

FixedUpdate(), Update() is another call back function that continuously runs. Update() is called once per

frame and can also be used to cause movement; however, it was mainly used in this project for timers

and finite state machines, which will be discussed later on. Similar to FixedUpdate() and Update(),

18
LateUpdate() is another function that is called every frame; but, LateUpdate() is called after Update().

Any camera movement is recommended to be placed into this function, which would allow for the final

physics calculations to be made and allow the camera to follow or move to a position more smoothly.

Finally, although certaintly not the final function to be covered, is the OnDisable() function. This

function is called up deactivation of a game object, and usually where one would put code that occurs

only when something needs to be taken care of before the game object is destroyed. As mentioned

previously this would be were one would unsubscribe from delegates [7].

These basic functions are available to any object that derives from the Monobehaviour class,

which is necessary for a script to be attached to an object. These functions allow us as game developers

to begin constructing the game using different techniques and game design patterns. Overall, these

functions are the building blocks of the game engine and game loop. It is also important to note that

these are not the only functions in the library that are provided with the Monobehaviour library. Some

of the others will be covered in later sections of this report.

Finite State Machines


One of the first things to understand in Unity, is to truly understand the power of Update().

Since this function is called once every frame it is a place of constant change, and therefore one thing

that can be done with it, is to create a Finite State Machine (FSM). An FSM is a “computation model

that…can be used to simulate sequential logic” according to Moore and Gupta [9]. These systems are a

collection of discrete behaviors that accept some form of input and based on the input, the behavior will

change.

19
To implement this in Unity, the FSM needs an enum to hold

the states, a variable that keeps track of the current state, a switch

statement, and a function for each of the states. I’m going to

describe the process of an FSM using the fireball ability in Alchem

for a bit of clarity.

Figure 11 demonstrates the states for the FSM, each one of Figure 11 – Demonstrates the
enum for the fireball ability
the states corresponds to a function that is called in update, as

illustrated in figure 12. The logic behind this is that the state variable is initialized to Ready. Upon

entering Update(), state is then evaluated, and the switch statement allows the correct function to be

executed. For instance, in the READY function, the machine waits for the proper key to be pressed to

proceed to the Targeting state.

To change the state, the state variable is updated when the input is received, using an if

statement as below:

if(Input.GetKeyDown(Trigger)){
//do stuff
state = States.Targeting;
}
The preceding code segment executes a

function that returns true on the one frame that

the Trigger key has been pressed by the user

and therefore allowing the state change by just

changing the state variable to the next state.

Once the state changes, the update

function will be called again upon the next

frame and evaluate the state variable again,


Figure 12 – FSM in Update

20
then jumping to the TARGETING function, and so on and so forth until it reaches the Cooldown state, in

which the ability cannot be used again for a set number of seconds. After the cooldown time has passed,

the FSM returns to the Ready state through a bit of code.

The great thing of the FSM is that it allows for a program to behave in a specified way based on

inputs that we as game developers can dictate. There are downsides to it, however. When there is a

bug, it may be a bit hard to determine where it is coming from, and therefore it is imperative to test at

all points during the development of the FSM.

For example, at one point in time during the creation of the fireball ability, the FSM would enter

the Cooldown state, and then immediately jump to the Ready state, which is not the desired behavior.

The idea was to have the FSM stay in the Cooldown state until the cooldown timer had made it to the

time specified in the ability’s code. The solution was to use a helper function that would change the

state, and another one of the Monobehaviour functions, called Invoke, signature below:

public void Invoke(string methodName, float time); [7]

This function would call a function with a delay in time number of seconds.

Although this would solve the issue of changing the state to Ready at the appropriate time,

another issue that arose was that this function was being called every frame while the FSM was in the

Cooldown state. In other words, if the game had 32 frames per second, then the Update function would

be called 32 times in that one second, and by extension, COOLDOWN() has also been called 32 times.

Therefore, the function would then change the state to Ready 32 times after the initial change at the

cooldown time.

To combat this newly risen problem, a latch was implemented. A latch is a simple concept that

acts as a one way lock. For instance, there would be a bool that is accessible in multiple state and

initialized to true and is left alone until it is time to lock it, in this case, after the first time Invoke is

21
called. If we then place the Invoke call inside an if statement checking if the latch is true, then we will

have a COOLDOWN function that will invoke the helper function to change the state once.

A general tip would be to draw out the FSM with circles representing each state and arrows

pointing to other states as transitions. Plan out what the transitions will be and think about the fact that

this function that represents the state will be called once per frame. Another tip is to try and limit the

number of function calls in each of the states. For small games this would not be a major issue;

however, the more function calls that are made in an Update function, means that in one frame the

game must call and return from each of those function calls. The more calls that are made, the longer

that it will take to finish processing that one frame, which may then lead to a jumpy, laggy game.

Some Game Design Patterns


During the course of a game’s development, one will run into problems that require some

ingenuity to solve. Although there are plenty of unique challenges the developer may need to face,

games can have problems that do have generally well understood problems and solutions that can be

implemented. Singleton and Observer patterns are two amongst a few design patterns that help with

some prevalent issues while working on a game.

The Singleton design pattern, as mentioned previously, allows for only one instance of an object

to exist at any given time [8]. This can reduce the time it takes to find an object if there is only to be one

instance of that object present in the game. In the example of Alchem, I did mention the singleton that

contains the prefab store that is used for the instantiation of various objects (see page 16). However, I

will be discussing another singleton used in the game, the player.

In Alchem, there should only be one player in the scene at any time. And since there are a lot of

things that happen that focus on the player, it is best to have a way to get the player that would be

faster than searching the hierarchy for the player. By creating a public static instance of the class Player,

22
I was able to access the player at any time by any

class. Although this may not be recommended for all

games, this was a good idea for the problems that

were unique to Alchem.

The Singleton pattern is created by having a

Figure 13 – Example of singleton pattern in static instance of a class be initialized to null, then in
PlayerStats.cs from Alchem
either Awake or Start (depending on how things are

initialized in your game) the instance is checked. Since the instance variable is static, all instances of the

class will share the same variable. If the instance variable is null, that would mean that no other instance

of the class exists, and therefore that instance would become the singleton. However, if the instance

variable is not null, then there exists an instance already, and the game object that holds the new

instance would be destroyed, ensuring that the old instance is the only instance that is running.

The singleton pattern’s strength really comes from its simplicity in access. For instance, in

Alchem, some of the ability targeters can only be moved in relation to the transform of the player. And

since these targeter’s positions would have to constantly be updated, their positioning is set in an

Update function. If one were to get the player component by using any other means, then the unity

engine would have to search each component on the game object until it found the correct component.

This would add a major strain on the system. The singleton allows for the player component to be

accessible to anything that would need to read information from it, without needing a cached reference

to the game object.

As a singleton pattern’s strength comes from its simplicity, the observer pattern’s strength

comes from its ability to reduce polling. Polling is

23
24
Table of Figures
Figure Page Number
Figure 1 3
Figure 2 4
Figure 3 6
Figure 4 7
Figure 5 8
Figure 6 11
Figure 7 11
Figure 8 14

25
Resources
[1] "Mixamo", Mixamo.com, 2019. [Online]. Available: https://www.mixamo.com/#/.

[Accessed: 26- Jan- 2019].

[2] E.Vieira, Autodesk Maya 2018 - Simple Character Rigging Part 1 of 3.

https://www.youtube.com/watch?v=cOokoFED7QE&list=PLSUtzVs5X8URhplyNJ24ZztO

3tmEPSJfG: YouTube, 2019.

[3] B. Chang, Unity and Maya animation tutorial: item and weapon attachments.

https://vimeo.com/213514630: Vimeo, 2017.

[4] AnimatorGameDev, Maya to Unity Tutorial: Exporting Rigs and Animation as FBX.

https://www.youtube.com/watch?v=4WsqAlMDpTQ&t=624s: YouTube, 2017.

[5] A. Becker, 12 Principles of Animation (Official Full Series).

https://www.youtube.com/watch?v=uDqjIdI4bF4: YouTube, 2017.

[6] "Game engines - how do they work? - Unity", Unity, 2019. [Online]. Available:

https://unity3d.com/what-is-a-game-engine. [Accessed: 21- Feb- 2019].

[7] U. Technologies, “Unity User Manual (2018.3),” Unity, 13-Mar-2019. [Online]. Available:

https://docs.unity3d.com/. [Accessed: 05-Apr-2019].

[8] Nystrom, R. (2019). Singleton · Design Patterns Revisited · Game Programming Patterns.

[online] Gameprogrammingpatterns.com. Available at:

http://gameprogrammingpatterns.com/singleton.html [Accessed 15 May 2019].

[9] Moore, K. and Gupta, D. (2019). Finite State Machines | Brilliant Math & Science Wiki.

[online] Brilliant.org. Available at: https://brilliant.org/wiki/finite-state-machines/

[Accessed 23 Jul. 2019].

26

You might also like