AI Component - Boid Movement

In 1987 a Researcher names Craig Reyonds published a conference paper (Reynolds, 1987) that described the movements of flocks in a set of 3 simple rules. 

  1. The members of a flock tends towards its center
  2. The flock aligns its rotation
  3. The flock tries to not overcrowd its self

These were refereed to in the paper as Cohesion, Alignment and Separation respectively. The goal being to create a realistic looking emergent movement system for birds/fish/cows where each agent, called a boid, makes its own decisions. 

So you are a developer and want to have a flock system in your game! Maybe a realistic group of birds? Fish in a koi pond? DO NOT USE THIS SYSTEM. You can get 90% of the way there with a single actor containing multiple meshes. It will be easier to test as the movement is not emergent and you will be spending much less in terms of performance.  

Use Reyonds system if - 

  • you have multiple flocks that you want to interact with each other
  • you want parts of the flock to react to something and then unaware members of the flock to react to the reaction
  • you want to decentralize the spawning of flock members

Here is how we implemented it and the levers we gave the level/game designers to use.  

We started with a few principles, we wanted the performance to be scale-able, we also wanted to ensure that everything was as event driven as possible.
Querying currently overlapping actors is an alluring siren when doing work like this but that route leads you to performance issues. 
We also wanted the majority of the attributes to be editable at runtime.

Behavior Tree


Each of the boids runs this tree.

Of these the most expensive is easily the find FindNeighbours node. For lower performance devices or situations where the flock is going to get particularly big we can add a cooldown to the root node to decrease the tick rate.

FindNeighbours - Queries the list of currently relevant Boid and gets average rotation and location to be used in future nodes.

CollisionAvoidance - uses an Ahead method, where we see if there is a object coming up that, if we keep out current course we will overlap with. If there is it checks to the left and right of the Boid to see if there is anything stopping it from turning.

Cohesion And Separation  - Using the central location of the current Boids the boid is aware of the rotation is altered to either angle the movement towards the center, or if the Boid is suffocating, away from the center.

Alignment of Rotation - This simply sets the rotation to the average rotation of the other Boids in the area.

Alignment - This node simply updates the Actor with the calculated rotation 

Design Levers

Levers For Use.png

Most of these are self explanatory, speed & rotation speed are the easiest. Speed variance, when enabled, gives the actor a random speed in the range of the set speed. Noise introduces an intermittent wobble to the movement of the actor. Suffocation decided how many other actors the Boid can be near before it starts to move away from the center of the group. 

Hard turn rotation is the rotation speed that is used when turning to avoid collision. This should be increased if the level is particularly angular. 

 The the Rotation Detection Distance is used to decide the area a given Boid uses to find the other Boids it aligns its rotation to. The Separation Detection Distance is used for the decision making when deciding on the center point to move towards. 

The reason these are separate collisions is there is swarms that will want to stay together but not necessarily point in the same direction, think cows eating grass in a field. Other swarms will care about pointing in the same direction, but not being all that near each other, think a fleet moving through space. 


This system gets a bit weird once you have areas with sharp corners or start thinking in 3D space but is a great place to start. 

Reynolds, C. W. (1987, August). Flocks, herds and schools: A distributed behavioral model. In ACM SIGGRAPH computer graphics (Vol. 21, No. 4, pp. 25-34). ACM.

Psychology Paper Analysis - 2

Heuristic evaluation for games: usability principles for video game design.

- David Pinelle, Nelson Wong, & Tadeusz Stach.
In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (pp. 1453-1462). 

AKA Your game needs to do more than compile

In this paper the researchers when through a few hundred game reviews and identified all the issues that we reviewers highlighted. They then coded the issues, then created a set of heuristics that would solve these coded issues.
Quick side note here, coding is psychology is essentially categorisation but with a little more rigor. 

Not to bury the lead, here are the heuristics! 

  1. Provide consistent responses to the user’s actions. 
  2. Allow users to customize video and audio settings, difficulty and game speed.
  3. Provide predictable and reasonable behavior for computer controlled units.
  4. Provide unobstructed views that are appropriate for the user’s current actions.
  5. Allow users to skip non-playable and frequently repeated content.
  6. Provide intuitive and customizable input mappings.
  7. Provide controls that are easy to manage, and that have an appropriate level of sensitivity and responsiveness.
  8. Provide users with information on game status. Users make decisions based on their knowledge of the current status of the game.
  9. Provide instructions, training, and help. 

                (Pinelle, D., Wong, N., & Stach, T., 2008, p1458). 

How to use this information -
If you are stuck in an endless bug prioritization the heuristics provided can be a tool to argue that your pet bug should be the one that gets fixed. 


Clipping issues - easy to identify, impossible to fix

Emotional response patterns and sense of presence during video games: Potential criterion variables for game design.

- Ravaja, N., Salminen, M., Holopainen, J., Saari, T., Laarni, J., & Järvinen, A.
In Proceedings of the third Nordic conference on Human-computer interaction (pp. 339-347). ACM.

AKA What can we actually detect in our users

This is a statistical grab bag approach to research, the team involved pretty much grabbed every psychological measurement scale and questionnaire they had nearby and asking people to fill them out after a gaming session. 

This is an old paper, done in 2004, but it predicts the rise of the souls style of game. They asked participants to play 1 of 4 different types of games and fill out a ton of questionnaires afterwards. 

They tested for - Violence, Arousal, Joy, Depression, Pleasant Relaxation, Fear, Anger, Spatial Presence, Engagement, Ecological orientation, Negative effects, Impulsiveness, Sensation seeking, Self forgetfulness. All showed different levels after playing different games. 

How to use this information -
When you are doing user testing, instead of asking your own random questions use one of the standardised scales that are out there. There is peer reviewed scales for Anime Genre Fandom, Game fan entitlement, Fantasy immersion. Its all out there, use it. 


Mass Effect scanning is slightly less tedious then Reviewing Psycology Scales

Influence of temperament and character on online gamer loyalty: Perspectives from personality and flow theories. 

- Huang, H. C., Huang, L. S., Chou, Y. J., & Teng, C. I. (2017). 
Published in Computers in Human Behavior70, 398-406.

AKA These hoes ARE loyal

This was a larger regression study that had the goal of analyzing a players traits or temperament to see if they will be "loyal". Loyalty in this regard is defined as intent to return to that game, in industry parlay, sticky. This study focused on online, service orientated games.

The main outcome was that ***SELF REPORTED*** levels of skill and challenge predict how loyal the player is. If the player is highly skilled and they feel challenged they will return. The self reported aspect of this is highlighted here as measures that directly ask a participant something tend to be unreliable. People tend to both under and over or give wildly incorrect assessments on self reported scales. 

Flow, as it was defined in this research, did not reach the level of statistical significance. This makes sense as they used telepresence (the feeling of inhabiting a character or place) as a contributing factor and that is hard to maintain over 1000+ hours of play.

How to use this information -
If you are building a PvE, game as a service, the challenge needs to keep scaling if you want to avoid losing players.


I like to think my incredible skill is why I have 3000 hours in TF2


Huang, H. C., Huang, L. S., Chou, Y. J., & Teng, C. I. (2017). Influence of temperament and character on online gamer loyalty: Perspectives from personality and flow theories. Computers in Human Behavior70, 398-406.

Pinelle, D., Wong, N., & Stach, T. (2008, April). Heuristic evaluation for games: usability principles for video game design. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (pp. 1453-1462). ACM.

Ravaja, N., Salminen, M., Holopainen, J., Saari, T., Laarni, J., & Järvinen, A. (2004, October). Emotional response patterns and sense of presence during video games: Potential criterion variables for game design. In Proceedings of the third Nordic conference on Human-computer interaction (pp. 339-347). ACM.

Psychology Paper Analysis - 1

The relationships between the structural video game characteristics, video game engagement and happiness among individuals who play video games

- Derek A.Laffan, John Greaney, Hannah Barton, & Linda K. Kaye
Published Computers in Human behavior 65 (2016) 544-549

AKA What to publish when all your responses come from the Dark Souls community

To abridge the outcomes of this paper, the core conclusion predicts whether a person enters a state of flow (the I've forgotten to eat, wait is it 2AM on a weekday? style of flow) is if the player likes punishing gameplay and good presentation and if those aspects are present in the game they are playing.

So this is a weird one. The paper has a large sample but it acknowledges that the participants were recruited via snowballing (i.e. "Once you are done please share this with your friends", its a cheep way to get participants but means they tend to be from similar demographics). There is also some issues with how the questions were asked. Participants were asked to think about their favorite game and answer based on that but they didn't control for time spent gaming (yer mums favorite game may be her only game). 

The researchers also talked extensively about Mihaly Csikszentmihalyi's definition of flow (Csikszentmihalyi, 2014) as a justification for the research but uses a scale that was not based on his definitions. Not a big deal but it bugs me when researchers do that. 

The researches also discuss that social and reward based aspects of a game are important to flow but only when the player valued those aspects. So maybe people get in a state of flow when playing a game that had things they like in it? 

On a side note they also correlated a lower score on the happiness questionnaire with a higher score on the flow scale. 

What a Game designer can learn - 
Make games that the people who like whats in your game will like? Market to sad people? (Don't actually do that) Dark Souls players get way more absorbed than other gamers? 


Excuse me sir, could you please fill out this 176 item happiness questionnaire? 

What makes continued mobile gaming enjoyable?

- J Merikivi, V Tuunainen, D Nguyen
Published Computers in Human behavior 68 (2017) 411-421

AKA People like fun??

This paper makes the fundamental and mind blowing correlation, you may need to take a seat.....

People continue to play a mobile games if they enjoy them.

I know, shocker! As part of the research in this paper the team survey 207 people regarding there favorite mobile game to see if notable trends can be discovered.

This is some basic stuff right here so lets just look at the highest and lowest rated trends in the 33 item survey. 

The lowest was -"The game I most often play is: Surprising" (Merikivi et al, 2017, p419). This is (quasi-ironically) unsurprising, knowing how tight the core gameplay loop of the average mobile game is. The second lowest measurement was -"Playing the stretches my capabilities to the limits" (Merikivi et al, 2017, p419). Another none statement. Very few are looking for such an active experience from there mobile phone game, and if they were they would buy a switch. This also had the largest standard deviation (Google it, this ain't a stats post) of any measure in the survey. There are some people out there who really aren't being stretched to there limits. 

The highest was -"Learning to use the game is easy to me" (Merikivi et al, 2017, p419) with one of the lowest standard deviations. Everyone finds their favorite game easy to use. There is probably a lot of bias on display here. I can just imagine people saying "Of course the game I play every day is easy to play!!!"

On a side note this paper referrers to games as -"Hedonistic Information Technology" (Merikivi et al, 2017, p412) which I thick we as an industry should adopt.

What a Game designer can learn - 
Ease of use, or player perceived ease of use, is something that is important in every sticky game. Build your tutorial good. 


A stripped green???? I'm very surprised

When newbies and veterans play together: The effect of video game content, context and experience on cooperation

- Y Jin, J Li
Published Computers in Human behavior 68 (2017) 556-563

AKA Co-op is hard and sometimes researchers don't know what they are doing. 

This is a weird one, the researchers published 3 surveys/experiments all related to Co-op behavior as part of the same paper. We'll break em down one by one but quick spoiler alert, 2 are uninteresting and the 3rd is just bad science ¯\_(ツ)_/¯

The first analysis looks at the correlation between co-op games and pro-social behavior (this is defined as behavior that has a positive impact on those around them, in comparison to antisocial behavior) via a survey. They found that those that play more co-op games tended to be have more pro-social interactions. "But Authors!", I hear you cry out. "That is does no meet scientific rigor! The causal relationship can not be established as there is no manipulated independent variable!" Congrats, you have pointed out the same issue the researchers noted. So -

Experiment two was a test to see how people behaved in a version of the prisoner dilemma, where generosity is more financially beneficial for everyone involved, after they played different types of co-op games. The results were that those who played violent co-op games were more generous than those that played non-violent co-op or single player games. The games chosen were important here, for co-op the games were Portal 2 and COD. Frustration was not controlled for and Portal 2 (in my opinion) is more likely to create a sense of animosity due to it's more interdependent co-op nature. Can't carry a puzzle game alone.

This is where the paper falls off the stupid cliff.

Experiment three was meant to measure pro-social behavior in those that play with people of different skill levels. It does not succeed at all.  

They wanted to use League of Legends as the test game but made three mistakes in the methodology planning.

  1. They defined high skill players based on amount of time spent gaming per week. That's right! Your granny, who grinds Candy Crush, would be considered a high level LOL player for this study.
  2. They ended the experiment after 20 minutes....... Quitting a LOL game early is simply not how the game was designed.
  3. They did not verify the two participants ever interacted in the game. Some LOL roles barely see each other till the laneing phase is over.

I'm not even going to share the results, you can read them for yourself but they are scientific unsound (to put it politely) 

What a Game designer can learn - 
Don't trust all research on games.


Going AFK, for science! 

Disclaimer - I make mistakes all the time. I like to make 4-5 good mistakes before breakfast. If I've made a mistake in any part of this post please let me know! Also all science is good, even bad science. If you assume good faith then even questionable papers can be a stepping stones to better answers. 


Csikszentmihalyi, Mihaly. "Toward a psychology of optimal experience." Flow and the foundations of positive psychology. Springer Netherlands, 2014. 209-226.

Jin, Y., & Li, J. (2017). When newbies and veterans play together: The effect of video game content, context and experience on cooperation. Computers in Human Behavior68, 556-563.

Laffan, Derek A., et al. "The relationships between the structural video game characteristics, video game engagement and happiness among individuals who play video games." Computers in Human Behavior 65 (2016): 544-549.

Merikivi, J., Tuunainen, V., & Nguyen, D. (2017). What makes continued mobile gaming enjoyable?. Computers in Human Behavior68, 411-421.

Hello World

As part of my Grad studies in Cyberpsychology I read a lot of game, gamification, and game culture related research papers. Some are great! Some are absolute crap. Most however sit somewhere in the middle.

The great ones tend to have a few things in common. They have an understanding of games and the psychology principles that apply, they have modern experiments (a 2016 paper that talks about Gameboy usage ain't good), awareness of there own flaws, and they have a conclusion that can be built upon by academics or people in the Industry can use.