Tag Development

Demox AI #2 – Sensors

I’ve made some relatively large updates to AI behaviours and sensors before the summer arrived- something that might sound familiar to you who follow me on Twitter, since I’ve shown some of these things there already. Haven’t posted any development logs touching the subject though, primarily since I tend to do other things than sitting in front of the nerd-station when the snow finally melt… anyways, let’s get on with it!

Dynamic Point System Behaviour

I’ve talked about the DPBS earlier already, if not you can find the post HERE, the AI in Demox will continuously evaluate their current situation and perform actions relevant to the conditions of said situation.
A new behaviour has been added to this system, to equip items found near the AI agent and to throw the equipment if it’s beneficial for the AI agent.

Clip showcasing both new AI hearing sensor, aswell as AI item useage behaviour

A quite simple behaviour but it opens up for more combat challenges, since AI who can collect weapons will increase their damage output. Also, if the AI can throw the weapon they can do so if the player tries to run away from them- giving them a ranged attack opportunity.


Situational Awareness

Furthermore, I wanted my AI to be a bit more aware of their surroundings, earlier the player could run and jump around right behind the AI and they wouldn’t notice a thing- two new sensors have been implemented, namely, “Smell Sensor” and “Sound Sensor”. Two components that simulate exactly what they sound like.

Smell Sensor

The Smell sensor is not used by all AI, at first I implemented it only for the Maradeur monster since it has no eyes. Another primary way of detecting hostiles was required so the smell sensor felt appropriate. Recently I’ve also used it for other creatures to some extent.

The way it works is that characters in the game feed the game master logic with some information at regular intervals. First, their own smell– some AI can determine what creature they’re facing by their smell. Second, a value to determine how far the character’s smell can be detected is sent. Information about the current area the character is in is sent aswell and the world position of the character when the smell data was registered.

Area and Character smell volumes
The Maradeur

The idea is that some areas with strong smells should cloak a character’s smell- not completely but for a pre-defined percent of the character’s scent range.

An example is in an area with very strong smells, a character who has a scent range of 8 meters would receive a 75% smell-cloak benefit, resulting in a reduced scent range of 2 meters, therefore making it alot harder for other AI to detect by smell. This can be seen in action in the attached Twitter-post to the left, the red circles represent the player character’s scent range. When entering different area volumes ( colored boxes ) the size of those red circles are reduced.

As can be seen in the video the trail of smell information left behind the player shrink over time.

So in order for the AI smell sensor to make use of this information it must first be collected from the Game Master, all registered smell information is filtered to find only information relevant to the AI, ie. smell data with a location reference within x meters of the AI’s world position.

The “x meters” is the smell sensors detection range, while the detection range is not printed in the video shown here it can still be noticed since the AI does not have to be inside the red circles to start investigating a smell. This is because only the sensors detection range volume have to intersect with the smell information’s volume, not the character itself.


Sound Sensor

The sound sensor was implemented since I found it weird that as the player you could jump and run around right behind an enemy and they’ld be completely clueless about what was going on behind their back. All AI characters in the game now use sound sensors, essentially it work the same way as my smell sensor– characters and actions happening in the game feed the game master with data containers holding information about the sound: noise radius (volume), origin (friendly, wildlife, hostile, combat, environment, unknown) and world position.

The AI regularly check the noises registered within the range that they can hear and depending on how the AI character is set up it will interact with the noise.

Combat sounds will make all combatant characters investigate, while non-combatants will get scared, try to hide, or run away from the sound. Hostile sounds ( primarily monster voices & footsteps ) will agitate friendly AI, and scare most wildlife AI, vice verca for friendly sounds, wildlife sounds ( animal voices & footsteps ) is ignored by friendly AI and agitates some of the hostile AI characters. Environment and unknown-flagged sounds depend alot on what type of sound is registered, but these two sound types are either ignored or investigated.

Water Interaction

My previous projects have contained water however, I’ve never put any effort into making it behave like water would. In LBKR for instance the water was purely a decoration, it would not react to objects colliding with it’s surface nor would said objects be manipulated by the water in any way- this made it feel very artificial and, in lack of a better word, un-watery… ( Please excuse my word-slaughtering, keep in mind that english is not my native language ;P )

Water Interaction

For Demox I aim to make the water less of an illusion, and have it actually react to collisions and affect objects that enter the “water volume“. This actually involves multiple object behaviours considering the code, and a whole lot of different particle effects among other things.

So what did I actually want to happend when something collides with the water? That depends on what type of object actually hit the water of course, but let’s have a look at the goals I had set in mind.

Water impact effect, played when a body hits the water volume.

Starting with the water volume itself, there were some important properties to keep in mind. For starters, the depth of the volume which is required to determine what ripple effect to use, small water puddles would for obvious reasons create smaller splashes than if an object fell into a large pond or river. I simply set this property to be the height of the water volumes bounds, and then adjust the volume size in the editor to fit the terrain or water container.
The next property required is the world position along the y-axis that the volume’s surface is located at, this is required when calculating the depth of the volume at a specific location. (Which is done to determine current speed penalty for avatars moving in the volume.)

float vDepth;    //Depth of the current water volume
float vSurfacePoint; //World position along y-axis of the current water volume's surface
float speedPenalty;  //The velocity reduction of this instance
Transform vSurface;  //Current water volume

//Water volumes are identified through the OnTriggerEnter method
public void OnTriggerEnter( Collider other ) {
    if( target.IsDead || _update || !other.CompareTag( Tags.SurfaceWater )) {
        return;
    }

    vSurface = other.transform;   //Assign colliding volume as current water volume
    target.InWater = true;        //Flag target avatar as inside a water volume

    //Determine if the current water surface is deep enough to swim/drown in
    vDepth = MapMaster.Current.FindWaterVolume( vSurface).VolumeDepth;
    if( vDepth >= SharedGlobalProps.SwimDepth ) {
        ControlledFx splash = GameObjectPool.Current.PullCommonFx(     CommonFxNames.W_Impact_Avatar );    //Pull a water-splash effect from the ObjectPool

        if( splash != null ) {    //Play the water splash effect if found
            splash.FireFxCheap( transform.position );
        }
    }

    StartManagedUpdate();         //Starts a custom Update-method
    target.RefreshMoveSpeed();    //Refresh move speed of the avatar
}

//Only viewing a fraction of the OnManagedUpdate method since alot of it's content is irrelevant for this post
public void OnManagedUpdate() {
    ....
    if( Time.time - _lastDepth >= _DepthThreshold ) {
        CalculateWaterDrag();    //Update the depth drag applied to the avatar at regular intervals ( ie. move speed penalty )
        _lastDepth = Time.time;
    }

    if( vDepth >= SharedGlobalProperties.SwimDepth && !target.CanSwim ) {
        if( _curFx == null ) {
            RetrieveFx();    //Pull an appropriate FX from the Object Pool
            return;
        }

        //Play the FX at the avatars position, but lock y-axis position to the surface point
        Vector3 pos = new Vector3( transform.position.x, vSurfacePoint, transform.position.z );
        _curFx.transform.SetPositionAndRotation( pos, transform.rotation );

        if( !_drowning ) {
            _drowning = true;
            _lastDrowned = Time.time;
        }

        target.OnDrown();
        return;
    }
    ....
}

public void CalculateWaterDrag() {
    //Determine how far beneath the surface the avatar currently is located
    Vector3 xzSurf = new Vector3( transform.position.x, vSurfacePoint, transform.position.z );
    float avDist = Vector3.Distance( transform.position, xzSurf );
    
    //Assign current drag / speed penalty
    speedPenalty = ( avDist / vDepth ) * _DepthDragMultiplier;
}

As you can see I simply determine the penalty by calculating how far along the way to the bottom of the volume the avatar has reached. Since this return a value in the range from 0 to 1 I also multiply it by a constant value “DepthDragMultiplier“, in order to get a value suitable for actual speed reduction.

As can be seen in the code, the depth will also determine what type of water ripple effects to play while the avatar is moving around. This also ensures that the correct effect is played if the terrain under the water is sloped and the character is moving from deep to shallow waters or vice verca.

///<summary>
/// Retrieves a water-ripple FX from the object pool and starts playback
///</summary>
private void RetrieveFx(){
    ControlledFx fx = null;

    //Select appropriate particle system
    if( vDepth < _SmallRipplesDepth ){
        //Play a smaller style of ripple effect
        fx = GameObjectPool.Current.PullCommonFx( CommonFxNames.W_PuddleRipples );
    }
    else if( vDepth < SharedGlobalProperties.SwimDepth ){
        //Play common water ripples effect
        fx = GameObjectPool.Current.PullCommonFx( CommonFxNames.W_Ripples );
    }
    else{
        if( target.CanSwim ){
            //Play ripples effect of type "SwimRipples"
            fx = GameObjectPool.Current.PullCommonFx( CommonFxNames.W_SwimRipples );
        }
        else{
            //Play ripple effect of type "DrownRipples"
            fx = GameObjectPool.Current.PullCommonFx( CommonFxNames.W_DrownRipples );
        }
    }

    if( fx == null ){
        Debug.LogWarning("WaterTracker could not retrieve a VFX");
        StopManagedUpdate();
        return;
    }

    _curFx = fx;            //Cache the located effect
    _curFx.FireFxCheap();   //Play the located effect
}
Water ripple effects in action

For ripple and water trail effects there are constant depth values as limits of when to select each type of effect, and each effect has three base variations. There are more variations aswell but for other purposes, such as larger objects or projectiles hitting the surface etc.

What impact / splash effect to use when a collision with the surface occur is determined within the water volume’s code, unlike the ripple effects as described above. It’s a really simple setup of checking the tag of the object hitting the surface, for now this works nice since most objects with the same tag are of similar sizes (ie. characters, items, etc). I already know now that some monsters will be alot larger than average though so in the near future I’ll need to adjust this to scale the splash effect to look good if the large monsters fall in the water aswell.


Buoyancy

One of the most important things in my own opinion are that bodies and some objects should float in water, as they do in reality. I don’t think I’ll go into detail of exactly how this is done since I’m quite certain there are a whole lot of information to gather on the subject out there in Google-land, but to cover some basics- everything that floats will require multiple properties to determine their behaviour in water.

First, the buoyancy of the object, how well does it float in water?
Each object also required values to limit their movement in the water volume, my first iteration of the floating behaviour had the objects jump out of the water over and over again, as if displaying their acrobatic grace- to adress this issue I added two limiters / thresholds. “Float Pivot“, and “Surface Range“, the float pivot is a Vector3 that describes a point relative to the object’s position that symbolizes the most buoyant part, eg. objects that contain air compartments tend to be most buoyant in the compartment where the air is located.

Dead body and limb floating in water

The surface range will be used to compare how much beneath or above the surface the object is and adapt force applied in an appropriate manner.

I also added a property of “water drag“, this property is modified by the current submergence of the object and the end product will be used to manipulate the power of the force or gravity applied to the Rigidbody of the object, objects in water usually have alot more velocity if floating up from the bottom but when at the surface the velocity is no longer as agressive- this was the behaviour that I intended to simulate with the help of this property.

Additionally, a reference to the water volume affecting the object was also required since the depth at the object’s position is compared alot.


Moving Water

While creating effects for the water system, one of the assets created was a waterfall- it looked really stupid though to have a waterfall but when looking at the source, or volume from which the water came from there was no movement. For obvious reasons I decided to create some assets for moving water. This could be achieved by writing a shader to handle I guess, but for example, rivers or streams can often bend and turn alot, and I don’t know how to make a really adaptable shader for this so I decided to try using particle systems for the moving water instead. I’ve only created a prototype effect this far but I like the result, with some more experimenting it could turn out quite nice I believe.

Bodies and objects beeing swept away by a river, using ‘Constant Force‘-triggers

The particles are rendered as simple 3D meshes instead of 2D billboards, and currently use Unity’s refractive water material. As I liked the style of this effect I modified all other water effects to use a similar technique.

But back on track, the moving water presented an obstacle that I hadn’t thought of, objects floating in the water volume would not follow the water flow, but instead just float near the same position. This was easily fixed though by creating triggers that apply constant force to objects within it’s bounds if those objects are in a floating state, and then positioning those triggers along the water’s path.

Example of the ConstantForce script can be seen below:

public Vector3 forceDir; //Direction of the force, relative to this object's rotation
public float force; //Power of the force applied
private List< Rigidbody > _bodies; //List to hold all updated rigidbodies

public void OnTriggerExit( Collider other ) {
     if( (other.CompareTag( Tags.Corpse ) || other.CompareTag( Tags.Item )) && _bodies.Contains( other.attachedRigidbody )) {
         //Remove the target from the list of updated bodies
         _bodies.Remove( other.attachedRigidbody );
     }
 }
public void OnTriggerEnter( Collider other ) {
     if( (other.CompareTag( Tags.Corpse ) || other.CompareTag( Tags.Item )) && !_bodies.Contains( other.attachedRigidbody )) {
         //Add the target to the list of updated bodies if it has correct object tag
         _bodies.Add( other.attachedRigidbody );
     }
 }
public void FixedUpdate() {
    if( _bodies.Count == 0 ) {
         return;
     }
     for( int i = 0; i < _bodies.Count; i++ ) {
         if( _bodies[i ]!= null ) {
             _bodies[ i ].AddForce( transform.TransformDirection( forceDir ) * force, ForceMode.Force );
         }
     }
 } 

( Pictures above are the final result of water volumes after a custom shader was written for water surfaces)


Swimming

Among the final touches for the water interactions I made it possible for characters to swim in deep water, I were able to make use of alot of the code I had already written for the other aspects concerning water interaction behaviour, only some minor adjustments and tweaks had to be made to implement it for the character classes. I did consider some possible obstacles before putting it all together though.

The first obstacle was easy (actually, all obstacles were easy to solve…) to sort out by using a similar technique to what I did for retrieving FX as included in a code snippet earlier in this post.

float submergence = vSurfacePoint - transform.position.y;  //Determine how deep beneath surface avatar is
if( submergence >= SharedGlobalProperties.SwimDepth ) {}   //Do some stuff

If the avatar is a defined distance beneath the surface I can thereby either flag the avatar as swimming, or if not able to swim have the avatar drown from within the if() block.

To stop the player from swimming into infinity, or reaching areas far away that were not intended to reach I added a stamina vital to the player character’s statistics. While swimming the stamina will drain and when out of stamina the avatar will drown. This obviously help when limiting the player’s swim range.

Since the avatar is not allowed to jump when swimming, it was critical to find a way to leave water volumes if there were no ramps or shallow water within the volume that would allow the avatar to start walking on the bottom of the water volume.
I’m making use of the code for mounting obstacles while not in water, by casting a ray in front of the player I can detect if there are any climbable geometry, and if so allow the player to mount said geometry and leave the volume by climbing.

Swimming behaviour in-game

An obstacle I had not anticipated were with floating platforms, as the player jumped or climbed on them they would not make any motion at all, this made it look like they were actually static objects just positioned above a water volume. I weren’t satisfied with this.

I made a quite simple work-around for this issue by attaching a trigger volume to the floating platform objects, when the trigger is activated by player motion a force impulse will be sent to the rigidbody component of the floating platform, making it move.

Demox AI #1

.Factions

So to start things off, let’s go through my primary goals for the AI in Demox.


.DPM

DPM, or “Decisions per Minute” is a simple AI behaviour system I put together for Loot Burn Kill Repeat, which I have continued to build upon for Demox aswell! The name comes from a property field I’ve assigned to the AI character code that determine how many times per minute the AI avatar will calculate a new decision, I guess a less confusing name for it would be “Intelligence”… but… err, yeah!

Worth to note is that despite the AI calculating a decision at regular intervals, the process won’t always end with the AI actually deciding or changing any current actions, since the decision is controlled by multiple aspects concerning the current situation of the AI. I’ve improved the AI’s situational awareness and the DPM system extensively for Demox, compared to what I had in LBKR. Among other things, the AI will check it’s own status,

A check will also be done for the AI’s party members, to determine how many are wounded, escaped combat, have been killed and how scattered the party members are on the battlefield. These are the primary things that are impacting the decision, furthermore, each decision behaviour have tailor-made awareness checks.

AI triggering an alarm, decided with the AI decision behaviour system.

To describe what these “decision behaviours” are, they are basicly instructions for the AI how to determine if the behaviour should be used, aswell as how it is to be executed by the AI. The instructions include simple things like escaping combat, changing target, changing position during combat aswell as more complicated tasks such as triggering traps and alarms, or aiding party members.


.DPBS

When improving the AI awareness there were, as with everything else, some issues to solve, first and foremost- how would I make the AI select the most appropriate action? This morphed my old DPM system into what I now call DPBS, or Decision Point Behaviour System. And yes, again, why not just call it “Intelligence“? … I’m a hopeless case…
When the AI is calculating a decision, all decision behaviours are iterated and compared with each others. They are individually scored depending on the current situation and the best scored behaviour will be selected at the end of the process.

Below is an example from one of the implemented decision behaviours, multiple aspects of the current situation are checked and affects the score of the decision before said score is sent back to the member calculating the AI’s decision.

///<summary>
/// Gets the current score of this decision
///</summary>
///<returns>The score.</returns>
///<param name="agent">Agent Reference.</param>
public override int GetScore( CombatAgent agent ){
    int score = 0;

    if( agent.main.CurHealth > ( agent.main.stats.health * .5f ) || Time.time - agent.alarmAgent.LastEscape < agent.alarmAgent.minEscapeDelay ){
        return ImpossibleScore;   //If the AI is comfortable with it's current health value, or we recently escaped, return a score that prohibits the AI from chosing this decision
    }

    if( !agent.main.canAttack ){
        score += 5;    //If AI is unable to engage in combat, increase score
    }

    if( agent.main.CurHealth < ( agent.main.stats.health * .5f )){
        score++;    //If AI health is less than half of max health value, increase score

        if( agent.main.party != null ){
            for( int i = 0; i < agent.main.party.Length; i++ ){
                if( agent.main.party[i].IsDead ){
                    //For each member of the AI's party who has died, either increase score, or decrease it. (Enemies should be able to be enraged by their friends dying)
                    score += Random.Range( -1, 1 );
                }
            }
        }
    }

    if( agent.main.targetDistance < agent.safeDistance.Sqr() ){
        if( agent.main.CurHealth < ( agent.main.stats.health * .35f )){
            score += Random.Range( 0, 1 );    //If at unsafe distance from target, increase score
        }

        if( agent.main.CurHealth < ( agent.main.stats.health * .2f )){
            score += Random.Range( 0, 2 );    //If health value is critically low, increase score
        }
    }

    if( Time.time - agent.main.LastDmgReceived <= agent.DpmThreshold && agent.main.CurHealth < .4f){
        score += Random.Range( 0, 1 );   //If at relatively low health and recently received damage, increase score
    }

    return score;    //Return calculated score.
}
Decision behaviours available can be customized for each AI character from the editor, since not all characters will have the same options nor priorities.

This improved handling of AI awareness also helped me improve the AI attack system, which has been created in a very similar fashion. To give an example, in LBKR enemies would always attack the player with the most powerful attack that was not in cooldown, they would take no regard to the distance to the player and only checked their maximum attack range to determine if they could not possibly hit the player.

This created a weird behaviour though where AI’s with ranged attacks would fire at the player from point-blank range alot. With DPBS I’ve implemented a ‘Reposition‘-behaviour that will determine if the AI is at the most appropriate distance from the player, taking in considering a whole lot of properties ranging from current situation as described above, aswell as presets of what ranges the AI is most and least comfortable with aswell as check if they have a clear line of sight to their target. Hence, the AI will adapt their position continuously.

This alone didn’t solve the point-blank ranged attack issue described though, since the AI only make use of the Decision Behaviours at certain intervals. But as I structured the attack system similar I, among other things, added attack range limits and preferences for each AI attack to prevent them from beeing used in certain situations. Also all AI with ranged attacks have a minimum of one melee-attack to allow the AI to attack in close quarters if repositioning is not possible.


.Party interaction

All AI in the game belong to a party, the sizes of each party differ from each other, some contain only two characters while another party can contain 10 – 20 party members. The AI party is used primarily for the AI to be able to interact with eachother, as described in the topic above, DBPS, AI party status is checked when making decisions, how dependant a decision behaviour is of the AI’s party is different for each behaviour, also how an action from a decision behaviour is performed can be modified by the status of the AI’s party.

Take the “Escape” behaviour for example, if the AI for some reason become scared and decides to break contact his action can be either to run away and try to hide, or if the AI know about nearby party members who are not in combat, the character will escape to those party members and rally them, then returns to the battle with his fetched friends.

The combat role assigned to the AI also determine how much it will interact with it’s party, the roles I’ve currently implemented are classed as “Assault“, “Ranger“, “Support” and “Berserk“. Assault characters will most commonly only use the party to find AI to help them when they’re in trouble, ranger’s will be more bold when Assaulters are nearby. Support are generally the most party-fixated characters, they will keep a close eye on it’s party members to know if they require their attention, eg. resurrection or healing of a party member, or summoning new minions. Berserkers pay least attention to it’s party, they are focused entirely on destroying their enemy.

Combat roles in action- Support classed Bone Wizard (Staff-carrying guy) focus on resurrecting and healing his party, while the Rangers and Assaulters engage the player
Example of a hybrid-class, the Corpse Warden is a mix of Berserker and support, agressively attacking it’s opponents but if needed it will prioritize resurrecting and aiding it’s party members

Dev. Report #11 – 2.26

I’ve been spending quite some time, updating my custom tools and editor extensions in order to keep them compatible with the changes I’ve made to some game mechanics and game logic managers that I’ve modified rather heavily for 2.26. Therefore I haven’t had very much to post about… well except mentioned editor updates but I can’t quite imagine that it would be very interesting for the common player.

But in any case, 2.26 is closing in, I’m currently bug and play testing the game thoroughly and fix old and new issues as I go. New game mechanics are up and running and old ones have been and are still beeing updated! I’ve also spent nearly two weeks doing pretty much nothing else than optimizing world collision data and code optimizations for the game mechanics that consume the most memory. For the world collision optimizations I will probably need another few days to have fully optimized all world chunks/tiles.

20177702331

Another thing I’ve fiddled with for 2.26 is game balancing, primarily regarding the game difficulty setting. Without going into detail I can say that it’s more adapted to the length of the game than it was before, also the item generator now calculate item tiers determined by the level of the monster or loot that was opened/killed instead of the player level. The new way of calculating item tiers ensure that the player can’t go back to the first map where monsters are of level 1 – 6 and still loot items of high-rated tiers.
A minor set-back with this approach did appear though, as the level of monsters in one map may vary individually I noticed that there usually is three item tiers that may drop within the same map. This turned into a headache when looting consumables that stack as I soon discovered that my inventory was full with health vials of various item tiers.

ui tiers
In order to get around this obstacle I added a small text on each item icon that present the item tier in roman numerals. I chose roman numerals in hope that the number should not be confused with the item’s stack size.

Some new visual updates has been made aswell, as I spoke of earlier, the character splatmaps that are applied to monsters as they get killed to splash some blood on their bodies- available if “Enable Gore” setting is toggled on.

gfx splatmap01 gfx splatmap02

gfx splatmap03

Also pay attention to the strong light sources in the images, as the bloom image effect is now generating anamorphic lens flares.

gfx anamorphic

There are also alot of world events and boss encounters that I have planned to modify for v2.27, but I’ve actually already begun to change a few of them. Just to spill some examples, the Necromancing mission’s boss encounter as mentioned in my previous article, and also the Machinist boss encounter has been redone. And as I felt that the Machinist model and animations were of too crappy quality I went ahead and created a new Machinist boss. The Machinist now also have the ability to jump into vent shafts to regain some health.

gen machinist gen necroject

-Achievments-

There’s currently 12 new achievments implemented, I still have a few to add before the release though. Do note that for 2.26 achievments may now hold alot more restrictions than earlier. Previous versions of the game forced each achievment to have 6 restrictions or less, while it currently may hold up to 20 restrictions. (Restrictions are achievment goals/stages/phases.)

upd achievmentLength

As mentioned in earlier posts regarding the 2.26 release lots of new game content awaits. I still need a few weeks to get done with all planned optimizations and update/implemented to remaining content of this game version and to ensure that everything works properly (aswell as make sure that old character profiles remain compatible.) But I dare to say that 2.26 is not far from done!

Another note, regarding the UI is that the game now gives a more clear hint of a character level up as can be seen below.

ui levelUp

I think that I spoke about the update log in an earlier post and the fact that I was about to start posting it on LBKR’s website with more or less live updates but I can’t recall ever telling anyone about it beeing done. In any case, I have begun to post the updates on the website on a weekly-ish basis so if you’re interested to check out the log with all updates and bug fixes this far go on and check LBKR Update Log. (2.26 current progress: 401 Updates/Changes, 127 Bug Fixes)