Back to Knowledge Base

Hytale NPC and AI System: Roles, Sensors, and Actions

How Hytale's NPC behavior system works — roles define archetypes, sensors detect conditions, actions execute behavior, and blackboards provide shared memory.

Player Games··hytale

Hytale uses a role-based AI architecture for NPC behavior. Instead of attaching individual goals to entities (like Minecraft does), Hytale defines Roles that bundle together perception (Sensors), behavior (Actions), and memory (Blackboards) into a complete NPC archetype. This reference covers the architecture as found in Hytale server version 2026.03.26.

Architecture Overview

Every NPC in Hytale follows this layered design:

Entity
 └── Role (defines the NPC archetype)
      ├── Sensors (perceive the world, write to Blackboard)
      ├── Actions (read from Blackboard, execute behavior)
      ├── Blackboard (key-value memory shared between Sensors and Actions)
      └── Motion (steering, collision, movement physics)

A Role is the top-level container. It defines what an NPC can see, what it can do, and how it moves. Sensors continuously evaluate world conditions and post results to a Blackboard. Actions check the Blackboard and decide whether to execute.

Note: Minecraft's mob AI uses a flat list of prioritized Goals (e.g., MeleeAttackGoal, WanderAroundGoal) that compete for execution each tick. Hytale's system is more like a state machine — Sensors trigger transitions, Actions define states, and the Blackboard carries context between them. If you've worked with behavior trees or GOAP systems, Hytale's architecture will feel familiar.

Roles

A Role is built using BuilderRole and defines everything about an NPC's behavior profile:

// Conceptual example — Roles are typically defined via asset data
BuilderRole guardRole = new BuilderRole()
    .setInitialMaxHealth(40)
    .setRootInstruction(patrolInstruction)
    .setInteractionInstruction(dialogueInstruction)
    .setDeathInstruction(deathAnimInstruction)
    .setDropListId("guard_drops")
    .setNameTranslationKey("npc.guard.name");

Key properties you configure on a Role:

  • HealthinitialMaxHealth sets starting HP
  • Instructions — root (main behavior loop), interaction (when a player interacts), death (death sequence)
  • Steering — separate Steering configs for body and head movement
  • CombatCombatSupport for damage dealing, targeting, and aggro
  • Flocking — alignment, separation, and cohesion weights for group behavior
  • Drops — item drop list on death
  • Environment constraints — breathing requirements (air, water), allowed environments
  • Movement physics — collision probes, inertia, knockback response

Roles also support variants through BuilderRoleVariant, letting you create variations of an NPC (e.g., an armored guard vs. a scout) without duplicating the entire definition.

Sensors

Sensors are the perception layer. They observe the world and determine whether conditions are met. Every Sensor implements the Sensor interface:

public interface Sensor {
    boolean matches(InfoProvider info);
    SensorInfo getSensorInfo();
}

The matches() method returns true when the sensor's condition is satisfied. The server calls this each evaluation cycle.

Built-in Sensor Types

SensorWhat It Detects
SensorPlayerPlayer presence within a detection range
SensorSelfNPC's own state (health threshold, status effects)
SensorDamageWhen the NPC takes damage
SensorAnimationAnimation state changes (attack finished, idle started)
SensorEventArbitrary events from the event messaging system

SensorEvent Search Modes

SensorEvent is particularly flexible. It supports different search modes that control how it finds relevant events:

  • PlayerFirst — prioritize events from players, fall back to NPC events
  • PlayerOnly — only respond to player-originated events
  • NpcFirst — prioritize events from other NPCs, fall back to player events
  • NpcOnly — only respond to NPC-originated events

This is how NPCs can react differently to players versus other NPCs. A guard might use PlayerFirst to prioritize responding to player threats, while a villager NPC might use NpcFirst to prioritize communication from other villagers.

Example: Health Threshold Sensor

// Sensor that triggers when NPC health drops below 25%
public class LowHealthSensor implements Sensor {

    @Override
    public boolean matches(InfoProvider info) {
        float currentHealth = info.getCurrentHealth();
        float maxHealth = info.getMaxHealth();
        return currentHealth < (maxHealth * 0.25f);
    }

    @Override
    public SensorInfo getSensorInfo() {
        return new SensorInfo("low_health", "Health below 25%");
    }
}

Actions

Actions are the behavior layer. They execute when their conditions are met. Every Action implements:

public interface Action {
    boolean canExecute(InfoProvider info);
    void execute(InfoProvider info);
    void activate(InfoProvider info);
    void deactivate(InfoProvider info);
}

The lifecycle is:

  1. canExecute() — checked each cycle. Returns true if the action should run.
  2. activate() — called once when the action starts.
  3. execute() — called each tick while the action is active.
  4. deactivate() — called once when the action ends.

Built-in Actions

ActionWhat It Does
PlayAnimationTriggers an NPC animation
PlaySoundPlays a sound at the NPC's position
SpawnParticlesEmits a particle effect
ApplyEntityEffectApplies a status effect to the NPC or a target
TeleportMoves the NPC to a new position
DisplayNameShows or hides the NPC's name tag
ModelAttachmentAttaches or detaches model parts (weapons, armor, accessories)

Composing Actions

Actions can be combined using ActionList for sequential execution or InstructionRandomized for random selection:

// Execute a sequence: play sound, then spawn particles, then apply effect
ActionList alertSequence = new ActionList(
    new PlaySound("alert_horn"),
    new SpawnParticles("exclamation_mark"),
    new ApplyEntityEffect("speed_boost", Duration.ofSeconds(10))
);

Example: Flee Action

public class FleeAction implements Action {

    private static final float FLEE_DISTANCE = 20.0f;

    @Override
    public boolean canExecute(InfoProvider info) {
        // Only flee when health is low (checked via Blackboard)
        return info.getBlackboard().has("low_health_triggered");
    }

    @Override
    public void activate(InfoProvider info) {
        info.getMotion().setSpeedMultiplier(1.5f);
    }

    @Override
    public void execute(InfoProvider info) {
        // Move away from the nearest threat
        Entity threat = info.getBlackboard().get("current_threat");
        if (threat != null) {
            Vector3d fleeDirection = info.getPosition()
                .subtract(threat.getPosition())
                .normalize()
                .multiply(FLEE_DISTANCE);
            info.getMotion().moveTo(info.getPosition().add(fleeDirection));
        }
    }

    @Override
    public void deactivate(InfoProvider info) {
        info.getMotion().setSpeedMultiplier(1.0f);
        info.getBlackboard().remove("low_health_triggered");
    }
}

Blackboard System

The Blackboard is a typed key-value store that acts as shared memory between Sensors and Actions. Sensors write observations, and Actions read them to make decisions.

// Sensor writes a detection result
blackboard.set("nearest_player", detectedPlayer);
blackboard.set("threat_distance", distance);

// Action reads the result
Entity target = blackboard.get("nearest_player");
float distance = blackboard.get("threat_distance");

The system uses EventTypeRegistration for dynamic event keys and EventView for type-safe reads:

// Register a typed blackboard key
EventTypeRegistration<Entity> threatKey =
    EventTypeRegistration.create("current_threat", Entity.class);

// Write with type safety
blackboard.set(threatKey, targetEntity);

// Read with type safety — no casting needed
Entity threat = blackboard.get(threatKey);

Tip: Always define Blackboard keys as EventTypeRegistration constants rather than raw strings. This gives you compile-time type checking and avoids runtime ClassCastException errors when Sensors and Actions disagree on value types.

Event Messaging Between NPCs

The Blackboard also supports cross-NPC communication through the event messaging system:

  • PlayerBlockEventSupport / NPCBlockEventSupport — share block interaction events
  • PlayerEntityEventSupport / NPCEntityEventSupport — share entity interaction events
  • AllNPCsLoadedEvent — fires when all NPCs in a region finish loading

This lets NPCs coordinate. A scout NPC that spots a player can post a threat event that nearby guard NPCs pick up through their SensorEvent listeners.

Flocking

Roles can enable flocking behavior for group movement. Three parameters control the simulation:

  • Alignment — how strongly NPCs match the heading of nearby group members
  • Separation — how strongly NPCs avoid crowding each other
  • Cohesion — how strongly NPCs move toward the center of the group
BuilderRole herdAnimal = new BuilderRole()
    .setFlockingAlignment(0.8f)
    .setFlockingSeparation(1.2f)
    .setFlockingCohesion(0.6f);

Note: High separation with low cohesion creates loose, spread-out groups (like grazing animals). High cohesion with low separation creates tight swarms (like insects or fish). Start with balanced values around 1.0 and adjust from there.

Motion and Steering

Each Role configures separate Steering objects for body and head movement. The body steering controls locomotion while the head steering controls where the NPC looks:

  • Collision probescollisionProbeDistance determines how far ahead the NPC checks for obstacles
  • Inertia — controls how quickly the NPC accelerates and decelerates
  • Knockback — defines the NPC's response to being hit
  • Entity avoidanceisAvoidingEntities controls whether the NPC pathfinds around other entities

Example: Patrol Guard NPC

Here's how the pieces fit together for a guard NPC that patrols waypoints, detects players, and calls for backup:

Sensors:

  • SensorPlayer with a 15-block detection range to spot approaching players
  • SensorDamage to react when attacked
  • SensorEvent in NpcOnly mode to receive alerts from other guards

Actions:

  • PatrolAction — walks a waypoint loop (root instruction)
  • InvestigateAction — moves toward detected player
  • CombatAction — engages hostile targets with CombatSupport
  • AlertAction — posts a threat to the Blackboard so nearby guards react via SensorEvent
  • PlaySound("guard_alert") + DisplayName("!") — visual and audio feedback

Blackboard keys:

  • detected_player (Entity) — written by SensorPlayer, read by InvestigateAction
  • under_attack (Boolean) — written by SensorDamage, read by CombatAction
  • alert_broadcast (Entity) — written by AlertAction, read by other guards via SensorEvent

Role configuration:

  • 40 max health, moderate speed, entity avoidance enabled
  • Flocking: low alignment (0.3), high separation (1.5), medium cohesion (0.8) — guards spread out but stay in the same area
  • Death instruction plays a collapse animation and drops guard_drops loot table

Custom NPC checklist

  • Define a BuilderRole with health, steering, and combat settings
  • Implement Sensors for each condition the NPC should detect
  • Implement Actions for each behavior the NPC should perform
  • Define typed Blackboard keys for Sensor-to-Action communication
  • Compose instructions using ActionList or InstructionRandomized
  • Configure flocking weights if the NPC should move in groups
  • Set up death instructions and drop lists
  • Test pathfinding, combat engagement, and edge cases like chunk boundaries