Beyond Scripted AI
Most game NPCs follow scripted behaviors or state machines: “If enemy seen, attack. If health low, flee.” While predictable and easy to implement, these approaches lack the intelligence to adapt to changing circumstances. What if your NPC could plan their own actions based on goals?
Goal-Oriented Action Planning (GOAP) empowers NPCs to dynamically create plans to achieve their goals. Used in games like F.E.A.R. and The Sims, GOAP creates emergent, intelligent behaviors that feel surprisingly alive.
Let me show you how to implement GOAP in Go.
How GOAP Works
GOAP uses A* pathfinding, but instead of pathfinding in physical space, it searches through action space:
- World State: Boolean facts about the world (hasWeapon=true, enemyAlive=true)
- Actions: Things the NPC can do, each with preconditions and effects
- Goals: Desired world states the NPC wants to achieve
- Planner: Uses A* to find the cheapest sequence of actions to reach the goal
Building the GOAP System
Let’s start with the world state representation:
package main
import (
"container/heap"
"fmt"
"math"
)
// WorldState represents the state of the world as key-value pairs
type WorldState map[string]bool
func (ws WorldState) Copy() WorldState {
copy := make(WorldState)
for k, v := range ws {
copy[k] = v
}
return copy
}
func (ws WorldState) Matches(goal WorldState) bool {
for key, value := range goal {
if ws[key] != value {
return false
}
}
return true
}
func (ws WorldState) Distance(goal WorldState) int {
distance := 0
for key, value := range goal {
if ws[key] != value {
distance++
}
}
return distance
}
func (ws WorldState) String() string {
return fmt.Sprintf("%v", map[string]bool(ws))
}
Defining Actions
Actions are the building blocks of GOAP plans:
// Action represents something an NPC can do
type Action struct {
Name string
Cost float64
Preconditions WorldState
Effects WorldState
}
func (a *Action) IsValid(state WorldState) bool {
for key, value := range a.Preconditions {
if state[key] != value {
return false
}
}
return true
}
func (a *Action) Apply(state WorldState) WorldState {
newState := state.Copy()
for key, value := range a.Effects {
newState[key] = value
}
return newState
}
func (a *Action) String() string {
return fmt.Sprintf("%s (cost: %.1f)", a.Name, a.Cost)
}
// Example actions for a combat NPC
var (
GetWeaponAction = &Action{
Name: "GetWeapon",
Cost: 2.0,
Preconditions: WorldState{
"hasWeapon": false,
},
Effects: WorldState{
"hasWeapon": true,
},
}
GetAmmoAction = &Action{
Name: "GetAmmo",
Cost: 1.5,
Preconditions: WorldState{
"hasWeapon": true,
"hasAmmo": false,
},
Effects: WorldState{
"hasAmmo": true,
},
}
FindEnemyAction = &Action{
Name: "FindEnemy",
Cost: 3.0,
Preconditions: WorldState{},
Effects: WorldState{
"enemyVisible": true,
},
}
AttackEnemyAction = &Action{
Name: "AttackEnemy",
Cost: 5.0,
Preconditions: WorldState{
"hasWeapon": true,
"hasAmmo": true,
"enemyVisible": true,
},
Effects: WorldState{
"enemyDefeated": true,
"hasAmmo": false,
},
}
TakeCoverAction = &Action{
Name: "TakeCover",
Cost: 1.0,
Preconditions: WorldState{
"enemyVisible": true,
},
Effects: WorldState{
"inCover": true,
},
}
HealAction = &Action{
Name: "Heal",
Cost: 4.0,
Preconditions: WorldState{
"hasHealthPack": true,
"injured": true,
},
Effects: WorldState{
"injured": false,
"hasHealthPack": false,
},
}
GetHealthPackAction = &Action{
Name: "GetHealthPack",
Cost: 2.5,
Preconditions: WorldState{
"hasHealthPack": false,
},
Effects: WorldState{
"hasHealthPack": true,
},
}
)
The A* Planner
Now let’s implement the GOAP planner using A*:
// Node represents a state in the search space
type Node struct {
state WorldState
parent *Node
action *Action
gCost float64 // Cost from start
hCost float64 // Heuristic cost to goal
fCost float64 // Total cost
index int // For heap operations
}
// PriorityQueue implements heap.Interface for A* search
type PriorityQueue []*Node
func (pq PriorityQueue) Len() int { return len(pq) }
func (pq PriorityQueue) Less(i, j int) bool {
return pq[i].fCost < pq[j].fCost
}
func (pq PriorityQueue) Swap(i, j int) {
pq[i], pq[j] = pq[j], pq[i]
pq[i].index = i
pq[j].index = j
}
func (pq *PriorityQueue) Push(x interface{}) {
n := len(*pq)
node := x.(*Node)
node.index = n
*pq = append(*pq, node)
}
func (pq *PriorityQueue) Pop() interface{} {
old := *pq
n := len(old)
node := old[n-1]
old[n-1] = nil
node.index = -1
*pq = old[0 : n-1]
return node
}
// Planner finds a plan to achieve a goal
type Planner struct {
actions []*Action
}
func NewPlanner(actions []*Action) *Planner {
return &Planner{actions: actions}
}
func (p *Planner) Plan(start WorldState, goal WorldState) ([]*Action, bool) {
startNode := &Node{
state: start,
gCost: 0,
hCost: float64(start.Distance(goal)),
}
startNode.fCost = startNode.gCost + startNode.hCost
openSet := &PriorityQueue{startNode}
heap.Init(openSet)
closedSet := make(map[string]bool)
iterations := 0
maxIterations := 1000
for openSet.Len() > 0 && iterations < maxIterations {
iterations++
current := heap.Pop(openSet).(*Node)
// Check if we've reached the goal
if current.state.Matches(goal) {
return p.reconstructPlan(current), true
}
closedSet[current.state.String()] = true
// Explore neighbors (actions)
for _, action := range p.actions {
if !action.IsValid(current.state) {
continue
}
newState := action.Apply(current.state)
stateKey := newState.String()
if closedSet[stateKey] {
continue
}
gCost := current.gCost + action.Cost
hCost := float64(newState.Distance(goal))
fCost := gCost + hCost
neighbor := &Node{
state: newState,
parent: current,
action: action,
gCost: gCost,
hCost: hCost,
fCost: fCost,
}
heap.Push(openSet, neighbor)
}
}
return nil, false
}
func (p *Planner) reconstructPlan(node *Node) []*Action {
plan := make([]*Action, 0)
for node.parent != nil {
plan = append([]*Action{node.action}, plan...)
node = node.parent
}
return plan
}
Building an NPC with GOAP
Let’s create an NPC that uses GOAP to make decisions:
// NPC represents a game character using GOAP
type NPC struct {
Name string
State WorldState
Goals []Goal
Planner *Planner
Plan []*Action
PlanIndex int
}
type Goal struct {
Name string
State WorldState
Priority int
}
func NewNPC(name string, initialState WorldState, actions []*Action) *NPC {
return &NPC{
Name: name,
State: initialState,
Planner: NewPlanner(actions),
Goals: make([]Goal, 0),
}
}
func (npc *NPC) AddGoal(name string, state WorldState, priority int) {
npc.Goals = append(npc.Goals, Goal{
Name: name,
State: state,
Priority: priority,
})
}
func (npc *NPC) SelectGoal() *Goal {
if len(npc.Goals) == 0 {
return nil
}
// Select highest priority achievable goal
var bestGoal *Goal
bestPriority := -1
for i := range npc.Goals {
if npc.Goals[i].Priority > bestPriority {
bestGoal = &npc.Goals[i]
bestPriority = npc.Goals[i].Priority
}
}
return bestGoal
}
func (npc *NPC) Update() {
// If no plan or plan is complete, create a new one
if npc.Plan == nil || npc.PlanIndex >= len(npc.Plan) {
goal := npc.SelectGoal()
if goal == nil {
fmt.Printf("[%s] No goals available\n", npc.Name)
return
}
fmt.Printf("\n[%s] Planning for goal: %s\n", npc.Name, goal.Name)
fmt.Printf("[%s] Current state: %s\n", npc.Name, npc.State)
fmt.Printf("[%s] Goal state: %s\n", npc.Name, goal.State)
plan, found := npc.Planner.Plan(npc.State, goal.State)
if !found {
fmt.Printf("[%s] No plan found for goal: %s\n", npc.Name, goal.Name)
npc.Plan = nil
return
}
npc.Plan = plan
npc.PlanIndex = 0
fmt.Printf("[%s] Plan created (%d steps):\n", npc.Name, len(plan))
for i, action := range plan {
fmt.Printf(" %d. %s\n", i+1, action.Name)
}
fmt.Println()
}
// Execute current action in plan
if npc.PlanIndex < len(npc.Plan) {
action := npc.Plan[npc.PlanIndex]
if !action.IsValid(npc.State) {
fmt.Printf("[%s] Action '%s' no longer valid, replanning...\n",
npc.Name, action.Name)
npc.Plan = nil
npc.PlanIndex = 0
return
}
fmt.Printf("[%s] Executing: %s\n", npc.Name, action.Name)
npc.State = action.Apply(npc.State)
fmt.Printf("[%s] New state: %s\n", npc.Name, npc.State)
npc.PlanIndex++
if npc.PlanIndex >= len(npc.Plan) {
fmt.Printf("[%s] Plan complete!\n\n", npc.Name)
npc.Plan = nil
}
}
}
func main() {
fmt.Println("=== GOAP AI Demo ===\n")
// Define available actions
actions := []*Action{
GetWeaponAction,
GetAmmoAction,
FindEnemyAction,
AttackEnemyAction,
TakeCoverAction,
HealAction,
GetHealthPackAction,
}
// Scenario 1: Combat NPC
fmt.Println("--- Scenario 1: Combat Mission ---")
combat := NewNPC("Soldier", WorldState{
"hasWeapon": false,
"hasAmmo": false,
"enemyVisible": false,
"enemyDefeated": false,
"inCover": false,
"injured": false,
"hasHealthPack": false,
}, actions)
combat.AddGoal("Defeat Enemy", WorldState{
"enemyDefeated": true,
}, 10)
// Run simulation
for i := 0; i < 5; i++ {
combat.Update()
}
// Scenario 2: Injured NPC needs to heal first
fmt.Println("\n--- Scenario 2: Survival Mode ---")
survivor := NewNPC("Survivor", WorldState{
"hasWeapon": true,
"hasAmmo": true,
"enemyVisible": false,
"enemyDefeated": false,
"inCover": false,
"injured": true,
"hasHealthPack": false,
}, actions)
survivor.AddGoal("Heal", WorldState{
"injured": false,
}, 15) // Higher priority
survivor.AddGoal("Defeat Enemy", WorldState{
"enemyDefeated": true,
}, 10)
for i := 0; i < 8; i++ {
survivor.Update()
}
fmt.Println("=== Demo Complete ===")
}
Advanced: Dynamic Action Costs
Actions can have dynamic costs based on context:
type DynamicAction struct {
*Action
CostFunc func(WorldState) float64
}
func (da *DynamicAction) GetCost(state WorldState) float64 {
if da.CostFunc != nil {
return da.CostFunc(state)
}
return da.Cost
}
// Example: Attack costs more when injured
var SmartAttackAction = &DynamicAction{
Action: AttackEnemyAction,
CostFunc: func(state WorldState) float64 {
if state["injured"] {
return 10.0 // More expensive when injured
}
return 5.0
},
}
Advanced: Sensor System
Real NPCs need to perceive their environment:
type Sensor interface {
UpdateWorldState(state WorldState) WorldState
}
type VisionSensor struct {
Range float64
}
func (vs *VisionSensor) UpdateWorldState(state WorldState) WorldState {
// Simulate vision checks
// In real game: raycast, distance checks, etc.
newState := state.Copy()
// Detect enemies within range (simplified)
if /* enemy in range */ true {
newState["enemyVisible"] = true
} else {
newState["enemyVisible"] = false
}
return newState
}
type HealthSensor struct{}
func (hs *HealthSensor) UpdateWorldState(state WorldState) WorldState {
newState := state.Copy()
// Update injury status based on health
health := 50.0 // Get from game state
newState["injured"] = health < 30.0
return newState
}
Benefits of GOAP
- Emergent Behavior: NPCs create unexpected but logical plans
- Flexibility: Easy to add new actions without changing AI code
- Debuggability: Plans are explicit sequences you can inspect
- Modularity: Actions are independent, reusable components
- Adaptability: Automatically replans when conditions change
When to Use GOAP
GOAP excels when:
- You want intelligent, adaptive NPCs
- Actions have complex preconditions and effects
- You need NPCs that can handle unexpected situations
- Building simulation games (The Sims-style)
- Creating tactical combat AI
Comparison: Behavior Trees vs GOAP
| Aspect | Behavior Trees | GOAP |
|---|---|---|
| Planning | Manual | Automatic |
| Flexibility | Scripted | Emergent |
| Complexity | Simple | Moderate |
| Performance | Fast | Slower (pathfinding) |
| Best for | Reactive AI | Goal-driven AI |
Performance Tips
- Cache Plans: Reuse plans when world state hasn’t changed significantly
- Limit Actions: Fewer actions = faster planning
- Early Exit: Set max iteration limits for A*
- Pool Nodes: Reuse node objects to reduce allocations
- Incremental Replanning: Only replan when necessary
Thank you
GOAP brings goal-driven intelligence to your NPCs. By letting them plan their own actions, you create dynamic, believable characters that adapt to changing game states. Combined with Go’s performance and simplicity, GOAP becomes a powerful tool in your game AI arsenal.
Please drop an email at [email protected] if you would like to share any feedback or suggestions. Peace!