Skip to content

Perception

The perception block transforms world state and agent state into brain sensor values. It runs at step 2 of the tick loop, after world machines and before the brain fires. Every sensor declared in the body block must have a corresponding sensor X = expr assignment in the perception block (except directional sensors on grid worlds, which are filled automatically).

Perception is where you control what the brain can “see.” Raw simulation state (positions in km, speeds in m/s, entity distances) is normalized and filtered into the 0-1 range that neural networks work with. You can also introduce noise, attention degradation, or information delay to model realistic sensing limitations.

perception OperatorSenses {
let max_speed_ms = world.max_speed / 3.6
-- Normalise speed to 0..1 range
sensor current_speed = agent.speed / max_speed_ms
-- Distance to next waypoint
let chk = nearest_ahead(waypoint, agent.position)
sensor waypoint_near = 1.0 - min(chk.distance / 2.0, 1.0)
-- Speed limit in current zone
let zone = speed_zone_at(agent.position)
sensor speed_limit = zone / world.max_speed
-- Warning level
sensor warning_level = agent.warning_active ? 1.0 : 0.0
-- Human factors (set by dynamics, forwarded to brain)
sensor fatigue = agent.fatigue
sensor stress = agent.stress
sensor boredom = agent.boredom
sensor cognitive_load = agent.cognitive_load
-- Distance to next warning marker
let wm = nearest_ahead(warning_marker, agent.position)
sensor distance_ahead = 1.0 - min(wm.distance / 5.0, 1.0)
}

Each sensor X = expr line writes a value to the brain input node named X. The name must match a sensor X declaration in the body block.

-- Normalized computation (Human Factors)
sensor current_speed = agent.speed / max_speed_ms
-- Direct passthrough from agent state (Network Security)
sensor rate = agent.packet_rate
-- Ternary conditional (Human Factors)
sensor warning_level = agent.warning_active ? 1.0 : 0.0
-- Inverse distance mapping (Human Factors)
sensor waypoint_near = 1.0 - min(chk.distance / 2.0, 1.0)

Sensor values are typically normalized to the 0-1 range, matching the internal(0..1) sensor type. The perception block is responsible for this normalization - the body only declares the sensor’s existence and type.

For directional sensors (directional(range: N, directions: 4)) on grid worlds, the engine fills them automatically from entity positions. This is why the Survival perception block does not assign food_nearby or water_nearby explicitly:

perception ForagerSenses {
sensor hunger = agent.hunger
sensor thirst = agent.thirst
sensor energy = agent.energy
sensor health = agent.health
sensor nausea = agent.nausea
-- Directional sensing fills food_nearby and water_nearby
-- automatically from entity positions on the grid
}

For explicit control over directional sensor values, use a for direction loop:

for direction in sensor.food_nearby.directions {
sensor food_nearby[direction] = nearest(food, agent.position, direction).distance
}

The engine expands this at compile time into 4 (or 8) separate sensor assignments. For route topology, directional sensors are not typically used - there is only one direction (ahead). Use internal() sensors with scalar computations instead.


Local variables reduce repetition and improve readability. They are stack-allocated temporaries scoped to the perception block. See the let keyword reference at the end of this page for full documentation.

Bind a scalar expression to a name for reuse.

-- Human Factors: convert world max speed from km/h to m/s
let max_speed_ms = world.max_speed / 3.6

Bind a spatial query result. Entity properties are promoted to top-level fields on the result.

-- Human Factors: query the nearest waypoint ahead
let chk = nearest_ahead(waypoint, agent.position)
-- chk.distance - distance to the entity (built-in)
-- chk.index - ordinal index (built-in)
-- chk.position - promoted from waypoint properties
-- Human Factors: query the nearest warning marker
let wm = nearest_ahead(warning_marker, agent.position)
-- wm.distance - distance to the marker
-- wm.severity - promoted from warning_marker properties
-- Human Factors: query the current speed zone
let zone = speed_zone_at(agent.position)
-- zone is the speed limit value (scalar result)

Let bindings are local to their enclosing block. A let in perception is not visible in action, and vice versa. Each block has its own independent scope.


Piecewise functions for computing values based on conditions:

let approach_limit = match {
when stn.distance > 0.16: zone_limit
when stn.distance > 0.08: 60.0
when stn.distance > 0.03: 45.0
else: 15.0
}

Match evaluates conditions top-to-bottom and returns the value of the first matching branch. The else branch is the fallback.


Perception can modify agent state as a side effect, though this should be limited to bookkeeping tasks. Side effects use when guards.

All agent.* writes must match state declarations in the body block.


The perception block has access to the full expression language. See the expression reference at the end of this page.


  • Every sensor X = expr must match a sensor X declaration in the body block
  • Spatial queries (nearest_ahead, speed_zone_at) must match query declarations in the world block
  • All agent.* references must match state declarations in the body block
  • World state references (world.X) are read-only in perception

The let keyword creates a local variable binding. It is available in perception, action, machine bodies, and entity handlers.

let dt = world.tick
let max_speed_ms = world.max_speed / 3.6

Binds the right-hand expression to the name on the left. The binding is evaluated once when the line executes.

When a spatial query is bound to a let, the result is a struct with built-in fields plus promoted entity properties.

let chk = nearest_ahead(waypoint, agent.position)
-- chk.distance - distance to entity (built-in)
-- chk.index - ordinal index in entity list (built-in)
-- chk.position - promoted from entity properties
-- chk.scheduled_time - promoted from entity properties

Let bindings are local to their enclosing block:

  • A let in a perception block is not visible in action
  • A let in an on_cross handler is not visible outside it
  • A let at machine level is visible to all states within that machine
ContextExample
Perception blocklet max_speed_ms = world.max_speed / 3.6
Action blocklet dt = world.tick
Machine bodylet sig = nearest_ahead(signal, agent.position)
Entity handlerlet limit_ms = zone / 3.6

Expression Reference

The full expression language available in perception, action, machine, and entity handler blocks.

OperatorDescriptionExample
+Additionagent.speed + 1.0
-Subtraction1.0 - min(chk.distance / 2.0, 1.0)
*Multiplicationagent.speed * dt / 1000.0
/Divisionagent.speed / max_speed_ms
OperatorDescriptionExample
>Greater thanactuator.block > 0.5
<Less thanagent.speed < zone_limit_ms
>=Greater than or equalagent.position >= world.length
<=Less than or equalactuator.block <= 0.5
==Equaldir == none
!=Not equaldir != none
OperatorDescriptionExample
andLogical ANDactuator.block > 0.5 and elapsed_in_state > 1.0
orLogical ORactuator.move_n > 0.5 or actuator.move_e > 0.5
notLogical NOTnot agent.warning_acknowledged
-- Human Factors
sensor warning_level = agent.warning_active ? 1.0 : 0.0
-- Network Security
correct: malicious ? 1.0 : 0.0
-- Survival
metric idle_rate = agent.ticks_alive > 0 ?
agent.idle_ticks / agent.ticks_alive : 1.0
PatternDescriptionExample
agent.XAgent stateagent.speed, agent.fatigue
world.XWorld constants and stateworld.max_speed, world.tick, world.length
actuator.XBrain output valuesactuator.brake, actuator.block
sensor.XSensor metadata (directions)sensor.food_nearby.directions
result.XQuery result fieldschk.distance, chk.index
OperatorDescriptionExample
-xNegation-5.0, -agent.accel
!xLogical NOT (symbol)!agent.alive
not xLogical NOT (keyword)not agent.warning_acknowledged

Right-associative conditional expression.

sensor warning_level = agent.warning_active ? 1.0 : 0.0

Nesting is supported: a ? b ? 1 : 2 : 3.

Condition-based (returns the value of the first matching branch):

match {
when stn.distance > 0.16: zone_limit
when stn.distance > 0.08: 60.0
else: 15.0
}

Value-based (compares a target against patterns; _ is the wildcard default):

match direction {
0 -> "north"
1 -> "east"
_ -> "unknown"
}
FunctionSignatureDescriptionExample
min(a, b)(float, float) -> floatMinimum of two valuesmin(chk.distance / 2.0, 1.0)
max(a, b)(float, float) -> floatMaximum of two valuesmax(0, agent.speed + agent.accel * dt)
abs(x)(float) -> floatAbsolute valueabs(agent.accel)
clamp(val, lo, hi)(float, float, float) -> floatClamp to rangeclamp((value - 0.5) / 1.5, 0, 1)
sqrt(x)(float) -> floatSquare rootsqrt(agent.distance)

Used in statements (not inside expressions). Available in action, machine, and entity handler contexts.

OperatorDescriptionExample
=Direct assignmentagent.accel = -5.0
+=Add and assignagent.warnings_acknowledged += 1
-=Subtract and assignagent.hunger -= 0.3
*=Multiply and assignagent.confidence *= 0.99
/=Divide and assignagent.score /= 2.0

Listed from lowest to highest. Higher-precedence operators bind tighter.

LevelCategoryOperatorsAssociativity
1Ternary? :Right
2Logical ORorLeft
3Logical ANDandLeft
4Comparison> < >= <= == !=Non-associative
5Addition+ -Left
6Multiplication* /Left
7Unary-x !x not xRight (prefix)
8PrimaryLiterals, identifiers, ., (), [], matchLeft (postfix)

All statement types available in perception, action, machine, and entity handler blocks.

StatementSyntaxDescription
letlet x = exprLocal variable binding
sensor (perception only)sensor x = exprWrite to brain input node
when (block)when cond { stmts }Independent conditional guard
when (single-line)when cond: stmtSingle-line conditional
when/else when/elsewhen cond { } else when cond { } else { }Mutually exclusive branches
Assignmenttarget op exprModify state (=, +=, -=, *=, /=)
recordrecord type { fields }Emit a typed event record
consumeconsume()Remove current entity (in entity handlers)
Transition-> state_nameState transition (in machines)