Skip to content

Mutation Operators

The mutation block inside evolve configures eight independent mutation operators. Each operator has a probability (0.0-1.0) that it fires during a single Mutate() call. All eight operators are evaluated independently per genome per generation - multiple operators can fire on the same genome in the same generation.

Mutations are how evolution modifies brain topology between generations. Some mutations fine-tune what already exists (adjusting weights), others add new structure (inserting neurons or connections), and others trim away what is not working. Together they form the building blocks of structural innovation - the mechanism by which an initially simple brain can grow into a complex one over thousands of generations.

mutation {
weight_shift: 0.8 // perturb connection weights
bias_shift: 0.1 // perturb node biases
add_connection: 0.05 // add a new connection
remove_connection: 0.03 // disable a connection
add_node: 0.03 // split a connection, inserting a hidden node
remove_node: 0.01 // remove a hidden node
rewire: 0.02 // move a connection endpoint
change_activation: 0.01 // switch a hidden node's activation function
}

Perturbs every connection weight in the genome. Each weight has a 90% chance of Gaussian perturbation (stddev controlled by WeightShiftStddev, default 0.1) and a 10% chance of full randomization within [-2, 2]. All weights are clamped to [-2, 2] after mutation.

This is the most common mutation - fine-tuning the strengths of existing connections is the primary mechanism for behavioral refinement.

Perturbs the bias of every non-input node by a Gaussian sample (stddev controlled by BiasShiftStddev, default 0.05). Biases are clamped to [-2, 2] after mutation.

Adds a new enabled connection between two unconnected nodes. The target must not be an input node, and the connection must not create a cycle (maintaining a feed-forward graph). The initial weight is drawn uniformly from [-1, 1].

When regions are defined, 80% of attempts preferentially select both endpoints from the same region. The remaining 20% use unrestricted selection.

Disables a randomly selected enabled connection, biased toward connections with smaller absolute weights (inverse-weight probability). A connection is never removed if it is the last enabled connection in the genome.

Splits a randomly chosen enabled connection A->B into two: A->New and New->B. The original connection is disabled. The incoming weight is 1.0 and the outgoing weight preserves the original, maintaining the prior function of the network.

When regions are defined, the new node inherits a region assignment:

  1. If the target node is hidden, inherit its region
  2. Else if the source node is hidden, inherit its region
  3. Otherwise, assign a random region

The new node’s activation function matches its assigned region’s default.

Removes a randomly chosen hidden node. All connections to and from the node are disabled. For each pair of enabled incoming connection I and outgoing connection O, a bypass connection from I.source to O.target is created with weight I.weight * O.weight, approximating the removed node’s contribution.

Moves one endpoint of a randomly chosen enabled connection to a different node. Either the source or target is replaced (50/50 chance). The rewired connection must not duplicate an existing connection, must not target an input node, and must not create a cycle.

Replaces the activation function of a randomly chosen hidden node with a different one sampled uniformly from the eight available functions: sigmoid, tanh, relu, leaky_relu, step, gaussian, linear, softplus.


CategoryOperatorsTypical RangePurpose
Parametricweight_shift, bias_shift0.1 - 0.8Fine-tune existing topology
Constructiveadd_connection, add_node0.01 - 0.1Grow network complexity
Destructiveremove_connection, remove_node0.01 - 0.05Prune unused structure
Topologicalrewire, change_activation0.01 - 0.05Explore alternative topologies

Higher constructive rates produce more complex networks faster but risk bloat. Higher destructive rates keep networks lean but can destroy useful structure. The defaults balance growth and pruning for most experiments.