Overview

4 Evolutionary algorithms

Evolutionary algorithms borrow their core idea from biological evolution: populations of varied individuals reproduce, inherit traits, and undergo random mutations, with selection pressure favoring fitter candidates. The chapter frames this intuition as a practical strategy for tackling large optimization spaces where many valid answers exist but some are better than others. It emphasizes that these methods are stochastic, typically aiming for high-quality, “good enough” solutions rather than guaranteed global optima, and uses the Knapsack Problem to anchor the discussion of how evolutionary search trades exhaustive enumeration for guided exploration.

The genetic algorithm’s workflow is presented as a recurring cycle: encode candidate solutions as chromosomes (for the knapsack, a binary string indicating item inclusion), initialize a diverse, constraint-aware population, evaluate fitness (e.g., maximize value without exceeding weight), select parents probabilistically (such as roulette-wheel selection), and generate offspring via crossover and mutation. Multiple crossover schemes—single-point, two-point, and uniform—mix parental genes to create variety, while mutation (e.g., bit flips) injects additional diversity to avoid premature convergence. Next-generation formation can follow steady-state or generational models, with continual re-evaluation of fitness and stopping conditions based on a fixed generation count, reaching a target fitness, or detecting stagnation. A central theme is balancing exploration (diversity early on) with exploitation (convergence toward strong regions) to reduce the risk of getting stuck in local optima.

Effective application hinges on careful configuration: choosing an encoding that matches the problem, setting population size and offspring counts, selecting parent and survivor strategies, picking crossover and mutation methods and rates, and defining sensible stopping criteria. The chapter underlines trade-offs between computational efficiency and solution quality, showing how tuned genetic algorithms can dramatically outperform brute force on large instances while remaining adaptable to varied domains. Beyond knapsack-style packing and routing, examples include feature selection in machine learning, modeling investor behavior, and code breaking, with the option to combine evolutionary search with other techniques for even more powerful hybrid solutions.

The idea of linear human evolution vs. actual human evolution
The evolution of the peppered moth
A simple example of reproduction and mutation
A simple Knapsack Problem example
The optimal solution for the simple Knapsack Problem example
Performance analytics of brute-forcing the Knapsack Problem
Local best vs. global best
Diversity to convergence
Genetic algorithm life cycle
Encode the solution.
Binary-encoding the Knapsack Problem
Binary-encoding the larger dataset for the Knapsack Problem
Create an initial population.
An example of a population of solutions
Measure the fitness of individuals.
Measuring the fitness of individuals
Select parents.
Determining the probability of selection for each individual
Reproduce offspring.
Single-point crossover
Two-point crossover
Uniform crossover
Bit-string mutation
Flip-bit mutation
Populate the next generation.
Measure the fitness of individuals.
Brute-force performance vs. genetic algorithm performance

Summary of evolutionary algorithms

FAQ

What are evolutionary algorithms, and when should I use them?They are optimization methods inspired by natural evolution. They search large, complex spaces by evolving a population of candidate solutions. Use them when the solution space is huge, constraints make brute force impractical, or multiple near-optimal answers exist—for example, knapsack packing, routing, scheduling, feature selection, or layout optimization.
How does a genetic algorithm work at a high level?It iterates through these steps: create an initial random population; evaluate each individual with a fitness function; select parents probabilistically based on fitness; create offspring via crossover and mutation; select survivors to form the next generation; repeat until a stopping condition is met.
What’s the difference between a local best and a global best, and how do GAs avoid getting stuck?A global best is the overall optimum; a local best is a suboptimal peak. GAs reduce the risk of local traps by maintaining diversity (random initialization, adequate population size, appropriate mutation and crossover), and by using selection that preserves some weaker individuals to keep exploring.
How should I encode solutions for the Knapsack Problem?Use binary encoding: a chromosome is a bit string of length equal to the number of items; 1 means include the item, 0 means exclude it. Ensure constraints are respected—either prevent invalid individuals or assign them very poor fitness so they are unlikely to survive.
What makes a good fitness function, and how is fitness computed for Knapsack?A good fitness function aligns with the problem’s objective and enforces constraints. For Knapsack, fitness can be the total value of included items, provided the total weight does not exceed capacity; otherwise, assign a terrible score (e.g., 0). You can also frame it as minimizing unused capacity—choose the definition that best fits your goal.
How are parents selected, and what is roulette-wheel selection?Parents are chosen probabilistically based on fitness. In roulette-wheel selection, each individual gets a “slice” proportional to its fitness; sampling the wheel picks parents. Fitter individuals are more likely to be chosen, but less fit ones still have a chance, which helps maintain diversity.
What crossover strategies are commonly used?- Single-point: split at one index; swap tails/heads between two parents. Simple and fast. - Two-point: choose two split points; swap the segment between them. Increases mixing. - Uniform: use a random mask to pick each gene from either parent. Maximizes diversity. Choose a method compatible with your encoding and constraints.
What is mutation, and how do I choose a mutation rate?Mutation makes small random changes to offspring to sustain diversity. In binary encodings: - Bit-string mutation flips one (or a few) random bits. - Flip-bit mutation inverts all bits (use sparingly). Set higher mutation early for exploration and reduce it later for exploitation; too high a rate can destroy good solutions, too low can cause stagnation.
What are steady-state vs. generational population models?Steady-state replaces only a portion of the population each generation (e.g., remove the weakest and insert new offspring), preserving continuity. Generational creates enough offspring to replace the entire population each cycle. Both can use selection and (optionally) some elitism; steady-state tends to maintain more ongoing diversity.
When should the algorithm stop?Common stopping conditions include: a fixed number of generations; achieving a target fitness; or detecting stagnation (fitness improves only marginally over several generations). Choose a criterion that balances solution quality with available compute time.

pro $24.99 per month

  • access to all Manning books, MEAPs, liveVideos, liveProjects, and audiobooks!
  • choose one free eBook per month to keep
  • exclusive 50% discount on all purchases
  • renews monthly, pause or cancel renewal anytime

lite $19.99 per month

  • access to all Manning books, including MEAPs!

team

5, 10 or 20 seats+ for your team - learn more


choose your plan

team

monthly
annual
$49.99
$399.99
only $33.33 per month
  • five seats for your team
  • access to all Manning books, MEAPs, liveVideos, liveProjects, and audiobooks!
  • choose another free product every time you renew
  • choose twelve free products per year
  • exclusive 50% discount on all purchases
  • renews monthly, pause or cancel renewal anytime
  • renews annually, pause or cancel renewal anytime
  • Grokking AI Algorithms, Second Edition ebook for free
choose your plan

team

monthly
annual
$49.99
$399.99
only $33.33 per month
  • five seats for your team
  • access to all Manning books, MEAPs, liveVideos, liveProjects, and audiobooks!
  • choose another free product every time you renew
  • choose twelve free products per year
  • exclusive 50% discount on all purchases
  • renews monthly, pause or cancel renewal anytime
  • renews annually, pause or cancel renewal anytime
  • Grokking AI Algorithms, Second Edition ebook for free