Summary
The primary concepts behind genetic algorithms are biological evolution and representation:
Genetic algorithms are based on natural selection, in which only the fittest individuals survive. Solutions are encoded as sequences of genes, known as genotypes. Arbitrary data structures such trees or graphs can also be evolved. An evaluation function is used to estimate the fitness of each solution. The fitness value can be postprocessed to maintain evolutionary pressure.
The genetic algorithm itself consists of multiple stages:
During initialization, a population of random individuals is created. Seeding enables the engineer to guide the evolution. The selection process picks two parents from the population, generally with an elitist policy. Crossover is used to combine the genetic code of two parents, and mutation randomly changes select genes. The replacement policy decides whether the resulting offspring should be reinserted into the population.
Watch out for a few details in practice, such as premature convergence and slow convergence. These issues can be resolved by tuning the parameters appropriately. Despite not being particularly efficient, genetic algorithms are an extremely robust and flexible optimization strategy.
Chapter 34, "Adaptive Defensive Strategies with Genetic Algorithms," applies genetic algorithms to learning simple behaviors such as rocket jumping and dodging rockets. The next chapter discusses a popular use of genetic algorithms, which are combined with a rule-like representation.
There's a simple animat that uses a genetic algorithm to learn its behaviors. Evolvee uses existing representations—such as neural networks—which are evolved according to a high-level metric. The realism of Evolvee is relatively poor, although simple tasks (such as attacking an enemy) do not pose a problem.
|
|