This section discusses Genetic Algorithms of AI in detail.
Genetic Algorithms (GAs) are search based algorithms based on the concepts of natural selection and genetics. GAs are a subset of a lot larger branch of computation known as Evolutionary Computation.
GAs were developed by John Holland and his students and colleagues at the University of Michigan, most notably David E. Goldberg. It has since been tried on different optimization problems with a high degree of achievement.
In GAs, we have a pool of possible solutions to the given issue. These solutions then undergo recombination and mutation (like in natural genetics), produces new children, and the process is repeated for different generations. Every individual (or candidate solution) is assigned a fitness value (based on its objective function value) and the fitter individuals are given a higher chance to mate and yield fitter individuals. This is in line with the Darwinian Theory of Survival of the Fittest.
Thus, it keeps evolving better individuals or solutions over generations, till it arrives a stopping criterion.
Genetic Algorithms are sufficiently randomized in nature, but they perform much superior than random local search (where we just try random solutions, keeping track of the best so far), as they exploit historical information as well.
Optimization is an action of making design, situation, resource and system, as effective as possible. The following block diagram shows the optimization cycle −
The following is a sequence of steps of GA mechanism when utilized for optimization of problems.
Step 1 − Generate the initial population randomly.
Step 2 − Select the initial solution with best fitness values.
Step 3 − Recombine the selected solutions using mutation and crossover operators.
Step 4 − Insert an offspring into the population.
Step 5 − Now, if the stop condition is met, return the solution with their best fitness value. Else go to step 2.
For solving the issue by using Genetic Algorithms in Python, we are going to use a powerful package for GA called DEAP. It is a library of novel evolutionary computation framework for rapid prototyping and testing of thoughts. We can install this package with the assistance of the following command on command prompt −
pip install deap
If you are using anaconda environment, then following command can be utilized to install deap −
conda install -c conda-forge deap
This part explains you the implementation of solutions using Genetic Algorithms.
The following example tells you how to generate a bit string that would contain 15 ones, based on the One Max issue.
Import the necessary packages as appeared −
import random from deap import base, creator, tools
Define the evaluation function. It is the initial step to create a genetic algorithm.
def eval_func(individual): target_sum = 15 return len(individual) - abs(sum(individual) - target_sum),
Now, create the toolbox with the correct parameters −
def create_toolbox(num_bits): creator.create("FitnessMax", base.Fitness, weights=(1.0,)) creator.create("Individual", list, fitness=creator.FitnessMax)
Initialize the toolbox
toolbox = base.Toolbox() toolbox.register("attr_bool", random.randint, 0, 1) toolbox.register("individual", tools.initRepeat, creator.Individual, toolbox.attr_bool, num_bits) toolbox.register("population", tools.initRepeat, list, toolbox.individual)
Register the evaluation operator −
toolbox.register("evaluate", eval_func)
Now, register the crossover operator −
toolbox.register("mate", tools.cxTwoPoint)
Register a mutation operator −
toolbox.register("mutate", tools.mutFlipBit, indpb = 0.05)
Define the operator for breeding −
toolbox.register("select", tools.selTournament, tournsize = 3) return toolbox if __name__ == "__main__": num_bits = 45 toolbox = create_toolbox(num_bits) random.seed(7) population = toolbox.population(n = 500) probab_crossing, probab_mutating = 0.5, 0.2 num_generations = 10 print('\nEvolution process starts')
Evaluate the entire population −
fitnesses = list(map(toolbox.evaluate, population)) for ind, fit in zip(population, fitnesses): ind.fitness.values = fit print('\nEvaluated', len(population), 'individuals')
Create and iterate through generations −
for g in range(num_generations): print("\n- Generation", g)
Selecting the next generation individuals −
offspring = toolbox.select(population, len(population))
Now, clone the selected individuals −
offspring = list(map(toolbox.clone, offspring))
Apply crossover and mutation on the offspring −
for child1, child2 in zip(offspring[::2], offspring[1::2]): if random.random() < probab_crossing: toolbox.mate(child1, child2)
Delete the fitness value of child
del child1.fitness.values del child2.fitness.values
Now, apply mutation −
for mutant in offspring: if random.random() < probab_mutating: toolbox.mutate(mutant) del mutant.fitness.values
Evaluate the individuals with an invalid fitness −
invalid_ind = [ind for ind in offspring if not ind.fitness.valid] fitnesses = map(toolbox.evaluate, invalid_ind) for ind, fit in zip(invalid_ind, fitnesses): ind.fitness.values = fit print('Evaluated', len(invalid_ind), 'individuals')
Now, replace population with next generation individual −
population[:] = offspring
Print the statistics for the current generations −
fits = [ind.fitness.values[0] for ind in population] length = len(population) mean = sum(fits) / length sum2 = sum(x*x for x in fits) std = abs(sum2 / length - mean**2)**0.5 print('Min =', min(fits), ', Max =', max(fits)) print('Average =', round(mean, 2), ', Standard deviation =', round(std, 2)) print("\n- Evolution ends")
Print the final output −
best_ind = tools.selBest(population, 1)[0] print('\nBest individual:\n', best_ind) print('\nNumber of ones:', sum(best_ind)) Following would be the output: Evolution process starts Evaluated 500 individuals - Generation 0 Evaluated 295 individuals Min = 32.0 , Max = 45.0 Average = 40.29 , Standard deviation = 2.61 - Generation 1 Evaluated 292 individuals Min = 34.0 , Max = 45.0 Average = 42.35 , Standard deviation = 1.91 - Generation 2 Evaluated 277 individuals Min = 37.0 , Max = 45.0 Average = 43.39 , Standard deviation = 1.46 … … … … - Generation 9 Evaluated 299 individuals Min = 40.0 , Max = 45.0 Average = 44.12 , Standard deviation = 1.11 - Evolution ends Best individual: [0, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 1, 1, 0, 0, 0, 1, 0, 0, 0, 1, 1, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 1, 1, 1, 0, 0, 1, 0, 1] Number of ones: 15
It is one of the best known issues in genetic programming. All symbolic regression problems utilize an arbitrary data distribution, and attempt to fit the most accurate data with a symbolic formula. Usually, a measure like the RMSE (Root Mean Square Error) is used to measure an individual’s fitness. It is a classic regressor problem and here we are using the equation 5x3-6x2+8x=1. We have to follow all the steps as followed in the above example, but the main part would be to create the primitive sets because they are the building blocks for the individuals so the evaluation can begin. Here we will be using the classic set of primitives.
The following Python code explains this in detail −
import operator import math import random import numpy as np from deap import algorithms, base, creator, tools, gp def division_operator(numerator, denominator): if denominator == 0: return 1 return numerator / denominator def eval_func(individual, points): func = toolbox.compile(expr=individual) return math.fsum(mse) / len(points), def create_toolbox(): pset = gp.PrimitiveSet("MAIN", 1) pset.addPrimitive(operator.add, 2) pset.addPrimitive(operator.sub, 2) pset.addPrimitive(operator.mul, 2) pset.addPrimitive(division_operator, 2) pset.addPrimitive(operator.neg, 1) pset.addPrimitive(math.cos, 1) pset.addPrimitive(math.sin, 1) pset.addEphemeralConstant("rand101", lambda: random.randint(-1,1)) pset.renameArguments(ARG0 = 'x') creator.create("FitnessMin", base.Fitness, weights = (-1.0,)) creator.create("Individual",gp.PrimitiveTree,fitness=creator.FitnessMin) toolbox = base.Toolbox() toolbox.register("expr", gp.genHalfAndHalf, pset=pset, min_=1, max_=2) toolbox.expr) toolbox.register("population",tools.initRepeat,list, toolbox.individual) toolbox.register("compile", gp.compile, pset = pset) toolbox.register("evaluate", eval_func, points = [x/10. for x in range(-10,10)]) toolbox.register("select", tools.selTournament, tournsize = 3) toolbox.register("mate", gp.cxOnePoint) toolbox.register("expr_mut", gp.genFull, min_=0, max_=2) toolbox.register("mutate", gp.mutUniform, expr = toolbox.expr_mut, pset = pset) toolbox.decorate("mate", gp.staticLimit(key = operator.attrgetter("height"), max_value = 17)) toolbox.decorate("mutate", gp.staticLimit(key = operator.attrgetter("height"), max_value = 17)) return toolbox if __name__ == "__main__": random.seed(7) toolbox = create_toolbox() population = toolbox.population(n = 450) hall_of_fame = tools.HallOfFame(1) stats_fit = tools.Statistics(lambda x: x.fitness.values) stats_size = tools.Statistics(len) mstats = tools.MultiStatistics(fitness=stats_fit, size = stats_size) mstats.register("avg", np.mean) mstats.register("std", np.std) mstats.register("min", np.min) mstats.register("max", np.max) probab_crossover = 0.4 probab_mutate = 0.2 number_gen = 10 population, log = algorithms.eaSimple(population, toolbox, probab_crossover, probab_mutate, number_gen, stats = mstats, halloffame = hall_of_fame, verbose = True)
Note that all the basic steps are same as used while generating bit patterns. This program will provide us the output as min, max, std (standard deviation) after 10 number of generations.