Analysis of Optimization Algorithms An iterative procedure, a self-organization system, two conflicting components, and three evolutionary operators
Algorithm as an Iterative Process  An algorithm A is an iterative process, that aims to generate a new and better solution xt+1 to a given problem from the current solution xt at iteration or time t
Algorithm as an Iterative Process  Newton-Raphson method to find the optimal value of f (x) is equivalent to finding the critical points or roots of f `(x) = 0 in a d-dimensional space Here x∗ is the optimal solution, or a fixed point of the iterative formula. Improve the convergence Convergence rate may become very slow near the optimal point
Algorithm as an Iterative Process  Optimal convergence of Newton-Raphson’s method leads to an optimal parameter setting p, which depends on the iterative formula and the optimality x∗ of the objective f (x) to be optimized.  Generally, the preceding iteration equation is rewritten which is valid for a deterministic method.
Algorithm as an Iterative Process  In modern metaheuristic algorithms, randomization is often used in an algorithm, and in many cases, randomization appears in the form of a set of m random variables ε = (ε1, . . . , εm) in an algorithm a vector of parameters p = (p1, . . . , pk ). where A is a nonlinear mapping from a given solution (a d-dimensional vector xt) to a new solution vector xt+1. for a trajectory-based, single-agent system
Algorithm as an Iterative Process  For population-based algorithms with a swarm of n solutions, the preceding iterative formula is extended to the following where p1, . . . , pk are k algorithm-dependent parameters and 1, . . . , m are m random variables
An Ideal Algorithm?  The number of iterations t needed to find an optimal solution for a given accuracy largely determines the overall computational efforts and the performance of an algorithm.  A better algorithm should use less computation and fewer iterations.
Self-Organization System  A complex system may be self-organizing under the right conditions:  when the size of the system is sufficiently large with a sufficiently high number of degrees of freedom or possible states S.  System must be allowed to evolve over a long time away from noise and far from equilibrium states.  Selection mechanism must be in place to ensure that self-organization is possible.  Main conditions for self-organization in a complex system  The system size is large with a sufficient number of degrees of freedom or states.  There is enough diversity in the system, such as perturbations, noise, or edge of chaos, or it is far from the equilibrium.  The system is allowed to evolve over a long time.  A selection mechanism or an unchanging law acts in the system.
Self-Organization System
Exploration and Exploitation  Exploration means to generate diverse solutions so as to explore the search space on a global scale  Algorithm searching for new solutions in new regions,  Exploitation means to focus on the search in a local region by exploiting the information that a current good solution is found in this region  Use already exist solutions and make refinement to it so it's fitness will improve
Exploration and Exploitation  Exploitation uses any information obtained from the problem of interest to help generate new solutions that are better than existing solutions. However, this process is typically local, and information (such as gradient) is also local.  Therefore, it is for local search.  Exploration makes it possible to explore the search space more efficiently, and it can generate solutions with enough diversity and far from the current solutions.  Therefore, the search is typically on a global scale
Exploration and Exploitation  Final balance is required so that an algorithm can achieve good performance.  Too much exploitation and too little exploration  System may converge more quickly, but the probability of finding the true global optimality may be low.  Too little exploitation and too much exploration  Cause the search path to wander around with very slow convergence  The optimal balance should mean the right amount of exploration and exploitation, which may lead to the optimal performance of an algorithm. Therefore, balance is crucially important.
Evolutionary Operators  Genetic algorithms (GA)  Gradient-free  Highly explorative  Parallelism  No gradient/derivative information is needed in GA, and thus GA can deal with complex, discontinuous problems  Stochastic nature of crossover and mutation make GA explore the search space more effectively and the global optimality is more likely to be reached.  genetic algorithms are population-based with multiple chromosomes, and thus it is possible to implement them in a parallel manner
3 key evolutionary operators  Crossover  Recombination of two parent chromosomes (solutions) by exchanging part of one chromosome with a corresponding part of another so as to produce offsprings (new solutions).  Mutation  Change of part of a chromosome (a bit or several bits) to generate new genetic characteristics. In binary encoding, mutation can be achieved simply by flipping between 0 and 1. Mutation can occur at a single site or multiple sites simultaneously  Selection  Survival of the fittest, which means the highest quality chromosomes and/characteristics will stay within the population. This often takes some form of elitism, and the simplest form is to let the best genes pass on to the next generations in the population.
THANK YOU

Analysis of optimization algorithms

  • 1.
    Analysis of Optimization Algorithms Aniterative procedure, a self-organization system, two conflicting components, and three evolutionary operators
  • 2.
    Algorithm as anIterative Process  An algorithm A is an iterative process, that aims to generate a new and better solution xt+1 to a given problem from the current solution xt at iteration or time t
  • 3.
    Algorithm as anIterative Process  Newton-Raphson method to find the optimal value of f (x) is equivalent to finding the critical points or roots of f `(x) = 0 in a d-dimensional space Here x∗ is the optimal solution, or a fixed point of the iterative formula. Improve the convergence Convergence rate may become very slow near the optimal point
  • 4.
    Algorithm as anIterative Process  Optimal convergence of Newton-Raphson’s method leads to an optimal parameter setting p, which depends on the iterative formula and the optimality x∗ of the objective f (x) to be optimized.  Generally, the preceding iteration equation is rewritten which is valid for a deterministic method.
  • 5.
    Algorithm as anIterative Process  In modern metaheuristic algorithms, randomization is often used in an algorithm, and in many cases, randomization appears in the form of a set of m random variables ε = (ε1, . . . , εm) in an algorithm a vector of parameters p = (p1, . . . , pk ). where A is a nonlinear mapping from a given solution (a d-dimensional vector xt) to a new solution vector xt+1. for a trajectory-based, single-agent system
  • 6.
    Algorithm as anIterative Process  For population-based algorithms with a swarm of n solutions, the preceding iterative formula is extended to the following where p1, . . . , pk are k algorithm-dependent parameters and 1, . . . , m are m random variables
  • 7.
    An Ideal Algorithm? The number of iterations t needed to find an optimal solution for a given accuracy largely determines the overall computational efforts and the performance of an algorithm.  A better algorithm should use less computation and fewer iterations.
  • 8.
    Self-Organization System  Acomplex system may be self-organizing under the right conditions:  when the size of the system is sufficiently large with a sufficiently high number of degrees of freedom or possible states S.  System must be allowed to evolve over a long time away from noise and far from equilibrium states.  Selection mechanism must be in place to ensure that self-organization is possible.  Main conditions for self-organization in a complex system  The system size is large with a sufficient number of degrees of freedom or states.  There is enough diversity in the system, such as perturbations, noise, or edge of chaos, or it is far from the equilibrium.  The system is allowed to evolve over a long time.  A selection mechanism or an unchanging law acts in the system.
  • 9.
  • 10.
    Exploration and Exploitation Exploration means to generate diverse solutions so as to explore the search space on a global scale  Algorithm searching for new solutions in new regions,  Exploitation means to focus on the search in a local region by exploiting the information that a current good solution is found in this region  Use already exist solutions and make refinement to it so it's fitness will improve
  • 11.
    Exploration and Exploitation Exploitation uses any information obtained from the problem of interest to help generate new solutions that are better than existing solutions. However, this process is typically local, and information (such as gradient) is also local.  Therefore, it is for local search.  Exploration makes it possible to explore the search space more efficiently, and it can generate solutions with enough diversity and far from the current solutions.  Therefore, the search is typically on a global scale
  • 12.
    Exploration and Exploitation Final balance is required so that an algorithm can achieve good performance.  Too much exploitation and too little exploration  System may converge more quickly, but the probability of finding the true global optimality may be low.  Too little exploitation and too much exploration  Cause the search path to wander around with very slow convergence  The optimal balance should mean the right amount of exploration and exploitation, which may lead to the optimal performance of an algorithm. Therefore, balance is crucially important.
  • 13.
    Evolutionary Operators  Geneticalgorithms (GA)  Gradient-free  Highly explorative  Parallelism  No gradient/derivative information is needed in GA, and thus GA can deal with complex, discontinuous problems  Stochastic nature of crossover and mutation make GA explore the search space more effectively and the global optimality is more likely to be reached.  genetic algorithms are population-based with multiple chromosomes, and thus it is possible to implement them in a parallel manner
  • 14.
    3 key evolutionaryoperators  Crossover  Recombination of two parent chromosomes (solutions) by exchanging part of one chromosome with a corresponding part of another so as to produce offsprings (new solutions).  Mutation  Change of part of a chromosome (a bit or several bits) to generate new genetic characteristics. In binary encoding, mutation can be achieved simply by flipping between 0 and 1. Mutation can occur at a single site or multiple sites simultaneously  Selection  Survival of the fittest, which means the highest quality chromosomes and/characteristics will stay within the population. This often takes some form of elitism, and the simplest form is to let the best genes pass on to the next generations in the population.
  • 15.