SIMULTANEOUS OPTIMIZATION OF NEURAL NETWORK WEIGHTS AND ACTIVE NODES USING METAHEURISTICS PAPER LINK: HTTP://IEEEXPLORE.IEEE.ORG/DOCUMENT/7086207/ AUTHORS: V K OJHA, A ABRAHAM, V SNASEL. IT4INNOVATIONS, VSB TECHNICAL UNIVERSITY OF OSTRAVA, OSTRAVA, CZECH REPUBLIC. V. K. Ojha, A. Abraham and V. Snášel, "Simultaneous optimization of neural network weights and active nodes using metaheuristics," 2014 14th International Conference on Hybrid Intelligent Systems, Kuwait, 2014, pp. 248-253. doi: 10.1109/HIS.2014.7086207
CONTENT • Introduction. • Neural Network • Meta-heuristic Algorithms • Metaheuristic Framework For Transfer Function Optimization • Results • Conclusion
INTRODUCTION • Neural Network (NN) is the most desirable computational tool for solving nonlinear and complex optimization, pattern recognition, function approximation, classification, etc., problems. • The Metaheuristic (MH) Algorithms are stochastic search algorithm that efficient in exploration and exploitation of search space to offer near optimal solution to optimization problems. • NN-MH: Metaheuristics can be used to optimize Neural Network parameter.
NEURAL NETWORK COMPONENTS Neural Network Synaptic Weights Network Architecture Activation Nodes Sigmoid Gaussian Tangent hyperbolic Beta function Learning Rules
NEURAL NETWORK- OPTIMIZATION Synaptic Weights Architecture Nodes Learning Rules Conventional Algorithm: The Backpropagation Algorithm (Rumelhart and McClelland, 1986) used for the optimization of synaptic weights. Evolutionary Artificial Neural Network (EANN): The Metaheuristc Algorithms are used to optimize the other components of Neural Network using Genotype representation of the Phenotype. A survey on EANN given by
• Sigmoid • Tangent Hyperbolic ACTIVE NODES EXAMPLES Tangent hyperbolic parameter varying 𝜃 and fixed 𝜆 Tangent hyperbolic parameter varying 𝜆 and fixed 𝜃
GENOTYPE REPRESENTATION OF NN
METAHEURISTICS • To find a solution to a problem using certain rules or mechanism. (Glover, 1986) • The operators of metaheuristics • Transition: Searching for the solutions (exploration and exploitation). • Evaluation: Evaluating the objective function. • Determination: Deciding the search directions. • Verifying Goal: Convergence
METAHEURISTICS • Artificial Bee Colony (Karaboga, 2005) is a meta-heuristic algorithm inspired by foraging behavior of honey bee swarm. Depends of food position is updated by the artificial bees in iterative fashion. • Particle Swarm Optimization (Eberhart and Kennedy, 1995) is a population based meta-heuristic algorithm imitates the mechanisms of the foraging behavior of swarms. Depends of Velocity and Position Update of Particles in Swarm. • Deferential Evaluation (Storn and Price, 1995) Evolutionary Algorithm based optimization algorithm [operator – Selection and Crossover ]
CLASSIFICATION DATASET Iris Cancer (Wdbc) Wine Input 4 30 13 Class 3 2 3 Examples 150 569 178 PARAMETER SETTING
RESULTS – CLASSICAL ALGORITHM FOR WEIGHTS OPTIMIZATION SigFix and TanhFix – Indicates activation function with fixed parameter and optimization of the synaptic weights only Default values: 𝜆 = 1 𝑎𝑛𝑑 𝜃 = 0. Weight Initialization: [-1.5, 1.5]
RESULTS IRIS (SIGFIX - 91.6%, TANHFIX – 95.6%) Conclusion: With fixed activation function Metaheuristic doesn’t perform well. However, the simultaneous optimization synaptic weights and adaptive transfer function excels in performance. ABC performs better than PSO and DE at present parameter setup
RESULTS CANCER(SIGFIX - 93.3%, TANHFIX – 94.5%) Conclusion: Metaheuristic based optimization excels over the performance of the classical algorithm ABC performs better than PSO and DE at present parameter setup
RESULTS WINE(SIGFIX - 91.4%, TANHFIX – 95.5%) Conclusion: Metaheuristic (basically the ABC) based optimization excels over the performance of the classical algorithm. ABC performs better than PSO and DE at present parameter setup
CONCLUSION • An approach for the simultaneous optimization of Neural Network synaptic weights and transfer function was used. • The proposed model of optimization was tested over three benchmark dataset. • The classical backpropagation algorithm was used to train NN with fixed activation function in order to obtain benchmark results for the validation of Metaheuristic based simultaneous optimization of NN synaptic weights and transfer function. • The performance of Metaheurisitc based simultaneous optimization of transfer function and synaptic weights were found to be exceled over all the three benchmark dataset. • At the present parameter setting ABC algorithm was found to performing better than the PSO and DE. • Initialization of the synaptic weights and the parameter of the transfer
THANK YOU

Simultaneous optimization of neural network weights and active nodes using metaheuristics

  • 1.
    SIMULTANEOUS OPTIMIZATION OF NEURALNETWORK WEIGHTS AND ACTIVE NODES USING METAHEURISTICS PAPER LINK: HTTP://IEEEXPLORE.IEEE.ORG/DOCUMENT/7086207/ AUTHORS: V K OJHA, A ABRAHAM, V SNASEL. IT4INNOVATIONS, VSB TECHNICAL UNIVERSITY OF OSTRAVA, OSTRAVA, CZECH REPUBLIC. V. K. Ojha, A. Abraham and V. Snášel, "Simultaneous optimization of neural network weights and active nodes using metaheuristics," 2014 14th International Conference on Hybrid Intelligent Systems, Kuwait, 2014, pp. 248-253. doi: 10.1109/HIS.2014.7086207
  • 2.
    CONTENT • Introduction. • NeuralNetwork • Meta-heuristic Algorithms • Metaheuristic Framework For Transfer Function Optimization • Results • Conclusion
  • 3.
    INTRODUCTION • Neural Network(NN) is the most desirable computational tool for solving nonlinear and complex optimization, pattern recognition, function approximation, classification, etc., problems. • The Metaheuristic (MH) Algorithms are stochastic search algorithm that efficient in exploration and exploitation of search space to offer near optimal solution to optimization problems. • NN-MH: Metaheuristics can be used to optimize Neural Network parameter.
  • 4.
  • 5.
    NEURAL NETWORK- OPTIMIZATION Synaptic Weights Architecture Nodes Learning Rules ConventionalAlgorithm: The Backpropagation Algorithm (Rumelhart and McClelland, 1986) used for the optimization of synaptic weights. Evolutionary Artificial Neural Network (EANN): The Metaheuristc Algorithms are used to optimize the other components of Neural Network using Genotype representation of the Phenotype. A survey on EANN given by
  • 6.
    • Sigmoid • TangentHyperbolic ACTIVE NODES EXAMPLES Tangent hyperbolic parameter varying 𝜃 and fixed 𝜆 Tangent hyperbolic parameter varying 𝜆 and fixed 𝜃
  • 7.
  • 8.
    METAHEURISTICS • To finda solution to a problem using certain rules or mechanism. (Glover, 1986) • The operators of metaheuristics • Transition: Searching for the solutions (exploration and exploitation). • Evaluation: Evaluating the objective function. • Determination: Deciding the search directions. • Verifying Goal: Convergence
  • 9.
    METAHEURISTICS • Artificial BeeColony (Karaboga, 2005) is a meta-heuristic algorithm inspired by foraging behavior of honey bee swarm. Depends of food position is updated by the artificial bees in iterative fashion. • Particle Swarm Optimization (Eberhart and Kennedy, 1995) is a population based meta-heuristic algorithm imitates the mechanisms of the foraging behavior of swarms. Depends of Velocity and Position Update of Particles in Swarm. • Deferential Evaluation (Storn and Price, 1995) Evolutionary Algorithm based optimization algorithm [operator – Selection and Crossover ]
  • 10.
    CLASSIFICATION DATASET Iris Cancer(Wdbc) Wine Input 4 30 13 Class 3 2 3 Examples 150 569 178 PARAMETER SETTING
  • 11.
    RESULTS – CLASSICALALGORITHM FOR WEIGHTS OPTIMIZATION SigFix and TanhFix – Indicates activation function with fixed parameter and optimization of the synaptic weights only Default values: 𝜆 = 1 𝑎𝑛𝑑 𝜃 = 0. Weight Initialization: [-1.5, 1.5]
  • 12.
    RESULTS IRIS (SIGFIX- 91.6%, TANHFIX – 95.6%) Conclusion: With fixed activation function Metaheuristic doesn’t perform well. However, the simultaneous optimization synaptic weights and adaptive transfer function excels in performance. ABC performs better than PSO and DE at present parameter setup
  • 13.
    RESULTS CANCER(SIGFIX -93.3%, TANHFIX – 94.5%) Conclusion: Metaheuristic based optimization excels over the performance of the classical algorithm ABC performs better than PSO and DE at present parameter setup
  • 14.
    RESULTS WINE(SIGFIX -91.4%, TANHFIX – 95.5%) Conclusion: Metaheuristic (basically the ABC) based optimization excels over the performance of the classical algorithm. ABC performs better than PSO and DE at present parameter setup
  • 15.
    CONCLUSION • An approachfor the simultaneous optimization of Neural Network synaptic weights and transfer function was used. • The proposed model of optimization was tested over three benchmark dataset. • The classical backpropagation algorithm was used to train NN with fixed activation function in order to obtain benchmark results for the validation of Metaheuristic based simultaneous optimization of NN synaptic weights and transfer function. • The performance of Metaheurisitc based simultaneous optimization of transfer function and synaptic weights were found to be exceled over all the three benchmark dataset. • At the present parameter setting ABC algorithm was found to performing better than the PSO and DE. • Initialization of the synaptic weights and the parameter of the transfer
  • 16.