Outline

  • Abstract
  • Keywords
  • 1. Introduction
  • 2. Particle Swarm Optimization
  • 3. Gravitational Search Algorithm
  • 4. the Hybrid Psogsa Algorithm
  • 5. Pso, Gsa, and Psogsa for Training Fnns
  • 5.1. Fitness Function
  • 5.2. Encoding Strategy
  • 6. Results and Discussion
  • 6.1. the N Bits Parity (xor) Problem
  • 6.2. Function Approximation Problem
  • 6.3. Iris Classification Problem
  • 7. Conclusion
  • References

رئوس مطالب

  • چکیده
  • کلیدواژه ها
  • 1. مقدمه
  • 2. بهینه سازی ازدحام ذرات
  • 3. الگوریتم جستجوی گرانشی
  • 4.الگوریتم ترکیبی PSOGSA
  • 5. PSO ،GSA و PSOGSA برای پرورش FNN ها
  • 5.1 تابع تناسب
  • 5.2 استراتژی کد کردن
  • 6.نتایج و مباحث
  • 7.نتایج

Abstract

The Gravitational Search Algorithm (GSA) is a novel heuristic optimization method based on the law of gravity and mass interactions. It has been proven that this algorithm has good ability to search for the global optimum, but it suffers from slow searching speed in the last iterations. This work proposes a hybrid of Particle Swarm Optimization (PSO) and GSA to resolve the aforementioned problem. In this paper, GSA and PSOGSA are employed as new training methods for Feedforward Neural Networks (FNNs) in order to investigate the efficiencies of these algorithms in reducing the problems of trapping in local minima and the slow convergence rate of current evolutionary learning algorithms. The results are compared with a standard PSO-based learning algorithm for FNNs. The resulting accuracy of FNNs trained with PSO, GSA, and PSOGSA is also investigated. The experimental results show that PSOGSA outperforms both PSO and GSA for training FNNs in terms of converting speed and avoiding local minima. It is also proven that an FNN trained with PSOGSA has better accuracy than one trained with GSA.

Keywords: - - - - - - -

Conclusions

In this paper, two new training algorithms called FNNGSA and FNNPSOGSA are introduced and investigated utilizing GSA and PSOGSA. Three benchmark problems: 3-bit XOR, function approximation, and Iris classification, are employed to evaluate the efficiencies of these new learning algorithms. The results are compared with FNNPSO. For all benchmark problems, FNNPSOGSA shows better performance in terms of convergence rate and avoidance of local minima. It is observed that FNNPSO gives the highest accuracy while FNNGSA shows the worst. Therefore, it can be concluded that the proposed FNNPSOGSA improves the problem of trapping in local minima with very good convergence speed. The results for FNNGSA also prove that GSA is not good for training FNNs because of its slow searching speed. In summary, the results prove that FNNPSOGSA boosts the problem of trapping in local minima and enhances the convergence speed compared to the existing learning algorithms for FNNs.

دانلود ترجمه تخصصی این مقاله دانلود رایگان فایل pdf انگلیسی