مروری بر انواع الگوریتم‌های فراکاوشی در بهینه‌سازی

نویسندگان

دانشگاه سمنان

چکیده

با پیچیده‌تر شدن مسائل بهینه‌سازی و عدم کارایی مطلوب روش‌های تحلیلی سنتی، نیاز به ابزارهای قویتر برای حل این مسائل احساس شد. علاوه‌بر مشکلاتی همچون نیاز به تضمین‌هایی در خصوص مشتق‌پذیری و پیوستگی، امکان همگرایی به بهینۀ محلی، زمان حلِ این روش‌ها در بسیاری از مسائل به صورت نمایی رشد می‌کند. در پاسخ به این نیاز، الگوریتم‌های حل فراکاوشی ظهور پیدا کردند. این روش‌ها هیچگونه نیازی به اطلاعات مشتق مساله ندارند، با عملگرهای خاص خود قادر به فرار از بهینۀ محلی و کشف بهینۀ کلی هستند و زمان محاسبات مورد نیاز در آن‌ها با افزایش ابعاد مساله به صورت خطی یا چندجمله‌ای افزایش می‌یابد. با این‌حال به‌دلیل پراکندگی این روش‌ها در تحقیقات مختلف و عدم سازمان‌دهی کامل آن‌ها، محققان شناخت مناسبی از طیف گستردۀ این الگوریتم‌ها، سازوکار و ویژگی‌های این الگوریتم‌ها ندارند. در این مقاله سعی شده است شماری از مهم‌ترین و کاربردی‌ترین این الگوریتم‌ها (40 الگوریتم فراکاوشی مختلف) معرفی گردد، ویژگی‌های اصلی این الگوریتم‌ها همچون سازوکار جستجوی فضای مسالۀ بهینه‌سازی، عملگرهای اساسی و منبع الهام هریک شرح داده شود. همچنین به‌صورت فشرده، بعضی وجوه تمایز این الگوریتم‌ها مانند قابلیت جستجوی محلی و کلی، تعریف حافظه و تنظیم پارامترها بحث شده است.

کلیدواژه‌ها


عنوان مقاله [English]

a Review of Metaheuristic Algorithms in Optimization

نویسندگان [English]

  • Hossein Sharifzadeh
  • Nima Amjady
چکیده [English]

With continuously increasing complexity of optimization problems and poor performance of conventional analytical based methods, more powerful tools are required to cope these problems. Difficulties such as necessity of differentiable and continuous model as well as possibility of converging to local minimum, computational time of these methods increase exponentially as well. Metaheuristic algorithms have introduced to overcome such challenges. These methods don’t require differentiation information, can discover global optimal and run away from local optima using their operators with linear or polynomial increase in their computational time. However, because of diversity and different publication resource of these methods, researchers don’t know their characteristic and search mechanism well. This paper aims to introduce some of the most important of these algorithms (40 different algorithms), to describe main characteristic of these algorithms such as solution space search method, main operators and their inspiration sources. Moreover, some of unique characteristic of these algorithms such as local and global search capability, memory consideration and parameters tuning methods are discussed.

کلیدواژه‌ها [English]

  • Optimization
  • Analytical methods
  • Metaheuristic algorithms
 
[1]     Rao, S. S. (2009). Engineering optimization: theory and practice. Wiley.
[2]     Nocedal, J., & Wright, S. J. (1999). Numerical optimization. Springer verlag.
[3]     AlRashidi, M. R., & El-Hawary, M. E. (2007). Hybrid particle swarm optimization approach for solving the discrete OPF problem considering the valve loading effects. IEEE Transactions on Power Systems, , 22(4), 2030-2038.
[4]     Blum, C., Puchinger, J., Raidl, G. R., & Roli, A. (2011). Hybrid metaheuristics in combinatorial optimization: A survey. Applied Soft Computing, 11(6), 4135-4151.
[5]     Lazar, A, Reynolds, R.G. (2003) Heuristic knowledge discovery for archaeological data using genetic algorithms and rough sets, Artificial Intelligence Laboratory, Department of Computer Science, Wayne State University..
[6]     Dorigo, M., Maniezzo, V., & Colorni, A. (1996). Ant system: optimization by a colony of cooperating agents. IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics, 26(1), 29-41.
[7]     Karaboga, D., & Basturk, B. (2007). A powerful and efficient algorithm for numerical function optimization: artificial bee colony (ABC) algorithm. Journal of Global Optimization, 39(3), 459-471.
[8]     Li LX, Shao ZJ, Qian JX (2002) An optimizing method based on autonomous animals: fish-swarm algorithm. Syst Eng Theory Practice 22(11):32–38.
[9]     DasGupta, D. (1998). Artficial Immune Systems and Their Applications. Springer-Verlag New York, Inc..
[10] Passino, K. M. (2002). Biomimicry of bacterial foraging for distributed optimization and control. IEEE Control Systems, 22(3), 52-67.
[11] Yang, X. S. (2010). A new metaheuristic bat-inspired algorithm. Nature Inspired Cooperative Strategies for Optimization (NICSO 2010), 65-74.
[12] Erol, O. K., & Eksin, I. (2006). A new optimization method: big bang–big crunch. Advances in Engineering Software, 37(2), 106-111.
[13] Simon, D. (2008). Biogeography-based optimization. IEEE Transactions on Evolutionary Computation, 12(6), 702-713.
[14] Kaveh, A., & Talatahari, S. (2010). A novel heuristic optimization method: charged system search. Acta Mechanica, 213(3), 267-289.
[15] Lam, A. Y., & Li, V. O. (2010). Chemical-reaction-inspired metaheuristic for optimization. IEEE Transactions on Evolutionary Computation, 14(3), 381-399.
[16] Rubinstein, R. (1999). The cross-entropy method for combinatorial and continuous optimization. Methodology and computing in applied probability,1(2), 127-190.
[17] Yang, X. S., & Deb, S. (2009, December). Cuckoo search via Lévy flights. IEEE World Congress on InNature & Biologically Inspired Computing, 2009. NaBIC 2009. (pp. 210-214)..
[18] Sakulin, A., & Puangdownreong, D. (2012). A novel meta-heuristic optimization algorithm: current search. 11th WSEAS international conference on Artificial Intelligence, Knowledge Engineering and Data Bases, 125-130.
[19] Storn, R., & Price, K. (1997). Differential evolution–a simple and efficient heuristic for global optimization over continuous spaces. Journal of global optimization, 11(4), 341-359.
[20] Birbil, Ş. İ., & Fang, S. C. (2003). An electromagnetism-like mechanism for global optimization. Journal of global optimization, 25(3), 263-282.
[21] H.P. Schwefel, Evolutionsstrategie und numerische Optimierung, Dissertation, TU Berlin, Germany, 1975.
[22] Fogel, L. J., Owens, A. J., & Walsh, M. J. (1966). Artificial intelligence through simulated evolution.
[23] Yang, X. S. (2009). Firefly algorithms for multimodal optimization. Stochastic algorithms: foundations and applications, 169-178.
[24] Shah-Hosseini, H. (2011). Principal components analysis by the galaxy-based search algorithm: a novel metaheuristic for continuous optimisation. International Journal of Computational Science and Engineering, 6(1), 132-140.
[25] Holland, J. H. (1975). Adaptation in natural and artificial systems, University of Michigan press. Ann Arbor, MI, 1(97), 5.
[26] Krishnanand, K. N., & Ghose, D. (2009). Glowworm swarm optimization for simultaneous capture of multiple local optima of multimodal functions. Swarm intelligence, 3(2), 87-124.
[27] Rashedi, E., Nezamabadi-pour, H., & Saryazdi, S. (2009). GSA: a gravitational search algorithm. Information Sciences, 179(13), 2232-2248.
[28] He, S., Wu, Q. H., & Saunders, J. R. (2006, July). A novel group search optimizer inspired by animal behavioural ecology. IEEE Congress on In Evolutionary Computation, CEC 2006. (pp. 1272-1278).
[29] Lee, K. S., & Geem, Z. W. (2005). A new meta-heuristic algorithm for continuous engineering optimization: harmony search theory and practice.Computer methods in applied mechanics and engineering, 194(36), 3902-3933.
[30] Abbass, H. A. (2001). MBO: Marriage in honey bees optimization-A haplometrosis polygynous swarming approach. IEEE Congress on Evolutionary Computation, 2001. (Vol. 1, pp. 207-214)..
[31] Oftadeh, R., & Mahjoob, M. J. (2009, September). A new meta-heuristic optimization algorithm: Hunting Search. IEEE Soft Computing, Computing with Words and Perceptions in System Analysis, Decision and Control, 2009. ICSCCW 2009. (pp. 1-5).
[32] Atashpaz-Gargari, E., & Lucas, C. (2007, September). Imperialist competitive algorithm: an algorithm for optimization inspired by imperialistic competition. IEEE Congress on Evolutionary Computation, 2007. CEC 2007. (pp. 4661-4667).
[33] Shah_Hosseini, H. (2007, September). Problem solving by intelligent water drops. IEEE Congress on Evolutionary Computation, 2007. CEC 2007. (pp. 3226-3231)..
[34] Qin, J. (2009, November). A new optimization algorithm and its application—Key cutting algorithm. IEEE International Conference on Grey Systems and Intelligent Services, 2009. GSIS 2009. (pp. 1537-1541).
[35] Mucherino, A., & Seref, O. (2007, November). Monkey search: a novel metaheuristic search for global optimization. In Data Mining, Systems Analysis and Optimization in Biomedicine (Vol. 953, pp. 162-173).
[36] Premaratne, U., Samarabandu, J., & Sidhu, T. (2009, December). A new biologically inspired optimization algorithm. IEEE International Conference on Industrial and Information Systems (ICIIS), 2009 (pp. 279-284).
[37] Kennedy, J., & Eberhart, R. (1995, November). Particle swarm optimization. IEEE International Conference on Neural Networks, 1995. Proceedings., (Vol. 4, pp. 1942-1948).
[38] Han, K. H., & Kim, J. H. (2002). Quantum-inspired evolutionary algorithm for a class of combinatorial optimization. IEEE Transactions on Evolutionary Computation, 6(6), 580-593.
[39] Rabanal, P., Rodríguez, I., & Rubio, F. (2007). Using river formation dynamics to design heuristic algorithms. Unconventional Computation, 163-177.
[40] Dai, C., Chen, W., & Zhu, Y. (2006, November). Seeker optimization algorithm. IEEE International Conference on Computational Intelligence and Security, (Vol. 1, pp. 225-229).
[41] Eusuff, M. M., & Lansey, K. E. (2003). Optimization of water distribution network design using the shuffled frog leaping algorithm. Journal of Water Resources Planning and Management, 129(3), 210-225.
[42] Kirkpatrick, S., & Vecchi, M. P. (1983). Optimization by simmulated annealing.science, 220(4598), 671-680.
[43] Glover, F. (1986). Future paths for integer programming and links to artificial intelligence. Computers & Operations Research, 13(5), 533-549.
[44] Mladenović, N., & Hansen, P. (1997). Variable neighborhood search.Computers & Operations Research, 24(11), 1097-1100.
[45] Eskandar, H., Sadollah, A., Bahreininejad, A., & Hamdi, M. (2012). Water cycle algorithm–A novel Metaheuristic optimization method for solving constrained engineering optimization problems. Computers & Structures.
[46] Prügel-Bennett, A. (2010). Benefits of a population: five mechanisms that advantage population-based algorithms. Evolutionary Computation, IEEE Transactions on, 14(4), 500-517.
[47] Wolpert, D. H., & Macready, W. G. (1997). No free lunch theorems for optimization. IEEE Transactions on Evolutionary Computation, 1(1), 67-82.
[48] Eiben, A. E., Hinterding, R., & Michalewicz, Z. (1999). Parameter control in evolutionary algorithms. Evolutionary Computation, IEEE Transactions on, 3(2), 124-141.
[49] Regis, R. G., & Shoemaker, C. A. (2004). Local function approximation in evolutionary algorithms for the optimization of costly functions. IEEE Transactions on Evolutionary Computation, 8(5), 490-505.
[50] Clerc, M., & Kennedy, J. (2002). The particle swarm-explosion, stability, and convergence in a multidimensional complex space. IEEE Transactions on Evolutionary Computation, 6(1), 58-73.