• Nenhum resultado encontrado

Existem aspectos da proposta que podem ser melhor investigados e melhorados no futuro. Estes são listados a seguir:

• Gramática do PSO: Embora a gramática proposta considere diversos componentes do PSO, pode-se investigar novos componentes para o PSO propostos na literatura que possam ser interessantes na composição do design.

• Meta-características: Neste trabalho foram utilizadas 5 métricas para descrição da super- fície de fitness de problemas de otimização. No futuro, pode-se investigar a aplicação de outras métricas ou até incluir novas métricas que possibilitem uma melhor caracterização do problema.

• O sistema utiliza distância euclidiana para calcular a similaridade entre meta-exemplos. Pode-se investigar outros meios para tal tarefa.

• Incorporar mecanismo de retro-alimentação e investigar sua contribuição à arquitetura da abordagem proposta.

• Recomendação: O sistema proposto recomenda o melhor algoritmo do problema mais similar ao problema de entrada. Embora os resultados tenham sido bons com esta abor- dagem, existem estratégias alternativas. Uma ideia seria, ao invés de considerar apenas o algoritmo do problema mais similar, considerar os k problemas mais similares e montar um algoritmo resultante (ensemble) a partir dos algoritmos destes k problemas (k é definido pelo especialista).

• A abordagem proposta otimiza os algoritmos considerando apenas um único objetivo, o valor de fitness alcançado pelo algoritmo em um problema dado. No futuro, pode-se

investigar outros objetivos que podem ser relevantes para a construção de algoritmos, tais como tempo de execução, uso de memória, entre outros.

95

Referências

AHA, D. W. Generalizing from case studies: A case study. In: Proc. of the 9th International Conference on Machine Learning. [S.l.: s.n.], 1992. p. 1–10.

ALKAYA, A. F.; ALGIN, R.; SAHIN, Y.; AGAOGLU, M.; AKSAKALLI, V. Performance of migrating birds optimization algorithm on continuous functions. In: SPRINGER. International Conference in Swarm Intelligence. [S.l.], 2014. p. 452–459.

BADER-EL-DEN, M.; POLI, R. Generating sat local-search heuristics using a gp hyper-heuristic framework. In: SPRINGER. Artificial evolution. [S.l.], 2007. p. 37–49.

BADER-EL-DEN, M.; POLI, R.; FATIMA, S. Evolving timetabling heuristics using a grammar-based genetic programming hyper-heuristic framework. Memetic Computing, Springer, v. 1, n. 3, p. 205–219, 2009.

BARREIROS, E. F. S. PSS: Um simulador para otimização por enxames de partículas. Tese (Doutorado) — Universidade de Pernambuco, 2008.

BOJARCZUK, C. C.; LOPES, H. S.; FREITAS, A. A. Discovering comprehensible classification rules by using genetic programming: a case study in a medical domain. In: CITESEER. GECCO. [S.l.], 1999. p. 953–958.

BOJARCZUK, C. C.; LOPES, H. S.; FREITAS, A. A. Genetic programming for knowledge discovery in chest-pain diagnosis. Engineering in Medicine and Biology Magazine, IEEE, IEEE, v. 19, n. 4, p. 38–44, 2000.

BORENSTEIN, Y.; POLI, R. Information landscapes. In: ACM. Proceedings of the 7th annual conference on Genetic and evolutionary computation. [S.l.], 2005. p. 1515–1522.

BOT, M. C.; LANGDON, W. B. Application of genetic programming to induction of linear classification trees. In: Genetic Programming. [S.l.]: Springer, 2000. p. 247–258.

BRAZDIL, P.; CARRIER, C. G.; SOARES, C.; VILALTA, R. Metalearning: applications to data mining. [S.l.]: Springer Science & Business Media, 2008.

BRAZDIL, P.; HENERY, R. Analysis of results. Machine learning, neural and statistical classification, Ellis Horwood, New York, p. 175–212, 1994.

BRAZDIL, P. B.; SOARES, C.; COSTA, J. P. D. Ranking learning algorithms: Using ibl and meta-learning on accuracy and time results. Machine Learning, Springer, v. 50, n. 3, p. 251–277, 2003.

BRITS, R.; ENGELBRECHT, A. P.; BERGH, F. Van den. A niching particle swarm optimizer. In: SINGAPORE: ORCHID COUNTRY CLUB. Proceedings of the 4th Asia-Pacific conference on simulated evolution and learning. [S.l.], 2002. v. 2, p. 692–696.

BURKE, E.; KENDALL, G.; NEWALL, J.; HART, E.; ROSS, P.; SCHULENBURG, S. Hyper-heuristics: An emerging direction in modern search technology. International series in operations research and management science, Springer, p. 457–474, 2003.

BURKE, E. K.; GENDREAU, M.; HYDE, M.; KENDALL, G.; OCHOA, G.; ÖZCAN, E.; QU, R. Hyper-heuristics: A survey of the state of the art. Journal of the Operational Research Society, Palgrave Macmillan, v. 64, n. 12, p. 1695–1724, 2013.

BURKE, E. K.; HYDE, M.; KENDALL, G.; WOODWARD, J. A genetic programming hyper-heuristic approach for evolving 2-d strip packing heuristics. Evolutionary Computation, IEEE Transactions on, IEEE, v. 14, n. 6, p. 942–958, 2010.

BURKE, E. K.; HYDE, M. R.; KENDALL, G. Grammatical evolution of local search heuristics. Evolutionary Computation, IEEE Transactions on, IEEE, v. 16, n. 3, p. 406–417, 2012.

BURKE, E. K.; HYDE, M. R.; KENDALL, G.; OCHOA, G.; OZCAN, E.; WOODWARD, J. R. Exploring hyper-heuristic methodologies with genetic programming. In: Computational intelligence. [S.l.]: Springer, 2009. p. 177–201.

CARVALHO, D. Ferreira de; BASTOS-FILHO, C. J. A. Clan particle swarm optimization. International Journal of Intelligent Computing and Cybernetics, Emerald Group Publishing Limited, v. 2, n. 2, p. 197–227, 2009.

CHATTERJEE, A.; SIARRY, P. Nonlinear inertia weight variation for dynamic adaptation in particle swarm optimization. Computers & Operations Research, Elsevier, v. 33, n. 3, p. 859–871, 2006.

CLERC, M.; KENNEDY, J. The particle swarm-explosion, stability, and convergence in a multidimensional complex space. Evolutionary Computation, IEEE Transactions on, IEEE, v. 6, n. 1, p. 58–73, 2002.

EBERHART, R. C.; KENNEDY, J. A new optimizer using particle swarm theory. In: NEW YORK, NY. Proceedings of the sixth international symposium on micro machine and human science. [S.l.], 1995. v. 1, p. 39–43.

EBERHART, R. C.; SHI, Y. Comparing inertia weights and constriction factors in particle swarm optimization. In: IEEE. Evolutionary Computation, 2000. Proceedings of the 2000 Congress on. [S.l.], 2000. v. 1, p. 84–88.

EIBEN, A. E.; HINTERDING, R.; MICHALEWICZ, Z. Parameter control in evolutionary algorithms. Evolutionary Computation, IEEE Transactions on, IEEE, v. 3, n. 2, p. 124–141, 1999.

EL-SHERBINY, M. M. Particle swarm inspired optimization algorithm without velocity equation. Egyptian Informatics Journal, Elsevier, v. 12, n. 1, p. 1–8, 2011.

ELSHAMY, W.; EMARA, H. M.; BAHGAT, A. Clubs-based particle swarm optimization. In: IEEE. Swarm Intelligence Symposium, 2007. SIS 2007. IEEE. [S.l.], 2007. p. 289–296.

ENGELBRECHT, A. P. Computational intelligence: an introduction. [S.l.]: John Wiley & Sons, 2007.

FALCO, I. D.; TARANTINO, E.; CIOPPA, A. D.; FONTANELLA, F. A novel grammar-based genetic programming approach to clustering. In: ACM. Proceedings of the 2005 ACM symposium on Applied computing. [S.l.], 2005. p. 928–932.

Referências 97

FOLINO, G.; PIZZUTI, C.; SPEZZANO, G. Genetic programming and simulated annealing: A hybrid method to evolve decision trees. In: Genetic Programming. [S.l.]: Springer, 2000. p. 294–303.

FREITAS, A. A.; PAPPA, G. L. Genetic Programming for Automatically Constructing Data Mining Algorithms.2009.

FUKUNAGA, A. S. Automated discovery of local search heuristics for satisfiability testing. Evolutionary Computation, MIT Press, v. 16, n. 1, p. 31–61, 2008.

GARDEN, R. W.; ENGELBRECHT, A. P. Analysis and classification of optimisation benchmark functions and benchmark suites. In: IEEE. Evolutionary Computation (CEC), 2014 IEEE Congress on. [S.l.], 2014. p. 1641–1649.

GENDREAU, M.; POTVIN, J.-Y. Handbook of metaheuristics. [S.l.]: Springer, 2010. v. 2. GRUAU, F. On using syntactic constraints with genetic programming. In: MIT PRESS. Advances in genetic programming. [S.l.], 1996. p. 377–394.

HANSEN, N.; FINCK, S.; ROS, R.; AUGER, A. Real-parameter black-box optimization benchmarking 2009: Noiseless functions definitions. 2009.

HANSEN, N.; ROS, R.; MAUNY, N.; SCHOENAUER, M.; AUGER, A. Pso facing non-separable and ill-conditioned problems. 2008.

HARDING, S.; BANZHAF, W. Fast genetic programming on gpus. In: Genetic Programming. [S.l.]: Springer, 2007. p. 90–101.

HARPER, R.; BLAIR, A. A structure preserving crossover in grammatical evolution. In: IEEE. Evolutionary Computation, 2005. The 2005 IEEE Congress on. [S.l.], 2005. v. 3, p. 2537–2544. HENDTLASS, T. A combined swarm differential evolution algorithm for optimization problems. In: Engineering of intelligent systems. [S.l.]: Springer, 2001. p. 11–18.

HIGASHI, N.; IBA, H. Particle swarm optimization with gaussian mutation. In: IEEE. Swarm Intelligence Symposium, 2003. SIS’03. Proceedings of the 2003 IEEE. [S.l.], 2003. p. 72–79. HOOS, H. H.; STÜTZLE, T. Stochastic local search: Foundations & applications. [S.l.]: Elsevier, 2004.

HUSSAIN, T. S.; BROWSE, R. A. Network generating attribute grammar encoding. In: IEEE. Neural Networks Proceedings, 1998. IEEE World Congress on Computational Intelligence. The 1998 IEEE International Joint Conference on. [S.l.], 1998. v. 1, p. 431–436.

JABEEN, H.; JALIL, Z.; BAIG, A. R. Opposition based initialization in particle swarm optimization (o-pso). In: ACM. Proceedings of the 11th Annual Conference Companion on Genetic and Evolutionary Computation Conference: Late Breaking Papers. [S.l.], 2009. p. 2047–2052.

JAMIL, M.; YANG, X.-S. A literature survey of benchmark functions for global optimisation problems. International Journal of Mathematical Modelling and Numerical Optimisation, Inderscience Publishers Ltd, v. 4, n. 2, p. 150–194, 2013.

JANSON, S.; MIDDENDORF, M. A hierarchical particle swarm optimizer and its adaptive variant. Systems, Man, and Cybernetics, Part B: Cybernetics, IEEE Transactions on, IEEE, v. 35, n. 6, p. 1272–1282, 2005.

JIAO, B.; LIAN, Z.; GU, X. A dynamic inertia weight particle swarm optimization algorithm. Chaos, Solitons & Fractals, Elsevier, v. 37, n. 3, p. 698–705, 2008.

KANDA, J.; CARVALHO, A.; HRUSCHKA, E.; SOARES, C. Selection of algorithms to solve traveling salesman problems using meta-learning. International Journal of Hybrid Intelligent Systems, IOS Press, v. 8, n. 3, p. 117–128, 2011.

KANDA, J. Y.; CARVALHO, A. C. D.; HRUSCHKA, E. R.; SOARES, C. Using meta-learning to recommend meta-heuristics for the traveling salesman problem. In: IEEE. Machine Learning and Applications and Workshops (ICMLA), 2011 10th International Conference on. [S.l.], 2011. v. 1, p. 346–351.

KANG, Q.; WANG, L. et al. Research on fuzzy adaptive optimization strategy of particle swarm algorithm. Citeseer, 2006.

KELLER, R. E.; BANZHAF, W. Genetic programming using genotype-phenotype mapping from linear genomes into linear phenotypes. In: MIT PRESS. Proceedings of the 1st annual conference on genetic programming. [S.l.], 1996. p. 116–122.

KELLER, R. E.; POLI, R. Cost-benefit investigation of a genetic-programming hyperheuristic. In: SPRINGER. Artificial Evolution. [S.l.], 2007. p. 13–24.

KELLER, R. E.; POLI, R. Linear genetic programming of parsimonious metaheuristics. In: IEEE. Evolutionary Computation, 2007. CEC 2007. IEEE Congress on. [S.l.], 2007. p. 4508–4515.

KENNDY, J.; EBERHART, R. Particle swarm optimization. In: Proceedings of IEEE International Conference on Neural Networks. [S.l.: s.n.], 1995. v. 4, p. 1942–1948.

KENNEDY, J.; MENDES, R. Population structure and particle swarm performance. IEEE computer Society, 2002.

KUBA, P.; BRAZDIL, P.; SOARES, C.; WOZNICA, A. et al. Exploiting sampling and meta-learning for parameter setting for support vector machines. University of Sevilla, 2002. KUMAR, S.; CHATURVEDI, D. Tuning of particle swarm optimization parameter using fuzzy logic. In: IEEE. Communication systems and network technologies (CSNT), 2011 international conference on. [S.l.], 2011. p. 174–179.

LEE, C.-Y.; YAO, X. Evolutionary algorithms with adaptive lévy mutations. In: IEEE. Evolutionary Computation, 2001. Proceedings of the 2001 Congress on. [S.l.], 2001. v. 1, p. 568–575.

LI, C.; YANG, S.; KOREJO, I. An adaptive mutation operator for particle swarm optimization. MIC 2008, 2008.

LI, X. Adaptively choosing neighbourhood bests using species in a particle swarm optimizer for multimodal function optimization. In: SPRINGER. Genetic and Evolutionary Computation–GECCO 2004. [S.l.], 2004. p. 105–116.

Referências 99

LIU, J.-L. Evolving particle swarm optimization implemented by a genetic algorithm. Journal of Advanced Computational Intelligence and Intelligent Informatics, v. 12, n. 3, p. 284–289, 2008. LUNACEK, M.; WHITLEY, D. The dispersion metric and the cma evolution strategy. In: ACM. Proceedings of the 8th annual conference on Genetic and evolutionary computation. [S.l.], 2006. p. 477–484.

MALAN, K. M. Characterising continuous optimisation problems for particle swarm optimisation performance prediction. Tese (Doutorado) — University of Pretoria, 2014. MALAN, K. M.; ENGELBRECHT, A. P. A survey of techniques for characterising fitness landscapes and some possible ways forward. Information Sciences, Elsevier, v. 241, p. 148–163, 2013.

MALAN, K. M.; ENGELBRECHT, A. P. A progressive random walk algorithm for sampling continuous fitness landscapes. In: IEEE. Evolutionary Computation (CEC), 2014 IEEE Congress on. [S.l.], 2014. p. 2507–2514.

MASCIA, F.; LÓPEZ-IBÁÑEZ, M.; DUBOIS-LACOSTE, J.; STÜTZLE, T. Grammar-based generation of stochastic local search heuristics through automatic algorithm configuration tools. Computers & operations research, Elsevier, v. 51, p. 190–199, 2014.

MCCONAGHY, T.; GIELEN, G. Canonical form functions as a simple means for genetic programming to evolve human-interpretable functions. In: ACM. Proceedings of the 8th annual conference on Genetic and evolutionary computation. [S.l.], 2006. p. 855–862.

MCKAY, R. I.; HOAI, N. X.; WHIGHAM, P. A.; SHAN, Y.; O’NEILL, M. Grammar-based genetic programming: a survey. Genetic Programming and Evolvable Machines, Springer, v. 11, n. 3-4, p. 365–396, 2010.

MENDES, R.; KENNEDY, J.; NEVES, J. Watch thy neighbor or how the swarm can learn from its environment. In: Swarm Intelligence Symposium, 2003. SIS\’03. Proceedings of the 2003 IEEE. [S.l.: s.n.], 2003. p. 88–94.

MILLONAS, M. M. Swarms, phase transitions, and collective intelligence. [S.l.], 1992. MIRANDA, P. B.; PRUDÊNCIO, R. B. Gefpso: A framework for pso optimization based on grammatical evolution. In: ACM. Proceedings of the 2015 on Genetic and Evolutionary Computation Conference. [S.l.], 2015. p. 1087–1094.

MIRANDA, P. B.; PRUDÊNCIO, R. B.; CARVALHO, A. P. D.; SOARES, C. A hybrid meta-learning architecture for multi-objective optimization of svm parameters. Neurocomputing, Elsevier, v. 143, p. 27–43, 2014.

O’NEILL, M.; BRABAZON, A. Grammatical differential evolution. In: IC-AI. [S.l.: s.n.], 2006. p. 231–236.

O’NEILL, M.; BRABAZON, A. Grammatical swarm: The generation of programs by social programming. Natural Computing, Springer, v. 5, n. 4, p. 443–462, 2006.

O’NEILL, M.; BRABAZON, A.; RYAN, C.; COLLINS, J. Evolving market index trading rules using grammatical evolution. In: Applications of evolutionary computing. [S.l.]: Springer, 2001. p. 343–352.

O’NEILL, M.; RYAN, C. Grammatical evolution. IEEE Transactions on Evolutionary Computation, v. 5, n. 4, p. 349–358, 2001.

PANIGRAHI, B.; PANDI, V. R.; DAS, S. Adaptive particle swarm optimization approach for static and dynamic economic load dispatch. Energy conversion and management, Elsevier, v. 49, n. 6, p. 1407–1415, 2008.

PANT, M.; THANGARAJ, R.; SINGH, V. P.; ABRAHAM, A. Particle swarm optimization using sobol mutation. In: IEEE. Emerging Trends in Engineering and Technology, 2008. ICETET’08. First International Conference on. [S.l.], 2008. p. 367–372.

PAPPA, G. L.; FREITAS, A. A. Automating the design of data mining algorithms: an evolutionary computation approach. [S.l.]: Springer Science & Business Media, 2009.

PAPPA, G. L.; OCHOA, G.; HYDE, M. R.; FREITAS, A. A.; WOODWARD, J.; SWAN, J. Contrasting meta-learning and hyper-heuristic research: the role of evolutionary algorithms. Genetic Programming and Evolvable Machines, Springer, v. 15, n. 1, p. 3–35, 2014.

PARSOPOULOS, K.; VRAHATIS, M. Initializing the particle swarm optimizer using the nonlinear simplex method. Advances in intelligent systems, fuzzy systems, evolutionary computation, Citeseer, v. 216, p. 1–6, 2002.

PARSOPOULOS, K. E. Particle Swarm Optimization and Intelligence: Advances and Applications: Advances and Applications. [S.l.]: IGI Global, 2010.

PARSOPOULOS, K. E.; VRAHATIS, M. N. Recent approaches to global optimization problems through particle swarm optimization. Natural computing, Springer, v. 1, n. 2-3, p. 235–306, 2002.

PASSARO, A.; STARITA, A. Particle swarm optimization for multimodal functions: a clustering approach. Journal of Artificial Evolution and Applications, Hindawi Publishing Corp., v. 2008, p. 8, 2008.

POLI, R.; CHIO, C. D.; LANGDON, W. B. Exploring extended particle swarms: a genetic programming approach. In: ACM. Proceedings of the 7th annual conference on Genetic and evolutionary computation. [S.l.], 2005. p. 169–176.

POLI, R.; LANGDON, W. B.; HOLLAND, O. Extending particle swarm optimisation via genetic programming. [S.l.]: Springer, 2005.

POLI, R.; WOODWARD, J.; BURKE, E. K. A histogram-matching approach to the evolution of bin-packing strategies. In: IEEE. Evolutionary Computation, 2007. CEC 2007. IEEE Congress on. [S.l.], 2007. p. 3500–3507.

RASHID, M. Combining PSO Algorithm and Honey Bee Food Foraging Behavior for Solving Multimodal and Dynamic Optimization Problems. Tese (Doutorado) — National University of Computer & Emerging Sciences, 2010.

RATNAWEERA, A.; HALGAMUGE, S. K.; WATSON, H. C. Self-organizing hierarchical particle swarm optimizer with time-varying acceleration coefficients. Evolutionary Computation, IEEE Transactions on, IEEE, v. 8, n. 3, p. 240–255, 2004.

RENDELL, L.; CHO, H. Empirical learning as a function of concept character. Machine Learning, Springer, v. 5, n. 3, p. 267–298, 1990.

Referências 101

RICE, J. R. The algorithm selection problem. 1975.

ROBILLIARD, D.; MARION, V.; FONLUPT, C. High performance genetic programming on gpu. In: ACM. Proceedings of the 2009 workshop on Bio-inspired algorithms for distributed systems. [S.l.], 2009. p. 85–94.

SABAR, N. R.; AYOB, M.; KENDALL, G.; QU, R. Grammatical evolution hyper-heuristic for combinatorial optimization problems. Evolutionary Computation, IEEE Transactions on, IEEE, v. 17, n. 6, p. 840–861, 2013.

SHI, Y.; EBERHART, R. A modified particle swarm optimizer. In: IEEE. Evolutionary Computation Proceedings, 1998. IEEE World Congress on Computational Intelligence., The 1998 IEEE International Conference on. [S.l.], 1998. p. 69–73.

SHI, Y.; EBERHART, R. C. Parameter selection in particle swarm optimization. In: SPRINGER. Evolutionary programming VII. [S.l.], 1998. p. 591–600.

SHI, Y.; EBERHART, R. C. Fuzzy adaptive particle swarm optimization. In: IEEE. Evolutionary Computation, 2001. Proceedings of the 2001 Congress on. [S.l.], 2001. v. 1, p. 101–106. SI, T. Grammatical differential evolution adaptable particle swarm optimization algorithm. International Journal of Electronics Communication and Computer Engineering, v. 3, n. 6, p. 1319–1324, 2012.

SI, T. Grammatical differential evolution adaptable particle swarm optimizer for artificial neural network training. IJECCE, v. 4, n. 1, p. 239–243, 2013.

SI, T.; DE, A.; BHATTACHARJEE, A. K. Grammatical swarm based-adaptable velocity update equations in particle swarm optimizer. In: SPRINGER. Proceedings of the International Conference on Frontiers of Intelligent Computing: Theory and Applications (FICTA) 2013. [S.l.], 2014. p. 197–206.

SMART, W.; ZHANG, M. Using genetic programming for multiclass classification by simultaneously solving component binary classification problems. In: Genetic Programming. [S.l.]: Springer, 2005. p. 227–239.

SMITH-MILES, K. Towards insightful algorithm selection for optimisation using meta-learning concepts. In: IEEE. WCCI 2008: IEEE World Congress on Computational Intelligence. [S.l.], 2008. p. 4118–4124.

SMITH-MILES, K.; HEMERT, J. I. van; LIM, X. Y. Understanding tsp difficulty by learning from evolved instances. LION, Springer, v. 4, p. 266–280, 2010.

SMITH-MILES, K. A. Cross-disciplinary perspectives on meta-learning for algorithm selection. ACM Computing Surveys (CSUR), ACM, v. 41, n. 1, p. 6, 2009.

SÖRENSEN, K. Metaheuristics—the metaphor exposed. International Transactions in Operational Research, Wiley Online Library, v. 22, n. 1, p. 3–18, 2015.

SOTELO-FIGUEROA, M. A.; SOBERANES, H. J. P.; CARPIO, J. M.; HUACUJA, H. J. F.; REYES, L. C.; SORIA-ALCARAZ, J. A. Evolving and reusing bin packing heuristic through grammatical differential evolution. In: IEEE. Nature and Biologically Inspired Computing (NaBIC), 2013 World Congress on. [S.l.], 2013. p. 92–98.

STACEY, A.; JANCIC, M.; GRUNDY, I. Particle swarm optimization with mutation. In: IEEE. Evolutionary Computation, 2003. CEC’03. The 2003 Congress on. [S.l.], 2003. v. 2, p. 1425–1430.

SUGANTHAN, P. N.; HANSEN, N.; LIANG, J. J.; DEB, K.; CHEN, Y.-P.; AUGER, A.; TIWARI, S. Problem definitions and evaluation criteria for the cec 2005 special session on real-parameter optimization. KanGAL report, Nanyang Technological University, Singapore; IIT Kanpur Kanpur, India, v. 2005005, p. 2005, 2005.

SURJANOVIC, S.; BINGHAM, D. Virtual library of simulation experiments: test functions and datasets. Simon Fraser University, in: http://www. sfu. ca/˜ ssurjano/optimization. html, 2013. SUTTON, A. M.; WHITLEY, D.; LUNACEK, M.; HOWE, A. Pso and multi-funnel landscapes: how cooperation might limit exploration. In: ACM. Proceedings of the 8th annual conference on Genetic and evolutionary computation. [S.l.], 2006. p. 75–82.

TALBI, E.-G. Metaheuristics: from design to implementation. [S.l.]: John Wiley & Sons, 2009. v. 74.

TAN, Y.; LI, J.; ZHENG, Z. Icsi 2014 competition on single objective optimization (icsi-2014-bs). arXiv preprint arXiv:1501.02128, 2015.

TANG, Y.; WANG, Z.; FANG, J.-a. Feedback learning particle swarm optimization. Applied Soft Computing, Elsevier, v. 11, n. 8, p. 4713–4725, 2011.

TAY, J. C.; HO, N. B. Evolving dispatching rules using genetic programming for solving multi-objective flexible job-shop problems. Computers & Industrial Engineering, Elsevier, v. 54, n. 3, p. 453–473, 2008.

TSAKONAS, A.; DOUNIAS, G.; JANTZEN, J.; AXER, H.; BJERREGAARD, B.;

KEYSERLINGK, D. G. von. Evolving rule-based systems in two medical domains using genetic programming. Artificial Intelligence in Medicine, Elsevier, v. 32, n. 3, p. 195–216, 2004. VASSILEV, V. K.; FOGARTY, T. C.; MILLER, J. F. Smoothness, ruggedness and neutrality of fitness landscapes: from theory to application. In: Advances in evolutionary computing. [S.l.]: Springer, 2003. p. 3–44.

VELLA, A. et al. Hyper-heuristic decision tree induction. Tese (Doutorado) — Heriot-Watt University, 2012.

WANG, Y.; LI, B.; WEISE, T.; WANG, J.; YUAN, B.; TIAN, Q. Self-adaptive learning based particle swarm optimization. Information Sciences, Elsevier, v. 181, n. 20, p. 4515–4538, 2011. WANG, Y.-X.; XIANG, Q.-L. Particle swarms with dynamic ring topology. In: IEEE. Evolutionary Computation, 2008. CEC 2008.(IEEE World Congress on Computational Intelligence). IEEE Congress on. [S.l.], 2008. p. 419–423.

WHIGHAM, P. A. Inductive bias and genetic programming. In: IET. Genetic Algorithms in Engineering Systems: Innovations and Applications, 1995. GALESIA. First International Conference on (Conf. Publ. No. 414). [S.l.], 1995. p. 461–466.

WHIGHAM, P. A.; DICK, G.; MACLAURIN, J.; OWEN, C. A. Examining the best of both worlds of grammatical evolution. In: ACM. Proceedings of the 2015 on Genetic and Evolutionary Computation Conference. [S.l.], 2015. p. 1111–1118.

Referências 103

WOLPERT, D. H.; MACREADY, W. G. No free lunch theorems for optimization. Evolutionary Computation, IEEE Transactions on, IEEE, v. 1, n. 1, p. 67–82, 1997.

WONG, M. L. An adaptive knowledge-acquisition system using generic genetic programming. Expert Systems with Applications, Elsevier, v. 15, n. 1, p. 47–58, 1998.

WONG, M. L.; LEUNG, K. S. Evolutionary program induction directed by logic grammars. Evolutionary Computation, MIT Press, v. 5, n. 2, p. 143–180, 1997.

WOODWARD, J. R.; SWAN, J. Template method hyper-heuristics. In: ACM. Proceedings of the 2014 conference companion on Genetic and evolutionary computation companion. [S.l.], 2014. p. 1437–1438.

XIAO, X.; ZHANG, Q. The multiple population co-evolution pso algorithm. In: SPRINGER. International Conference in Swarm Intelligence. [S.l.], 2014. p. 434–441.

XIN, J.; CHEN, G.; HAI, Y. A particle swarm optimizer with multi-stage linearly-decreasing inertia weight. In: IEEE. Computational Sciences and Optimization, 2009. CSO 2009. International Joint Conference on. [S.l.], 2009. v. 1, p. 505–508.

XINCHAO, Z. A perturbed particle swarm algorithm for numerical optimization. Applied Soft Computing, Elsevier, v. 10, n. 1, p. 119–124, 2010.

XU, G. An adaptive parameter tuning of particle swarm optimization algorithm. Applied Mathematics and Computation, Elsevier, v. 219, n. 9, p. 4560–4569, 2013.

XU, L.; HUTTER, F.; HOOS, H. H.; LEYTON-BROWN, K. Satzilla: portfolio-based algorithm selection for sat. Journal of Artificial Intelligence Research, p. 565–606, 2008.

YANG, H. Particle swarm optimization with modified velocity strategy. Energy Procedia, n. 11, p. 1074–1079, 2011.

YANG, X.; YUAN, J.; YUAN, J.; MAO, H. A modified particle swarm optimizer with dynamic adaptation. Applied Mathematics and Computation, Elsevier, v. 189, n. 2, p. 1205–1213, 2007. YAO, X. Evolving artificial neural networks. Proceedings of the IEEE, IEEE, v. 87, n. 9, p. 1423–1447, 1999.

YAO, X.; LIU, Y. Fast evolutionary programming. In: CITESEER. Evolutionary Programming. [S.l.], 1996. p. 451–460.

ZHANG, B.; ZHANG, M.; ZHENG, Y.-J. Improving enhanced fireworks algorithm with new gaussian explosion and population selection strategies. In: SPRINGER. International Conference in Swarm Intelligence. [S.l.], 2014. p. 53–63.

ZHANG, Q.; LI, C.; LIU, Y.; KANG, L. Fast multi-swarm optimization with cauchy mutation and crossover operation. In: Advances in Computation and Intelligence. [S.l.]: Springer, 2007. p. 344–352.

ZHANG, W.-J.; XIE, X.-F. et al. Depso: hybrid particle swarm with differential evolution operator. In: IEEE International Conference on Systems Man and Cybernetics. [S.l.: s.n.], 2003. v. 4, p. 3816–3821.

ZHENG, S.; JANECEK, A.; TAN, Y. Enhanced fireworks algorithm. In: 2013 IEEE Congress on Evolutionary Computation. [S.l.: s.n.], 2013. p. 2069–2077. ISSN 1089-778X.

ZHENG, S.; LIU, L.; YU, C.; LI, J.; TAN, Y. Fireworks algorithm and its variants for solving icsi2014 competition problems. In: SPRINGER. International Conference in Swarm Intelligence. [S.l.], 2014. p. 442–451.

ZHENG, Y.-J.; WU, X.-B. Evaluating a hybrid de and bbo with self adaptation on icsi 2014 benchmark problems. In: SPRINGER. International Conference in Swarm Intelligence. [S.l.], 2014. p. 422–433.