Prix bas
CHF190.40
Impression sur demande - l'exemplaire sera recherché pour vous.
This book delivers the state of the art in deep learning (DL) methods hybridized with evolutionary computation (EC). Over the last decade, DL has dramatically reformed many domains: computer vision, speech recognition, healthcare, and automatic game playing, to mention only a few. All DL models, using different architectures and algorithms, utilize multiple processing layers for extracting a hierarchy of abstractions of data. Their remarkable successes notwithstanding, these powerful models are facing many challenges, and this book presents the collaborative efforts by researchers in EC to solve some of the problems in DL. EC comprises optimization techniques that are useful when problems are complex or poorly understood, or insufficient information about the problem domain is available. This family of algorithms has proven effective in solving problems with challenging characteristics such as non-convexity, non-linearity, noise, and irregularity, which dampen the performance of most classic optimization schemes. Furthermore, EC has been extensively and successfully applied in artificial neural network (ANN) research from parameter estimation to structure optimization. Consequently, EC researchers are enthusiastic about applying their arsenal for the design and optimization of deep neural networks (DNN). This book brings together the recent progress in DL research where the focus is particularly on three sub-domains that integrate EC with DL: (1) EC for hyper-parameter optimization in DNN; (2) EC for DNN architecture design; and (3) Deep neuroevolution. The book also presents interesting applications of DL with EC in real-world problems, e.g., malware classification and object detection. Additionally, it covers recent applications of EC in DL, e.g. generative adversarial networks (GAN) training and adversarial attacks. The book aims to prompt and facilitate the research in DL with EC both in theory and in practice.
Presents a compilation of state-of-the-art research in deep learning using evolutionary computation Features hyper-parameter optimization, deep neural network architecture design, and deep neuroevolution Facilitates research both in theory and in practice
Auteur
Hitoshi Iba received his Ph.D. degree from The University of Tokyo, Japan, in 1990. From 1990 to 1998, he was with the Electro Technical Laboratory in Ibaraki, Japan. Since 1998, he has been with The University of Tokyo, where he is currently a professor in the Graduate School of Information Science and Technology. His research interests include evolutionary computation, artificial life, artificial intelligence, and robotics. He is an associate editor of the Journal of Genetic Programming and Evolvable Machines (GPEM). Dr. Iba is also is an underwater naturalist and experienced Professional Association of Diving Instructors (PADI) divemaster, having completed more than a thousand dives.
Nasimul Noman received his Ph.D. degree from The University of Tokyo, Japan, in 2007. He was a faculty member in the Department of Computer Science and Engineering, University of Dhaka, Bangladesh, from 2002 to 2012. In 2013, he joined the School of Electrical Engineering and Computing at The University of Newcastle, Australia, and currently he is working as a senior lecturer there. His research interests include evolutionary computation, computational biology, bioinformatics, and machine learning.
Contenu
Chapter 1: Evolutionary Computation and meta-heuristics.- Chapter 2: A Shallow Introduction to Deep Neural Networks.- Chapter 3: On the Assessment of Nature-Inspired Meta-Heuristic Optimization Techniques to Fine-Tune Deep Belief Networks.- Chapter 4: Automated development of DNN based spoken language systems using evolutionary algorithms.- Chapter 5: Search heuristics for the optimization of DBN for Time Series Forecasting.- Chapter 6: Particle Swarm Optimisation for Evolving Deep Convolutional Neural Networks for Image Classification: Single- and Multi-objective Approaches.- Chapter 7: Designing Convolutional Neural Network Architectures Using Cartesian Genetic Programming.- Chapter 8: Fast Evolution of CNN Architecture for Image Classificaiton.- Chapter 9: Discovering Gated Recurrent Neural Network Architectures.- Chapter 10: Investigating Deep Recurrent Connections and Recurrent Memory Cells Using Neuro-Evolution.- Chapter 11: Neuroevolution of Generative Adversarial Networks.- Chapter 12: Evolving deep neural networks for X-ray based detection of dangerous objects.- Chapter 13: Evolving the architecture and hyperparameters of DNNs for malware detection.- Chapter 14: Data Dieting in GAN Training.- Chapter 15: One-Pixel Attack: Understanding and Improving Deep Neural Networks with Evolutionary Computation. <p