Hands-On Generative Adversarial Networks with Keras
Your guide to implementing next-generation generative adversarial networks
About the Book
Book description
Develop generative models for a variety of real-world use-cases and deploy them to production
Key Features
- Discover various GAN architectures using Python and Keras library
- Understand how GAN models function with the help of theoretical and practical examples
- Apply your learnings to become an active contributor to open source GAN applications
Book Description
Generative Adversarial Networks (GANs) have revolutionized the fields of machine learning and deep learning. This book will be your first step towards understanding GAN architectures and tackling the challenges involved in training them.
This book opens with an introduction to deep learning and generative models, and their applications in artificial intelligence (AI). You will then learn how to build, evaluate, and improve your first GAN with the help of easy-to-follow examples. The next few chapters will guide you through training a GAN model to produce and improve high-resolution images. You will also learn how to implement conditional GANs that give you the ability to control characteristics of GAN outputs. You will build on your knowledge further by exploring a new training methodology for progressive growing of GANs. Moving on, you'll gain insights into state-of-the-art models in image synthesis, speech enhancement, and natural language generation using GANs. In addition to this, you'll be able to identify GAN samples with TequilaGAN.
By the end of this book, you will be well-versed with the latest advancements in the GAN framework using various examples and datasets, and you will have the skills you need to implement GAN architectures for several tasks and domains, including computer vision, natural language processing (NLP), and audio processing.
Foreword by Ting-Chun Wang, Senior Research Scientist, NVIDIA
What you will learn
- Learn how GANs work and the advantages and challenges of working with them
- Control the output of GANs with the help of conditional GANs, using embedding and space manipulation
- Apply GANs to computer vision, NLP, and audio processing
- Understand how to implement progressive growing of GANs
- Use GANs for image synthesis and speech enhancement
- Explore the future of GANs in visual and sonic arts
- Implement pix2pixHD to turn semantic label maps into photorealistic images
Who this book is for
This book is for machine learning practitioners, deep learning researchers, and AI enthusiasts who are looking for a perfect mix of theory and hands-on content in order to implement GANs using Keras. Working knowledge of Python is expected.

eBook Preview
Author Details

<a href="https://market.tutorialspoint.com/author/rafael_valle">Rafael Valle</a>
Packt are an established, trusted, and innovative global technical learning publisher, founded in Birmingham, UK with over eighteen years experience delivering rich premium content from ground-breaking authors and lecturers on a wide range of emerging and established technologies for professional development.
Packt’s purpose is to help technology professionals advance their knowledge and support the growth of new technologies by publishing vital user focused knowledge-based content faster than any other tech publisher, with a growing library of over 9,000 titles, in book, e-book, audio and video learning formats, our multimedia content is valued as a vital learning tool and offers exceptional support for the development of technology knowledge.
We publish on topics that are at the very cutting edge of technology, helping IT professionals learn about the newest tools and frameworks in a way that suits them.
Our students work
with the Best


































Related eBooks
Annual Membership
Become a valued member of Tutorials Point and enjoy unlimited access to our vast library of top-rated Video Courses
Subscribe now
Online Certifications
Master prominent technologies at full length and become a valued certified professional.
Explore Now