Deep Generative Modeling
Autor Jakub M. Tomczaken Limba Engleză Paperback – 12 sep 2025
Deep Generative Modeling is designed to appeal to curious students, engineers, and researchers with a modest mathematical background in undergraduate calculus, linear algebra, probability theory, and the basics in machine learning, deep learning, and programming in Python and PyTorch (or other deep learning libraries). It will appeal to students and researchers from a variety of backgrounds, including computer science, engineering, data science, physics, and bioinformatics, who wish to become familiar with deep generative modeling. To engage the reader, the book introduces fundamental concepts with specific examples and code snippets. The full code accompanying the book is available on github.
The ultimate aim of the book is to outline the most important techniques in deep generative modeling and, eventually, enable readers to formulate new models and implement them.
| Toate formatele și edițiile | Preț | Express |
|---|---|---|
| Paperback (2) | 316.88 lei 6-8 săpt. | +20.95 lei 6-12 zile |
| Springer Nature Switzerland AG – 20 feb 2023 | 316.88 lei 6-8 săpt. | +20.95 lei 6-12 zile |
| Springer – 12 sep 2025 | 397.82 lei 39-44 zile | |
| Hardback (2) | 383.47 lei 6-8 săpt. | |
| Springer – 11 sep 2024 | 383.47 lei 6-8 săpt. | |
| Springer International Publishing – 19 feb 2022 | 437.28 lei 3-5 săpt. | +23.88 lei 6-12 zile |
Preț: 397.82 lei
Preț vechi: 497.28 lei
-20%
Puncte Express: 597
Preț estimativ în valută:
70.33€ • 83.60$ • 61.02£
70.33€ • 83.60$ • 61.02£
Carte tipărită la comandă
Livrare economică 09-14 martie
Specificații
ISBN-13: 9783031640896
ISBN-10: 3031640896
Pagini: 340
Dimensiuni: 155 x 235 x 19 mm
Greutate: 0.52 kg
Ediția:Second Edition 2024
Editura: Springer
ISBN-10: 3031640896
Pagini: 340
Dimensiuni: 155 x 235 x 19 mm
Greutate: 0.52 kg
Ediția:Second Edition 2024
Editura: Springer
Cuprins
Chapter 1 Why Deep Generative Modeling?.- Chapter 2 Probabilistic modeling: From Mixture Models to Probabilistic Circuits.- Chapter 3 Autoregressive Models.- Chapter 4 Flow-based Models.- Chapter 5 Latent Variable Models.- Chapter 6 Hybrid Modeling.- Chapter 7 Energy-based Models.- Chapter 8 Generative Adversarial Networks.- Chapter 9 Score-based Generative Models.- Chapter 10 Deep Generative Modeling for Neural Compression.- Chapter 11 From Large Language Models to Generative AI.
Notă biografică
Jakub M. Tomczak is an associate professor and the head of the Generative AI group at the Eindhoven University of Technology (TU/e). Before joining the TU/e, he was an assistant professor at Vrije Universiteit Amsterdam, a deep learning researcher (Engineer, Staff) in Qualcomm AI Research in Amsterdam, a Marie Sklodowska-Curie individual fellow in Prof. Max Welling's group at the University of Amsterdam, and an assistant professor and a postdoc at the Wroclaw University of Technology. His main research interests include ML, DL, deep generative modeling (GenAI), and Bayesian inference, with applications to image/text processing, Life Sciences, Molecular Sciences, and quantitative finance. He serves as an action editor of "Transactions of Machine Learning Research", and an area chair of major AI conferences (e.g., NeurIPS, ICML, AISTATS). He is a program chair of NeurIPS 2024. He is the author of the book entitled "Deep Generative Modeling", the first comprehensive book on Generative AI. He is also the founder of Amsterdam AI Solutions.
Textul de pe ultima copertă
This textbook tackles the problem of formulating AI systems by combining probabilistic modeling and deep learning. Moreover, it goes beyond typical predictive modeling and brings together supervised learning and unsupervised learning. The resulting paradigm, called deep generative modeling, utilizes the generative perspective on perceiving the surrounding world. It assumes that each phenomenon is driven by an underlying generative process that defines a joint distribution over random variables and their stochastic interactions, i.e., how events occur and in what order. The adjective "deep" comes from the fact that the distribution is parameterized using deep neural networks. There are two distinct traits of deep generative modeling. First, the application of deep neural networks allows rich and flexible parameterization of distributions. Second, the principled manner of modeling stochastic dependencies using probability theory ensures rigorous formulation and prevents potential flaws in reasoning. Moreover, probability theory provides a unified framework where the likelihood function plays a crucial role in quantifying uncertainty and defining objective functions.
Deep Generative Modeling is designed to appeal to curious students, engineers, and researchers with a modest mathematical background in undergraduate calculus, linear algebra, probability theory, and the basics in machine learning, deep learning, and programming in Python and PyTorch (or other deep learning libraries). It will appeal to students and researchers from a variety of backgrounds, including computer science, engineering, data science, physics, and bioinformatics, who wish to become familiar with deep generative modeling. To engage the reader, the book introduces fundamental concepts with specific examples and code snippets. The full code accompanying the book is available on github.
Caracteristici
This approach combines probability theory with deep learning to obtain powerful AI systems Outlines the most important techniques in deep generative modeling, enabling readers to formulate new models All chapters include code snippets to help understand how the presented methods can be implemented