fchollet / deep-learning-with-python-notebooks
Jupyter notebooks for the code samples of the book "Deep Learning with Python"
AI Architecture Analysis
This repository is indexed by RepoMind. By analyzing fchollet/deep-learning-with-python-notebooks in our AI interface, you can instantly generate complete architecture diagrams, visualize control flows, and perform automated security audits across the entire codebase.
Our Agentic Context Augmented Generation (Agentic CAG) engine loads full source files into context, avoiding the fragmentation of traditional RAG systems. Ask questions about the architecture, dependencies, or specific features to see it in action.
Repository Summary (README)
PreviewCompanion notebooks for Deep Learning with Python
This repository contains Jupyter notebooks implementing the code samples found in the book Deep Learning with Python, third edition (2025) by Francois Chollet and Matthew Watson. In addition, you will also find the legacy notebooks for the second edition (2021) and the first edition (2017).
For readability, these notebooks only contain runnable code blocks and section titles, and omit everything else in the book: text paragraphs, figures, and pseudocode. If you want to be able to follow what's going on, I recommend reading the notebooks side by side with your copy of the book.
Running the code
We recommend running these notebooks on Colab, which provides a hosted runtime with all the dependencies you will need. You can also, run these notebooks locally, either by setting up your own Jupyter environment, or using Colab's instructions for running locally.
By default, all notebooks will run on Colab's free tier GPU runtime, which is sufficient to run all code in this book. Chapter 8-18 chapters will benefit from a faster GPU if you have a Colab Pro subscription. You can change your runtime type using Runtime -> Change runtime type in Colab's dropdown menus.
Choosing a backend
The code for third edition is written using Keras 3. As such, it can be run with JAX, TensorFlow or PyTorch as a backend. To set the backend, update the backend in the cell at the top of the colab that looks like this:
import os
os.environ["KERAS_BACKEND"] = "jax"
This must be done only once per session before importing Keras. If you are in the middle running a notebook, you will need to restart the notebook session and rerun all relevant notebook cells. This can be done in using Runtime -> Restart Session in Colab's dropdown menus.
Using Kaggle data
This book uses datasets and model weights provided by Kaggle, an online Machine Learning community and platform. You will need to create a Kaggle login to run Kaggle code in this book; instructions are given in Chapter 8.
For chapters that need Kaggle data, you can login to Kaggle once per session
when you hit the notebook cell with kagglehub.login(). Alternately,
you can set up your Kaggle login information once as Colab secrets:
- Go to https://www.kaggle.com/ and sign in.
- Go to https://www.kaggle.com/settings and generate a Kaggle API key.
- Open the secrets tab in Colab by clicking the key icon on the left.
- Add two secrets,
KAGGLE_USERNAMEandKAGGLE_KEYwith the username and key you just created.
Following this approach you will only need to copy your Kaggle secret key once, though you will need to allow each notebook to access your secrets when running the relevant Kaggle code.
Table of contents
- Chapter 2: The mathematical building blocks of neural networks
- Chapter 3: Introduction to TensorFlow, PyTorch, JAX, and Keras
- Chapter 4: Classification and regression
- Chapter 5: Fundamentals of machine learning
- Chapter 7: A deep dive on Keras
- Chapter 8: Image Classification
- Chapter 9: Convnet architecture patterns
- Chapter 10: Interpreting what ConvNets learn
- Chapter 11: Image Segmentation
- Chapter 12: Object Detection
- Chapter 13: Timeseries Forecasting
- Chapter 14: Text Classification
- Chapter 15: Language Models and the Transformer
- Chapter 16: Text Generation
- Chapter 17: Image Generation
- Chapter 18: Best practices for the real world