The lecture has been developed by Christa Cuchiero and Josef Teichmann. It has been held since spring 2019 at ETH Zurich as a regular lecture for master students, and as a

The lecture introduces several fundamental concepts from machine learning with a view towards important financial applications. Lecture notes are provided as ipython notebooks or in form of slides as well as of classical notes. Most of the following code runs savely under Python 3.7, Tensorflow 1.14.0 and Keras 2.2.5, see the first notebook for checking the version. You can get to a downloadable .ipynb via the following links or open it directly with Google Colab (don t forget to add data in case in the corresponding folder).

- Lecture 1 (Introduction, Universal Approximation by shallow networks, one string of arguments for depth): Lecture 1 as iPython notebook and some training data (should be unpacked, then store the files a folder where the notebook has access). Please find some slides on universal approximation theorems in a general context at UAT Slides.
- Lecture 2 (neural ordinary differential equations, backpropagation, expressiveness by randomness): Lecture 2 as iPython notebook and some data-file (should be unpacked, then store the files a folder where the notebook has access).
- Lecture 3 (training): Lecture 3 as iPython notebook.
- Lecture 4 (Deep Hedging without transaction costs): Lecture 4 as iPython notebook, where a short Keras implementation of Deep Hedging for the Black Scholes model with analysis of the hedging strategies can be found. Additionally an instance of indifference pricing is shown. An implementation illustrating the use of Keras in a very simple case can be found here.
- Lecture 5 (Deep Portfolio Optimization without transaction costs): Lecture 5 as iPython notebook, where Deep Portfolio Optimization is presented. An implemetation illustrating portfolio optimization with transaction costs can be found here.
- Lecture 8 (Deep Simulation): Lecture 8 as iPython notebook. Several additional formulas and extensions towards semimartingales are provided in Aspects of Signatures.
- Lecture 6 (Deep Calibration): a deep calibration implementation for the Heston model and deep calibration for local (stochastic) volatility models. We distinguish three sorts of deep calibration: learning directly the map from market data to model parameters, learning the map from model parameters to market data and inverting it by inverse problem methodology, and parmetrizing (infinite dimensional) parmeters by neural networks. Notice that the Heston calibration code runs safely under Python 3.6, Tensorflow 1.8.0 and Keras 2.0.8 or in Google Colab.
- Lecture 7 (Deep Reinforcement Learning): a short theoretical introduction to concepts of reinforcement learning as iPython notebook.
- Lecture 9 (Deep Hedging and Bayesian Optimization): Lecture 9 as iPython notebook.
- A typical exam for this lecture.

- Helmut Bölcskei, Philipp Grohs, Gitta Kutyniok, Philipp Petersen: Optimal Approximation with Sparsely Connected Deep Neural Networks, arxiv.1705.01714.
- Hans Bühler, Lukas Gonon, Josef Teichmann, Ben Wood: Deep Hedging, arXiv:1802.03042.
- Ricky T. Q. Chen, Yulia Rubanova, Jesse Bettencourt, David Duvenaud: Neural ordinary differential equations, arxiv.1806.07366.
- Ilya Chevyrev, Andrej Kromilitzin, A primer on the signature method in machine learning, arxiv.1603.03788.
- Uri Shaham, Alexander Cloninger, Ronald R. Coifman: Provable approximation properties of deep neural networks, arxiv.1509.07385.
- Christa Cuchiero, Martin Larsson, Josef Teichmann: Deep neural networks, generic universal interpolation, and controlled ODEs, arXiv/1908.07838.
- Christa Cuchiero, Lukas Gonon, Lyudmila Grigoryeva, Juan-Pablo Ortega, Josef Teichmann: Discrete-time signatures and randomness in reservoir computing, arXiv/2010.14615, preprint, submitted, 2020.
- Weinan E, Jiequn Han, Arnulf Jentzen: Deep learning-based numerical methods for high-dimensional parabolic partial differential equations and backward stochastic differential equations, arxiv.1706.04702.
- Matteo Gambara, Josef Teichmann: Consistent Recalibration Models and Deep Calibration, arXiv/2006.09455, preprint, submitted, 2020.
- Jakob Heiss, Josef Teichmann, Hanna Wutte: How implicit regularization of Neural Networks affects the learned function -- Part I, arXiv/1911.02903.
- Catherine F. Higham, Desmond J. Higham, Deep Learning: An Introduction for Applied Mathematicians, arxiv.1801.05894.
- Thomas Krabichler, Josef Teichmann: Deep Replication of a Runoff Portfolio, arXiv/2009.05034, preprint, submitted, 2020.
- Na Lei, Kehua Su, Li Cui, Shing-Tung Yau, David Xianfeng Gu, A Geometric View of Optimal Transportation and Generative Model, arxiv.1710.05488.
- Terry Lyons, Rough paths, Signatures and the modelling of functions on streams, arxiv.1405.4537.
- Terry Lyons, Harald Oberhauser, Sketching the order of events, arxiv.1708.09708.
- Josef Teichmann, A recent talk in Konstanz on randomness in training algorithms, 2019.
- Josef Teichmann, A recent talk in Oslo on stationary versions of discrete signature in the spirit of reservoir computing, 2019.
- Josef Teichmann, A recent talk in New York on at Thalesian Seminar Series on generative models in finance, 2020.