The lecture has been developed by Christa Cuchiero and Josef Teichmann. It has been held by Josef Teichmann in spring 2019 at ETH Zurich as a regular lecture for master students, and as a Risk Center lecture in autumn 2019 (jointly with Sebastian Becker, Patrick Cheridito, Olga Fink and Stefan Feuerriegel). At WU Wien Christa Cuchiero has held this lecture for Master and PhD students. Several further issues and ramifications are planned. More material will follow soon.
References and several talks on new developments can be found at the end of this webpage.
Basic material and some exercises for the courses in Zurich in spring and autumn 2019
The lecture introduces several fundamental concepts from machine learning with a view towards important financial applications. Lecture notes are provided as ipython notebooks or in form of slides as well as of classical notes. Most of the following code runs savely under Python 3.6, Tensorflow 1.8.0 and Keras 2.0.8, see the first notebook for checking the version. You can get to a downloadable .ipynb file by clicking on 'download' in the upper left corner of the jupyter notebook viewer.
Basic material and exercises for the courses in Vienna in autumn 2019
The course in Vienna held by Christa Cuchiero
splits into partially distinct parts for Master and PhD students, the
structure is similar but more exercises and some slides are
included:
Basic material and some exercises for the courses in Zurich in spring 2020
The lecture introduces several fundamental concepts from machine learning with a view towards important financial applications. Lecture notes are provided as ipython notebooks or in form of slides as well as of classical notes. Most of the following code runs savely under Python 3.7, Tensorflow 1.14.0 and Keras 2.2.5, see the first notebook for checking the version. You can get to a downloadable .ipynb file by clicking on 'download' in the upper left corner of the jupyter notebook viewer.
- Lecture 1 (Introduction, Universal Approximation by shallow networks, one string of arguments for depth): Lecture 1 as iPython notebook and some training data (should be unpacked, then store the files in the folder of the notebook).
-
Lecture 2 (neural ordinary differential equations, backpropagation, expressiveness by randomness): Lecture 2 as iPython notebook and some data-file. For the exercises just write a two-hidden layer network by hand with backpropagation by hand following the code in the notebook, see also one possible solution. Exercises can be found at Exercise 1 and Exercise 2.
-
Lecture 3 (training): Lecture 3 as iPython notebook. Here updates will follow soon.
-
Lecture 4 (Deep Hedging without transaction costs): Lecture 4 as iPython notebook, where a short Keras implementation of Deep Hedging for the BS model with analysis of the hedging strategies can be found. An implementation illustrating the use of Keras in a very simple case can be found here.
-
Lecture 5 (Deep Portfolio Optimization without or with transaction costs): Lecture 5 as iPython notebook, where a short Keras implementation of the Merton problem for the BS model with analysis of the trading strategies can be found. Code with transaction costs can be found here.
-
Lecture 6 (Deep Calibration): deep calibration for the Heston model and deep calibration for local stochastic volatility models. We distinguish three sorts of deep calibration: learning directly the map from market data to model parameters, learning the map from model parameters to market data and inverting it by inverse problem methodology, and parmetrizing (infinite dimensional) parmeters by neural networks. Notice that the Heston calibration code runs safely under Python 3.6, Tensorflow 1.8.0 and Keras 2.0.8.
-
Lecture 7 (Deep Reinforcement Learning): a short theoretical introduction to concepts of reinforcement learning as iPython notebook.
-
Lecture 8 (Deep Simulation): Lecture 8 as iPython notebook. These notes include an introduction on iterated integrals of controls and on the Johnson-Lindenstrauss Lemma as well as code on 'learning' an unknown SDE and stock market dynamics.
References
- Helmut Bölcskei, Philipp Grohs, Gitta Kutyniok, Philipp Petersen: Optimal Approximation with Sparsely Connected Deep Neural Networks, arxiv.1705.01714.
- Hans Bühler, Lukas Gonon, Josef Teichmann, Ben Wood: Deep Hedging, arXiv:1802.03042.
-
Ricky T. Q. Chen, Yulia Rubanova, Jesse Bettencourt, David Duvenaud: Neural ordinary differential equations, arxiv.1806.07366.
- Ilya Chevyrev, Andrej Kromilitzin, A primer on the signature method in machine learning, arxiv.1603.03788.
-
Uri Shaham, Alexander Cloninger, Ronald R. Coifman: Provable approximation properties of deep neural networks, arxiv.1509.07385.
- Christa Cuchiero, Martin Larsson, Josef Teichmann: Deep neural networks, generic universal interpolation, and controlled ODEs, arXiv/1908.07838.
- Weinan E, Jiequn Han, Arnulf Jentzen: Deep learning-based numerical methods for high-dimensional parabolic partial differential equations and backward stochastic differential equations, arxiv.1706.04702.
- Jakob Heiss, Josef Teichmann, Hanna Wutte: How implicit regularization of Neural Networks affects the learned function -- Part I, arXiv/1911.02903.
- Catherine F. Higham, Desmond J. Higham, Deep Learning: An Introduction for Applied Mathematicians, arxiv.1801.05894.
- Na Lei, Kehua Su, Li Cui, Shing-Tung Yau, David Xianfeng Gu, A Geometric View of Optimal Transportation and Generative Model, arxiv.1710.05488.
- Terry Lyons, Rough paths, Signatures and the modelling of functions on streams, arxiv.1405.4537.
- Terry Lyons, Harald Oberhauser, Sketching the order of events, arxiv.1708.09708.
- Josef Teichmann, A recent talk in Konstanz on randomness in training algorithms, 2019.
- Josef Teichmann, A recent talk in Oslo on stationary versions of discrete signature in the spirit of reservoir computing, 2019.
- Josef Teichmann, A recent talk in New York on at Thalesian Seminar Series on generative models in finance, 2020.