Machine Learning in Finance (joint lecture project with Christa Cuchiero supported by Matteo Gambara, Florian Krach and Hanna Wutte)
The lecture has been developed by Christa Cuchiero and Josef Teichmann. It has been held since spring 2019 at ETH Zurich as a regular lecture for master students, and as a Risk Center lecture since autumn 2019. See alsp previous materials.
Basic material and some exercises for the courses in Zurich in spring 2022
The lecture introduces several fundamental concepts from machine learning with a view towards important financial applications. Lecture notes are provided as ipython notebooks or in form of slides as well as of classical notes. Most of the following code runs savely under Python 3.7, Tensorflow 1.14.0 and Keras 2.2.5, see the first notebook for checking the version. You can get to a downloadable .ipynb via the following links or open it directly with Google Colab (don t forget to add data in case in the corresponding folder).
- Lecture 1 (Introduction, Universal Approximation by shallow networks, one string of arguments for depth): Lecture 1 as iPython notebook and some training data (should be unpacked, then store the files a folder where the notebook has access). Please find some slides on universal approximation theorems in a general context at SW-UAT Slides.
-
Lecture 2 (neural ordinary differential equations, backpropagation, expressiveness by randomness): Lecture 2 as iPython notebook and some data-file (should be unpacked, then store the files a folder where the notebook has access).
-
Lecture 3 (training): Lecture 3 as iPython notebook.
-
Lecture 4 (Deep Hedging without transaction costs): Lecture 4 as iPython notebook, where a short Keras implementation of Deep Hedging for the Black Scholes model with analysis of the hedging strategies can be found. Additionally an instance of indifference pricing is shown. An implementation illustrating the use of Keras in a very simple case can be found here. An excellent implementation of Deep hedging on all sorts of models with all sorts of market environments can be found at a Pytorch framework for Deep Hedging PFHEDGE.
-
Lecture 5 (Deep Portfolio Optimization without transaction costs): Lecture 5 as iPython notebook, where Deep Portfolio Optimization is presented. An implemetation illustrating portfolio optimization with transaction costs can be found here.
-
Lecture 8 (Deep Simulation): Lecture 8 as iPython notebook. Several additional formulas and extensions towards semimartingales are provided in a recent talk in Verona.
-
Lecture 6 (Deep Calibration and GANs): a deep calibration implementation for the Heston model and deep calibration for local (stochastic) volatility models. We distinguish three sorts of deep calibration: learning directly the map from market data to model parameters, learning the map from model parameters to market data and inverting it by inverse problem methodology, and parmetrizing (infinite dimensional) parmeters by neural networks. Notice that the Heston calibration code runs safely under Python 3.6, Tensorflow 1.8.0 and Keras 2.0.8 or in Google Colab.
Deep Calibration is related to generative adverserial structures, which in turn is related to robust optimization problems, see Florian Krach's notebook on Robust Utility Optimization and GANs.
-
Lecture 7 (Deep Reinforcement Learning): a short theoretical introduction to concepts of reinforcement learning as iPython notebook. An implementation of a simple partially observed Markov decision process is given here.
-
Lecture 9 (Deep Hedging and Bayesian Optimization): Lecture 9 as iPython notebook.
-
Lecture 10 (Large language models and time series generation): we explain the mathematical background of LLMs, their relationships to time series modelling and several application to Finance. See Lecture 10 as iPython notebook.
References
- Helmut Bölcskei, Philipp Grohs, Gitta Kutyniok, Philipp Petersen: Optimal Approximation with Sparsely Connected Deep Neural Networks, arxiv.1705.01714.
- Hans Bühler, Lukas Gonon, Josef Teichmann, Ben Wood: Deep Hedging, arXiv:1802.03042.
-
Ricky T. Q. Chen, Yulia Rubanova, Jesse Bettencourt, David Duvenaud: Neural ordinary differential equations, arxiv.1806.07366.
- Ilya Chevyrev, Andrej Kromilitzin, A primer on the signature method in machine learning, arxiv.1603.03788.
-
Uri Shaham, Alexander Cloninger, Ronald R. Coifman: Provable approximation properties of deep neural networks, arxiv.1509.07385.
- Christa Cuchiero, Martin Larsson, Josef Teichmann: Deep neural networks, generic universal interpolation, and controlled ODEs, arXiv/1908.07838.
- Christa Cuchiero, Lukas Gonon, Lyudmila Grigoryeva, Juan-Pablo Ortega, Josef Teichmann: Discrete-time signatures and randomness in reservoir computing, arXiv/2010.14615, preprint, submitted, 2020.
- Weinan E, Jiequn Han, Arnulf Jentzen: Deep learning-based numerical methods for high-dimensional parabolic partial differential equations and backward stochastic differential equations, arxiv.1706.04702.
- Matteo Gambara, Josef Teichmann: Consistent Recalibration Models and Deep Calibration, arXiv/2006.09455, preprint, submitted, 2020.
- Jakob Heiss, Josef Teichmann, Hanna Wutte: How implicit regularization of Neural Networks affects the learned function -- Part I, arXiv/1911.02903.
- Catherine F. Higham, Desmond J. Higham, Deep Learning: An Introduction for Applied Mathematicians, arxiv.1801.05894.
- Thomas Krabichler, Josef Teichmann: Deep Replication of a Runoff Portfolio, arXiv/2009.05034, preprint, submitted, 2020.
- Na Lei, Kehua Su, Li Cui, Shing-Tung Yau, David Xianfeng Gu, A Geometric View of Optimal Transportation and Generative Model, arxiv.1710.05488.
- Terry Lyons, Rough paths, Signatures and the modelling of functions on streams, arxiv.1405.4537.
- Terry Lyons, Harald Oberhauser, Sketching the order of events, arxiv.1708.09708.
- Josef Teichmann, A recent talk in Konstanz on randomness in training algorithms, 2019.
- Josef Teichmann, A recent talk in Oslo on stationary versions of discrete signature in the spirit of reservoir computing, 2019.
- Josef Teichmann, A recent talk in New York on at Thalesian Seminar Series on generative models in finance, 2020.