The lecture has been developed by Christa Cuchiero and Josef Teichmann. It has been held since spring 2019 at ETH Zurich as a regular lecture for master students, and as a Risk Center lecture since autumn 2019. See alsp previous materials.
Basic material and some exercises for the courses in Zurich in spring 2021
The lecture introduces several fundamental concepts from machine learning with a view towards important financial applications. Lecture notes are provided as ipython notebooks or in form of slides as well as of classical notes. Most of the following code runs savely under Python 3.7, Tensorflow 1.14.0 and Keras 2.2.5, see the first notebook for checking the version. You can get to a downloadable .ipynb via the following links or open it directly with Google Colab (don t forget to add data in case in the corresponding folder).
Lecture 1 (Introduction, Universal Approximation by shallow networks, one string of arguments for depth): Lecture 1 as iPython notebook and some training data (should be unpacked, then store the files a folder where the notebook has access). Please find some slides on universal approximation theorems in a general context at UAT Slides.
Lecture 2 (neural ordinary differential equations, backpropagation, expressiveness by randomness): Lecture 2 as iPython notebook and some data-file (should be unpacked, then store the files a folder where the notebook has access).
Lecture 4 (Deep Hedging without transaction costs): Lecture 4 as iPython notebook, where a short Keras implementation of Deep Hedging for the Black Scholes model with analysis of the hedging strategies can be found. Additionally an instance of indifference pricing is shown. An implementation illustrating the use of Keras in a very simple case can be found here.
Lecture 5 (Deep Portfolio Optimization without transaction costs): Lecture 5 as iPython notebook, where Deep Portfolio Optimization is presented. An implemetation illustrating portfolio optimization with transaction costs can be found here.
Lecture 6 (Deep Calibration): a deep calibration implementation for the Heston model and deep calibration for local (stochastic) volatility models. We distinguish three sorts of deep calibration: learning directly the map from market data to model parameters, learning the map from model parameters to market data and inverting it by inverse problem methodology, and parmetrizing (infinite dimensional) parmeters by neural networks. Notice that the Heston calibration code runs safely under Python 3.6, Tensorflow 1.8.0 and Keras 2.0.8 or in Google Colab.
Lecture 7 (Deep Reinforcement Learning): a short theoretical introduction to concepts of reinforcement learning as iPython notebook.