论文标题

回归方法的数学基础,用于近似近期初始边缘

Mathematical Foundations of Regression Methods for the approximation of the Forward Initial Margin

论文作者

Kun, Lucia Cipolina, Caenazzo, Simone, Ponomareva, Ksenia

论文摘要

大量文献已经发表了有关正向初始边缘的近似方法。最受欢迎的是回归方法家族。本文描述了这些回归近似方法所在的数学基础。我们介绍了数学严格的表明,从本质上讲,所有方法都提出了条件期望函数的近似值的变化,这被解释为希尔伯特空间上的正交投影。我们表明,每种方法都只是选择不同的功能形式来估计条件期望。我们尤其涵盖了迄今为止文献中最流行的方法,即多项式近似,内核回归和神经网络。

Abundant literature has been published on approximation methods for the forward initial margin. The most popular ones being the family of regression methods. This paper describes the mathematical foundations on which these regression approximation methods lie. We introduce mathematical rigor to show that in essence, all the methods propose variations of approximations for the conditional expectation function, which is interpreted as an orthogonal projection on Hilbert spaces. We show that each method is simply choosing a different functional form to numerically estimate the conditional expectation. We cover in particular the most popular methods in the literature so far, Polynomial approximation, Kernel regressions and Neural Networks.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源