Get Complete Project Material File(s) Now! »
Improved bounds for Square-root Lasso and Square-root Slope
About kernel-based estimation of conditional Kendall’s tau: finite-distance bounds and asymptotic behavior
A classification point-of-view on conditional Kendall’s tau
Table of contents :
1 Introduction
1.1 Estimation of the conditional mean: linear regression and related methods
1.1.1 Least-squares estimators and penalization
1.1.2 Adaptivity to using two square-root estimators
1.1.3 Robustness to outliers using the Median-of-Means approach
1.2 Copulas and conditional dependence modeling
1.2.1 Distributions with given margins
1.2.2 Inference of copulas models
1.2.3 Conditional copulas and the simplifying assumption
1.2.4 Kendall’s tau: a measure of dependence, and its conditional version
1.2.5 Estimation of the conditional Kendall’s tau
1.3 Other topics in inference
1.3.1 Estimation of a regular conditional functional by conditional U-statistic regression .
1.3.2 About confidence intervals for ratios of means
Publications List
I Linear regression
2 Improved bounds for Square-root Lasso and Square-root Slope
2.1 Introduction
2.2 The framework
2.3 Optimal rates for the Square-Root Lasso
2.4 Adaptation to sparsity by a Lepski-type procedure
2.5 Algorithms for computing the Square-root Slope
2.6 Optimal rates for the Square-Root Slope
2.7 Proofs
2.7.1 Preliminary lemmas
2.7.2 Proof of Theorem 2.1
2.7.3 Proofs of the adaptive procedure
2.7.3.1 Proof of Theorem 2.3
2.7.3.2 Proof of Lemma 2.4
2.7.3.3 Proof of Lemma 2.5
2.7.4 Proof of Theorem 2.8
Contents 4
3 Robust-to-outliers simultaneous inference and noise level estimation using a MOM approach
3.1 Introduction
3.2 Results in the high-dimensional linear regression framework
3.3 A general framework
3.4 Technical lemmas
3.5 Control of the supremum of TK;(g; ; f; ) on each F()
3.5.1 Preliminaries
3.5.2 Proof of the first assertion of Lemma 3.12
3.5.3 Proof of the second assertion of Lemma 3.12
3.6 Proof of Lemma 3.11
3.6.1 Bound on F()1
3.6.2 Bound on F()2
3.6.3 Bound on F()3
3.6.3.1 Case jjf fjjL2 P r(K)
3.6.3.2 Case jjf fjjL2 P > r(K)
3.6.4 Bound on F()4
3.6.5 Bound on F()5
3.6.6 Bound on F()6
3.6.6.1 Case jjf fjjL2 P r(K)
3.6.6.2 Case jjf fjjL2 P > r(K)
3.6.7 Bound on F()7
3.6.8 Bound on F()8
3.6.9 Bound on F()9
3.6.9.1 Case jjf fjjL2 P r(K)
3.6.9.2 Case jjf fjjL2 P > r(K)
3.7 Proofs of main results
3.7.1 Proof of Theorem 3.4
3.7.2 Proof of Theorem 3.1
II Conditional copula estimation
4 About tests of the “simplifying” assumption for conditional copulas
4.1 Introduction
4.2 Tests of the simplifying assumption
4.2.1 “Brute-force” tests of the simplifying assumption
4.2.2 Tests based on the independence property
4.2.3 Parametric tests of the simplifying assumption
4.2.4 Bootstrap techniques for tests of H0
4.2.4.1 Some resampling schemes
4.2.4.2 Bootstrapped test statistics
4.3 Tests with “boxes”
4.3.1 The link with the simplifying assumption
4.3.2 Non-parametric tests with “boxes”
4.3.3 Parametric test statistics with “boxes”
4.3.4 Bootstrap techniques for tests with boxes
4.4 Numerical applications
4.5 Conclusion
4.6 Notation
4.7 Proof of Theorem 4.14
4.7.1 Preliminaires
4.7.2 Proof of Theorem 4.14
4.7.3 Proof of Proposition 4.16
5 About kernel-based estimation of conditional Kendall’s tau: finite-distance bounds and asymptotic behavior
5.1 Introduction
5.2 Definition of several kernel-based estimators of 1;2jz
5.3 Theoretical results
5.3.1 Finite distance bounds
5.3.2 Asymptotic behavior
5.4 Simulation study
5.5 Proofs
5.5.1 Proof of Proposition 5.1
5.5.2 Proof of Proposition 5.2
5.5.3 Proof of Proposition 5.3
5.5.4 Proof of Proposition 5.4
5.5.5 Proof of Proposition 5.6
5.5.6 Proof of Proposition 5.7
5.5.7 Proof of Proposition 5.8
5.5.8 Proof of Proposition 5.9
5.5.9 Proof of Lemma 5.17
6 About Kendall’s regression
6.1 Introduction
6.2 Finite-distance bounds on ^
6.3 Asymptotic behavior of ^
6.3.1 Asymptotic properties of ^ when n ! 1 and for fixed n0
6.3.2 Oracle property and a related adaptive procedure
6.3.3 Asymptotic properties of ^ when n and n0 jointly tend to +1
6.4 Simulations
6.4.1 Numerical complexity
6.4.2 Choice of tuning parameters and estimation of the components of
6.4.3 Comparison between parametric and nonparametric estimators of the conditional Kendall’s tau
6.4.4 Comparison with the tests of the simplifying assumption
6.4.5 Dimension 2 and choice of
6.5 Real data application
6.6 Proofs of finite-distance results for ^
6.6.1 Technical lemmas
6.6.2 Proof of Theorem 6.5
6.7 Proofs of asymptotic results for ^ n;n0
6.7.1 Proof of Lemma 6.7
6.7.2 Proof of Theorem 6.10
6.7.3 Proof of Proposition 6.11
6.7.4 Proof of Theorem 6.12
6.7.5 Proof of Theorem 6.13
6.8 Proof of Theorem 6.14
6.8.1 Proof of Lemma 6.18 : convergence of T1
6.8.2 Proof of the asymptotic normality of T4
6.8.3 Convergence of T6 to 0
6.8.4 Convergence of T7 to 0
6.8.5 Convergence of T3 to 0
6.9 Technical results concerning the first-step estimator
6.10 Estimation results for a particular sample
7 A classification point-of-view on conditional Kendall’s tau
7.1 Introduction
7.2 Regression-type approach
7.3 Classification algorithms and conditional Kendall’s tau
7.3.1 The case of probit and logit classifiers
7.3.2 Decision trees and random forests
7.3.3 Nearest neighbors
7.3.4 Neural networks
7.3.5 Lack of independence and its influence on the proposed algorithms
7.4 Simulation study
7.4.1 Choice of the functions f ig; i = 1; : : : ; p0
7.4.2 Comparing different copulas families
7.4.3 Comparing different conditional margins
7.4.4 Comparing different forms for the conditional Kendall’s tau
7.4.5 Higher dimensional settings
7.4.6 Choice of the number of neurons in the one-dimensional reference setting
7.4.7 Influence of the sample size n
7.4.8 Influence of the lack of independence
7.5 Applications to financial data
7.5.1 Conditional dependence with respect to the Eurostoxx’s volatility proxy
7.5.2 Conditional dependence with respect to the variations I of the Eurostoxx’s implied volatility index
7.6 Conclusion
7.7 Some basic definitions about copulas
7.8 Proof of Theorem 7.3
7.9 Proof of Theorem 7.4
III Other topics in inference
8 Estimation of a regular conditional functional by conditional U-statistic regression
8.1 Introduction
8.2 Theoretical properties of the nonparametric estimator ^()
8.2.1 Non-asymptotic bounds for Nk
8.2.2 Non-asymptotic bounds in probability for ^
8.2.3 Asymptotic results for ^
8.3 Theoretical properties of the estimator ^
8.3.1 Non-asymptotic bounds on ^
8.3.2 Asymptotic properties of ^ when n ! 1 and for fixed n0
8.3.3 Asymptotic properties of ^ jointly in (n; n0)
8.4 Applications and examples
8.5 Notations
8.6 Finite distance proofs for ^ and ^
8.6.1 Proof of Lemma 8.3
8.6.2 Proof of Proposition 8.5
8.6.3 Proof of Theorem 8.8
8.7 Proof of Theorem 8.14
8.7.1 Proof of Lemma 8.20
8.7.2 Proof of the asymptotic normality of T4
8.7.3 Convergence of T6 to 0
8.7.4 Convergence of T7 to 0
8.7.5 Convergence of T3 to 0
9 Confidence intervals for ratios of means: limitations of the delta method and honest confidence intervals
9.1 Introduction
9.2 Our framework
9.3 Limitations of the delta method
9.3.1 Asymptotic approximation takes time to hold
9.3.2 Asymptotic results may not hold for sequences of models
9.3.3 Extension of the delta method for ratios of expectations in the sequence-of-models framework
9.4 Construction of nonasymptotic confidence intervals
9.4.1 An easy case: the support of Y is well-separated from
9.4.2 Nonasymptotic confidence intervals with no assumption on the support of PY
9.5 Nonasymptotic CIs: impossibility results and practical guidelines
9.5.1 An upper bound on testable confidence levels
9.5.2 Practical methods and plug-in estimators
9.5.3 A lower bound on the length of nonasymptotic confidence intervals
9.6 Numerical applications
9.6.1 Simulations
9.6.2 Application to real data
9.7 Conclusion
9.8 Proofs of the results in Sections 9.3, 9.4 and 9.5
9.8.1 Proof of Theorem 9.1
9.8.2 Proof of Theorem 9.2
9.8.3 Proof of Theorem 9.3
9.8.4 Proof of Theorem 9.6
9.8.5 Proof of Theorem 9.5
9.9 Adapted results for Hoeffding framework
9.9.1 Concentration inequality in the easy case
9.9.2 Concentration inequality in the general case
9.9.3 An upper bound on testable confidence levels
9.9.4 Proof of Theorems 9.11 and 9.12
9.9.5 Proof of Theorem 9.13
9.10 Additional simulations
9.10.1 Gaussian distributions
9.10.2 Rule of thumb using n
9.10.3 Student distributions
9.10.4 Exponential distributions
9.10.5 Pareto distributions
9.10.6 Bernoulli distributions
9.10.7 Poisson distributions
Remerciements
Bibliography