14.05.2025 12:15 Luciana Dalla Valle (University of Torino, IT): Approximate Bayesian conditional copulas
According to Sklar’s theorem, any multidimensional absolutely continuous distribution function can be uniquely represented as a copula, which captures the dependence structure among the vector components. In real data applications, the interest of the analyses often lies on specific functionals of the dependence, which quantify aspects of it in a few numerical values. A broad literature exists on such functionals, however extensions to include covariates are still limited. This is mainly due to the lack of unbiased estimators of the conditional copula, especially when one does not have enough information to select the copula model. Several Bayesian methods to approximate the posterior distribution of functionals of the dependence varying according covariates are presented and compared; the main advantage of the investigated methods is that they use nonparametric models, avoiding the selection of the copula, which is usually a delicate aspect of copula modelling. These methods are compared in simulation studies and in two realistic applications, from civil engineering and astrophysics.
Quelle
14.05.2025 16:15 Rajen Shah (University of Cambridge, UK): Robustness in Semiparametric Statistics
Given that all models are wrong, it is important to understand the performance of methods when the settings for which they have been designed are not met, and to modify them where possible so they are robust to these sorts of departures from the ideal. We present two examples with this broad goal in mind.
\[ \]
We first look at a classical case of model misspecification in (linear) mixed-effect models for grouped data. Existing approaches estimate linear model parameters through weighted least squares, with optimal weights (given by the inverse covariance of the response, conditional on the covariates) typically estimated by maximizing a (restricted) likelihood from random effects modelling or by using generalized estimating equations. We introduce a new ‘sandwich loss’ whose population minimizer coincides with the weights of these approaches when the parametric forms for the conditional covariance are well-specified, but can yield arbitrarily large improvements when they are not.
\[ \]
The starting point of our second vignette is the recognition that semiparametric efficient estimation can be hard to achieve in practice: estimators that are in theory efficient may require unattainable levels of accuracy for the estimation of complex nuisance functions. As a consequence, estimators deployed on real datasets are often chosen in a somewhat ad hoc fashion and may suffer high variance. We study this gap between theory and practice in the context of a broad collection of semiparametric regression models that includes the generalized partially linear model. We advocate using estimators that are robust in the sense that they enjoy root n consistent uniformly over a sufficiently rich class of distributions characterized by certain conditional expectations being estimable by user-chosen machine learning methods. We show that even asking for locally uniform estimation within such a class narrows down possible estimators to those parametrized by certain weight functions and develop a new random forest-based estimation scheme to estimate the optimal weights. We demonstrate the effectiveness of the resulting estimator in a variety of semiparametric settings on simulated and real-world data.
Quelle
19.05.2025 14:15 Teemu Pennanen (King's College London): Optimal Operation and Valuation of Electricity Storages
We apply computational techniques of convex stochastic optimization to optimal operation and valuation of electricity storages in the face of uncertain electricity prices. Our approach is based on quadrature approximations of Markov processes and on the Stochastic Dual Dynamic Programming (SDDP) algorithm which is widely applied across the energy industry. The approach is applicable to various specifications of storages, and it allows for e.g. hard constraints on storage capacity and charging speed. Our valuations are based on the indifference pricing principle, which builds on optimal trading strategies and calibrates to the user's initial position, market views and risk preferences. We illustrate the effects of storage capacity and charging speed by numerically computing the valuations using stochastic dual dynamic programming. If time permits, we provide theoretical justification of the employed computational techniques.
Quelle
19.05.2025 15:00 Kevin Hu: An H-theorem for the Markov local-field equation
In this talk, I will discuss recent results which characterize the long-time behavior of a conditional McKean-vlasov equation related to interacting diffusions on regular trees. This is joint work with Kavita Ramanan.
Quelle
20.05.2025 16:00 Leon Jendraszewski: Assigning new employees to positions
We study a problem arising in the management of staff positions in institutions with a cameralistic budgeting system, for example in public universities in Germany. When a new employee is hired, she needs to be assigned to one or (partially) to several of the available positions. These position may already be (partially) assigned to other staff members during certain time periods. Some positions are better suited for the new hire due to, e.g., their associated pay grades or other administrative reasons. One seeks a solution with assignments to suitable open positions and wants few changes of those assignments over time. This yields a multi-objective optimization problem with a lexicographic objective function which can be seen as a scheduling problem with non-availability periods for the machines.
We derive structural insights into this problem and present several MIP-formulations for it. Their solutions are optimal w.r.t. the three most important objectives and optimal or near-optimal w.r.t. the least important objective, respectively. In particular, we are able to solve our problem faster than with a straight-forward approach if one is willing to potentially sacrifice a bit of accuracy in the least important objective. In addition, we present very fast combinatorial algorithms for important special cases of the problem. Overall, we can solve most practically relevant instances in less than a few seconds. Our optimization tool was developed in collaboration with the administration of the School of Computation, Information, and Technology (CIT) at the Technical University of Munich where it is now used on a regular basis.
Quelle