We consider power utility maximization in a multivariate Black-Scholes model that is enhanced by credit risk via the Marshall-Olkin exponential distribution. On the practical side, the model is analytically tractable, easy to interpret, and thus simple to implement. On the theoretical side, the model constitutes a well-justified and intuitive mathematical wrapping to study the effect of extreme and higher-order dependence on optimal portfolios. In particular, we show that it is rich enough to model both, situations in which diversification is beneficial and situations in which this is not the case.
We explore how the joint modeling of financial assets can utilize methodologies from geostatistical modeling. The considered approach is essentially based on modeling data as realizations of a (Gaussian) random field. This allows for a parsimonious representation of the dependence structure by means of a covariance function taken to be a function of the distance between observations. A key beneft of this ansatz is the possibility to include new data points, i.e. to consider new companies in financial applications. Consequently, geostatistical modeling has appealing benefits in the contexts of covariance matrix estimation and missing data imputation. We thoroughly discuss the necessary adjustments when applying geostatistical methods to the high-dimensional framework that entails the modeling of financial data, instead of the 2D/3D coordinate space encountered in original applications of the method. We illustrate the two use cases of covariance matrix estimation and missing data imputation on a data set of CDS spreads of constituents of the iTraxx universe.
The major part of observed correlation matrices in financial applications exhibits the Perron-Frobenius property, namely a dominant eigenvector with only positive entries. We present a simulation algorithm for random correlation matrices satisfying this property, which can be augmented to take into account a realistic eigenvalue structure. From the construction principle applied in our algorithm, and the fact that it is able to generate all such correlation matrices, we are further able to compute explicitly the proportion of Perron-Frobenius correlation matrices in the set of all correlation matrices in a fixed dimension.
A current market-practice to incorporate multivariate defaults in global risk-factor simulations is the iteration of (multiplicative) i.i.d. survival indicator increments along a given time-grid, where the indicator distribution is based on a copula ansatz. The underlying assumption is that the behavior of the resulting iterated default distribution is similar to the one-shot distribution. It is shown that in most cases this assumption is not fulfilled and furthermore numerical analysis is presented that shows sizable differences in probabilities assigned to both “survival-of-all” and “mixed default/survival” events. We furthermore present a survey of those copula families that make the aforementioned methodology work.
Some empirical studies suggest that the computation of certain graph structures from a (large) historical correlation matrix can be helpful in portfolio selection. In particular, a repeated finding is that information about the portfolio weights in the minimum variance portfolio (MVP) from classical Markowitz theory can be inferred from measurements of centrality in such graph structures. The present article compares the two concepts from a purely algebraic perspective. It is demonstrated that this heuristic relationship between graph centrality and the MVP is not inner-mathematical, at least not significantly strong. This means that empirically found relations between both concepts depend critically on the underlying historical data. Repeated empirical evidence for a strong relationship is hence shown to constitute a stylized fact of financial return time series, rather than the expected outcome of a heuristic similarity between both approaches.