Assuming the absence of arbitrage in a single-name credit risk model, it is shown how to replicate the risk-free bank account until a credit event by a static portfolio of a bond and infinitely many credit default swaps (CDS). From the viewpoint of classical arbitrage pricing theory this static portfolio can be viewed as the solution of a credit risk hedging problem whose dual problem is to price the bond consistently with the CDS. This duality is maintained when the risk-free rate is shifted parallelly. In practice, there is a unique parallel shift that is consistent with observed market prices for bond and credit default swaps. The resulting, risk-free trading strategy in case of a positive shift earns more than the risk-free rate, is referred to as negative basis arbitrage in the market, and the parallel shift defined in this way is a scientifically well-justified definition for what the market calls negative basis. In economic terms, it is a premium for taking the un-modeled residual risks of a bond investment after interest rate risk and credit risk are hedged away, predominantly these are liquidity risk and legal risk.
The major part of observed correlation matrices in financial applications exhibits the Perron-Frobenius property, namely a dominant eigenvector with only positive entries. We present a simulation algorithm for random correlation matrices satisfying this property, which can be augmented to take into account a realistic eigenvalue structure. From the construction principle applied in our algorithm, and the fact that it is able to generate all such correlation matrices, we are further able to compute explicitly the proportion of Perron-Frobenius correlation matrices in the set of all correlation matrices in a fixed dimension.
A current market-practice to incorporate multivariate defaults in global risk-factor simulations is the iteration of (multiplicative) i.i.d. survival indicator increments along a given time-grid, where the indicator distribution is based on a copula ansatz. The underlying assumption is that the behavior of the resulting iterated default distribution is similar to the one-shot distribution. It is shown that in most cases this assumption is not fulfilled and furthermore numerical analysis is presented that shows sizable differences in probabilities assigned to both “survival-of-all” and “mixed default/survival” events. We furthermore present a survey of those copula families that make the aforementioned methodology work.
We discuss some critical aspects when evaluating convertible bonds whose underlying equity trades in a currency different from the bond currency.
Some empirical studies suggest that the computation of certain graph structures from a (large) historical correlation matrix can be helpful in portfolio selection. In particular, a repeated finding is that information about the portfolio weights in the minimum variance portfolio (MVP) from classical Markowitz theory can be inferred from measurements of centrality in such graph structures. The present article compares the two concepts from a purely algebraic perspective. It is demonstrated that this heuristic relationship between graph centrality and the MVP is not inner-mathematical, at least not significantly strong. This means that empirically found relations between both concepts depend critically on the underlying historical data. Repeated empirical evidence for a strong relationship is hence shown to constitute a stylized fact of financial return time series, rather than the expected outcome of a heuristic similarity between both approaches.