A current market-practice to incorporate multivariate defaults in global risk-factor simulations is the iteration of (multiplicative) i.i.d. survival indicator increments along a given time-grid, where the indicator distribution is based on a copula ansatz. The underlying assumption is that the behavior of the resulting iterated default distribution is similar to the one-shot distribution. It is shown that in most cases this assumption is not fulfilled and furthermore numerical analysis is presented that shows sizable differences in probabilities assigned to both “survival-of-all” and “mixed default/survival” events. We furthermore present a survey of those copula families that make the aforementioned methodology work.