We investigate under which conditions a single simulation of joint default times at a final time horizon can be decomposed into a set of simulations of joint defaults on subsequent adjacent sub-periods leading to that final horizon. Besides the theoretical interest, this is also a practical problem as part of the industry has been working under the misleading assumption that the two approaches are equivalent for practical purposes. As a reasonable trade-off between realistic stylized facts, practical demands, and mathematical tractability, we propose models leading to a Markovian multi-variate survival-indicator process, and we investigate two instances of static models for the vector of default times from the statistical literature that fall into this class. On the one hand, the "looping default" case is known to be equipped with this property, and we point out that it coincides with the classical "Freund distribution" in the bivariate case. On the other hand, if all sub-vectors of the survival indicator process are Markovian, this constitutes a new characterization of the Marshall-Olkin distribution, and hence of multi-variate lack-of-memory. A paramount property of the resulting model is stability of the type of multi-variate distribution with respect to elimination or insertion of a new marginal component with marginal distribution from the same family. The practical implications of this "nested margining" property are enormous. To implement this distribution we present an efficient and unbiased simulation algorithm based on the Lévy-frailty construction. We highlight different pitfalls in the simulation of dependent default times and examine, within a numerical case study, the effect of inadequate simulation practices.