Updating probabilities by information from only one hypothesis and thereby ignoring alternative hypotheses, is not only biased but leads to progressively imprecise conclusions. In psychology this phenomenon was studied in experiments with the "pseudodiagnosticity task''. In probability logic the phenomenon that additional premises increase the imprecision of a conclusion is known as "degradation''. The present contribution investigates degradation in the context of second order probability distributions. It uses beta distributions as marginals and copulae together with C-vines to represent dependence structures. It demonstrates that in Bayes' theorem the posterior distributions of the lower and upper probabilities approach 0 and 1 as more and more likelihoods belonging to only one hypothesis are included in the analysis.
probability logic, degradation, Bayes' theorem, pseudodiagnosticity task, second order probability distributions
03B48, 49N30, 62F15, 91E10