Probability logic studies the properties resulting from the probabilistic interpretation of logical argument forms. Typical examples are probabilistic Modus Ponens and Modus Tollens. Argument forms with two premises usually lead from precise probabilities of the premises to imprecise or interval probabilities of the conclusion. In the contribution, we study generalized inference forms having three or more premises. Recently, Gilio has shown that these generalized forms "degrade'' -- more premises lead to more imprecise conclusions, i. e., to wider intervals. We distinguish different forms of degradation. We analyse Predictive Inference, Modus Ponens, Bayes' Theorem, and Modus Tollens. Special attention is devoted to the case where the conditioning events have zero probabilities. Finally, we discuss the relation of degradation to monotonicity.
An important field of probability logic is the investigation of inference rules that propagate point probabilities or, more generally, interval probabilities from premises to conclusions. Conditional probability logic (CPL) interprets the common sense expressions of the form "if \dots, then \dots" by conditional probabilities and not by the probability of the material implication. An inference rule is \emph{probabilistically informative} if the coherent probability interval of its conclusion is not necessarily equal to the unit interval [0,1]. Not all logically valid inference rules are probabilistically informative and vice versa. The relationship between logically valid and probabilistically informative inference rules is discussed and illustrated by examples such as the {\sc modus ponens} or the {\sc affirming the consequent}. We propose a method to evaluate the strength of CPL inference rules. Finally, an example of a proof is given that is purely based on CPL inference rules.
Updating probabilities by information from only one hypothesis and thereby ignoring alternative hypotheses, is not only biased but leads to progressively imprecise conclusions. In psychology this phenomenon was studied in experiments with the "pseudodiagnosticity task''. In probability logic the phenomenon that additional premises increase the imprecision of a conclusion is known as "degradation''. The present contribution investigates degradation in the context of second order probability distributions. It uses beta distributions as marginals and copulae together with C-vines to represent dependence structures. It demonstrates that in Bayes' theorem the posterior distributions of the lower and upper probabilities approach 0 and 1 as more and more likelihoods belonging to only one hypothesis are included in the analysis.