Default reasoning about probabilities is the assignment of subjective probabilities on the basis of partial information about an uncertain event, and statistical knowledge about the domain of discourse. In this thesis a logic is developed for representing probabilistic information about both statistical and subjective probabilities. The question is explored about how exactly statistical knowledge should influence the formation of degrees of belief. Strong arguments are presented that justify Cross-Entropy minimization as the appropriate rule of inference. The minimum cross-entropy principle is implemented in a preferred model semantics for the representation language. When probabilities are allowed to take values in real closed fields with a logarithmic function a complete proof system for the resulting logic is obtained.
Available: PostScript (514 KBytes)