On Tue, Oct 11, 2011 at 18:44, Henry Baker <hbaker1@pipeline.com> wrote:
Bayes' Theorem takes advantage of whatever a priori information you might have to refine that probability based on a new observation.
Suppose that you don't have very good information about the shape of the a priori probability curve, but may know some simple facts -- e.g., the mean, standard deviation, etc. You then do an experiment & refine the shape, but still don't have a very good idea about the precise shape.
I think that someone suggested a particular family of parameterized curves for a probability distributions that is particularly well suited for Bayesian analysis/computations, in that the experiments modify the parameters of the curve, but keep the subsequent curve in the same family.
Does anyone know which family of curves might be useful in this regard and/or a link to some info ?
These distributions are called conjugate prior (when the prior and posterior distribution are in the same family). See http://en.wikipedia.org/wiki/Conjugate_prior The Beta, Gamma, and Normal distributions all fit, depending on what parameters you are trying to model (Beta is limited to parameters in [0,1] and Gamma to [0,+\infty]). Cheers, Seb