Back to Lexicon

Bayesian Information Criterion (BIC)

\ Bayesian \ ˌɪnfərˈmeɪʃən \ kraɪˈtɪriən \ (bɪk) \

A model selection criterion used to evaluate the goodness of fit of statistical models, particularly in contexts like longevity modelling. BIC balances model fit and complexity by combining the log-likelihood of the model (which measures how well the model explains the observed data) with a penalty term that increases with the number of parameters. This penalty discourages overfitting by favouring simpler models when possible.

The BIC is a similar statistic to the Akaike Information Criterion, which captures the goodness of fit via the (log-) likelihood (i.e. a measure of how likely the observed data is under the fitted model), with a different form of penalty based upon the number of variables used (i.e. to avoid “over-fitting” to the data).

Keep exploring our Lexicon of Longevity
Back to Lexicon
Icon/Arrow/UpIcon/Pin/Calander12Icon/Close/blackIcon/Social/FacebookFlag/CanadaFlag/wolrdFlag/ukFlag/usaIcon/Social/LinkedinIcon/MinusIcon/PinIcon/ExpandIcon/QuoteIcon/Website-greenIcon/Website/grey