HaroldH.JeffreysHaroldH., Bertha SwirlesB.S.JeffreysBertha SwirlesB.S., Methods of Mathematical Physics (3rd.ed.), Cambridge University Press, 1956 [dostęp 2019-04-04]. Brak numerów stron w książce
arxiv.org
SumioS.WatanabeSumioS., Asymptotic Equivalence of Bayes Cross Validation and Widely Applicable Information Criterion in Singular Learning Theory, „arXiv:1004.2316 [cs]”, 14 kwietnia 2010, arXiv:1004.2316 [dostęp 2019-04-04].
ArthurA.HobsonArthurA., Bin-KangB.K.ChengBin-KangB.K., A comparison of the Shannon and Kullback information measures, „Journal of Statistical Physics”, 7 (4), 1973, s. 301–310, DOI: 10.1007/BF01014906, ISSN0022-4715 [dostęp 2019-04-04](ang.).
NicolasN.Veyrat-CharvillonNicolasN., François-XavierF.X.StandaertFrançois-XavierF.X., Mutual Information Analysis: How, When and Why?ChristopheCh.Clavier, KrisK.Gaj (red.), t. 5747, Berlin, Heidelberg: Springer Berlin Heidelberg, 2009, s. 429–443, DOI: 10.1007/978-3-642-04138-9_30, ISBN 978-3-642-04137-2 [dostęp 2019-04-04].
HamparsumH.BozdoganHamparsumH., Model selection and Akaike’s Information Criterion (AIC): The general theory and its analytical extensions, „Psychometrika”, 52 (3), 1987, s. 345–370, DOI: 10.1007/BF02294361, ISSN0033-3123 [dostęp 2019-04-04](ang.).
HirotoguH.AkaikeHirotoguH., Information Theory and an Extension of the Maximum Likelihood Principle, EmanuelE.Parzen, KunioK.Tanabe, GenshiroG.Kitagawa (red.), New York, NY: Springer New York, 1998, s. 199–213, DOI: 10.1007/978-1-4612-1694-0_15, ISBN 978-1-4612-7248-9 [dostęp 2019-04-04].
Scott I.S.I.VriezeScott I.S.I., Model selection and psychological theory: A discussion of the differences between the Akaike information criterion (AIC) and the Bayesian information criterion (BIC)., „Psychological Methods”, 17 (2), 2012, s. 228–243, DOI: 10.1037/a0027127, ISSN1939-1463, PMID: 22309957, PMCID: PMC3366160 [dostęp 2019-04-04](ang.).
ImreI.CsiszarImreI., Why Least Squares and Maximum Entropy? An Axiomatic Approach to Inference for Linear Inverse Problems, „The Annals of Statistics”, 19 (4), 1991, s. 2032–2066, ISSN0090-5364, JSTOR: 2241918 [dostęp 2019-04-04].
nih.gov
ncbi.nlm.nih.gov
Scott I.S.I.VriezeScott I.S.I., Model selection and psychological theory: A discussion of the differences between the Akaike information criterion (AIC) and the Bayesian information criterion (BIC)., „Psychological Methods”, 17 (2), 2012, s. 228–243, DOI: 10.1037/a0027127, ISSN1939-1463, PMID: 22309957, PMCID: PMC3366160 [dostęp 2019-04-04](ang.).
Christopher MichaelCh.M.BishopChristopher MichaelCh.M., Pattern recognition and machine learning, New York: Springer, 2006, s. 55, ISBN 0-387-31073-8, OCLC71008143 [dostęp 2019-04-04].
ArthurA.HobsonArthurA., Bin-KangB.K.ChengBin-KangB.K., A comparison of the Shannon and Kullback information measures, „Journal of Statistical Physics”, 7 (4), 1973, s. 301–310, DOI: 10.1007/BF01014906, ISSN0022-4715 [dostęp 2019-04-04](ang.).
ImreI.CsiszarImreI., Why Least Squares and Maximum Entropy? An Axiomatic Approach to Inference for Linear Inverse Problems, „The Annals of Statistics”, 19 (4), 1991, s. 2032–2066, ISSN0090-5364, JSTOR: 2241918 [dostęp 2019-04-04].
SolomonS.KullbackSolomonS., Information theory and statistics, Gloucester, Mass.: Peter Smith, 1959, s. 6, 22, ISBN 0-8446-5625-9, OCLC4140346 [dostęp 2019-04-04].
HamparsumH.BozdoganHamparsumH., Model selection and Akaike’s Information Criterion (AIC): The general theory and its analytical extensions, „Psychometrika”, 52 (3), 1987, s. 345–370, DOI: 10.1007/BF02294361, ISSN0033-3123 [dostęp 2019-04-04](ang.).
Scott I.S.I.VriezeScott I.S.I., Model selection and psychological theory: A discussion of the differences between the Akaike information criterion (AIC) and the Bayesian information criterion (BIC)., „Psychological Methods”, 17 (2), 2012, s. 228–243, DOI: 10.1037/a0027127, ISSN1939-1463, PMID: 22309957, PMCID: PMC3366160 [dostęp 2019-04-04](ang.).