Dywergencja Kullbacka-Leiblera (Polish Wikipedia)

Analysis of information sources in references of the Wikipedia article "Dywergencja Kullbacka-Leiblera" in Polish language version.

refsWebsite
Global rank Polish rank
5th place
2nd place
2nd place
6th place
low place
3,879th place
low place
low place
26th place
172nd place
3,707th place
4,048th place
6th place
33rd place
2,656th place
65th place
610th place
682nd place
4th place
7th place
69th place
224th place

archive.org

  • Harold Jeffreys, Bertha Swirles Jeffreys, Methods of Mathematical Physics (3rd.ed.), Cambridge University Press, 1956 [dostęp 2019-04-04].

arxiv.org

  • Sumio Watanabe, Asymptotic Equivalence of Bayes Cross Validation and Widely Applicable Information Criterion in Singular Learning Theory, „arXiv:1004.2316 [cs]”, 14 kwietnia 2010, arXiv:1004.2316 [dostęp 2019-04-04].

doi.org

dx.doi.org

elsevier.com

linkinghub.elsevier.com

icm.edu.pl

yadda.icm.edu.pl

inference.org.uk

jstor.org

  • Imre Csiszar, Why Least Squares and Maximum Entropy? An Axiomatic Approach to Inference for Linear Inverse Problems, „The Annals of Statistics”, 19 (4), 1991, s. 2032–2066, ISSN 0090-5364, JSTOR2241918 [dostęp 2019-04-04].

nih.gov

ncbi.nlm.nih.gov

  • Scott I. Vrieze, Model selection and psychological theory: A discussion of the differences between the Akaike information criterion (AIC) and the Bayesian information criterion (BIC)., „Psychological Methods”, 17 (2), 2012, s. 228–243, DOI10.1037/a0027127, ISSN 1939-1463, PMID22309957, PMCIDPMC3366160 [dostęp 2019-04-04] (ang.).

projecteuclid.org

ptm.org.pl

wydawnictwa.ptm.org.pl

worldcat.org

  • David John Cameron MacKay, Information theory, inference, and learning algorithms, Cambridge, UK: Cambridge University Press, 2003, s. 34, ISBN 0-521-64298-1, OCLC 52377690 [dostęp 2019-04-04].
  • Christopher Michael Bishop, Pattern recognition and machine learning, New York: Springer, 2006, s. 55, ISBN 0-387-31073-8, OCLC 71008143 [dostęp 2019-04-04].
  • Arthur Hobson, Bin-Kang Cheng, A comparison of the Shannon and Kullback information measures, „Journal of Statistical Physics”, 7 (4), 1973, s. 301–310, DOI10.1007/BF01014906, ISSN 0022-4715 [dostęp 2019-04-04] (ang.).
  • Imre Csiszar, Why Least Squares and Maximum Entropy? An Axiomatic Approach to Inference for Linear Inverse Problems, „The Annals of Statistics”, 19 (4), 1991, s. 2032–2066, ISSN 0090-5364, JSTOR2241918 [dostęp 2019-04-04].
  • S. Kullback, R.A. Leibler, On Information and Sufficiency, „The Annals of Mathematical Statistics”, 22 (1), 1951, s. 79–86, DOI10.1214/aoms/1177729694, ISSN 0003-4851 [dostęp 2019-04-04] (ang.).
  • Solomon Kullback, Information theory and statistics, Gloucester, Mass.: Peter Smith, 1959, s. 6, 22, ISBN 0-8446-5625-9, OCLC 4140346 [dostęp 2019-04-04].
  • Letters to the Editor, „The American Statistician”, 41 (4), 1987, s. 338–341, DOI10.1080/00031305.1987.10475510, ISSN 0003-1305 [dostęp 2019-04-04] (ang.).
  • A. Cichocki, S. Amari, Information geometry of divergence functions, „Bulletin of the Polish Academy of Sciences. Technical Sciences”, 58 (nr 1), 2010, s. 183–195, ISSN 0239-7528 [dostęp 2019-04-04] (ang.).
  • Hamparsum Bozdogan, Model selection and Akaike’s Information Criterion (AIC): The general theory and its analytical extensions, „Psychometrika”, 52 (3), 1987, s. 345–370, DOI10.1007/BF02294361, ISSN 0033-3123 [dostęp 2019-04-04] (ang.).
  • Scott I. Vrieze, Model selection and psychological theory: A discussion of the differences between the Akaike information criterion (AIC) and the Bayesian information criterion (BIC)., „Psychological Methods”, 17 (2), 2012, s. 228–243, DOI10.1037/a0027127, ISSN 1939-1463, PMID22309957, PMCIDPMC3366160 [dostęp 2019-04-04] (ang.).