An alternative is the information entropy definition introduced in 1948 by Claude Shannon.[۱]بایگانیشده در ۳۱ ژانویه ۱۹۹۸ توسط Wayback Machine It was intended for use in communication theory, but is applicable in all areas. It reduces to Boltzmann's expression when all the probabilities are equal, but can, of course, be used when they are not. Its virtue is that it yields immediate results without resorting to factorials or تقریب استرلینگ. Similar formulas are found, however, as far back as the work of Boltzmann, and explicitly in Gibbs (see reference).
doi.org
"Paul Ehrenfest (1880–1933) along with Nernst[,] Arrhenius, and Meitner must be considered among Boltzmann's most outstanding students."—Jäger, Gustav; Nabl, Josef; Meyer, Stephan (April 1999). "Three Assistants on Boltzmann". Synthese. 119 (1–2): 69–84. doi:10.1023/A:1005239104047.
An alternative is the information entropy definition introduced in 1948 by Claude Shannon.[۱]بایگانیشده در ۳۱ ژانویه ۱۹۹۸ توسط Wayback Machine It was intended for use in communication theory, but is applicable in all areas. It reduces to Boltzmann's expression when all the probabilities are equal, but can, of course, be used when they are not. Its virtue is that it yields immediate results without resorting to factorials or تقریب استرلینگ. Similar formulas are found, however, as far back as the work of Boltzmann, and explicitly in Gibbs (see reference).