To use all functions of this page, please activate cookies in your browser.
my.chemeurope.com
With an accout for my.chemeurope.com you can always see everything at a glance – and you can configure your own website and individual newsletter.
- My watch list
- My saved searches
- My saved topics
- My newsletter
Gibbs algorithm
In statistical mechanics, the Gibbs algorithm, first introduced by J. Willard Gibbs in 1878, is the injunction to choose a statistical ensemble (probability distribution) for the unknown microscopic state of a thermodynamic system by minimising the average log probability Additional recommended knowledgesubject to the probability distribution satisfying a set of constraints (usually expectation values) corresponding to the known macroscopic quantities. Physicists call the result of applying the Gibbs algorithm the Gibbs distribution for the given constraints, most notably Gibbs's grand canonical ensemble for open systems when the average energy and the average number of particles are given. (See also partition function). In the light of Claude Shannon's information theory, in 1957 E.T. Jaynes re-interpreted the Gibbs algorithm as a much more general, more widely applicable inference technique, leading to the principle of maximum entropy, and the MaxEnt view of thermodynamics. This general result of the Gibbs algorithm is then a maximum entropy probability distribution. Statisticians identify such distributions as belonging to exponential families. Not to be confused withThe Gibbs sampler, an update algorithm used in Markov chain Monte Carlo iterations, a special case of the Metropolis-Hastings algorithm. See alsoCategories: Statistical mechanics | Particle statistics | Entropy and information |
||
This article is licensed under the GNU Free Documentation License. It uses material from the Wikipedia article "Gibbs_algorithm". A list of authors is available in Wikipedia. |