翻訳と辞書
Words near each other
・ Entropy (album)
・ Entropy (anonymous data store)
・ Entropy (arrow of time)
・ Entropy (astrophysics)
・ Entropy (Buffy the Vampire Slayer)
・ Entropy (classical thermodynamics)
・ Entropy (comics)
・ Entropy (computing)
・ Entropy (disambiguation)
・ Entropy (energy dispersal)
・ Entropy (film)
・ Entropy (Hip Hop Reconstruction from the Ground Up)
・ Entropy (information theory)
・ Entropy (journal)
・ Entropy (order and disorder)
Entropy (statistical thermodynamics)
・ Entropy (video game)
・ Entropy / Send Them
・ Entropy and life
・ Entropy compression
・ Entropy encoding
・ Entropy estimation
・ Entropy exchange
・ Entropy in thermodynamics and information theory
・ Entropy maximization
・ Entropy monitoring
・ Entropy of activation
・ Entropy of entanglement
・ Entropy of fusion
・ Entropy of mixing


Dictionary Lists
翻訳と辞書 辞書検索 [ 開発暫定版 ]
スポンサード リンク

Entropy (statistical thermodynamics) : ウィキペディア英語版
Entropy (statistical thermodynamics)

In classical statistical mechanics, the entropy function earlier introduced by Clausius is interpreted as statistical entropy using probability theory. The statistical entropy perspective was introduced in 1870 with the work of the Austrian physicist Ludwig Boltzmann.
== Gibbs Entropy Formula==
The macroscopic state of the system is defined by a distribution on the microstates that are accessible to a system in the course of its thermal fluctuations. So the entropy is defined for two different levels of description of the given system. At one of these levels, the entropy is given by the Gibbs entropy formula, named after J. Willard Gibbs. For a classical system (i.e., a collection of classical particles) with a discrete set of microstates, if E_i is the energy of microstate ''i'', and p_i is the probability that it occurs during the system's fluctuations, then the entropy of the system is
: S = -k_\text\,\sum_i p_i \ln \,p_i

Entropy changes for systems in a canonical state
A system with a well-defined temperature, i.e., one in thermal equilibrium with a thermal reservoir, has a probability of being in a microstate ''i'' given by Boltzmann's distribution.
Changes in the entropy caused by changes in the external constraints are then given by:
: dS = -k_\text\,\sum_i dp_i \ln p_i
: \,\,\, = -k_\text\,\sum_i dp_i (-E_i/k_\textT -\ln Z)
: \,\,\, = \sum_i E_i dp_i / T
: \,\,\, = \sum_i ((E_i p_i) - (dE_i) p_i ) / T
where we have twice used the conservation of probability, .
Now, is the expectation value of the change in the total energy of the system.
If the changes are sufficiently slow, so that the system remains in the same microscopic state, but the state slowly (and reversibly) changes, then is the expectation value of the work done on the system through this reversible process, ''dw''rev.
But from the first law of thermodynamics, . Therefore,
:dS = \frac
In the thermodynamic limit, the fluctuation of the macroscopic quantities from their average values becomes negligible; so this reproduces the definition of entropy from classical thermodynamics, given above.

The quantity k_\text is a physical constant known as Boltzmann's constant, which, like the entropy, has units of heat capacity. The logarithm is dimensionless.
This definition remains meaningful even when the system is far away from equilibrium. Other definitions assume that the system is in thermal equilibrium, either as an isolated system, or as a system in exchange with its surroundings. The set of microstates (with probability distribution) on which the sum is done is called a statistical ensemble. Each type of statistical ensemble (micro-canonical, canonical, grand-canonical, etc.) describes a different configuration of the system's exchanges with the outside, varying from a completely isolated system to a system that can exchange one or more quantities with a reservoir, like energy, volume or molecules. In every ensemble, the equilibrium configuration of the system is dictated by the maximization of the entropy of the union of the system and its reservoir, according to the second law of thermodynamics (see the statistical mechanics article).
Neglecting correlations (or, more generally, statistical dependencies) between the states of individual particles will lead to an incorrect probability distribution on the microstates and thence to an overestimate of the entropy.〔E.T. Jaynes; Gibbs vs Boltzmann Entropies; American Journal of Physics, 391, 1965〕 Such correlations occur in any system with nontrivially interacting particles, that is, in all systems more complex than an ideal gas.
This ''S'' is almost universally called simply the ''entropy''. It can also be called the ''statistical entropy'' or the ''thermodynamic entropy'' without changing the meaning. Note the above expression of the statistical entropy is a discretized version of Shannon entropy. The von Neumann entropy formula is an extension of the Gibbs entropy formula to the quantum mechanical case.
It has been shown〔 that the Gibb's Entropy is equal to the classical "heat engine" entropy characterized by dS = \frac \!

抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)
ウィキペディアで「Entropy (statistical thermodynamics)」の詳細全文を読む



スポンサード リンク
翻訳と辞書 : 翻訳のためのインターネットリソース

Copyright(C) kotoba.ne.jp 1997-2016. All Rights Reserved.