|
In classical statistical mechanics, the entropy function earlier introduced by Clausius is interpreted as statistical entropy using probability theory. The statistical entropy perspective was introduced in 1870 with the work of the Austrian physicist Ludwig Boltzmann. == Gibbs Entropy Formula== The macroscopic state of the system is defined by a distribution on the microstates that are accessible to a system in the course of its thermal fluctuations. So the entropy is defined for two different levels of description of the given system. At one of these levels, the entropy is given by the Gibbs entropy formula, named after J. Willard Gibbs. For a classical system (i.e., a collection of classical particles) with a discrete set of microstates, if is the energy of microstate ''i'', and is the probability that it occurs during the system's fluctuations, then the entropy of the system is : Entropy changes for systems in a canonical state A system with a well-defined temperature, i.e., one in thermal equilibrium with a thermal reservoir, has a probability of being in a microstate ''i'' given by Boltzmann's distribution. Changes in the entropy caused by changes in the external constraints are then given by: : : : : where we have twice used the conservation of probability, . Now, is the expectation value of the change in the total energy of the system. If the changes are sufficiently slow, so that the system remains in the same microscopic state, but the state slowly (and reversibly) changes, then is the expectation value of the work done on the system through this reversible process, ''dw''rev. But from the first law of thermodynamics, . Therefore, : In the thermodynamic limit, the fluctuation of the macroscopic quantities from their average values becomes negligible; so this reproduces the definition of entropy from classical thermodynamics, given above. The quantity is a physical constant known as Boltzmann's constant, which, like the entropy, has units of heat capacity. The logarithm is dimensionless. This definition remains meaningful even when the system is far away from equilibrium. Other definitions assume that the system is in thermal equilibrium, either as an isolated system, or as a system in exchange with its surroundings. The set of microstates (with probability distribution) on which the sum is done is called a statistical ensemble. Each type of statistical ensemble (micro-canonical, canonical, grand-canonical, etc.) describes a different configuration of the system's exchanges with the outside, varying from a completely isolated system to a system that can exchange one or more quantities with a reservoir, like energy, volume or molecules. In every ensemble, the equilibrium configuration of the system is dictated by the maximization of the entropy of the union of the system and its reservoir, according to the second law of thermodynamics (see the statistical mechanics article). Neglecting correlations (or, more generally, statistical dependencies) between the states of individual particles will lead to an incorrect probability distribution on the microstates and thence to an overestimate of the entropy.〔E.T. Jaynes; Gibbs vs Boltzmann Entropies; American Journal of Physics, 391, 1965〕 Such correlations occur in any system with nontrivially interacting particles, that is, in all systems more complex than an ideal gas. This ''S'' is almost universally called simply the ''entropy''. It can also be called the ''statistical entropy'' or the ''thermodynamic entropy'' without changing the meaning. Note the above expression of the statistical entropy is a discretized version of Shannon entropy. The von Neumann entropy formula is an extension of the Gibbs entropy formula to the quantum mechanical case. It has been shown〔 that the Gibb's Entropy is equal to the classical "heat engine" entropy characterized by 抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)』 ■ウィキペディアで「Entropy (statistical thermodynamics)」の詳細全文を読む スポンサード リンク
|