Selasa, 10 Maret 2015

Entropy

  Entropy 

Tomasz Downarowicz (2007), Scholarpedia, 2(11):3901. doi:10.4249/scholarpedia.3901 revision #126991 [link to/cite this article]

Post-publication activity

Figure 1: In a naive analogy, energy in a physical system may be compared to water in lakes, rivers and the sea. Only the water that is above the sea level can be used to do work (e.g. propagate a turbine). Entropy represents the water contained in the sea.
  • In classical physics, the entropy of a physical system is proportional to the quantity of energy no longer available to do physical work. Entropy is central to the second law of thermodynamics, which states that in an isolated system any activity increases the entropy.
  • In probability theory, the entropy of a random variable measures the uncertainty about the value that might be assumed by the variable.
  • In information theory, the compression entropy of a message (e.g. a computer file) quantifies the information content carried by the message in terms of the best lossless compression rate.
  • In sociology, entropy is the natural decay of structure (such as law, organization, and convention) in a social system.
  • In the common sense, entropy means disorder or chaos.

 History :

The term entropy was coined in 1865 [Cl] by the German physicist Rudolf Clausius from Greek en- = in + trope = a turning (point). The word reveals an analogy to energy and etymologists believe that it was designed to denote the form of energy that any energy eventually and inevitably turns into -- a useless heat. The idea was inspired by an earlier formulation by Sadi Carnot [Ca] of what is now known as the second law of thermodynamics.
The Austrian physicist Ludwig Boltzmann [B] and the American scientist Willard Gibbs [G] put entropy into the probabilistic setup of statistical mechanics (around 1875). This idea was later developed by Max Planck. Entropy was generalized to quantum mechanics in 1932 by John von Neumann [N]. Later this led to the invention of entropy as a term in probability theory by Claude Shannon [Sh] (1948), popularized in a joint book [SW] with Warren Weaver, that provided foundations for information theory.
The concept of entropy in dynamical systems was introduced by Andrei Kolmogorov [K] and made precise by Yakov Sinai [Si] in what is now known as the Kolmogorov-Sinai entropy.
The formulation of Maxwell's paradox by James C. Maxwell (around 1871) triggered a search for the physical meaning of information, which resulted in the finding by Rolf Landauer [L] (1961) of the heat equivalent of the erasure of one bit of information, which brought the notions of entropy in thermodynamics and information theory together.
The term entropy is now used in many other sciences (such as sociology), sometimes distant from physics or mathematics, where it no longer maintains its rigorous quantitative character. Usually, it roughly means disorder, chaos, decay of diversity or tendency toward uniform distribution of kinds.

Entropy in physics

 

Thermodynamical entropy - macroscopic approach

In thermodynamics, a physicahttps://www.blogger.com/blogger.g?blogID=7582271987512820802#editor/target=post;postID=1450603746430767798l system is a collection of objects (bodies) whose state is parametrized by several characteristics such as the distribution of density, pressure, temperature, velocity, chemical potential, etc. The change of entropy of a physical system when it passes from one state to another equals

ΔS=dQT,
where dQ denotes an element of heat being absorbed (or emitted; then it has negative sign) by a body, T is the absolute temperature of that body at that moment, and the integration is over all elements of heat active in the passage. The above formula allows one to compare the entropies of different states of a system, or to compute the entropy of each state up to a constant (which is satisfactory in most cases). The absolute value of entropy is established by the third law of thermodynamics.
Notice that when an element dQ of heat is transmitted from a warmer body at temperature T1 to a cooler one at temperature T2 , then the entropy of the first body changes by dQT1 , while that of the other rises by dQT2 . Since T2<T1 , the absolute value of the latter fraction is larger and jointly the entropy of the two-body system increases (while the global energy remains the same).
A system is isolated if it does not interact with its surroundings (i.e., is not influenced in any way). In particular, an isolated system does not exchange energy or matter (or even information) with its surroundings. In virtue of the first law of thermodynamics (the conservation of energy principle), an isolated system can pass only between states of the same global energy. The second law of thermodynamics introduces irreversibility of the evolution: an isolated system cannot pass from a state of higher entropy to a state of lower entropy. Equivalently, the second law says that it is impossible to perform a process whose only final effect is the transmission of heat from a cooler medium to a warmer one. Any such transmission must involve outside work; the elements participating in the work will also change their states and the overall entropy will rise.
The first and second laws of thermodynamics together imply that an isolated system will tend to the state of maximal entropy among all states of the same energy. This state is called the equilibrium state and reaching it is interpreted as the thermodynamical death of the system. The energy distributed in this state is incapable of any further activity.
See Example of calculating entropy and finding the equilibrium state.


Untuk lebih jauh melihat tentang artikel ini " SILAHKAN KLIK LINK DIBAWAH INI"

http://www.scholarpedia.org/article/Entropy

Tidak ada komentar:

Posting Komentar