Definitions
-
Entropy is defined as a thermodynamic parameter representing
the state of disorder of a system at the atomic, ionic, or
molecular level.
- Entropy is a thermodynamic property which serves as a measure
of how close a system is to equilibrium.
-
Entropy is a measure of disorder in a system; the higher the
entropy the greater the disorder. In the context of entropy,
"perfect internal disorder" is synonymous with
"equilibrium".
-
Entropy is a measure of the unavailability of a system’s energy
to do work; Thus, thermodynamic entropy is a measure of the
amount of energy in a physical system that cannot be used to do
work.
-
Entropy is a measure of the dispersal of energy; how much
energy is spread out in a process, or how widely spread out it
becomes, at a specific temperature.
-
Entropy is the capacity factor for thermal energy that is
hidden with respect to temperature.
-
Entropy is a measure of disorder in the universe.
- Entropy is the tendency of a system, that is left to itself,
to descend into chaos.
According to the second law of thermodynamics the entropy of an
isolated system never decreases. An isolated system will
spontaneously evolve toward thermodynamic equilibrium, the
configuration with maximum entropy.
Systems that are not isolated may decrease in entropy, provided
they increase the entropy of their environment by at least that
same amount.
Since entropy is a state function, the change in the entropy of a
system is the same for any process that goes from a given initial
state to a given final state, whether the process is reversible
or irreversible.
Irreversibility
The idea of "irreversibility" is central to the understanding of
entropy. Most people have an intuitive understanding of
irreversibility (a dissipative process): if one watches a movie
of everyday life running forward and in reverse, it is easy to
distinguish between the two. The movie running in reverse shows
impossible things happening: water jumping out of a glass into a
pitcher above it, smoke going down a chimney, water "unmelting"
to form ice in a warm room, crashed cars reassembling themselves,
and so on.
The intuitive meaning of expressions such as "you can't
unscramble an egg", "don't cry over spilled milk" or "you can't
take the cream out of the coffee" is that these are irreversible
processes. There is a direction in time by which spilled milk
does not go back into the glass (see: The arrow of
time).
In thermodynamics, one says that the "forward" processes –
pouring water from a pitcher, smoke going up a chimney, etc. – are
"irreversible": they cannot happen in reverse, even though, on a
microscopic level, no laws of physics would be violated if they
did. This reflects the time-asymmetry of entropy.
All real physical processes involving systems in everyday life,
with many atoms or molecules, are irreversible. For an
irreversible process in an isolated system, the thermodynamic
state variable known as entropy is always
increasing.
The reason that the movie in reverse is so easily recognized is
because it shows processes for which entropy is decreasing, which
is physically impossible.
Entropy as energy dispersal
Entropy can also be described in terms of "energy dispersal" and
the "spreading of energy", while avoiding all mention of
"disorder", "randomness" and "chaos". In this approach, the
second law of thermodynamics is introduced as: "Energy
spontaneously disperses from being localized to becoming spread
out if it is not hindered from doing so."
This explanation can be used in the context of common experiences
such as a rock falling, a hot frying pan cooling down, iron
rusting, air leaving a punctured tyre and ice melting in a warm
room. Entropy is then depicted as a sophisticated kind of "before
and after" yardstick: Measuring how much energy is spread out
over time as a result of a process such as heating a system, or
how widely spread out the energy is after something happens in
comparison with its previous state, in a process such as gas
expansion or fluids mixing (at a constant temperature).
The equations are explored with reference to the common
experiences, with emphasis that in chemistry the energy that
entropy measures as dispersing is the internal energy of
molecules.
Chemical reactions
The second law of thermodynamics, states that a closed system has
entropy which may increase or otherwise remain constant. Chemical
reactions cause changes in entropy and entropy plays an important
role in determining in which direction a chemical reaction
spontaneously proceeds.
Nowadays, many biologists use the term 'entropy of an organism',
or its antonym 'negentropy', as a measure of the structural order
within an organism.
Historical frame
The term entropy was coined in 1865 by the German physicist
Rudolf Clausius, who stated that: “The entropy of the universe
tends to a maximum.”.
Calculation
Unlike many other functions of state, entropy cannot be directly
observed but must be calculated. Entropy can be calculated for a
substance as the standard molar entropy from absolute zero
temperature (also known as absolute entropy).
Entropy has the dimension of energy divided by temperature, which
has a unit of joules per kelvin (J/K) in the International System
of Units.
While these are the same units as heat capacity, the two concepts
are distinct. Entropy is not a conserved quantity: for example,
in an isolated system with non-uniform temperature, heat might
irreversibly flow and the temperature become more uniform such
that entropy increases.
The arrow of time
Entropy is the only quantity in the physical sciences that seems
to imply a particular direction of progress, sometimes called an
arrow of time. As time progresses, the second law of
thermodynamics states that the entropy of an isolated system
never decreases (but rather will increase). Hence, from this
perspective, entropy measurement is thought of as a kind of clock
(an isolated system has low entopy in the past, and high entropy
in the future).
The Second Law of Thermodynamics allows for the entropy to remain
the same regardless of the direction of time. If the entropy is
constant in either direction of time, there would be no preferred
direction. However, the entropy can only be a constant if the
system is in the highest possible state of disorder, such as a
gas that always was (and always will be) uniformly spread out in
its container.
The existence of a thermodynamic arrow of time implies that the
system is highly ordered (i.e. low entropy) in
one time direction only, which would by definition be the "past".
Thus this law is about the boundary conditions rather than the
equations of motion of our world.
Linguistic derivation
The term Entropy is derived from the Ancient Greek word
entropía (ἐντροπία) meaning “a turning
towards”.
This is a combination of the prefix en- (ἐν) meaning
"in", and the word tropḗ (τροπή) meaning "a turning", in
analogy with energy.
External sources
http://en.wikipedia.org/wiki/Introduction_to_entropy
http://en.wikipedia.org/wiki/Entropy
http://en.wikipedia.org/wiki/Entropy_(order_and_disorder)
http://en.wikipedia.org/wiki/Entropy_(energy_dispersal)
http://en.wikipedia.org/wiki/Entropy_(arrow_of_time)
http://en.wikipedia.org/wiki/Biological_thermodynamics
http://en.wiktionary.org/wiki/entropy
▲ Top ▲