entropy


Also found in: Dictionary, Thesaurus, Legal, Financial, Acronyms, Idioms, Encyclopedia, Wikipedia.
Related to entropy: enthalpy

entropy

 [en´trŏ-pe]
1. in thermodynamics, a measure of the part of the internal energy of a system that is unavailable to do work. In any spontaneous process, such as the flow of heat from a hot region to a cold region, entropy always increases.
2. the tendency of a system to move toward randomness.
3. in information theory, the negative of information, a measure of the disorder or randomness in a physical system. The theory of statistical mechanics proves that this concept is equivalent to entropy as defined in thermodynamics.
4. diminished capacity for spontaneous change, as occurs in the psyche in aging.

en·tro·py (S),

(en'trŏ-pē),
That fraction of heat (energy) content not available for the performance of work, usually because (in a chemical reaction) it has been used to increase the random motion of the atoms or molecules in the system; thus, entropy is a measure of randomness or disorder. Entropy occurs in the Gibbs free energy (G) equation: ΔG = ΔH - TΔSH, change in enthalpy or heat content; T, absolute temperature; ΔS, change in entropy; ΔG, change in Gibbs free energy).
See also: second law of thermodynamics.
[G. entropia, a turning toward]

en·tro·py

(S) (en'trŏ-pē)
That fraction of heat (energy) content not available for the performance of work, usually because (in a chemical reaction) it has been used to increase the random motion of the atoms or molecules in the system; thus, a measure of randomness or disorder.
[G. entropia, a turning toward]

entropy

the amount of disorder or the degree of randomness of a system. For example, when a protein is denatured by heat (see DENATURATION), the molecule (which has a definite shape) uncoils and takes up a random shape, producing a large change in entropy.
References in periodicals archive ?
Theorem 2: Let [D.sub.j], j= 1, ..., 6 be the above-mentioned six distance measure equations (1)-(6) between IvIFSs, then, [E.sub.j](A) = 1 - 3[D.sub.j] (A, <[1/3, 1/3], [1/3, 1/3], [1/3, 1/3]>), j= 1, ..., 6 for any A [member of] IvIFSs([OMEGA]) are measure of entropy of IvIFSs.
From theorem 2 and various distance formulas' mentioned (equation (1) to (6)), we get corresponding entropy formulas as follows:
For the studied ranges of moisture content, the differential enthalpies of adsorption and desorption varied from 1,153.029 to 97.207 kJ [kg.sup.-1] and from 1,998.435 to 149.079 kJ [kg.sup.-1], respectively; whereas the differential entropy varied from 2.804 to 0.276 kJ [kg.sup.-1] [K.sup.-1] for adsorption and from 5.643 to 0.447 kJ [kg.sup.-1] [K.sup.-1] for desorption.
Relations between the energy and entropy of solution and their significance.
Cabal-Yepez, "FPGA-based online detection of multiple combined faults in induction motors through information entropy and fuzzy inference", IEEE Trans.
He gave the name of entropy to this measure, at the suggestion of the mathematician John von Neumann, who remarked the similarity with the formula of entropy in thermodynamics.
According to Saxe, study participants' entropy scores were strongly tied to IQ.
Weighted entropy is a generalization of Shannon's entropy and is the measure of information supplied by a probablistic experiment whose elementary events are characterized both by their objective probabilities and by some qualitative (objective or subjective) weights [29].
The entropy measure of IVIFS on X can be defined as a function E : IVIF(X) [right arrow] [0,1] which fulfills the properties as follows:
In order to analyze the clear picture of validity of GSLT for this entropy on the Hubble horizon, we plot [[??].sub.tot] against cosmic time (t) by fixing constant parameters as [alpha] = 0.2, [beta] = 0.00l, and n = 4 as shown in Figure 1.
In this paper we prove a generalization of Shannon's inequality for the case of entropy of order [xi] with the help of Holder inequality.