Applications of entropy
       
 

Statistical entropy

Entropy is inherently a statistical phenomenon. We have seen that entropy describes the relationship between heat and temperature, or more precisely it is defined by the ratio of heat to temperature. In a physical sense this means that entropy depends on the number of energy states that are accessible to the system at a given temperature. The great the number of accessible states, the greater the entropy. However, the dependent is not linear, but rather it is logarithmic. This fact may be hard to grasp interitively. Yet, in general chemistry, we teach that the free energy is related logarithmically to the equilibrium constant and that does not appear to raise any eyebrows. We will first present the statistical definition of entropy and then we will show how this definition is a foundation for the other definitions.

Conformational entropy

We can tink of conformational entropy as a special case of statistical entropy, in which the combinatoric of the possible conformations of a polymer gives rise to an entropy contribution.

The hydrophobic effect

The hydrophobic effect arises from the fact that water molecules are constrained to fewer degrees of freedom when they surround a hydrophobic solute. It is said thatwater has an ice-like structure around hydrophobic molecules. Normally water has the freedom to hydrogen bond to two or more other water molecules. All water molecules are rpid flux switching between various hydrogen bonding partners. The hydrogphobic solute causes that kind of free motion to be severely restricted, thus lowering the entropy. For this reason hydrophobic solutes tend to aggregate or fold up (if they are in a polymer) in such a way as t o minimize the surface are. The water molecules are liberated in this process experience an increase in energy.

Protein folding

Entropy of mixing

Entropy of phase transition