<< Chapter < Page | Chapter >> Page > |
In 1824, at the age of 28, Nicolas Léonard Sadi Carnot ( [link] ) published the results of an extensive study regarding the efficiency of steam heat engines. In a later review of Carnot’s findings, Rudolf Clausius introduced a new thermodynamic property that relates the spontaneous heat flow accompanying a process to the temperature at which the process takes place. This new property was expressed as the ratio of the reversible heat ( q rev ) and the kelvin temperature ( T ). The term reversible process refers to a process that takes place at such a slow rate that it is always at equilibrium and its direction can be changed (it can be “reversed”) by an infinitesimally small change is some condition. Note that the idea of a reversible process is a formalism required to support the development of various thermodynamic concepts; no real processes are truly reversible, rather they are classified as irreversible .
Similar to other thermodynamic properties, this new quantity is a state function, and so its change depends only upon the initial and final states of a system. In 1865, Clausius named this property entropy ( S ) and defined its change for any process as the following:
The entropy change for a real, irreversible process is then equal to that for the theoretical reversible process that involves the same initial and final states.
Following the work of Carnot and Clausius, Ludwig Boltzmann developed a molecular-scale statistical model that related the entropy of a system to the number of microstates possible for the system. A microstate ( W ) is a specific configuration of the locations and energies of the atoms or molecules that comprise a system like the following:
Here k is the Boltzmann constant and has a value of 1.38 10 −23 J/K.
As for other state functions, the change in entropy for a process is the difference between its final ( S f ) and initial ( S i ) values:
For processes involving an increase in the number of microstates, W f > W i , the entropy of the system increases, Δ S >0. Conversely, processes that reduce the number of microstates, W f < W i , yield a decrease in system entropy, Δ S <0. This molecular-scale interpretation of entropy provides a link to the probability that a process will occur as illustrated in the next paragraphs.
Consider the general case of a system comprised of N particles distributed among n boxes. The number of microstates possible for such a system is n N . For example, distributing four particles among two boxes will result in 2 4 = 16 different microstates as illustrated in [link] . Microstates with equivalent particle arrangements (not considering individual particle identities) are grouped together and are called distributions . The probability that a system will exist with its components in a given distribution is proportional to the number of microstates within the distribution. Since entropy increases logarithmically with the number of microstates, the most probable distribution is therefore the one of greatest entropy .
Notification Switch
Would you like to follow the 'Chemistry' conversation and receive update notifications?