Skip to main content

General Physics I:

Subsection 10.6.12 Entropy and Probability

Entropy is intimately connected with the probability. If you toss four pennies, then it has very less chance that all four will land heads up. It is six times more likely to get two heads and two tails up. The two heads - two tails state is the most likely, shows the most disorder, and has the highest entropy. Four heads is less likely, has the most order, and the lowest entropy. If you tossed more coins, it would be even less likely that they’d all land heads up, and even more likely that you’d end up with close to the same number of heads as tails.
Let S be entropy and \(\omega\) be the maximum probability of the system in a definite state, then we can write
\begin{equation*} S=f(\omega) \end{equation*}
Now consider \(S_{1}\) and \(S_{2}\) are the entropies of the two system then total entropy of the combined system will be
\begin{equation*} S =S_{1}+S_{2} \end{equation*}
entropy is an additive quantity. The probability of a composite system is equal to the product of the probabilities of the individual system. That is,
\begin{equation*} \omega = \omega_{1}\omega_{2} \end{equation*}
Therefore,
\begin{equation} S =S_{1}+S_{2} \tag{10.6.19} \end{equation}
\begin{equation} f(\omega_{1}\omega_{2}) = f(\omega_{1})+f(\omega_{2}) \tag{10.6.20} \end{equation}
Differentiating eqn. (10.6.20) with respect to \(\omega_{1}\text{,}\) keeping \(\omega_{2} \) constant, we get -
\begin{equation} \omega_{2}f'(\omega_{1}\omega_{2}) = f'(\omega_{1}) \tag{10.6.21} \end{equation}
Again differentiating this eqn. (10.6.20) with respect to \(\omega_{2},\) keeping by \(\omega_{1}\) constant, we get -
\begin{equation} \omega_{1}f'(\omega_{1}\omega_{2}) = f'(\omega_{2}) \tag{10.6.22} \end{equation}
Dividing eqns. (10.6.21) and (10.6.22), we get -
\begin{equation} \frac{\omega_{2}}{\omega_{1}} = \frac{f'(\omega_{1})}{f'(\omega_{2})}\tag{10.6.23} \end{equation}
\begin{equation*} \text{or,}\quad \omega_{1} f'(\omega_{1}) =\omega_{2} f'(\omega_{2}) = k,\quad \text{constant} \end{equation*}
So that
\begin{equation*} f'(\omega_{1}) = \frac{k}{\omega_{1}}\qquad \text{and}\quad f'(\omega_{2}) = \frac{k}{\omega_{2}} \end{equation*}
Integrating these expressions, we get -
\begin{equation*} \int f'(\omega_{1})\,d\omega_{1} = \int \frac{k}{\omega_{1}}\,d\omega_{1} \end{equation*}
\begin{equation} \therefore\quad f(\omega_{1}) = k\ln\omega_{1}+C \tag{10.6.24} \end{equation}
also,
\begin{equation*} \int f'(\omega_{2})\,d\omega_{2} = \int \frac{k}{\omega_{2}}\,d\omega_{2} \end{equation*}
\begin{equation} \therefore\quad f(\omega_{2}) = k\ln\omega_{2}+C \tag{10.6.25} \end{equation}
In general,
\begin{equation*} f(\omega) = k\ln\omega+C \end{equation*}
\begin{equation*} \text{or,}\quad S = k\ln\omega+C \end{equation*}
Where C is a constant of integration which can be determined by initial conditions. In case of a perfect order, \(\omega=1\) and \(S=0 \text{,}\) then \(C =0.\)
\begin{equation*} \therefore \quad S =k\ln \omega \end{equation*}
Hence the microscopic definition of entropy is given by
\begin{equation} S = N k_{B} ln \omega \tag{10.6.26} \end{equation}
Where \(k=Nk_{B}\) for N number of molecules of gas and \(k_{B} \) is Boltzmann constant.