Entropy
Entropy is a measure how much the energy of atoms and molecules become more spread out in a process and can be defined in terms of statistical probabilities of a system or in terms of the other thermodynamic quantities. The most familiar case is the entropy of an ideal gas.
the entropy S is the natural logarithm of the number of microstates Ω, multiplied by the Boltzmann constant kB.
Entropy = (Boltzmann constant) * log(number of possible states of the system)
S = kB logΩ
Let value of Ω =x