Graham's law of diffusion states that the rate of diffusion \( r \) of a gas is inversely proportional to the square root of its density \( d \). Mathematically, this is expressed as:
\[
r \propto \frac{1}{\sqrt{d}}
\]
Thus, the rate of diffusion decreases as the density of the gas increases, under the same temperature and pressure conditions.