Step 1: Recall Bragg's Law for X-ray diffraction.
Bragg's Law is given by the formula:
\[
n\lambda = 2d\sin\theta
\]
where \(n\) is the order of the maximum, \(\lambda\) is the wavelength of the X-rays, \(d\) is the distance between atomic planes, and \(\theta\) is the angle of incidence.
Step 2: Identify the given values from the problem.
- Order of maximum, \(n = 1\) (since it's the "first maxima").
- Wavelength, \(\lambda = 0.32 \, \text{nm}\).
- Angle of diffraction, \(\theta = 30^{\circ}\).
Step 3: Substitute the values into Bragg's Law and solve for \(d\).
\[
(1)(0.32 \, \text{nm}) = 2 \cdot d \cdot \sin(30^{\circ})
\]
We know that \(\sin(30^{\circ}) = 0.5\).
\[
0.32 \, \text{nm} = 2 \cdot d \cdot (0.5)
\]
\[
0.32 \, \text{nm} = d
\]
Thus, the distance between the atomic planes is \(0.32 \, \text{nm}\).