The angular width is given by: \[ \theta = \frac{\lambda}{d} \] Substituting values: \[ \theta = \frac{600 \times 10^{-9}}{1.0 \times 10^{-4}} \] \[ \theta = 6.0 \times 10^{-3} \text{ rad} \] Converting to degrees: \[ \theta = 6.0 \times 10^{-3} \times \frac{180}{\pi} \] \[ \theta \approx 0.034^\circ \] Thus, the angular width is 0.034°.
Two slits 0.1 mm apart are arranged 1.20 m from a screen. Light of wavelength 600 nm from a distant source is incident on the slits. How far apart will adjacent bright interference fringes be on the screen?
If vector \( \mathbf{a} = 3 \hat{i} + 2 \hat{j} - \hat{k} \) \text{ and } \( \mathbf{b} = \hat{i} - \hat{j} + \hat{k} \), then which of the following is correct?