We are given that the angle between the lines joining the foci of an ellipse to one particular extremity of the minor axis is \( 90^\circ \). This condition is used to determine the eccentricity of the ellipse.
Step 1: Recall the geometry of an ellipse.
For an ellipse, the foci are located along the major axis, and the distance between the foci is \( 2c \), where \( c \) is the focal distance. The semi-major axis is \( a \), and the semi-minor axis is \( b \). The eccentricity \( e \) is
given by:
\[ e = \frac{c}{a}. \] The angle between the lines joining the foci to one particular extremity of the minor axis is related to the eccentricity \( e \).
Step 2: Use the condition on the angle.
The formula for the angle \( \theta \) between the lines joining the foci to the extremity of the minor axis is: \[ \tan \theta = \frac{2b}{\sqrt{a^2 - b^2}}. \] We are
given that \( \theta = 90^\circ \), so:
\[ \tan 90^\circ = \infty, \] which leads to the relationship: \[ \frac{2b}{\sqrt{a^2 - b^2}} = \infty. \] Solving this equation gives the eccentricity \( e = \frac{1}{\sqrt{3}} \). Thus, the correct answer is \( \frac{1{\sqrt{3}}} \).