Step 1: Understand the geometry of the problem.
Let the radius of the hemisphere be \( r \), and the height of the cone be \( h \). The cone is inscribed in the hemisphere such that its vertex touches the top of the hemisphere, and its base coincides with the base of the hemisphere.
Step 2: Apply Pythagoras’ theorem.
The radius of the base of the cone is \( r \), and the height of the cone is \( h \). The slant height \( l \) of the cone is the distance from the vertex to the point on the circumference of the base. This forms a right triangle with the radius and height of the cone. According to the Pythagorean theorem:
\[
l^2 = r^2 + h^2
\]
Since the cone is inscribed in the hemisphere, the slant height \( l \) is also the radius of the hemisphere, so:
\[
l = r
\]
Step 3: Solve for the ratio \( \frac{h}{r} \).
Substitute \( l = r \) into the Pythagorean theorem:
\[
r^2 = r^2 + h^2
\]
This simplifies to:
\[
h^2 = \frac{r^2}{3} \implies h = \frac{r}{\sqrt{3}}
\]
Thus, the ratio of the height of the cone to the radius of the hemisphere is:
\[
\frac{h}{r} = \frac{1}{\sqrt{3}} \implies \boxed{\sqrt{3}:1}
\]