Step 1: Understand the function of a voltmeter.
A voltmeter is an instrument used to measure the potential difference (voltage) between two points in an electrical circuit. To measure voltage, a voltmeter must be connected in parallel across the component whose voltage is to be measured.
Step 2: Consider how a voltmeter should ideally interact with a circuit.
For a voltmeter to accurately measure the voltage across a component without altering the current distribution or voltage drop in the rest of the circuit, it should draw negligible current from the circuit.
Step 3: Relate current drawn to resistance (Ohm's Law).
According to Ohm's Law, $V = IR$, where $V$ is voltage, $I$ is current, and $R$ is resistance. If the voltmeter draws negligible current ($I \approx 0$) for a given voltage ($V$), its resistance ($R = V/I$) must be very high.
Step 4: Define "ideal" voltmeter resistance.
In an ideal scenario, a voltmeter draws no current at all to ensure perfect measurement accuracy. For $I$ to be zero while $V$ is non-zero, the resistance ($R$) must be infinitely large.
Step 5: Evaluate the given options in the context of an ideal voltmeter.
(1) $100 \, \Omega$: This is a finite, relatively low resistance, which would draw significant current and affect the circuit being measured.
(2) Infinite: An infinite resistance ensures that no current flows through the voltmeter, thus not disturbing the circuit it is measuring. This is the characteristic of an ideal voltmeter.
(3) Low: A low resistance would cause the voltmeter to act like a short circuit, diverting current and significantly altering the voltage being measured.
(4) Zero: Zero resistance would mean the voltmeter acts as a perfect short circuit, drawing maximum current and effectively shorting out the component it's meant to measure the voltage across.
Step 6: Conclude the correct resistance for an ideal voltmeter.
The resistance of an ideal voltmeter is infinite.
(2) Infinite