Question:

A voltmeter of resistance 1000 \( \Omega \) can measure up to 25 V. How will you convert it so that it can read up to 250 V?

Show Hint

To extend the voltage range of a voltmeter, a series resistor is used. The value of the series resistor is calculated by the voltage division rule. Make sure to use a resistor with appropriate power rating to avoid overheating.
Updated On: Jun 20, 2025
Hide Solution
collegedunia
Verified By Collegedunia

Solution and Explanation

To convert the voltmeter so that it can measure up to 250 V, we use a series resistor. Let the series resistance be \( R_s \), and the total resistance should be such that the voltmeter can measure 250 V when the original meter reads 25 V. Using the formula for voltage division: \[ V_{\text{new}} = V_{\text{max}} \times \frac{R_{\text{meter}}}{R_{\text{meter}} + R_s} \] Where:
- \( V_{\text{new}} \) is the new voltage range (250 V),
- \( V_{\text{max}} \) is the original maximum voltage (25 V),
- \( R_{\text{meter}} \) is the resistance of the voltmeter (1000 \( \Omega \)). We can rearrange the formula to solve for \( R_s \): \[ \frac{250}{25} = \frac{1000}{1000 + R_s} \] Simplifying: \[ 10 = \frac{1000}{1000 + R_s} \] Now, solve for \( R_s \): \[ 10 (1000 + R_s) = 1000 \] \[ 10000 + 10 R_s = 1000 \] \[ 10 R_s = 1000
- 10000 = 9000 \] \[ R_s = 9000 \, \Omega \] Thus, the required series resistance is \( 9000 \, \Omega \).
Was this answer helpful?
0
0

Top Questions on Units and measurement

View More Questions