Question:

If a voltage across a bulb rated 220 V, 100 W drops by 2.5 % of rated value. the percentage of rated value by which the power would decrease is

Updated On: Apr 1, 2025
  • 5%
  • 20%
  • 10%
  • 2.5%
Hide Solution
collegedunia
Verified By Collegedunia

The Correct Option is A

Approach Solution - 1

A light bulb is rated at 220 V and 100 W. If the voltage supplied to the bulb drops by 2.5% of its rated value, we want to determine the percentage decrease in power output relative to the rated power. We can assume the resistance of the bulb remains constant.

Here's how we can approach this problem: 

  • Rated voltage: \(V_R = 220 \text{ V}\)
  • Rated power: \(P_R = 100 \text{ W}\)
  • Voltage drop: 2.5% of \(V_R\)

First, let's calculate the new voltage after the drop:

Voltage drop amount: \(0.025 \times 220 \text{ V} = 5.5 \text{ V}\)

New voltage: \(V_N = 220 \text{ V} - 5.5 \text{ V} = 214.5 \text{ V}\)

We know the power is related to voltage and resistance by \(P = \frac{V^2}{R}\). Since the resistance \(R\) is constant, we can write a ratio:

\(\frac{P_N}{P_R} = \frac{V_N^2}{V_R^2}\)

Where:

  • \(P_N\) is the new power
  • \(P_R\) is the rated power
  • \(V_N\) is the new voltage
  • \(V_R\) is the rated voltage

Now, let's solve for the new power \(P_N\):

\(P_N = P_R \times \frac{V_N^2}{V_R^2} = 100 \text{ W} \times \frac{(214.5 \text{ V})^2}{(220 \text{ V})^2}\)

\(P_N = 100 \text{ W} \times (\frac{214.5}{220})^2 \approx 100 \text{ W} \times (0.975)^2\)

\(P_N \approx 100 \text{ W} \times 0.950625 \approx 95.06 \text{ W}\)

The decrease in power is \(100 \text{ W} - 95.06 \text{ W} = 4.94 \text{ W}\).

Finally, let's calculate the percentage decrease in power:

Percentage decrease = \(\frac{4.94 \text{ W}}{100 \text{ W}} \times 100\% \approx 4.94\%\)

Therefore, the percentage decrease in power is approximately:

5%

Was this answer helpful?
0
0
Hide Solution
collegedunia
Verified By Collegedunia

Approach Solution -2

The power consumed by a bulb can be calculated using the formula: 

\( P = \frac{V^2}{R}\)

Given that the rated voltage (V) is 220 V and the rated power (P) is 100 W, we can calculate the resistance (R) using the above formula:

\( 100 = \frac{220^2}{R}\)

\( R = \frac{220^2}{100}\)

Thus, \( R = 484 \, \Omega \).

Now, if the voltage across the bulb drops by 2.5% of the rated value, the new voltage \( V' \) can be calculated as:

\( V' = V - \frac{2.5}{100} \times V\)

\( V' = 220 - \frac{2.5}{100} \times 220\)

\( V' = 220 - 5.5\)

\( V' = 214.5 \, \text{V}\)

To find the new power \( P' \), we can use the formula:

\( P' = \frac{V'^2}{R}\)

\( P' = \frac{214.5^2}{484}\)

\( P' = \frac{46084.25}{484}\)

\( P' \approx 95.15 \, \text{W}\)

Now, let's calculate the percentage decrease in power compared to the rated value:

\(\text{Percentage decrease in power} = \left( \frac{P - P'}{P} \right) \times 100\)

\(\text{Percentage decrease in power} = \left( \frac{100 - 95.15}{100} \right) \times 100\)

Percentage decrease in power is: \( \approx 4.85\% \approx 5\% \)

Therefore, the correct answer is: (A) 5%.

Was this answer helpful?
0
0