Question:

A 20 mV DC signal has been superimposed with a 10 mV RMS band-limited Gaussian noise with a flat spectrum up to 5 kHz. If an integrating voltmeter is used to measure this DC signal, what is the minimum averaging time (in seconds) required to yield a 99% accurate result with 95% certainty?

Show Hint

For an averaging voltmeter, output noise RMS scales as \(\sigma_{\text{avg}}=\sigma\sqrt{\dfrac{1}{2TB}}\). Combine this with the required confidence interval (\(\pm1.96\sigma\) for 95%) to size the averaging time.
Updated On: Aug 26, 2025
  • 0.1
  • 1.0
  • 5.0
  • 10.0
Hide Solution
collegedunia
Verified By Collegedunia

The Correct Option is B

Solution and Explanation

Step 1: For an integrating (averaging) voltmeter of duration \(T\), the equivalent noise bandwidth is \(B_{\mathrm{eq}}=\dfrac{1}{2T}\). With input white noise of one-sided bandwidth \(B=5\,\text{kHz}\) and RMS \(\sigma=10\ \text{mV}\), the output noise standard deviation is \[ \sigma_{\text{avg}}=\sigma\sqrt{\frac{B_{\mathrm{eq}}}{B}} =\sigma\sqrt{\frac{1}{2TB}}=0.01\,\text{V}\sqrt{\frac{1}{2T\cdot 5000}} =\frac{0.1\ \text{mV}}{\sqrt{T}}. \] Step 2: “99% accurate” for a 20 mV DC means the error band is \(\pm 1%\times 20\text{ mV}=\pm 0.2\text{ mV}\). For 95% certainty (\(\approx 1.96\sigma\) for Gaussian noise), \[ 1.96\,\sigma_{\text{avg}} \le 0.2\ \text{mV} \Rightarrow \sigma_{\text{avg}}\le 0.102\ \text{mV}. \] Step 3: Using \(\sigma_{\text{avg}}=\dfrac{0.1}{\sqrt{T}}\ \text{mV}\): \[ \frac{0.1}{\sqrt{T}} \le 0.102 \ \Rightarrow\ \sqrt{T}\ge \frac{0.1}{0.102}\approx 0.98 \ \Rightarrow\ T \gtrsim 0.96\ \text{s}\approx 1.0\ \text{s}. \] Thus the minimum averaging time is \(\boxed{1.0\ \text{s}}\).
Was this answer helpful?
0
0

Questions Asked in GATE BM exam

View More Questions