Step 1: Understanding the effect of time scaling.
When a signal \( x(t) \) is scaled in time, i.e., replaced by \( x(at + b) \), the frequency spectrum of the signal is scaled by a factor of \( 1/|a| \). In this case, the signal \( y(t) = x(2t - 5) \) corresponds to a time scaling by a factor of 2. This scaling halves the period of the signal, which doubles its frequency range.
Step 2: Analyzing the options.
- (A) Incorrect, time scaling by a factor of 2 causes the bandwidth to increase, not decrease.
- (B) Incorrect, the frequency range will not remain between 100 Hz and 200 Hz.
- (C) Correct, scaling by 2 increases the frequency range from 100 Hz-200 Hz to 200 Hz-400 Hz.
- (D) Incorrect, \( y(t) \) is still band-limited, just with a different frequency range.
Step 3: Conclusion.
The correct answer is (C) \( y(t) \) is band-limited between 200 Hz and 400 Hz.