We are given a random sample \( X_1, X_2, \dots, X_n \) from a normal distribution \( N(-\theta, \theta) \), where \( \theta > 0 \) is the unknown parameter. We need to determine which of the following options is correct.
Step 1: Understanding the Distribution
The random variables \( X_1, X_2, \dots, X_n \) are from the normal distribution \( N(-\theta, \theta) \), which means:
The mean of the distribution is \( -\theta \),
The variance of the distribution is \( \theta \).
Step 2: Factorization Theorem
To determine which statistic is minimal sufficient, we apply the Factorization Theorem, which states that a statistic is sufficient if the likelihood function can be factored into two parts: one depending on the data through the statistic, and the other not depending on the parameter. The likelihood function for the normal distribution is given by: \[ L(\theta) = \prod_{i=1}^{n} \frac{1}{\sqrt{2\pi \theta}} \exp \left( -\frac{(X_i + \theta)^2}{2\theta} \right). \] This can be written as: \[ L(\theta) = \frac{1}{(\sqrt{2\pi \theta})^n} \exp \left( -\sum_{i=1}^{n} \frac{(X_i + \theta)^2}{2\theta} \right), \] which depends on the sum \( \sum_{i=1}^{n} X_i^2 \), making it crucial in determining the likelihood.
Step 3: Identifying the Minimal Sufficient Statistic
The statistic \( \sum_{i=1}^{n} X_i^2 \) is sufficient for \( \theta \) because it contains all the information needed to estimate \( \theta \). Additionally, it is minimal because there are no redundant pieces of information: it captures both the mean and variance of the distribution. Thus, \( \sum_{i=1}^{n} X_i^2 \) is the minimal sufficient statistic.
Step 4: Re-evaluating the Other Options
(A) \( \sum_{i=1}^{n} X_i \): This is a sufficient statistic, but not minimal, because it does not capture information about the variance of the distribution.
(C) \( \left( \frac{1}{n} \sum_{i=1}^{n} X_i, \frac{1}{n-1} \sum_{j=1}^{n} (X_j - \frac{1}{n} \sum_{i=1}^{n} X_i)^2 \right) \): While this is a complete statistic, it is not minimal since the sum of squares \( \sum_{i=1}^{n} X_i^2 \) alone is already sufficient and minimal.
(D) \( -\frac{1}{n} \sum_{i=1}^{n} X_i \): This is the unbiased estimator of \( \theta \), but not minimal or sufficient for the variance in this case. Thus, the correct answer is (B) \( \sum_{i=1}^{n} X_i^2 \).
Three villages P, Q, and R are located in such a way that the distance PQ = 13 km, QR = 14 km, and RP = 15 km, as shown in the figure. A straight road joins Q and R. It is proposed to connect P to this road QR by constructing another road. What is the minimum possible length (in km) of this connecting road?
Note: The figure shown is representative.
For the clock shown in the figure, if
O = O Q S Z P R T, and
X = X Z P W Y O Q,
then which one among the given options is most appropriate for P?
“His life was divided between the books, his friends, and long walks. A solitary man, he worked at all hours without much method, and probably courted his fatal illness in this way. To his own name there is not much to show; but such was his liberality that he was continually helping others, and fruits of his erudition are widely scattered, and have gone to increase many a comparative stranger’s reputation.” (From E.V. Lucas’s “A Funeral”)
Based only on the information provided in the above passage, which one of the following statements is true?