Question:

Let \( (X, Y) \) be a random vector with joint moment generating function \[ M(t_1, t_2) = \frac{1}{(1 - (t_1 + t_2))(1 - t_2)}, \quad -\infty<t_1, t_2<\min(1, 1 - t_2) \] Let \( Z = X + Y \). Then, \( \mathrm{Var}(Z) \) is equal to:

Show Hint

The MGF of the sum \( Z = X + Y \) is obtained by substituting \( t_1 = t_2 = t \). Use gamma properties to find moments easily.
Updated On: Dec 6, 2025
  • 3
  • 4
  • 5
  • 6
Hide Solution
collegedunia
Verified By Collegedunia

The Correct Option is C

Solution and Explanation

Step 1: Identify distribution type.
The given MGF can be written as: \[ M(t_1, t_2) = \frac{1}{(1 - t_1 - t_2)(1 - t_2)} = M_X(t_1 + t_2) \cdot M_Y(t_2) \] This corresponds to \( X, Y \) as jointly distributed with linear dependency in \( t_1 \) and \( t_2 \).
Step 2: Derive the MGF of \( Z = X + Y \).
\[ M_Z(t) = M(t, t) = \frac{1}{(1 - 2t)(1 - t)}. \] Thus, \( Z = X + Y \) is the sum of two independent gamma(1,1) variables with shape parameters 1 and 2.
Step 3: Compute variance.
For a gamma distribution \( \Gamma(k, \theta) \), variance = \( k\theta^2 \). Here, \( Z \) is equivalent to \( \Gamma(3,1) \), hence variance = 3. Final Answer: \[ \boxed{3} \]
Was this answer helpful?
0
0

Questions Asked in IIT JAM MS exam

View More Questions