Question:

f and g are continuous functions on interval \( [0, a] \). Given that \( f(a - x) = f(x) \) and \( g(x) + g(a - x) = a \), show that \( \int_{0}^{a} f(x) g(x) dx = \frac{a}{2} \int_{0}^{a} f(x) dx \).

Show Hint

The property \( \int_{0}^{a} h(x) dx = \int_{0}^{a} h(a - x) dx \) is very useful when dealing with integrals where the integrand has symmetry about the line \( x = a/2 \). Combining this property with the given conditions on the functions often leads to the solution.
Hide Solution
collegedunia
Verified By Collegedunia

Solution and Explanation

Let the given integral be \( I \): $$I = \int_{0}^{a} f(x) g(x) dx \quad \cdots (1)$$ Using the property of definite integrals \( \int_{0}^{a} h(x) dx = \int_{0}^{a} h(a - x) dx \), we can write: $$I = \int_{0}^{a} f(a - x) g(a - x) dx$$ We are given that \( f(a - x) = f(x) \), so substituting this into the integral: $$I = \int_{0}^{a} f(x) g(a - x) dx \quad \cdots (2)$$ We are also given that \( g(x) + g(a - x) = a \), which implies \( g(a - x) = a - g(x) \). Substituting this into equation (2): $$I = \int_{0}^{a} f(x) (a - g(x)) dx$$ $$I = \int_{0}^{a} (a f(x) - f(x) g(x)) dx$$ Using the linearity of integrals: $$I = \int_{0}^{a} a f(x) dx - \int_{0}^{a} f(x) g(x) dx$$ $$I = a \int_{0}^{a} f(x) dx - I \quad (\text{from equation (1)})$$ Now, we solve for \( I \): $$I + I = a \int_{0}^{a} f(x) dx$$ $$2I = a \int_{0}^{a} f(x) dx$$ $$I = \frac{a}{2} \int_{0}^{a} f(x) dx$$ Thus, we have shown that \( \int_{0}^{a} f(x) g(x) dx = \frac{a}{2} \int_{0}^{a} f(x) dx \).
Was this answer helpful?
0
0