In an adiabatic expansion of a gas initial and final temperatures are T1 and T2 respectively then the change in internal energy of the gas is [R = gas constant, γ = adiabatic ratio]
R (T1 - T2)
\(\frac {R(T_1-T_2)}{γ-1}\)
\(\frac {R(T_2-T_1)}{γ-1}\)
The change in internal energy (ΔU) of a gas during an adiabatic expansion can be determined using the adiabatic equation:
ΔU =\(\frac {γ}{γ - 1}\) * R * (T2 - T1)
where γ is the adiabatic ratio or heat capacity ratio, R is the gas constant, T1 is the initial temperature, and T2 is the final temperature.
Therefore, the correct option is (B) \(\frac {R(T_1 - T_2)}{γ - 1}\).
Thermodynamics in physics is a branch that deals with heat, work and temperature, and their relation to energy, radiation and physical properties of matter.
The first law of thermodynamics, also known as the Law of Conservation of Energy, states that energy can neither be created nor destroyed; energy can only be transferred or changed from one form to another.
The second law of thermodynamics says that the entropy of any isolated system always increases. Isolated systems spontaneously evolve towards thermal equilibrium—the state of maximum entropy of the system. More simply put: the entropy of the universe (the ultimate isolated system) only increases and never decreases.
The third law of thermodynamics states that the entropy of a system approaches a constant value as the temperature approaches absolute zero. The entropy of a system at absolute zero is typically zero, and in all cases is determined only by the number of different ground states it has. Specifically, the entropy of a pure crystalline substance (perfect order) at absolute zero temperature is zero