Hello

Okay, so I'm studying for a thermodynamics exam which is due in two days. In the book (

*Fundamentals of Engineering Thermodynamics 6th edition* by Moran and Shapiro, p 156) they offer an example on how to use Control Volume analysis in transient states.

They provide the following question:

*Steam at a pressure of 15 bar and a temperature of 320*C is contained in a large vessel. Connected to the vessel through a valve is a turbine followed by a small initially evacuated tank with a volume of 0.6 m³. When emergency power is required, the valve is opened and the tank fills with steam until the pressure is 15 bar. The temperature in the tank is then 400*C. The filling process takes place adiabatically and kinetic and potential energy effects are negligible. Determine the amount of work developed by the turbine in kJ.*
I am kind of curious how they manage to raise the temperature from 320*C to 400*C while the final pressure is the same; where does the increase of internal energy comes from?

Anyway, that's not my main question.

In the example they work out that ΔU = -W + hΔm.

I can understand that; the total increase of energy in the tank is the incoming energy in terms of incoming enthalpy minus the work done at the turbine. Then in order to found out the work, we have to find ΔU, h and Δm.

I can also understand how to find Δm, given the volume, temperature and pressure - as can I understand how to find h given the temperature and pressure. What I can't get my mind wrapped around, however, is how to find ΔU.

In the example they say that ΔU = m*u-0 where u is the internal energy.

Why is it u and not h?

Why is it that the total increase of energy (ΔU) is only caused by an increase of internal energy (m*

**u**) in one case, but is also affected by enthalpy in another.

So put short:

Why does in ΔU = -W + hΔm involve an h while ΔU = u*m not involve an h?

I know that h = u + pv, though.