Water level drops in a river. The previously saturated gravel slowly takes on the same temperature as the air. Assuming the gravel was originally the same temperature as the water, and the surface is flat, How long does it take?

I believe these are the relevant formulas :

Q=mC∆T

Q/t = kA∆T/d

Where Q = energy [J], m = mass (kg), t = time(secs), ∆T = temperature difference, d = thickness (meters) of material conducting energy, A = surface area of interface between bodies exchanging heat (m^2).

Given:

drop in water surface elevation = 1m (for simplicity's sake),

water temp = 10°C,

air temp = 15°C,

specific heat capacity of gravel (C) = 1175 J/kg\cdot K,

thermal conductivity of gravel (k) = 3.5 W/m\cdot K,

density of gravel (relevant to m) = 1950 kg/m^3,

This is how I've gone about it so far:

Use Q = mC∆T to calculate the amount of energy required to heat 1 cubic meter of gravel (cube dimensions are based off drop in elevation, 1m)

Then rearrange Q/t = (kA∆T)/d as

t = (d/(kA∆T))(Q)

Solve for t, inputing values consistent with the cube dimensions for surface area (A) = 1, and thickness (d) = 1

Is this the correct approach?

Thank you in advance.

EDIT *** Bonus question

Would the approach be the same if the temperature was lower in the air than the water, ie; ∆T was negative?