The equation you give:
$$ Q = m C_p \Delta T $$
just tells us the total amount of heat transferred, and does not tell us anything about the rate at which the heat transfer occurs. To calculate heat flow we have to solve the heat equation. If you do a physics degree this apparently innocent equation will cause you many hours of frustrated head scratching, so in practice we tend to use simple approximations. In everyday life heat flow tends to be well described by Newton's equation:
$$ \frac{dQ}{dt} \propto \Delta T $$
so, as you suggest, the greater the temperature difference the faster the heat flow.
The experiment you describe isn't really a good way of showing this, because if you mix hot and cold liquid in practice the rate of temperature change will be controlled by how fast you do the mixing. Newton's equation would be more useful if you place the two liquids in contact but don't allow them to mix e.g. have a metal (or some other high thermal conductivity) divider between them. In that case you're quite correct that the initial heat flow will be faster at 100°C than at 70°C. However the 100°C system will take longer to cool because the amoutn of heat, $Q \propto \Delta T$, that needs to be transferred is greater with a 100°C difference. The temperature difference as a function of time will look like:
$$ \Delta T(t) = \Delta T_0 e^{-\alpha t} $$
where $\Delta T_0$ is the initial temperature and $\alpha$ is a constant related to the thermal conductivity (large $\alpha$ means high thermal conductivity). If I use this equation to graph the cooling for 70°C and 100°C initial temperatures (choosing a random value of $\alpha$) I get:
So even though the 100°C difference initially cools faster the 70°C difference always reaches any specified temperature difference before the 100°C does.
No, the temperature of the water is not that important for the performance of an evaporative cooler. This is basically because the energy needed to increase the temperature of liquid water (its specific heat capacity) is very small compared to the energy needed to evaporate the same amount of water (its enthalpy of evaporation).
At room temperature the specific heat of liquid water is 4.18 J/(g$\cdot$K) while the ethalpy of evaporation is 44.0 kJ/mol. Since the molar mass of water is roughly 18 g/mol, this means that approximately 585 times as much energy is needed to evaporate an amount of water as to increase the temperature of the same amount of water by 1 K. This means that even if the water starts at freezing temperature, is heated to 40 $^\circ$C (104 $^\circ$F) and then evaporates, less than 7% of the energy absorbed by it is used for increasing the temperature.
The temperature of the water might effect the rate at which evaporation occurs, but I guess evaporative coolers are designed to achieve 100% relative humidity in the wet bulb regardless of the temperature of the water, so this probably doesn't affect the performance.
Best Answer
Short Answer:
The greater the temperature difference, the greater the rate at which heat transfers.
Longer more detailed answer, based on Wikipedia Heat and Temperature
Heat is transferred by one or more of three processes.
Conduction, an example of which is having the objects in physical contact.
Convection, in which the heat is transferred though a medium, such as air, so it's a slower, much less efficient process.
Radiation, which is how the heat of the Sun gets to us through the vacuum of space.
One complication to the heat transfer process is Newton's Law of Cooling, although it does not apply to all three methods of heat transfer outlined above:
Newton's Law of Cooling