In microwave ovens what matters is how much energy the radiation carries and how that energy is absorbed by the food. Visible light and IR are rapidly absorbed by most foods, so they would only heat the outer layer of the food. You'd get food with the outside carbonised and the inside raw.
Microwaves are far less strongly absorbed by foods, so they penetrate deep into the food and can heat the interior. Even so large objects won't be heated throughout, which is one reason why microwave cooking instructions frequently advise a multi stage process of heating, letting the food stand then heating a final time.
Microwave ovens often include IR heating as well as the microwave heating. This is done so you get food with a browned exterior and heated throughout.
The answers to Why do microwave ovens use radiation with such long wavelength? give a nice discussion of why the exact wavelength used was chosen. The frequencies commonly used in microwave ovens are 2.45 GHz (12 cm) for home ovens and 915 MHz (38 cm) for industrial overs. Much higher frequencies are not used due to the cost of the magnetron, while much lower frequencies would not work because the wavelengths would be too big to allow a half wavelength to fit in the oven.
Finally, you say:
Why do we use microwaves in microwave oven when infrared and visible light are much hotter and how do microwaves cook food when they are cooler than visible light and others.
But this is a slight misunderstanding. The wavelength of light emitted is indeed related to the temperature of the source, but light itself doesn't really have a temperature in the sense that matter does. Light transfers energy, and if this energy is absorbed it will heat the food. However the amount of heating is just related to the intensity of the EM radiation and the abosrption cross section. The wavelength makes a difference only insofar as it affects the absorption cross section.
Different molecules and different crystalline structures have frequency dependent absorption/reflection/transmission properties. In general, light in the human visible range can travel with little absorption through glass, but not through brick. UV can travel well through plastic, but not through silicate-based glass. Radio waves can travel through brick and glass, but not well through a metal box. Each of these differences has a slightly different answer, but each answer is based on molecular resonance or crystalline structure (or lack thereof) or electrical conductivity.
Bottom line: There isn't one general answer for why $\lambda_A$ goes through material X but $\lambda_B$ doesn't.
Best Answer
Your question contains a premise that is false: Microwaves do not have less energy than visible light per se. They only have less energy per photon, as per the Planck–Einstein relation, $E = hf$. In other words, you can raise the power of electromagnetic radiation to a dangerous level at any wavelength, if only you generate enough photons – as your microwave oven does.
That very much includes visible light. You can easily verify this by waiting for a sunny day, getting out your magnifying glass, and using it to focus sunlight on a piece of paper. Watch it char and maybe even burn. (Make sure there's nothing around that piece of paper that can burn.) In conclusion, then, sunlight is dangerous!