Scientists say that entropy of our universe is increasing as it is expanding and our universe is cooling down gradually from the time of its birth. If something is getting cooler and cooler, then how can it become more random (entropy increase) with the passage of time? According to laws of thermodynamics, at absolute zero temperature, the entropy is zero. That means as we go down and down to the temperature scale entropy must decrease, but why is it not happening in the case of our universe?
[Physics] Does entropy increase or decrease as our Universe is expanding
big-bangentropyspace-expansionthermodynamics
Related Solutions
The third law does not say that "if the entropy of a system approaches a minimum, it's temperature approaches absolut zero." It says that if the temperature approaches absolute zero, the entropy does. These are logical converses.
The second law of thermodynamics says that entropy can only increase, so if the early universe had been in a state of maximum entropy, then the cosmos would have experienced its heat death immediately after being born. This contradicts the observation that the present universe contains burning stars, heat engines, and life. These observations imply that the early universe was in a very low-entropy state, which shows that its initial conditions were extremely finely tuned. The reasons for this fine-tuning are not explained by general relativity or the standard model. Adding inflation to the model does not cure this fine-tuning problem.[Penrose 2005]
These ideas are strongly counterintuitive to most people, since we picture the early universe as an undifferentiated soup of hot gas, very much like what we might imagine a heat-dead universe to be like. In what way is the early universe not equilibrated?
We observe that the cosmic microwave background radiation's spectrum is a blackbody curve, which would normally be interpreted as evidence of thermal equilibrium. However, this observation only really tells us that the matter degrees of freedom were in thermal equilibrium. The gravitational degrees of freedom were not. In standard cosmological models, which are constructed to be as simple as possible, there are no gravitational waves. Although the real universe presumably does have gravitational waves in it, they are apparently very weak. In a maximum-entropy universe, the gravitational modes would be equilibrated with the matter degrees of freedom, and they would be very strong, as in models like Misner's mixmaster universe.[Misner 1969]
Even in Newtonian mechanics, gravitating systems violate most people's intuition about entropy. If we psssssht a bunch of helium atoms into a box through an inlet valve, they will quickly reach a maximum-entropy state in which their density is nearly constant everywhere. But in an imaginary Newtonian "box" full of gravitating particles, the maximum-entropy state is one in which the particles have all glommed onto each other in a single blob. This is because of the attractive nature of the gravitational force.
Charles W. Misner, "Mixmaster Universe", Physical Review Letters 22(1969)1071. http://astrophysics.fic.uni.lodz.pl/100yrs/pdf/07/036.pdf
Roger Penrose, 2005 talk at the Isaac Newton Institute, http://www.newton.ac.uk/webseminars/pg+ws/2005/gmr/gmrw04/1107/penrose/
I might see part of the problem here. There are processes in which energy is extracted via heating from a thermal reservoir, and in the process the system does positive work on the environment, and all of the energy coming in via heating gets transformed into work. There are many canonical examples in classic thermodynamics: the main one is an ideal gas undergoing an isothermal expansion.
So when you say
The hot reservoir provides heat energy to the system. Does it cause a decrease in entropy of the universe(system + hot reservoir)? How? In order to receive heat wouldn't the system have to be cooler than the reservoir? If so, then entropy increases as the heat energy gets expelled from the reservoir at a higher temperature than the temperature the system receives the heat energy. Is it so?
you are correct. This doesn't violate the Second Law at all, for the reasons you have expounded: either the system and the reservoir have the same temperature while they are exchanging energy via heat---in which case the net change in entropy is zero---or the system has a smaller temperature, in which case it is straight-forward to show that the system entropy increases more than the reservoir entropy decreases.
So what is the actual statement of the Second Law here? It is this:
It is impossible to construct an engine which will work in a complete cycle, and produce no effect except the raising of a weight and cooling of a heat reservoir.
The operative word there is "cycle": if the system has to operate on a cycle, then the entropy increase of the system caused by heat flow from the hot thermal reservoir must be offset by an entropy decrease, as I explain in this answer. This means that the system must expel energy via heating to a cold thermal reservoir, and that is exactly the reason why a perpetual motion machine doesn't exist: some of the energy must be wasted.
This is what people talk about when they talk about perpetual motion machines of the second kind: in order to have "perpetual motion", the system must repeat its motion over and over and over again, forever. In the processes I discussed above where all of the heat is converted into work, the system doesn't reset (it doesn't operate on a cycle!), and so such a machine must eventually stop. On the other hand, if the system does reset (i.e. if it does operate on a cycle), then some of the available energy is wasted every cycle, and so eventually again, the machine must run down and eventually stop.
Best Answer
The definition of entropy is $$S = -k \log(\Omega),$$ where Omega is roughly the number of microstates (ways of ordering your particles) compatible with the macrostate (what you observe macroscopically).
Intuitively, you can say that, if you have particles inside a box, and you increase the size of the box, you can arrange them in more ways; therefore, the entropy increases. The third postulate tells you that you will never reach 0K, so you can have more and more entropy without a paradox. The entropy density, on the other hand, could be decreasing.
But, beware! Thermodynamics are built assuming there are not long distance interactions, but the evolution of the universe is controlled by gravity, that has infinite range. Therefore, you cannot naively apply any thermodynamical theorem to the universe as a whole. You can physically argue the correctness of many of the postulates, but you are on shaky ground there.