We know that an electromagnetic wave is produced by periodically changing electric field (by an accelerating charged particle). We know that the electric field of a point charge varies inversely with square of distance from the charge. Therefore the amplitude of the electromagnetic wave such as light must go on decreasing and practically vanish on covering a finite distance. But , as we know, it doesn't happen and we see the light waves coming from stars very far away. Then what's happening there? Or am I wrong? Please explain.
[Physics] Amplitude of electromagnetic waves
electromagnetic-radiationelectromagnetism
Related Solutions
Well, I would say the electromagnetic field is the medium.
For like the medium water oscillates when a water wave is observable after throwing a stone, so the electromagnetic field oscillates when excited by an antenna, say. If nothing oscillates there are no waves, neither in water nor in the electromagnetic field.
The medium disappears only when one thinks of an electromagnetic field as being nothing, only a vacuum. But this liberal view of the vacuum is quite different from the view of the vacuum in QED, the accepted theory of electromagnetic fields. There the vacuum state doesn't possess an electromagnetic field. More precisely, its expectation value - i.e., what is observable about it - is identically zero.
Let's look at your question
"Why in case of light, the Amplitude doesn't seem to decrease when it travels in vacuum(even though electric and magnetic fields from nearby sources exists)?"
Perhaps the confusion is caused by the concept of a plane wave. Yes, indeed, a plane wave has an amplitude that remains constant throughout space. However, one never finds an exact plane wave in practical situations. Practical optical beams always have a finite transverse scale. You can think of the optical beam produced by a laser point. The spot of light that it produces has a finite size. As a result this beam will gradually expand as in propagates further and further.
In general on can have cylindrical waves or spherical waves, in addition to plane waves. The conservation of energy dictates that total power on a closed surface perpendicular to the direction of propagation must be constant regardless of far away that surface is (assuming of course there is no absorption of the optical power along the way). Power is the integral over the intensity over an area and intensity is proportional to the square of the amplitude. To satisfy this requirement the amplitude of a cylindrical wave must decrease as one over the square root of the radius of the cylindrical surface. On the other hand, for the spherical wave the amplitude decreases as one over the radius spherical surface.
Next question:
"It is said that from Maxwell's wave equation, light is a transverse wave. So, when we draw a light wave, the changing electric field is drawn mutually perpendicular to the changing magnetic field. The Amplitude is the highest value of the function, but physically, the value keeps on increasing and after reaching a certain point(the Amplitude) decreases again up to it's negative value, where does the light gets its energy to again oscillate from its negative Amplitude to the positive Amplitude."
Some times the diagram could perhaps be misleading. The typical diagram showing the electric and magnetic fields represents the spatial shape of the fields as they are frozen in time. However, if one were to turn on the evolution of this field in time, how would the diagram change? It would shift in the direction of propagation. This is the basic property of a wave. If the frozen diagram for the electric field for instance is represented by a function $\mathbf{E}(z)$, then the corresponding expression for the electric field as it evolves in time is represented by $\mathbf{E}(z-ct)$. So we see that the function shift toward the direction of propagation (positive $z$-direction in this case) at the speed of light.
One can now use this evolution to see what would happen if we look at just one point in space and see what the electric field does as a function of time. So let's set $z=0$, then we get $\mathbf{E}(-ct)$. So we see that we get the same function but as a function of time and now it is inverted. The the electric field oscillates at any particular point in space.
The energy in the field is carried along with it. One can calculate the energy by integrating the power over time.
Hope all the issues have been addressed. Let me know if anything is still unclear.
Best Answer
The reason we can see stars from far away is because those stars are unimaginably bright up close. Stars do vary in brightness depending on the star type and distance, and there are certainly stars whose light we cannot see. Read about apparent magnitude and absolute magnitude.