Your eye is a second optical system.
It re-focuses the diverging rays to produce a real image on the retina.
This process is exactly the same thing it does when looking at a nearby (i.e. not at effective infinity) object.
I believe the explanation is quite simple. When you take a wide angle video, you are mapping 170 degrees of azimuth onto the film. When you play that back, you play it back on a screen that takes up roughly 40-60 degrees of azimuth. The edges are compressed relative to the center when recording the image ($r=f\sin\theta$), but not when playing back the image. Because larger physical distances are being recorded in a smaller film space, the velocity of objects at the edges appears slower (and the speed at which objects depart the field of view is what most strongly influences your sense of motion). The apparent velocity in the middle of the image is being recorded quite accurately, but we have difficulty judging the velocity of things which are moving toward or away from us; we judge velocity best when it is nearly tangential (at the edge of the frame, with a fisheye lens).
This is not present in all types of lenses. Standard lenses map the environment to a focal plane using the perspective mapping $r=f\tan(\theta)$, where $r$ is the distance across the imaging plane and $\theta$ is the angular position in the scene. The angular derivative of this function is $dr/d\theta=f\sec^2(\theta)$, which means that the change in pixel position with respect to change in scene angle increases strongly as theta increases.
The GoPro lens is approximately an equisolid fisheye lens (there are many possible fisheye mappings, but this is among the most common). The mapping for this type of lens is $r=2f\sin(\theta/2)$, which has angular derivative $dr/d\theta=f\cos(\theta/2)$. This means that changes in scene angular position correspond to smaller and smaller changes across the imaging plane as $\theta$ increases.
Since playback of recorded images is typically not reprojected with the same mapping with which it was recorded, this results in distortions when watching video playback. For standard lenses, this over-expands the edges, while for fisheye lenses, the edges of the image are under-expanded.
Best Answer
In order for you to see an image of the Sun coming off the windshield, that means that rays therefrom must be hitting that windshield and then reflecting off in the right direction to reach your eyes.
And the fact that rays are hitting that windshield means that also, given that it is mostly transparent, some, in fact most, of those rays are being transmitted, instead of reflected. That is, the Sun is shining into the inside of the vehicle, and hence will be near or in the driver's field of vision, likely greatly interfering with their ability to see thanks to the intense light.
Of course, depending on the exact angles involved, it might still be out of hir field of vision, or sie may have a visor down to block it. Nonetheless, the point is to use the precautionary principle: since it indicates a sizeable chance that something dangerous is happening, treat that chance as 100%, and act accordingly (i.e. don't just dart out into the road expecting they'll see you and stop, but wait for them to pass and there to be a large enough opening in traffic to make it through with considerable safety margins or, better yet, look for an intersection with traffic signals).