You are certainly not alone in wondering about this! I should ask: in what sense do you mean the question?
a) If your question -- "can a specific moment in time really have a rate of change?" -- is directed towards the physical world, and the words "time" or "moment" are to be taken as referring to those things from our daily experience, then I'd tell you not to forget what math does: it doesn't constitute the real world, it just models it.
Perhaps the space we live in is actually discrete; i.e. if you zoom in close enough, our world is made of atomic "cells", just like a Minecraft world. Suppose each cell is a cube $1.6 \times 10^{-45}$ meters (ten orders of magnitude below the Planck length) on an edge. We don't know if this hypothesis is true or not: what experiment would disprove it? If it were true, then some things about real numbers that we learn in math (i.e. the idea of the limit is based, that for any number you name, I can always name a smaller one*), would be "wrong" for talking about objects on that size scale.
But it would still work just as well, as an approximation, for things that we currently use calculus for -- e.g. to calculate where to aim our spaceships. The rocket equations themselves are never going to fit the situation exactly (have you accounted for that dust particle? and that one?), the numbers we put into them are never going to be measured precisely.
A model cannot be judged right or wrong in itself; only the application of a model to a real-world situation can be judged, and then only in grades -- more appropriate or less appropriate. If speed comes in discrete chunks, then there may be no moment at which the volleyball, whose arc is described by $y = -x^2$, is ever moving at $-4$ meters/second calculus would predict at $x = 2$. Or maybe speed is continuous, and there is such a moment.
There's no way, even in principle, to tell, so we stick with the model we've got and change it only when it predicts the real world incorrectly.
b) But being less high-minded, it's helpful to have several ways to think about these things (and don't let anyone, including me, convince you that you have to think only their way about it).
As others have said, the derivative of a function $f(x)$ is a function $f'(x)$ which gives you the slope of the tangent line at $x$. If you believe that there can be a tangent line at a single point, then you can just think of that when others say "instantaneous rate of change".
*Here's the technical definition of a limit (ripped from Wikipedia), in case it helps. The statement
$$
\lim_{x \rightarrow 0} f(x) = L
$$
means that you can make $f(x)$ as close to $L$ as you like by making $x$ sufficiently close to $0$. With variables, that's:
For every $\epsilon > 0$, there exists a $\delta > 0$ such that if $0 < |x| < \delta$, then $|f(x) - L| < \epsilon$.
You can see how this would not work if there was a smallest real number -- then if I choose $\epsilon$ equal to that number, how are you going to make $|f(x) - L|$ smaller than it?
Consider this example: You get into your car and drive around. The function $f(t)$ tells you how much distance you have covered until time $t$. Now the derivative of this function is your momentaneous velocity.
So, since you only asked for an interpretation, think about this: Did you ever sit in a car wondering how fast you were going 'in that instance'? Of course, technically speaking, the tachometer only gives you some average speed over the last second or so but one still often thinks of it as the speed of the car 'right now'.
Best Answer
It's perfectly fine to use intuition in applying mathematics - it's just that in mathematics itself we want rigorous definitions so we can actually prove stuff. We seek definitions that formalize our intuition about something.
Just consider the simple example of finding the derivative of $f(x) = x^2$. Using the "intuitive definition" it's not really clear that this should equal $2x$. You could of course look at a few examples and extrapolate from those that it should be true, but how can you really be sure? In contrast the "hard" definition (which can also be considered to be a rather intuitive one) directly allows you to construct the derivative.
The approach mathematicians often times take is to take some concept, state which properties it should have, try formalizing those and seeing if the resulting thing:
So we defined the mathematical concept of derivative the way we did, because it corresponds to our intuitive notion of rate of change and thus should be applicable in circumstances where the intuitive thing is asked for.
When applying math in physics, engineering etc. you of course always have to consider whether it makes sense to model some real life phenomenon via the mathematical idealized version: What assumptions go into a derivative? Are they compatible with the real world? Surely some notion of continuity is needed for a derivative. Is the real world continuous? We really don't know, and afaik (not a physicist) we can never find out. That's why physics is more than just building theories - we also need to do experiments to see if our theory corresponds with the real world up to some acceptable margin of error. And judging from experiments and how succesful we are in modelling the real world using differential calculus, it would seem that using the intuition behind derivatives in the real world isn't totally wrong.