This is a good question, given the way calculus is currently taught, which for me says more about the sad state of math education, rather than the material itself. All calculus textbooks and teachers claim that they are trying to teach what calculus is and how to use it. However, in the end most exams test mostly for the students' ability to turn a word problem into a formula and find the symbolic derivative for that formula. So it is not surprising that virtually all students and not a few teachers believe that calculus means symbolic differentiation and integration.
My view is almost exactly the opposite. I would like to see symbolic manipulation banished from, say, the first semester of calculus. Instead, I would like to see the first semester focused purely on what the derivative and definite integral (not the indefinite integral) are and what they are useful for. If you're not sure how this is possible without all the rules of differentiation and antidifferentiation, I suggest you take a look at the infamous "Harvard Calculus" textbook by Hughes-Hallett et al. This for me and despite all the furor it created is by far the best modern calculus textbook out there, because it actually tries to teach students calculus as a useful tool rather than a set of mysterious rules that miraculously solve a canned set of problems.
I also dislike introducing the definition of a derivative using standard mathematical terminology such as "limit" and notation such as $h\rightarrow 0$. Another achievement of the Harvard Calculus book was to write a math textbook in plain English. Of course, this led to severe criticism that it was too "warm and fuzzy", but I totally disagree.
Perhaps the most important insight that the Harvard Calculus team had was that the key reason students don't understand calculus is because they don't really know what a function is. Most students believe a function is a formula and nothing more. I now tell my students to forget everything they were ever told about functions and tell them just to remember that a function is a box, where if you feed it an input (in calculus it will be a single number), it will spit out an output (in calculus it will be a single number).
Finally, (I could write on this topic for a long time. If for some reason you want to read me, just google my name with "calculus") I dislike the word "derivative", which provides no hint of what a derivative is. My suggested replacement name is "sensitivity". The derivative measures the sensitivity of a function. In particular, it measures how sensitive the output is to small changes in the input. It is given by the ratio, where the denominator is the change in the input and the numerator is the induced change in the output. With this definition, it is not hard to show students why knowing the derivative can be very useful in many different contexts.
Defining the definite integral is even easier. With these definitions, explaining what the Fundamental Theorem of Calculus is and why you need it is also easy.
Only after I have made sure that students really understand what functions, derivatives, and definite integrals are would I broach the subject of symbolic computation. What everybody should try to remember is that symbolic computation is only one and not necessarily the most important tool in the discipline of calculus, which itself is also merely a useful mathematical tool.
ADDED: What I think most mathematicians overlook is how large a conceptual leap it is to start studying functions (which is really a process) as mathematical objects, rather than just numbers. Until you give this its due respect and take the time to guide your students carefully through this conceptual leap, your students will never really appreciate how powerful calculus really is.
ADDED: I see that the function $\theta\mapsto \sin\theta$ is being mentioned. I would like to point out a simple question that very few calculus students and even teachers can answer correctly: Is the derivative of the sine function, where the angle is measured in degrees, the same as the derivative of the sine function, where the angle is measured in radians. In my department we audition all candidates for teaching calculus and often ask this question. So many people, including some with Ph.D.'s from good schools, couldn't answer this properly that I even tried it on a few really famous mathematicians. Again, the difficulty we all have with this question is for me a sign of how badly we ourselves learn calculus. Note, however, that if you use the definitions of function and derivative I give above, the answer is rather easy.
Best Answer
For some reason, students I teach always love epsilon-delta (not that they write good epsilon-delta proofs per se), and the more "wrong" I teach it, the more they enjoy it. The "wrong" thing that I like to do is to define real numbers via Cauchy sequences right at the beginning, at least in a hand-wavy way.
Calling a real number "real" is Orwellian, really- none of you have ever seen a real number. You might estimate pi to 100 or 1000 or a million places after the decimal point, but you can never write it all down- you never know precisely what it is. Even numbers like 0 and 1 are unknown as real numbers. You can write "0.00000..." until you're blue in the face, but you'll never know if the number you've written was actually zero or not, because there might be a sneaky "...000001..." coming up just around the corner to bite you.
So real numbers come with "fuzz". They're inherently "fuzz". There's no way around it. You can pretend they're points on a line, and crash face-first into Zeno's paradoxes, or you can accept their fuzziness and work with it; and I argue that calculus is none other than that "second approach".
So my "epsilon" is the fuzz. It's all those digits of "pi" (or of "zero") you never wrote down. It's the tree falling when nobody is there to hear it. It's the gap between human knowledge and universal truth. As such, it's part of what a real number is- a real number (as observed by mortal beings) comes with fuzz. A continuous function from the reals to the reals, then, has to "respect the fuzz".
Anyway, counter-intuitive though it is (perhaps), this motivation has worked wonders for me in practice. And so I keep using it, and keep getting excited about it, and it's the best part of the course, year after year.
Added: Interpreting the verb "to motivate" in another way, I always discuss the history of the ideas in some depth (I learnt it myself from wikipedia and books on the history of mathematics), and just how much people struggled to find the "right" definition, with no success, until Bernard Bolzano (primarily a Catholic priest!) finally hit upon an idea that worked in 1810. What idea were they trying to capture? Why was it so hard? How come it took 2500 years (Zeno of Elea to Bolzano) to find the right idea?
I'll also discuss the definition having been reworked and distilled by many many people- first its inventors, then mathematicians, then textbook writers, becoming more and more refined and smaller and smaller until that which is left looks to one who sees it for the first time like a small cold hard stone. It's only once you polish it (working it over in your mind, and solve problems) and shine it under a bright light (make sense for yourself of all those nested quantifiers) that you can finally see it for what it is- a diamond.