A rigorous calculus textbook would probably help you "get into" Rudin. An example is Lang. Spivak is also much liked, and there's Hardy's Course of Pure Mathematics whose first three editions are free and have lots of amusing exercises. However, Rudin has stuff that isn't in Lang or Hardy (or I assume Spivak) in chapters 9, 10, and 11.
As for whether real analysis is a prerequisite for differential geometry, I'm afraid that you need a mastery of the subject matter of Rudin chapter 9 (multidimensional derivatives, inverse functions, implicit functions) for modern differential geometry texts to make sense to you (e.g., Lee's Smooth Manifolds). An alternative might be a style of learning in which there is a separate course in 2- and 3-dimensional manifolds first; for this a common text is Manfredo do Carmo's Differential Geometry of Curves and Surfaces.
Independently of all the above, for math learning you need to (i) learn theorem-proof style and (ii) work exercises, writing down the proofs in "rigorous" mathematician style.
Apostol's book is very good for first learning analysis. However, it leaves out a number of multivariable calculus topics, such as line and surface integrals, and vector analysis. These topics can be found in other books, including the same author's Calculus, Vol. 2.
Overall, Rudin's book has less content than Apostol's and less detailed proofs. The exercises in Rudin's book tend, more often than Apostol's, to require you to come up with ideas that are very different from those in the main text, or to perform more steps in a proof without hints. For some people, this is an advantage of Rudin, and for others a disadvantage.
I would say that Dieudonné's book is probably the best "reference", because it's very formal and systematic. (For example, the first definition given of the derivative is for a mapping between two Banach spaces.) It also discusses important results in the exercises. It is actually the first part of Dieudonné's nine-volume treatise in analysis. Because of its comprehensiveness, it wouldn't be a good first book to learn from for most people, with the exception of someone with very high ability and motivation.
You could also consider Zorich's two-volume book Mathematical Analysis. Generally, the first volume deals with differential and integral calculus in $\mathbf{R}$ and differential calculus in $\mathbf{R}^n$, and the second volume deals with various advanced topics. However, even the calculus material in the first volume is taught in a relatively advanced way (for example, using lim sup and lim inf to simplify proofs, or open and closed sets). This could be a good book if you want to both start analysis and learn multivariable calculus properly (i.e., with full proofs and difficult exercises).
Based on a very cursory glance at the book by Adams and Essex, I'd say that, compared to rigorous calculus books like those of Apostol and Spivak, it doesn't seem like great preparation for a course in analysis. There is much less theory, and the exercises are easier. So whether you'd be successful starting directly with Apostol's Mathematical Analysis depends a lot on you. If you find that it's difficult going, then you could try using a book like Ross's Elementary Analysis, which is intended for students who have little background with proofs.
Best Answer
In general, the push for rigor is usually in response to a failure to be able to demonstrate the kinds of results one wishes to. It's usually relatively easy to demonstrate that there exist objects with certain properties, but you need precise definitions to prove that no such object exists. The classic example of this is non-computable problems and Turing Machines. Until you sit down and say "this precisely and nothing else is what it means to be solved by computation" it's impossible to prove that something isn't a computation, so when people start asking "is there an algorithm that does $\ldots$?" for questions where the answer "should be" no, you suddenly need a precise definition. Similar things happened with real analysis.
In real analysis, as mentioned in an excellent comment, there was a shift in what people's conception of the notion of a function was. This broadened conception of a function suddenly allows for a number of famous "counter example" functions to be constructed. These often that require a reasonably rigorous understanding of the topic to construct or to analyze. The most famous is the everywhere continuous nowhere differentiable Weierstrass function. If you don't have a very precise definition of continuity and differentiability, demonstrating that that function is one and not the other is extremely hard. The quest for weird functions with unexpected properties and combinations of properties was one of the driving forces in developing precise conceptions of those properties.
Another topic that people were very interested in was infinite series. There are lots of weird results that can crop up if you're not careful with infinite series, as shown by the now famously cautionary theorem:
This theorem means you have to be very careful dealing with infinite sums, and for a long time people weren't and so started deriving results that made no sense. Suddenly the usual free-wheeling algebraic manipulation approach to solving infinite sums was no longer okay, because sometimes doing so changed the value of the sum. Instead, a more rigorous theory of summation manipulation, as well as concepts such as uniform and absolute convergence had to be developed.
Here's an example of an problem surrounding an infinite product created by Euler:
Questions like this were very popular in the 1800s, as mathematicians were notably obsessed with infinite products and summations. However, most questions of this form require a very sophisticated understanding of analysis to handle (and weren't handled particularly well by the tools of the previous century).