Not really. For one thing, before you can "add", "subtract", "multiply", and "divide", you have to have something that you are "adding", "subtracting", "multiplying", or "dividing." So that "something" has to precede these operations. And for another, there are many notions that cannot really be expressed in these terms. To take a familiar one if you've taken Calculus, the idea of "limit" is not one that can be expressed in terms of adding, subtracting, multiplying, or dividing, but rather is expressed in terms of "approaching", or of being "arbitrarily close to". (This is one reason why Newton and Leibnitz are so rightly hailed as revolutionaries: the ideas of Calculus were very much unlike what mathematics had been up to that point.)
However, one has to start somewhere. Historically, mathematics started from different starting points in different cultures; counting was certainly one of the firsts things that comes up. But some cultures then realized that they need some more general ideas that went beyond simply counting (with its addition, multiplication, etc). Greeks settled on geometry as their underlying structure, and in fact translated everything about numbers into statements about geometry: lines, points, circles, squares, etc. Other cultures did not.
These days we usually take the route of axiomatic theories, where we have some "primitive notions" that are not defined, and some rules about how we can talk about them and their properties, and then we try to build everything on top of that. One can start from the natural numbers (the "counting numbers") using, for instance, the Peano axioms, and start building from there. Then you are actually starting before you get to "addition", "multiplication", "subtraction", and "division", with only the notion of "number", "next one", and induction. Or we can have other starting points.
Now, today, there is usually one of two starting points: Set Theory, and Category Theory. Both presume that we agree on the basics of logic (the meaning of things like "and", "or", "if... then...", and "not", the meaning of "is equal to", and things like that); and the rules of inference, which are rules that let you go from some statements others; for instance, the rule that says that if you know that A implies B, and you also know that A is true, then you know that B is true.
In Set Theory, you start with the idea of "set" and "element of"; these ideas are not actually defined, you just give some axioms that describe things you can say about them. From these axioms one can define numbers, addition, multiplication, etc. And also build other notions that are orthogonal to natural numbers and their usual operations.
In Category Theory, you start with the idea of "objects" and "maps" between objects (again, not defined; you only get some axioms that tell you how you can talk about them); then you can use these notions to construct everything else in terms of them.
So, rather, in modern mathematics everything is either "sets all the way down", or everything is "categories all the way down". If there is one common set of presuppositions, it's just basic logic.
The key point is that "proof" requires you to at least understand what theory you are assuming.
Both the Riemann Hypothesis and "$T$ is consistent" (where $T$ is any recursively enumerable theory) are $\Pi^0_1$ statements. This means that if there is a counterexample, then the statement is disprovable. Or in other words, if the statement is not disprovable, then it is true.
But what does "true" mean? In the context of arithmetic true means "true in the natural numbers". In other contexts, it is usually meaningless, or synonymous for "provable".
Yes. $\sf PA$ is consistent is a $\Pi^0_1$ statement, but we also can prove that it is a true statement, since $\Bbb N$ exists and it is a model of $\sf PA$. Or at least that is the case if your metatheory is sufficiently strong (e.g. set theory à la $\sf ZFC$). The point is that $\sf PA$ itself cannot prove that $\sf PA$ is consistent, and that $\sf ZFC$ cannot prove that $\sf ZFC$ is consistent.
The difference between that and the Riemann Hypothesis is that you don't really talk about consistency of a theory. So there is no natural candidate of a theory which is insufficient for proving the Riemann Hypothesis.
It is possible for a $\Pi^0_1$ statement to turn out equivalent to "$\sf ZFC$ is consistent", in which case it cannot be proved. But it would be a mind shattering surprise to modern mathematics if RH turns out to be such statement. So what we expect is that RH is either outright refutable from a weak theory like $\sf PA$, and if not then it is at least true in $\Bbb N$, which is "the thing we care about" for the Riemann Hypothesis.
Best Answer
Apples and oranges are actually a rather bad example. The reason why it doesn't make sense to add quantities with different dimensions, but it does make sense to multiply (or divide) them is scale invariance.
Let U be the unit of some quantity $u$, and $V$ be the unit of another quantity $v$. Now say we change the scale of U, i.e. we instead use a different unit U' such that $1U = 10U'$. For $V$ we do the same, only that there we choose $V'$ such that $1V = 5V'$. If we compute the sum $s$ of $u$ and $v$ in units U,V we get $$ s = u + v $$ If, instead, we compute the sum in units $U'$ and $V'$, however, we get $$ s' = 10\cdot u + 5\cdot v $$ Note that $s$ and $s'$ don't just differ by a factor, i.e. we can't convert $s$ from unit $U+V$ to $s'$ in unit $U'+V'$ without knowing the original values of $u$ and $v$.
Compare this to the situation of a product. If we compute the product $p$ of $u$ and $v$ in units $U$ and $V$, we get $$ p = u\cdot v $$ If, instead, we compute it in units $U'$ and $V'$, we get $$ p' = (10\cdot u) \cdot (5\cdot v ) = 50\cdot p \text{.} $$ So $p'$ is simply $p$, expressed in a different unit P', with $1P = 1UV = 50P' = 1U'V'$.
So why do you want scale invariance? We want that, because the scale of physical units is usually completely arbitrary. There's nothing fundamental about 1 meter, or 1 inch, or 1 Volt - we just picked some reference value. But since the reference value is arbitrary, the actual physics must not change if we replace it by a different one. Which it doesn't, so long as we only multiply and divide, but not add or subtract values with different units, as the example above shows.
And this is also why apples and oranges are a bad example. We don't expect scale invariance for these, because apples and oranges are discrete objects, so there's a canonical definition of what "1 apple" means. So adding apples and oranges makes perfect sense, and we may e.g. assign the result the unit fruits.