[Edit: A bug in the original proof has been fixed, thanks to a comment by Francois Dorais.]
The answer is no. This kind of thing can be proved by what I call a "gas tank" argument. First enumerate all Turing machines in some manner $N_1, N_2, N_3, \ldots$ Then construct a sequence of Turing machines $M_1, M_2, M_3, \ldots$ as follows. On an input of length $n$, $M_i$ simulates $N_i$ (on empty input) for up to $n$ steps. If $N_i$ does not halt within that time, then $M_i$ halts immediately after the $n$th step. However, if $N_i$ halts within the first $n$ steps, then $M_i$ "runs out of gas" and starts behaving erratically, which in this context means (say) that it continues running for $n^e$ steps before halting where $e$ is the number of steps that $N_i$ took to halt.
Now if we had a program $P$ that could compute a polynomial upper bound on any polytime machine, then we could determine whether $N_i$ halts by calling $P$ on $M_i$, reading off the exponent $e$, and simulating $N_i$ for (at most) $e$ steps. If $N_i$ doesn't halt by then, then we know it will never halt.
Of course this proof technique is very general; for example, $M_i$ can be made to simulate any fixed $M$ as long as it has gas but then do something else when it runs out of gas, proving that it will be undecidable whether an arbitrary given machine behaves like $M$.
No, we can construct a computable tree with no computable paths such that there is a probabilistic Turing machine which with nonzero probability constructs a path.
The basic idea is this: kill off a small-measure subset of $2^\omega$ containing all computable reals, and pick our branch completely at random (so that our probability of success is exactly the measure of the complement of the set we kill off).
Here's one way to do it. Call a sequence $\sigma\in 2^{<\omega}$ dangerous if for some $e$, we have $\Phi_e(n)[\vert\sigma\vert]\downarrow=\sigma(n)$ for all $n<4^e$. Note that the set of dangerous strings is computable, and that every infinite computable sequence extends a dangerous string.
Now let $T$ be the tree of all non-dangerous strings, and use the probabilistic Turing machine which simply picks a random extension without thinking at all. Since the function $e\mapsto 4^e$ grows sufficiently fast, we have a positive probability of getting a path. However, $T$ clearly has no computable paths.
Re: question $2$, I don't have an answer but here's a quick observation: we can defeat the coin-flipping algorithm. (I suspect the answer to 2 is yes, by diagonalizing against probabilistic Turing machines - kill nodes to shrink the measure of branches built by a given machine - but I don't immediately see the details.)
Say $\sigma$ is risky if for some $e$, we have
$\Phi_e(2^n)[\vert\sigma\vert]\downarrow=\sigma(2^n)$ for all $n<4^e$, or
for some $m$ not of the form $2^k$ we have $\sigma(m)\not=1$ (say).
Again, take the tree of non-risky strings. Note that every computable sequence extends a risky string, so this tree has no computable paths, and every sequence with only non-risky initial segments has asymptotic-density-$1$ many $1$s, so the set of paths is null.
Best Answer
EDIT: This was in a comment below, but I now think it should be part of the main answer:
There are two different ways to ask the question in the OP:
Is there a real number $r$ such that no polytime algorithm computes all the bits of $r$?
Is there a real number $r$ such that no individual bit of $r$ can be computed by a polytime algorithm?
The former is the question I answer below; the latter is trivial! Given any real $r$, and any $n$, there is a polytime (indeed, constant time) algorithm $p_{r, n}$ which computes the first $n$ bits of $r$ correctly. (Consider the silly algorithm which has the string $\langle r(0), r(1), . . . , r(n)\rangle$" "hard-coded" in - on input $k$ for $k\le n$ this algorithm outputs $r(k)$, and on input $k$ for $k>n$ it outputs $0$ (say).) So in order to get anything interesting, we need to look at algorithms which attempt to compute all the bits simultaneously.
Note that noncomputable reals trivially satisfy the first question, so the right question to ask is:
Here's an explicit construction of a computable real which is hard to compute:
Let $\{\varphi_e\}$ list all (partial) computable functions, and $\{p_e\}$ all polynomials.
We define a real number $r$ as follows: the $n$th bit of the binary expansion of $r$ is a $1$ iff $\varphi_i(n)$ does not halt and output $1$ in $\le p_j(n)$ steps (so, either doesn't halt in that time, or does halt and outputs something $\not=1$) - where $n=\langle i, j\rangle$. (Here "$\langle\cdot,\cdot\rangle$" denotes the Cantor pairing function.)
$r$ is computable (note that we run each computable function for only a bounded amount of time before deciding what to do for that bit), but $r$ is not computable in polynomial time since it diagonalizes against all polynomial-time computations: if $\varphi_i$ has running time bounded by $p_j$, then $\varphi_i(\langle i, j\rangle)\not=r(\langle i, j\rangle)$.
Note that nothing special about polynomials was used here: given any reasonable complexity class (for instance, any complexity class of the form "Running time bounded by some $f\in\mathcal{F}$" for $\mathcal{F}$ a computable set of computable total functions), there is a computable real whose bits are not computable by any algorithm in that class.
Going back to the second, "trivial" version of the question, it can actually be "de-trivialized" in an interesting way: look at how hard it is to get successively longer approximations to $r$. That is, the silly algorithm I describe which gets the first $n$ bits correctly has length $\approx 2^n$; can we do better?
This question forms the basis of Kolmogorov complexity. Roughly speaking, given a real $r$, let $K^r$ be the function whose $n$th value is the length of the shortest Turing machine which computes the first $n$ bits of $r$ correctly. Then we can ask (even for noncomputable $r$!): how quickly does $K^r$ grow? If $r$ is computable, then $K^r$ is eventually constant; if $r$ is not computable, however, things get really interesting. (See e.g. the notion of K-trivials, which are reals which are "almost computable" in the sense of Kolmogorov complexity.)
Now, this isn't quite what you're asking about - you'd want to look at $K_{poly}^r$, the function whose $n$th value is the length of the least polytime Turing machine which computes the first $n$ bits of $r$ correctly. This, and other resource-bounded variations of Kolmogorov complexity, don't appear to be as studied - but see this article.