The conditional distribution you are looking for is the following
$$ P(X_i=k) =
\begin{cases}
\frac{n-r}{n}, & \text{if k=0} \\
\frac{r}{n}, & \text{if k=1}
\end{cases}$$
it is very easy to prove..
Without loss of generality let's prove the conditional distribution of $X_1$ (all the variables have the same density)
$$\mathbb{P}[X_1=0|\sum_iX_i=r]=\frac{\mathbb{P}[X_1=0;\sum_{i=1}^nX_i=r]}{\mathbb{P}[\sum_{i=1}^nX_i=r]}=$$
$$=\frac{\mathbb{P}[X_1=0;\sum_{i=2}^nX_i=r]}{\mathbb{P}[\sum_{i=1}^nX_i=r]}=\frac{\mathbb{P}[X_1=0]\mathbb{P}[\sum_{i=2}^nX_i=r]}{\mathbb{P}[\sum_{i=1}^nX_i=r]}=$$
$$\frac{(1-p)\binom{n-1}{r}p^r(1-p)^{n-1-r}}{\binom{n}{r}p^r(1-p)^{n-r}}=\frac{(n-1)!r!(n-r)!}{r!(n-1-r)!n!}=\frac{n-r}{n}$$
Similar brainstorming for $\mathbb{P}[X_1=1]$
The answer is no.
Consider the probability space $(\Omega, \mathcal{F}, \mathbb{P})$, where $\Omega = \{0,\ldots,n\}$, $\mathcal{F} = \mathcal{P}(\Omega)$ and $\mathbb{P}: \mathcal{F} \to [0,1]$ defined as $\mathbb{P}(A) = \sum\limits_{i \in A} \binom ni p^i(1-p)^{n-i}$ for every $A \in \mathcal{F}$.
Consider now $X: \Omega \to \{0,\ldots, n\}$ as $X(i)=i$ (the identity function).
The distribution of $X$ is clearly binomial.
Suppose now that $X = X_1 + \ldots + X_n$ with $X_i$ iid with law Bernoulli of parameter $p$, all defined on $(\Omega, \mathcal{F}, \mathbb{P})$.
This would mean that there is $A \in \mathcal{F}$ such that $\mathbb{P}(A) = p$.
And this must be true for every $p \in (0,1)$.
Consider the map $F: (0,1) \to \mathcal{P}(\{0,\ldots,n\})$ that for a given $p$ returns such a set $A$. Since the domain is infinite and the codomain is finite there must be a given $A \subseteq \{0,\ldots,n\}$ such that for infinite many $p$ we have $$\sum\limits_{i \in A} \binom ni p^i(1-p)^{n-i} = p$$
Therefore we have $\sum\limits_{i \in A} \binom ni x^i(1-x)^{n-i} = x$ for every real $x$ (polynomial identity).
And now, for example with $n=2$, we obtain that the LHS can be equal to $0,1,x^2,(1-x)^2,2x(1-x),1-x^2,1-(1-x)^2,1-2x(1-x)$, none of which is equal to $x$.
Best Answer
In general, no. As @Did pointed out, we can construct a $\operatorname{Bin}(n,p)$ random variable on a probability space with $n+1$ elements. Consider $\Omega = \{0,1,2\}$ with \begin{align}\mathbb P(\{0\})&=(1-p)^2\\\mathbb P(\{1\})&= 2p(1-p)\\ \mathbb P(\{2\})&=p^2 \end{align} and $X(\omega)=\omega$. If $X=X_1+X_2$ where $X_1, X_2$ take values in $\{0,1\}$ then from \begin{align} 0 &= X(0) = X_1(0) + X_2(0)\\ 2 &= X(2) = X_1(2) + X_2(2) \end{align} we see that $X_1(0)=X_2(0)=0$ and $X_1(2)=X_2(2)=1$. It is clear that $X_1$ and $X_2$ cannot be independent.
An example where the answer is "yes" is given by the $\Omega = \{H,T\}^n$ (i.e. the outcomes are $n$ tosses of a coin), $$\mathbb P(\{\omega\}) = p^j (1-p)^{n-j},\; \omega\in\Omega$$ where $j$ is the number of "heads" in $\omega$, and $X$ again the number of "heads" in $\omega$. For $\omega = (\omega_1,\ldots,\omega_n)$, define $X_j$ by $X_j(\omega) = \mathsf 1_{\omega_j=1}$. It follows from the binomial theorem that $$\mathbb P(X_j=1) = p = 1 - \mathbb P(X_j=0),\; j=1,2,\ldots,n, $$ and by construction, $$X=\sum_{j=1}^n X_j$$ with the $X_j$ independent.