You're right about there not being a closed form, but there being a generating function solution.
In combinatorics, the objects you are counting are called partitions, which are ways of writing one number as a sum of positive integers.
Interpret $n_k $ as the number of times the integer $k $ appears in the sum. Then
$$ \sum_{k =1}^E k \cdot n_k = E $$
means that the numbers sum up to $E $, so we are interested in partitions of $E $ (note that we can ignore $k=0$ here as it does not affect this sum).
The first condition, $\sum_{k=0}^E n_k = N $, or equivalently $\sum_{k=1}^E n_k \le N $, means that we are interested in partitions of $E $ with at most $N $ parts.
Now unfortunately there is no closed solution to this problem. There is, however, a generating function. We first make a transformation.
We claim that the number of partitions of $E $ into at most $N $ parts is the same as the number of partitions of $E $ into parts of size at most $N $. To see this, let $m_j = \sum_{k=j}^E n_k $ count the number of parts of size at least $j $. Clearly $m_j=0$ for $j>N $, and
$$ \sum_j m_j = \sum_j \sum_{k \ge j} n_k = \sum_k \sum_{j \le k} n_k = \sum_k k \cdot n_k, $$
so given our partition $(k \cdot n_k)_k $ [that is, $n_k$ copies of $k$ as $k$ ranges from $1$ to $E$] of $E $ with at most $N $ parts, we formed a partition $(m_j)_j $ of $E $ with parts of size at most $N $. One can check that this is in fact a bijection, so we can count either type of partition.
Now for the generating function. A partition represents the (unordered) sum $m_1 + m_2 + ... + m_n = E$, where $n$ is the largest integer for which $m_j \ge 1$. One convenient way of representing such sums is through polynomial multiplication: $x^{m_1} \cdot x^{m_2} \cdot ... \cdot x^{m_n} = x^{E} $.
When multiplying out these monomials, we can group them together by their powers. So the terms with $m_j =1$ can be collected into the factor $x^{1 \cdot 0} + x^{1 \cdot 1} + x^{1 \cdot 2} + ... + x^{1 \cdot s_1} + ... $, where in $1 \cdot s_1$, the $1$ indicates we are dealing with parts of size $1$, and the $s_1$ keeps track of how many there are. The cool thing is that this is a geometric series, so we have
$$ x^{1 \cdot 0} + x^{1 \cdot 1} + x^{1 \cdot 2} + ... + x^{1 \cdot s_1} + ... = \frac {1}{1-x^1}. $$
We can do this for every other part size, and the factor corresponding to the parts of size $j$ will be $\frac {1}{1-x^j} $.
Finally, to get the generating function, we multiply these factors together, because remember that adding the parts corresponds to multiplying the monomials. Since we only want parts of size up to $N $, we take the product over $1 \le j \le N $, giving
$$ F_N (x) = \prod_{j=1}^N (1-x^j)^{-1}. $$
This gives a power series, and the coefficient of $x^E $ counts the partitions of $E $ into parts of size at most $N $ (or, equivalently, with at most $N $ parts), which is what you are looking for.
Even more finally, what good is a generating function? Well, it gives a systematic way to approach these problems. Note that the function $F_N $ doesn't just encode the solution to your problem, but to the problem for every possible value of $E $, all at once! In practical terms, it gives an algorithm for finding the solution, since the power series can be computed efficiently. We humans can also use analytic tools to extract the asymptotics of the solution, although that is not something I can do off the top of my head, so I'll leave that part to someone else.
This is a guide of what you have regarding the binomial coefficient $t_i$.
The binomial in $t_i$ is basically the problem of arranging a word of $n_i+g_i-1$ letters, from which $n_i$ letters are, say A (or bosons $b$), and $g_i-1$ letters are, say B (or boxes $b$). What you have then is a Mississippi problem (very well worked out in the early part of combinatorics courses and books). Martin's book Counting: the art of enumerative combinatorics can help you to understand these concepts.
Now to the problem. You can form $N!$ words from $N$ distinct letters. Consider now that the letters are not all different, and in fact that you have only two types of letters, say $n_A$ $A$'s and $n_b$ $B$'s such that $n_A+n_B=N$. Then $N!$ is an overcounting of all the possible distinct words that you can form (try this with, say $3$ $A$'s and $2$ $B$'s so that you can convince yourself of this fact). By the division principle, the total number of words of length $N$ that you can form with $n_A$ $A$'s and $n_b$ $B$'s is
$$
\frac{(n_A+n_B)!}{n_A!n_B!}.
$$
Now it is easy to say that there are $n_A$ bosons and $n_B+1$ boxes (you need essentially $n_B$ "bars" to form $n_b+1$ boxes).
Best Answer
The Bose-Einstein principle is that we consider the balls to be indistiguishable, and the boxes to be distinguishable, and we assume that the distinguishable arrangements are equally likely.
The total number of possibilities (with these indistinguishable balls and distinguishable boxes) is $${ n+r-1 \choose r}.$$
This is easy to see. Picture, say, $r=5$ indistinguishable particles and $n=4$ cells. The $4$ cells can be represented by $3$ separating walls between the particles. (Think of $n-1$ separating walls in the general case.) If we represent both the walls and the balls by asterisks then we have the following picture $$********.$$ In order to get one arrangement of the $r=5$ particles in the $n=4$ cells we have to transform $r=5$ asterisks to balls, and the rest of the asterisks to walls: $$\bullet\bullet|\bullet|\bullet\bullet|.$$
That is, the total number of the arrangements equals the number of possibilities to choose $r$ objects from a set of $n+r-1$ similar objects. One could reformulate this statement the following way: We have $n+r-1$ asterisks and we have to chose $n-1$ objets to play the role of the walls. As a result, we have $${ n+r-1 \choose r}={ n+r-1 \choose n-1}.$$
Furthermore, if we have an occupied cell with $i$ balls then the reamaining number of cells is $n-1$ and the number of the remaining particles is $r-i$. We can repeat the argumentation above and say that the number of arrangements with $i$ balls in a given cell is $${ n+r-i-2 \choose r-i}={ n+r-i-2 \choose n-2}.$$ The probability of the desirable event is then $$\frac{{ n+r-i-2 \choose r-i}}{{ n+r-1 \choose r}}=\frac{{ n+r-i-2 \choose n-2}}{{ n+r-1 \choose n-1}}.$$