Here's how I interpret your example: there are a bunch of coins (k initially), each being flipped every round until it comes up tails, at which point the coin is "out," And you want to know, after n rounds, the probability that exactly j coins are still active, for j = 0, ..., k. (The existence of multiple players seems irrelevant.)
In that case, this is pretty elementary: after n rounds, the probability of each individual coin being active is p^n, so you have a binomial distribution with parameter p^n, k trials. Since you want to send p to 1 and n to infinity, note that replacing p by its square root and doubling n gives you the same distribution.
Your problem has a surprisingly fascinating generalization, which I believe is called the Galton-Watson process. Its solution has a very elegant representation in terms of generating functions, but I think there are very few examples in which the probabilities are simple to compute in general. Your instance is one of those. (The generalization: at each round, you have a certain number of individuals, each of which turns (probabilistically, independently) into a finite number of identical individuals. If the individuals are coins and each coin turns into one coin with probability p and zero coins with probability 1-p, and you begin with k coins, then we recover your example.)
I want to propose a strategy in the limiting case $n=\infty$. Maybe this is better described as a limit of strategies, since I will allow a sequence of coin flips that are each assigned probability $\epsilon$ of success (where $\epsilon$ is infinitessimal). The total amount of probability we will "spend" before the next head appears will then be exponentially distributed, with mean 1.
I will denote by $f_k(x)$ the probability that my strategy results in success if we still need $k$ heads, and have $x$ probability remaining in our "budget." Here is how the strategy works:
If $x\geq k$, we assign probability 1 to the next $k$ flips. This results in $f_k(x)=1$.
If $x\in (k-1,k)$, then we assign the next flip probability $x-(k-1)$. If this flip lands heads, we will win with probabilty 1. If the flip lands tails, we will win with probabilty $f_k(k-1)$. It follows that
$$
f_k(x)=(x-(k-1))+(k-x)f_k(k-1)
$$
Finally, if $x\leq k-1$, then we will assign probability $\epsilon$ to each subsequent flip, until we see a heads. This gives
$$
f_k(x)=\int_0^x e^{-t}f_{k-1}(x-t)\,dt
$$
We can recursively compute $f_k(x)$ for any $k$. Each $f_k$ is a continuous, piecewise-analytic function. The first few values (computed with the help of Mathematica; I hope they're correct) are:
$$
f_1(x)=\begin{cases}
x&\text{ if }0\leq x\leq 1\newline
1&\text{ if }x>1
\end{cases}
$$
$$
f_2(x)=\begin{cases}
-1+x+e^{-x}&\text{ if }0\leq x\leq1\newline
-1+\frac{2}{e}+(1-\frac{1}{e})t&\text{ if }1\leq x\leq 2\newline
1&\text{ if }x>2
\end{cases}
$$
$$
f_3(x)=\begin{cases}
-2+x+(x+2)e^{-x}&\text{ if }0\leq x\leq 1\newline
e^{-x}+\frac{3}{e}-2-\frac{x}{e}+x&\text{ if }1\leq x\leq 2\newline
-2+x+\frac{(1+e)(3-x)}{e^2}&\text{ if }2\leq x\leq 3\newline
1&\text{ if }x\geq 3
\end{cases}
$$
While I don't have a proof this strategy is optimal, I've got a heuristic argument that assigning probability $\epsilon$ to each flip is a good idea. If our budget is $x$, then whatever our strategy, the expected number of heads we will have seen when we exhaust our budget is $x$. If the desired number of heads is much larger than $x$, we will need to make the variance in the number of heads large. If we assign probabilities $p_1,p_2,\ldots$ to the coin flips (with $p_1+p_2+\ldots=x$), then the variance in the number of heads is $\sum p_i(1-p_i)$, which is bounded above by $x$. We can make the variance arbitrarily close to $x$ by taking each $p_i$ as small as possible.
The argument is a little different if the next head that appears could cause our remaining budget to be larger than the number of additional heads we need to win.
Best Answer
There is a whole field around questions like this called bayesian statistics, it's been a while since I looked at this stuff but if I remember right.
Sadly you do need to have some pre-determined view of what p is. That is some before flipping the n coins you have a distribution in mind for the value of p (called the prior distribution). This distribution changes as you flip the coins (you get a posterior distribution).
For example you might start out believing that the coin has a 50% chance of being fair and a 50% chance of coming up heads 2/3rds of the time. (You might believe this if you know the person you got the coin from has both types of coins and there is a 50% chance he's trying to trick you).
The interesting (or at least nice) case is when your prior distribution is a "conjugate prior". Which basically means that your posterior distiribution for p is of the same parametric family as your prior distribution. I believe the conjugate prior for this is the beta distribution, but you might want to google "conjugate prior" and "bayesian statistics".
Hope that helps.