# How to reverse the $n$ choose $k$ formula?

If I want to find how many possible ways there are to choose k out of n elements I know you can use the simple formula below:

$$\binom{n}{k} = \frac{n! }{ k!(n-k)! } .$$

What if I want to go the other way around though?

That is, I know I want to have $X$ possible combinations, and I want to find all the various pairs of $n$ and $k$ that will give me that number of combinations.

For example, if the number of combinations I want is $3$, I want a formula/method to find that all the pairs that will result in that number of combinations are $(3,1)$ and $(3,2)$

I know I could test all the possible pairs, but this would be impractical for large numbers.

But perhaps there’s no easier way of doing this then the brute force approach?

If $X$ is only as large as $10^7$ then this is straightforward to do with computer assistance. First note the elementary inequalities
$$\frac{n^k}{k!} \ge {n \choose k} \ge \frac{(n-k)^k}{k!}$$

which are close to tight when $k$ is small. If $X = {n \choose k}$, then it follows that
$$n \ge \sqrt[k]{k! X} \ge n-k$$

hence that
$$\sqrt[k]{k! X} + k \ge n \ge \sqrt[k]{k! X}$$

so for fixed $k$ you only have to check at most $k+1$ possible values of $n$, which is manageable when $k$ is small. You can speed up this process by factoring $X$ if you want and applying Kummer’s theorem (the first bullet point in that section of the article), but computing binomial coefficients for $k$ small is straightforward so this probably isn’t necessary.

For larger $k$, note that you can always assume WLOG that $n \ge 2k$ since ${n \choose k} = {n \choose n-k}$, hence you can assume that
$$X = {n \choose k} \ge {2k \choose k} > \frac{4^k}{2k+1}$$

(see Erdős’ proof of Bertrand’s postulate for details on that last inequality). Consequently you only have to check logarithmically many values of $k$ (as a function of $X$). For example, if $X \le 10^7$ you only have to check up to $k = 14$.

In total, applying the above algorithm you only have to check $O(\log(X)^2)$ pairs $(n, k)$, and each check requires at worst $O(\log(X))$ multiplications of numbers at most as large as $X$, together with at worst a comparison of two numbers of size $O(X)$. So the above algorithm takes polynomial time in $\log(X)$.

Edit: It should be totally feasible to just factor $X$ at the sizes you’re talking about, but if you want to apply the Kummer’s theorem part of the algorithm to larger $X$, you don’t actually have to completely factor $X$; you can probably do the Kummer’s theorem comparisons on the fly by computing the greatest power of $2$ that goes into $X$, then $3$, then $5$, etc. and storing these as necessary. As a second step, if neither $X$ nor the particular binomial coefficient ${n_0 \choose k_0}$ you’re testing are divisible by a given small prime $p$, you can appeal to Lucas’ theorem. Of course, you have to decide at some point when to stop testing small primes and just test for actual equality.