Rom Maimon is describing the program of proof-theoretic ordinal analysis.
First, as you observed in your addendum, it isn't interesting to find some encoding of an ordinal whose well-foundedness implies Con(ZFC), but rather an ordinal such that the well-foundedness of any representation implies Con(ZFC). One hopes that it suffices to consider natural representations of the ordinal, which has been true in practice, but is unproven (and probably unprovable, given the difficulty of making precise what counts as a natural representation).
It's possible to prove that the smallest ordinal which ZFC fails to prove well-founded is computable by noticing that the computable notations for ordinals provably well-founded in ZFC are a $\Sigma_1$ subset of the computable notations for ordinals, so certainly $\Sigma^1_1$, and by a result of Spector, any $\Sigma_1^1$ subset of the computable notations for ordinals is bounded.
As pointed out in the answer Timothy Chow links to above, it's typically true that this notion of proof-theoretic ordinal ends up being the same as ordinals with other nice properties (like implying Con(ZFC)), but there's no proof that that will always happen (and can't be, since there are defective examples that show it's not always true), nor a proof that covers ZFC.
However it's generally believed that for "natural" theories, including ZFC, the different notions of proof-theoretic ordinal will line up.
Finding an explicit description of the ordinal for ZFC is an active problem in proof theory, but progress has been very slow. The best known results are by Rathjen and Arai (separately) at the level of $\Pi^1_2$-comprehension (a subtheory of second order arithmetic, so much, much weaker than ZFC), and after nearly 20 years, those remain unpublished. The results in the area got extremely difficult and technical, and didn't seem to provide insight into the theories the way the smaller ordinals had, so it's not nearly as active an area as it once was. Wolfram Pohlers and his students still seem to working in the area, and some other people seem to be thinking about other approaches rather than directly attacking it (Tim Carlson and Andreas Weiermann and their students come to mind).
Addressing your issue with Godel: note that even the one-use $\omega$-rule is not computable. There's no obstacle to a non-recursive theory being complete.
Actually this is a bit more subtle than I'm giving it credit for, but the point still holds. Using your restricted $\omega$-rule, we do have finitary-ish proof objects: e.g. a proof of $PA+\omega_{PA}\vdash \forall x\varphi(x)$ is a Turing machine $\Phi_e$ such that for all $n$, $\Phi_e(n)$ is a $PA$-proof of $\varphi(n)$. In this sense, a proof is a finite object. However, telling whether $\Phi_e$ has this property is not computable! That is, proof-verification is non-effective.
Note that things get even worse once we allow more $\omega$-ruliness . . .
Best Answer
As is well known, Hilbert had earlier posed the problem of establishing the consistency of first order arithmetic (with full induction), and had suggested the development of the method of "$\varepsilon$-substitutions" (also known as "$\varepsilon$-calculus") as a means of achieving this goal.
The first substantial advance in this direction was made by Ackermann in his 1925-doctoral dissertation (written under Hilbert's supervision) entitled Begründung des "tertium non datur" mittels der Hilbertschen Theorie der Widerspruchsfreiheit, which was published in the same year in the journal Mathematische Annalen. In this work, Ackermann proved the consistency of a system of arithmetic with induction only for quantifier-free formulae. Ackermann's work was later extended by von Neumann in his 1927-paper you have asked about, in which von Neumann developed further techniques that demonstrated the efficacy of $\varepsilon$-calculus. He also gave a detailed critique of Ackermann's earlier proof.
Eventually, Gentzen used another technique (cut-elimination) to give a consistency proof for first order arithmetic in 1936, which highlighted the role of the ordinal $\varepsilon_0$ in the consisteny proof of arithmetic. Gentzen's breakthrough prompted Ackermann, in his 1940 paper in Mathematische Annalen, to forumlate a proof of the consistency of arithmetic with the machinery of $\varepsilon$-calculus.
From what I can tell, techniques developed by von Neumann in his 1927 paper have now been absorbed into the folklore of the subject. The following papers provide excellent expository accounts of $\varepsilon$-calculus. They also contain pointers to the rich literature of the subject.
G. Mints, A method of epsilon substitution for the predicate logic with equality, Journal of Math. Sc.i 1997
G. Moser and R. Zach, The epsilon calculus and herbrand complexity, Studia Logica 2006.