Arity of Primitive Recursive Functions

computabilityfunctionslogic

I'm currently working through a few books on computability and I am having a little bit of trouble with the primitive recursive scheme. When defining the primitive recursive scheme for unary functions most authors give something like the following:

$\text{For function $h$ which is primitive recursive, the function $f$ defined by}$ $$f(0)=m $$ $$f(n+1)=h(n,f(n))$$
is primitive recursive where $m$ is a zero-ary constant function and where $h$ is a binary function.

My question is can we view $h$ as a unary function since it seems to depend on only $n$? I feel like I could define $g(n)=h(n,f(n))$ where g is unary. This would then violate the requirement that the arity of $h$ be $k+1$ if the arity of $f$ is $k$. I'm just a little confused about how the arity requirement works in the unary case.

Edit: In addition to the above question, I'm curious about performing primitive recursion on zero-ary functions. For example, one of the authors I'm reading defines the zero-ary constant function $o'()=0$. He then proceeds to define the family of zero-ary constant functions $c_i'()=i$ as follows: $$c_0'(0)=o'()$$ and $$c_{n+1}'()=S((c_{n}'())$$ where S is the successor function. It seems like this violates the primitive recursive scheme conditions as well with $c_0'(0)$ having arity 1 and $c_{n+1}'()$ having the same arity as $S((c_{n}'())$ . I'm guessing something is going on in the background, but I can't seem to see it.

Best Answer

You can indeed define $g(n)=h(n,f(n))$ (as I assume you intended to write) -- but in order to argue that this $g$ is primitive recursive, you need to already know that $f$ (as well as $h$) is primitive recursive, and for that you need to apply the primitive recursion rule, which depends on knowing that $h$ is primitive recursive.

Note well that what the primitive recursion rule demands as a premise is that $h$ is primitive recursive as a two-argument function. That is the function that describes how to combine $n$ and $f(n)$ in order to find the number you want to be $f(n+1)$. In principle this $h$ needs to be applicable to every pair of numbers, not just ones where the second element happens to be $f$ applied to the first one. If you can't give such a general rule for $h$, the primitive recursion construction does not -- by definition -- necessarily produce a primitive recursive $f$.


In response to the added material headed "edit": The construction you're quoting seems to arguing for the theorem that every constant function is primitive recursive. This conclusion is certainly true, but the argument you're quoting does not use the primitive recursion rule at all. It works by induction at the metalevel, but does not use recursion as a building block for functions.

In fact the same argument would work to prove this:

Define the class of supersimple function by the following rules:

  1. The zero function is supersimple.
  2. The successor function is supersimple.
  3. All $n$-ary projection functions are supersimple.
  4. Every function that arises by composition of supersimple function is supersimple.
  5. No other functions are supersimple.

Theorem. Every constant function $\mathbb N^n\to\mathbb N$ (where $n\ge 0$) is supersimple.

You should recognize rules 1-4 as exactly the same as the corresponding parts of the definition of primitive recursive functions. The primitive recursion rule is missing, but it should be clear that every "supersimple" function is necessarily also "primitive recursive".

And the reason why the constant functions are supersimple is exactly the same as the reason why they are primitive recursive.

Related Question