Discrete Mathematics – Applications of Finite Calculus

applicationsdiscrete mathematicssoft-questionsummation

I'm reading through Concrete Mathematics [Graham, Knuth, Patashnik; 2nd edition], and in the section regarding Summation, they have a sub-section entitled "Finite and Infinite Calculus". In this section they introduce to the reader the concept of finite calculus, the discrete analog of the traditional infinite calculus.

Throughout the text, they use the following notations for use in finite calculus (I'm not sure if this is standard notation, so I'd be grateful for any clarification):

$$\Delta f(x)\equiv f(x+1)-f(x)$$

Which is described as the "finite analog of the derivative in which we restrict ourselves to positive integer values of h".

Along with the following notation as the analog for the anti-derivative, and definite integral:

$$g(x)=\Delta f(x) \iff \sum{g(x) \:\delta x}=f(x)+C\\\sum_{a}^{b}{\Delta f(x)\:\delta x}=\left.f(x)\right|_{a}^{b}=f(b)-f(a)$$

It then goes on to introduce various identities relating to these interesting operators, but never describes any application for them, aside from satisfying one's mathematical curiosity, and for shortening a few proofs later in the book (although even these seem to be somewhat contrived applications).

So my question is as follows:

What applications are there for finite calculus, in fields such as mathematics, computer science, physics, etc.?

Thanks in advance!

Best Answer

As I didn't really get a satisfactory answer (although thanks to user35071 for his link, and to those of you that commented), I have decided to answer my own question as best as I can (if I've missed anything important, please let me know in the comments):

  1. Calculating closed-form summations from known expressions.
  2. Approximation of infinite-calculus (forward-difference methods, numerical solutions to PDEs and ODEs, etc.)

It is often the case that we are trying to express the summation of some expression $f(x)$ in closed form, and this can be tricky, using standard-techniques.

For instance, computation of $\sum{x^{2}\:\delta x}$ can be done by using two known formulae:

$$x^{n}=\sum_{k}{\left\{n \atop k\right\}x^{\underline{k}}} \text{ and } \sum{x^{\underline{m}}\:\delta x}=\frac{x^{\underline{m+1}}}{(m+1)}\tag{1}$$

Where $\left\{n\atop k\right\}$ is read "$n$ subset $k$", and is a Stirling number of the second kind. And $x^{\underline{k}}=x(x-1)\cdots(x-k+1)$ is the falling-factorial function (note this is also written as the Pochhammer symbol: $(x)_{k}$).

Expanding $x^{2}$ in terms of falling factorials, we get:

$$x^{2}=x^{\underline{2}}+x^{\underline{1}}\implies\sum{x^{2}\:\delta x}=\sum{x^{\underline{2}}\:\delta x}+\sum{x^{\underline{1}}\:\delta x}$$

Using our known formulae in $(1)$, we get:

$$\sum{x^{2}\:\delta x}=\frac{x^{\underline{3}}}{3}+\frac{x^{\underline{2}}}{2}=\frac{x(x-1)(x-2)}{3}+\frac{x(x-1)}{2}=\frac{x(x-1)(2x-1)}{6}$$

Which corresponds to our standard closed form for $\sum_{k=1}^{n}{k^{2}}$.

If we don't restrict ourselves to integral finite-differences, then we can also use it to numerically approximate derivatives and integrals of continuous functions. There are several types of difference methods often used in numerical approximation to calculus,

$$\Delta_{h}[f](x)=f(x+h)-f(x) \tag{2}$$ $$\nabla_{h}[f](x)=f(x-h)-f(x) \tag{3}$$ $$\delta_{h}[f](x)=f\left(x+\frac{h}{2}\right)-f\left(x-\frac{h}{2}\right)\tag{4}$$

Where $(2)$ is called the forward-difference, $(3)$ is called the backwards-difference, and $(4)$ is called the central-difference. These can be used to approximate the derivative using the following formulae:

$$f'(x)\approx\frac{\Delta_{h}[f](x)}{h}\approx\frac{\nabla_{h}[f](x)}{h}\approx\frac{\delta_{h}[f](x)}{h} \tag{5}$$

Which is often used when a non-analytic evaluation of a derivative is required.

Related Question