[Math] Why aren’t all intervals called infinite

elementary-set-theoryorder-theoryterminology

I'm a bit confused about the definition of finite sets/intervals. I know that a set S is called finite when it has a finite number of elements, or formally, when there exists a bijection $f:S\to\{1,…,n\}$ for some natural number n.

However, the interval $(1,2)$ is called finite. I don't understand why; $(1,2)$ is not even countable, and there definitely does not exist a bijection $f:(1,2)\to\{1,…,n\}$.

Why do we call $(a,b)$ with $a,b\in\mathbb{R}$, finite? Did we just agree to do so, or is it incorrect to use the interval $(a,b)$ as a set like I did in the definition of finiteness above?

Thanks!

Best Answer

There are different ways to think about the size of a set. In the case of the real numbers, and specifically intervals, we can talk about their length (and generally, their Lebesgue measure in the case of measurable sets).

If you think about the real numbers as a model of time or space, then the distance between you and the screen through which you are reading this is a finite interval. But in this model, based on the real numbers, it is an uncountable interval, not a finite set.


One thing to remember about terminology, is that it should highlight to the reader or listener something about a certain relevant property. In the case of intervals, we already know they all have the same cardinality (in the case of non-degenerate intervals). So we can use "finite" or "infinite" to talk about their length (and formally, their measure). Thus setting the importance on that aspect, rather than their cardinality.

Related Question