[Math] the difference between discrete and continuous mathematics

computer sciencediscrete mathematics

I am studying computer science and this has me absolutely flummoxed. The definition I can find is that discrete data is countable and that continuous is uncountable.

Examples are given stating that integers are discrete, and real numbers are continuous. Rationals it states are controversial.

I understand topics that fall within the category of discrete mathematics, I don't get why they do. Especially set theory, what if my universal set is the real numbers? How can I have a continuous realm of values and consider my structure to be discrete?

Considering both integers and reals belong to an unlimited realm of possibilities, I fail to see how either one is considered countable; they are both infinite. How would you define discrete mathematics and how would someone recognize discrete or continuous mathematics when faced with it?

Note: My knowledge of mathematics is discrete and very limited so please explain this as you would to a 6 year old.

Best Answer

The difference between countable and uncountable sets is well formalized and there is never any doubt. These are two different "sizes of infinity". You can read this page for more information on why countable is not the same as uncountable: http://en.wikipedia.org/wiki/Cantor%27s_diagonal_argument

But I think one intuition which is really helpful, and also linking this with computer science, is the fact that a countable set is a set whose elements are finitely describable. For instance each integer can be written on a piece of paper, so the set of integers is countable. This makes integers manageable by computers: since you can completely describe an integer in a finite way, you can always pass it to a computer, as a finite sequence of $0$'s and $1$'s. This is also why reals are not countable: you might need to write down all the decimals, that is an infinite sequence. This makes "continuous mathematics" not well-suited for automatic treatment by computers.

Of course this is very schematic and can be further detailed, but this intuition is very important. It is possible to formally prove: "every element of $E$ contains a finite amount of information $\implies$ $E$ is countable".

Following this intuition, rationals are countable, because a rational $r$ can be given by two integers $a,b$ with $r=a/b$. This does not prevent rationals to be an important tool of analysis, because reals can be approached by rationals arbitrarily close (we say $\mathbb Q$ is $dense$ in $\mathbb R$). But most computer algorithms dealing with "arbitrary" numbers actually deal with only rational numbers.

As for the classification of maths into "discrete" and "continuous", the frontiers are really not well-defined, and everything interacts with everything else, so it is almost impossible to give a sound definition. A big part of it is subjective. At best, you have a "flavour" in some fields that is mostly discrete (like graph theory) or continuous (analysis), but in both cases, you might need also to consider the other side in order to get a good understanding (like using probability theory in graph theory).

Related Question