What’s wrong with the following definition of $\lim$ and derivative for the surreal numbers

analysisderivativeslimitssurreal-numbers

I am reading about surreal numbers, and was asking myself, what is wrong with the following definition of limit and derivative for surreal functions:
Let $x,p$ are surreal numbers and $f$ is a function from the surreal numbers to itself.

$$\lim_{x \rightarrow p} f(x) := \{f(p-\frac{1}{2^k})|f(p+\frac{1}{2^k})\}$$ where $k$ ranges over the natural numbers $1,2,3,\ldots$, iff the right hand side defines a surreal number.

$$f'(x) := \lim_{\delta \rightarrow 0 } \frac{f(x+\delta)-f(x)}{\delta}$$

Is it possbile to prove some properties of the "usual limit" and the "usual derivative"?

Thanks for your help!

Best Answer

Two immediate issues with your definition:

  1. A form must follow the rule that every member of the LHS is smaller than every member of the RHS. This is only ensured if $f$ is increasing.

  2. If $f$ is an arbitrary "function" on Surreal numbers (can't be a function since surreal numbers don't form a set), you are only sampling countably many points on either side of $p$, which is not enough to determine the behavior of $f$ around $p$. Since no countable set is dense in the Surreal numbers, this is essentially analogous to asserting that you know $\displaystyle\lim_{t \to 0} g(t)$ from only knowing $g(-1)$ and $g(1)$ for a real function $g: \mathbb R \to \mathbb R$, which is absurd.

Related Question