(Algebraic) derivations $\frac{\partial}{\partial x_j}$ are linearly independent

abstract-algebralinear algebrapartial derivativetangent linetangent-spaces

I have a basic question about a proof regarding the (algebraic) definition of Derivations forming a basis of the $n$-dimensional linear space of derivations.

Let $M$ be a differentiable manifold, $a \in M$. Let $\bar f$ be the
germ of a differentiable function $f:U \to \mathbb{R}$ with $a \in U
> \subset M$
. Let $\mathcal{E}_{M,a}$ be the set of germs of
differentiable functions $f:M \to \mathbb{R}$

A Derivation of $\mathcal{E}_{M,a}$ is a linear map $\delta:
\mathcal{E}_{M,a} \to \mathbb{R}$
that satisfies the product rule $$
\delta(\bar f \bar g) = \bar f(a)\delta(\bar g) + \bar g(a)\delta
(\bar f)$$

for all $\bar f, \bar g \in \mathcal{E}_{M,a}$.

We denote the set of all derivations of $\mathcal{E}_{M,a}$ by $$\operatorname{Der}\mathcal{E}_{M,a}$$

We consider the derivations of $\mathcal{E}_{\mathbb{R}^n,0}$.

Let $x_1,…,x_n$ be the coordinates of $\mathbb{R}^n$. We define the
Derivations $$\frac{\partial}{\partial x_j}:
\mathcal{E}_{\mathbb{R}^n,0} \to \mathbb{R}$$
via $\bar f \mapsto
(\partial/\partial x_j)f(0)$
(partial derivative with respect to the
variable $x_j$)

Proposition: The derivations $\partial/\partial x_j, j = 1,…,n$ form a basis of the linear space $\operatorname{Der}\mathcal{E}_{\mathbb{R}^n,0}$

Proof We'll prove the linearly independence of $\partial/\partial x_j$. Let $$\sum_{i=1}^na_j\frac{\partial}{\partial x_j} = 0,\ a_j \in \mathbb{R}\ (*) $$

Thus

$$ a_k = \sum_{i=1}^n a_j\frac{\partial \bar{x}_k}{\partial x_j} = 0 \ (**)$$ for all $k$.

I don't understand the implication from $(*)$ to $(**)$

Can someone please elaborate? I'd really appreciate any help.

Thanks in advance.

Best Answer

The statement in $(*)$ that $\sum a_j \dfrac{\partial}{\partial x_j} = 0$ means that if you apply this derivation on any germ of functions $[f]$, then you get the real number $0$. To prove linear independence, we need to prove each $a_k = 0$. So, in particular, lets choose $[f]$ to be the germ of one of the coordinate functions $[x_k]$. Recall that $x_k: \Bbb{R}^n \to \Bbb{R}$ is the map $x_k(p^1, \dots, p^n) = p^k$.

Now, what does the partial derivative of this function $x_k$ with respect to the $j^{th}$ variable evaluated at $0$ $\partial_j(x_k)(0)$ equal? This is $1$ if $j=k$ and $0$ otherwise (because if $j \neq k$ then the function $x_k$ doesn't even depend on the $j^{th}$ variable). Thus, \begin{align} 0 &= \left(\sum_{j=1}^n a_j \dfrac{\partial}{\partial x_j} \right)(x_k) \tag{by assumption}\\ &= \sum_{j=1}^n a_j \dfrac{\partial x_k}{\partial x_j}(0) \\ &= \sum_{j=1}^n a_k \delta_{kj} \\ &= a_k \end{align} where $\delta_{jk}$ is the kronecker delta symbol (1 if $j=k$, $0$ otherwise). This proves each $a_k = 0$ hence proves linear independence.


Looking at your comment, I hope you can first prove that with respect to the usual differentiable structure on $\Bbb{R}^n$, the coordinate functions $x_k : \Bbb{R}^n \to \Bbb{R}$ are indeed smooth. Hence, you can take their germ equivalence classes $[x_k]$. This will thus be an element of the space $\mathcal{E}_{\mathbb{R}^n,0}$. Hence, you can apply the derivation on this element.

So, strictly speaking, the derivation $\sum a_j \dfrac{\partial}{\partial x_j}$ should be evaluated on the germ $[x_k]$. But by definition, this is defined by its action on a representative in the equivalence class: $\bigg(\sum a_j \dfrac{\partial}{\partial x_j} \bigg)(x_k)$, and I hope you've proven somewhere that this definition is independent of which representative is chosen, and hence is well-defined. This justifies my argument above.

Related Question