So functions are just uncountabley-infinite dimensional vectors, and as such there's a nice generalization of the inner product between two functions (the integral of their product). Is their a similar generalization for the cross product between two functions?!
[Math] Cross Product for functions
cross productfunctional-analysisfunctions
Related Solutions
For finding the correct definition to apply, one needs to know whether the scalar product is taken to be anti-linear in its first or its second argument. Assuming the first convention, the relation one would want to preserve for $\vec x=(x_1,x_2,x_3)$ and similarly for $\vec y, \vec z$, that one still has $$ (\vec x \times \vec y)\cdot\vec z= \left|\begin{matrix}x_1&y_1&z_1\\x_2&y_2&z_2\\x_3&y_3&z_3\\\end{matrix} \right|. $$ Note that the determinant is linear in all of its columns, so the left hand side needs to be an expression that is linear in the vector that appears directly as a column, which explains that one cannot use $\vec x\cdot(\vec y\times\vec z)$ instead, which is anti-linear in $\vec x$. Now it is easy to see that the coordinates of $\vec x \times \vec y$ should be taken to be the complex conjugates of the expressions in their usual definition, for instance $\overline{x_2y_3-x_3y_2}$ for the first coordinate.
One actually arrives at the same conclusion for a scalar product that is defined to be anti-linear in its second argument. However the identity that leads to this definition is different, namely the one which equates $\vec x\cdot(\vec y\times\vec z)$ to the above determinant.
If $x_1,\dotsc,x_{n-1} \in \mathbb{R}^n$, one defines $x_1 \times \cdots \times x_{n-1} \in \mathbb{R}^n$ to be the unique vector such that $$ \forall y \in \mathbb{R}^n, \quad \langle x_1 \times \cdots \times x_{n-1},y \rangle = \operatorname{det}(x_1,\dotsc,x_{n-1},y), $$ where the determinant is being viewed as a function of the rows or columns of the usual matrix argument, i.e., as the unique antisymmetric $n$-form $\operatorname{det} : \mathbb{R}^n \times \cdots \times \mathbb{R}^n \to \mathbb{R}$ such that $\det(e_1,\dotsc,e_n) = 1$ for $\{e_k\}$ the standard ordered basis of $\mathbb{R}^n$.
Now, suppose that $x_1,\dotsc,x_{n-1} \in \mathbb{R}^n$ are linearly independent, and hence span a hyperplane $H$ ($n-1$-dimensional subspace) in $\mathbb{R}^n$. Then, in particular, $x_1 \times \cdots \times x_{n-1} \neq 0$ is orthogonal to each $x_k$, and hence defines a non-zero normal vector to $H$; write $$x_1 \times \cdots \times x_{n-1} = \|x_1 \times \cdots \times x_{n-1}\|\hat{n}$$ for $\hat{n}$ the corresponding unit normal. Let $y \notin H$. Then $x_1,\dotsc,x_{n-1},y$ are linearly independent and span an $n$-dimensional parallelopiped $P$ with $n$-dimensional volume $$ |\operatorname{det}(x_1,\dotsc,x_{n-1},y)| = |\langle x_1 \times \cdots x_{n-1},y\rangle| = \|x_1 \times \cdots \times x_{n-1}\||\langle \hat{n},y\rangle|. $$ Now, with respect to the decomposition $\mathbb{R}^n = H^\perp \oplus H$, let $$ T = \begin{pmatrix} I_{H^\perp} & 0 \\ M & I_{H} \end{pmatrix} $$ for $M : H^\perp \to H$ given by $$M(c \hat{n}) = -c \langle \hat{n},y \rangle^{-1} P_H y = -c\langle \hat{n},y\rangle^{-1}(y-\langle\hat{n},y\rangle\hat{n}),$$ where $P_H(v)$ denotes the orthogonal projection of $v$ onto $H$. Then $T(P)$ is a $n$-dimensional parallelepiped with with vertices $Tx_1 = x_1,\dotsc,Tx_{n-1}=x_{n-1}$, and $$ Ty = \langle \hat{n},y \rangle \hat{n} = P_{H^\perp} y = y - P_H y, $$ with the same volume as $P$. On the one hand, since $Ty = y - P_H y$ for $P_H y \in H = \{x_1 \times \cdots \times x_{n-1}\}^\perp$, $$ \operatorname{Vol}_n(T(P)) = |\operatorname{det}(Tx_1,\dotsc,Tx_{n-1},Ty)|\\ = |\operatorname{det}(x_1,\dotsc,x_{n-1},y-P_H y)|\\ = |\operatorname{det}(x_1,\dotsc,x_{n-1},y)|\\ = \|x_1 \times \cdots \times x_{n-1}\||\langle \hat{n},y\rangle|. $$ On the other hand, since $Ty \in H^\perp$, $T(P)$ is an honest cylinder with height $\|Ty\| = |\langle \hat{n},y\rangle|$ and base the $(n-1)$-dimensional parallelopiped $R$ spanned by $x_1,\dotsc,x_{n-1}$, so that $$ \operatorname{Vol}_n(T(P)) = \operatorname{Vol}_{n-1}(R)|\langle \hat{n},y\rangle|. $$ Thus, $$ \operatorname{Vol}_{n-1}(R)|\langle \hat{n},y\rangle| = \operatorname{Vol}_n(T(P)) = \|x_1 \times \cdots \times x_{n-1}\||\langle \hat{n},y\rangle|, $$ so that $$ \operatorname{Vol}_{n-1}(R)| = \|x_1 \times \cdots \times x_{n-1}\|, $$ as required.
EDIT: Theoretical Addendum
Let's see what $\phi x_1 \times \cdots \times \phi x_n$ is in terms of $x_1 \times \cdots \times x_{n-1}$ for $\phi$ a linear transformation on $\mathbb{R}^n$.
Define a linear map $T : (\mathbb{R}^n)^{\otimes(n-1)} \to (\mathbb{R}^n)^\ast$ by $$ T : x_1 \otimes \cdots \otimes x_{n-1} \mapsto \operatorname{det}(x_1,\cdots,x_{n-1},\bullet), $$ so that if $S : \mathbb{R}^n \to (\mathbb{R}^n)^\ast$ is the isomorphism $v \mapsto \langle v,\bullet \rangle$, then $$ x_1 \times \cdots \times x_n = (S^{-1}T)(x_1 \otimes \cdots \otimes x_n). $$ Now, since the determinant is antisymmetric, so too is $T$, and hence $T$ descends to a linear map $T : \bigwedge^{n-1} \mathbb{R}^n \to (\mathbb{R}^n)^\ast$, $$ x_1 \wedge \cdots \wedge x_{n-1} \mapsto \operatorname{det}(x_1,\cdots,x_{n-1},\bullet); $$ indeed, if $\operatorname{Vol} = e_1 \wedge \cdots \wedge e_n$ for $\{e_k\}$ the standard ordered basis for $\mathbb{R}^n$, then for any $y \in \mathbb{R}^n$, $$ \langle x_1 \otimes \cdots \otimes x_{n-1},y \rangle \operatorname{Vol} = \operatorname{det}(x_1,\cdots,x_{n-1},y)\operatorname{Vol} = x_1 \wedge \cdots \wedge x_{n-1} \wedge y, $$ which, in fact, shows that $$ x_1 \times \cdots \times x_{n-1} = \ast (x_1 \wedge \cdots \wedge x_{n-1}), $$ where $\ast : \wedge^{n-1} \mathbb{R}^n \to \mathbb{R}^n$ is the relevant Hodge $\ast$-operator. Thus, a cross product is really an $(n-1)$-form in the orientation-dependent disguise given by the Hodge $\ast$-operator; in particular, it will really transform as an $(n-1)$-form, as we'll see now.
Now, let $\phi : \mathbb{R}^n \to \mathbb{R}^n$ be linear. Observe that the adjugate matrix $\operatorname{Adj}(\phi)$ of $\phi$ can be invariantly defined as the unique linear transformation $\operatorname{Adj}(\phi) : \mathbb{R}^n \to \mathbb{R}^n$ such that for any $\omega \in \bigwedge^{n-1} \mathbb{R}^n$ and $y \in \mathbb{R}^n$, $$ (\wedge^{n-1})\omega \wedge y = \omega \wedge \operatorname{Adj}(\phi) y, $$ e.g., in our case, $$ x_1 \wedge \cdots \wedge x_{n-1} \wedge \operatorname{Adj}(\phi) y = (\wedge^{n-1}\phi)(x_1 \wedge \cdots \wedge x_{n-1}) \wedge y = \phi x_1 \wedge \cdots \wedge \phi x_{n-1} \wedge y, $$ and that, as a matrix, $\operatorname{Adj}(\phi) = \operatorname{Cof}(\phi)^T$, where $\operatorname{Cof}(\phi)$ denotes the cofactor matrix of $\phi$. Then for any $y$, $$ \langle \phi x_1 \times \cdots \times \phi x_{n-1},y \rangle \operatorname{Vol} = \operatorname{det}(\phi x_1,\cdots,\phi x_{n-1},y)\operatorname{Vol}\\ = \phi x_1 \wedge \cdots \wedge \phi x_{n-1} \wedge y\\ = (\wedge^{n-1}\phi)(x_1 \wedge \cdots \wedge x_{n-1}) \wedge y\\ = (x_1 \wedge \cdots \wedge x_{n-1}) \wedge \operatorname{Adj}(\phi)y\\ = \langle x_1 \times \cdots \times x_{n-1},\operatorname{Adj}(\phi)y \rangle \operatorname{Vol}\\ = \langle \operatorname{Cof}(\phi)(x_1 \times \cdots \times x_{n-1}),y \rangle \operatorname{Vol}, $$ and hence, since $y$ was arbitrary, $$ \phi x_1 \times \cdots \times \phi x_{n-1} = \operatorname{Cof}(\phi)(x_1 \times \cdots \times x_{n-1}) = (\ast \circ \wedge^{n-1}\phi \circ \ast^{-1})(x_1 \times \cdots \times x_{n-1}), $$ in terms of the Hodge $\ast$-operation and the invariantly defined $\wedge^{n-1}\phi$.
Best Answer
Thinking of functions as uncountably-infinite vectors may lead you wildly astray, so I'd be careful of that.
The cross product is a fairly unique binary operation, as it turns out. For any vector space $V$, we have an inner product $\langle\cdot,\cdot\rangle:V\times V\to \mathbb R$ (we're just going to assume these are real vector spaces). However, in most circumstances we DO NOT have an operation from $V\times V\to V$. The notable exceptions are the cross product in $\mathbb R^3$ and $\mathbb R^7$ (see here).
You can lift the idea of an inner product to functions, though. Have you had any functional analysis? If you haven't this idea may be a little hard to tackle.