I'm looking for a textbook that goes into as much detail as possible about the parallels between linear algebra in finite, countable, and continuous "spaces." Specific topics that I'm trying to get a better (more general) understanding of are, for example: The relationship between Hermitian matrices and self-adjoint operators in general; How the orthogonality and completeness relations can be described in a way to include "normalization with the Dirac delta function" without making it a special case; How to understand "basis vectors" (such as with Dirac normalization) that don't appear to be "part of" the vector space that the normalized functions live in.
I hope my terminology make sense. The whole point is that I'm just trying to learn this stuff, and I don't quite know what the correct terminology is 🙂
I've had basic courses in linear algebra and applied PDEs, and I see a lot of parallels, but in the books I have these parallels are (sometimes) mentioned but hardly ever emphasised.
I think that, for example, a book which aims to teach partial differential equations "from a linear algebra perspective" might be what I'm looking for.
Any suggestions appreciated.
Best Answer
Let me elaborate on my comment.
If you're looking for some "linear algebra" treatment of PDEs you will get into the field called functional analysis. Functional analysis is some kind of infinite-dimensional linear algebra. Here we are working with function spaces for example the space of square integrable functions $L^2$ ($f$ is in $L^2$ if $\int |f|^2 < \infty$). It can be quickly seen that there will be no finite basis that spans the complete space, so our space is infinite dimensional.
So, we are working with function spaces, in PDE we are looking for function spaces where our solutions of the PDEs live. That is, we are finding functions that satisfy the equation. It turns out that looking at the derivative in a classical sense doesn't give you much tools to study certain properties of the solutions like regularity (how "differentiable" your function is) and so on. So we interprete the derivative in a "weak" (or distributional) sense. In this way we have a larger class of functions that satisfies our equation and we can apply the tools of functional analysis to that. These spaces are called the Sobolev spaces. If you would like me to explain more in detail why we would study those spaces, do ask.
This was actually a short summary why we would need functional analysis. Another thing you will need is measure theory. Measure and integration theory studies a different type of integral than the one you're used to (the Riemann integral) namely the Lebesgue integral. This integral has many more nice properties for example you have nice theorems that state that under mild conditions on $f_n, f$ that $$\int f_n \to \int f.$$ The Riemann integral also possesses this property, but under quite "unnatural" conditions (uniform convergence for example).
I'm not sure how far you exactly are but if I understood correctly, you're having the engineering mathematics knowledge.
So, you would need to know:
Basic Real Analysis:
Some examples:
Measure Theory:
Functional analysis:
Partial Differential Equations:
This was some short list of suggestions. If you have any questions do ask!