I'm taking a QM course and I'm trying to make sense of why observables are sometimes conjugated for no apparent reason in their inner products.
There is a lot of confusion even already. Let's start with the basics. There are vectors and operators and an inner product. If everything was finite dimensional the vectors would be complex column vectors, the operators would be complex square matrices and the inner product would be taken by transposing one vector and then conjugating every term in it and then doing regular matrix multiplication.
Note right away that unlike regular 3d space this new inner product is different. Firstly it is complex, secondly you no longer get that it is symmetric. This is important and serious. We can no longer just say "the inner products of vectors A and B." So we need a (new) clear notation.
One notation is $\langle A | B \rangle.$ And that is just like taking the A vector and transposing the column vector $|A\rangle$ into a row vector and complex conjugating every term in it to get a new row vector $\langle A |$ and then we take the natural product.
So whenever you see a ket vector (e.g. $|A\rangle$) just think column vector. And whenever you see a bra vector (e.g. $\langle A|$) just think row vector. And think of the vectors $|A\rangle$ and $\langle A |$ as being transpose conjugates of each other.
This operation of taking the transpose conjugate of one and multiplying by the other has the properties that it is linear in the second argument, conjugate linear in the first argument and is non negative when you give it the same argument in both the positions. Plus if you put the two vectors in the opposite order you get the complex conjugate of putting them in the first order. Those are really the properties you are looking for and you can get them even if there isn't a finite dimensional basis. For instance the operation that takes two functions and conjugates one and then multiplies them together and then integrates is like an infinite dimensional version of that.
Right now I'm watching Dr. Susskind's lecture on youtube ( https://www.youtube.com/watch?v=8mi0PoPvLvs ) and around 13:16 he takes the inner product of the ket wavefunction $$ |\psi(x)> $$ and the state representing the particle at x, $$ |x> $$ to make $$<x|\psi> $$
That isn't actually what he does. He takes the (column) vector $ |\psi\rangle $ and the (column) vector $ |x\rangle $ and makes the scalar $\langle x|\psi\rangle=\psi(x).$ Each of
The vectors $ |\psi\rangle $ and $ |x\rangle $ was an entire vector and you are computing the complex scalar you get from the inner product. If you imagine fixing the vector $|\psi\rangle$ and having a different vector $|x\rangle$ for each position $x$ then you are effectively getting a complex number for each position. You are getting the wave function from the inner product.
And you are getting it the same way you get coefficients from a vector, but inner product long them with your basis vectors. You might be used to getting coefficients by just looking at a vector, but if we have an infinite dimensional basis we can't just write down an infinite number of coefficients. When you think about it, you get a coefficient by inner producting with the basis vectors.
Doesn't this conjugate the 'x' term?
What does this have to do with anything? And firstly we are starting with a vector space with an addition (like adding column vectors), a scaling of vectors by complex numbers (like scaling a column vector by a complex number), and an inner product (like as described). But it just has the properties we want. We are getting the wave function from that. If we used wave functions this would be circular. Of you wanted to do that then you could define the inner product as an integral, but then you won't have vectors like $x\rangle.$
My question is why all of a sudden does he conjugate the position vector to complete the inner product?
The inner product is always (in physics) conjugate linear in the first argument and linear in the second argument. But he's not doing that. He's just taking their inner product. If the $|x \rangle$ were itself a wave function then it would be a Dirac delta function, which can be real, and hence conjugating it doesn't do anything.
He's just finding a coefficient of the vector $\psi\rangle$ in the "basis" $\{|x\rangle\}.$
Right before that he took the inner product with two different state vectors, conjugating the $ x' $ term but not the $ x $ term. I guess I'm just trying to "visualize" why this is proper.
It is proper to take the inner product with any two vectors you feel like. Its just not a symmetric operation anymore. You do what you want, you get what you get.
This also happens when I find the projection of one vector onto another vector, but I'm just confused as to why this is "legal".
You can project A onto B by $\langle B |A \rangle | B\rangle/\langle B |B \rangle$ for exactly the same reason you normally can.
Really. Go review everything you know about linear algebra but where you aren't allowed to look at the coefficients (imagine there are too many to look at) and instead you can ask a magic device to compute the inner products and scaling and addition for you. That's all that is going on, except the inner product isn't symmetric anymore.
So why does Susskind do this?
Its how you take an inner product. And some notation.
And why does this work?
Work at what? Practicing doing linear algebra with complex column matrices. If you want to do calculus so you can have derivatives so you can set up a differential equation you need a sense of distance for your limit. One sense of distance between A and B is to take $\sqrt {\langle A-B|A-B\rangle}$ and you need that complex conjugate so that the square root gives a non negative number.
I need to fully understand this as it's the foundation of quantum mechanics and I wouldn't be able to set up problems otherwise.
Practice with finite dimensional complex vector spaces first.
Best Answer
The inner product is invariant, and your contradiction comes from
an incorrect method of finding bra component vectorsthe fact that the inner product does not correspond to multiplication of components.The rule for finding the row vector of the bra corresponding to any ket is to take the conjugate-transpose of the ket's column vector. An important distinction is that the column and row vectors corresponding to kets and bras are not equal to those kets and bras. Kets are elements of a Hilbert space $\mathcal H$ and bras are elements of the dual space $\mathcal H^*$. The corresponding column and row vectors of the components in a given basis are representations which belong to the row and column matrix spaces $\mathbb C^{1\times n}$ and $\mathbb C^{n\times1}$ (for an $n$ dimensional $\mathcal H$).
A general 2D ket $|\psi\rangle$ can be expanded in terms of a (not necessarily orthonormal) basis $B = \{|a\rangle,|b\rangle\}$ as
$$|\psi\rangle = \psi_a|a\rangle + \psi_b|b\rangle$$ This can be represented by a column vector of its components in this basis $$|\psi\rangle_B = \begin{bmatrix} \psi_a \\ \psi_b \end{bmatrix} \in \mathbb C^{2\times1}$$
The corresponding bra $\langle\psi|$ can be expanded in terms of the bra basis $B^* = \{\langle a|,\langle b|\}$ as $$\langle \psi| = \langle a|\psi_a^* + \langle b|\psi_b^*$$ This can be represented by a row vector of its components in this basis $$\langle \psi|_{B^*} = [\psi_a^*,\psi_b^*]\in \mathbb C^{1\times2}$$
which is indeed the conjugate-transpose the column representation of $|\psi\rangle$.
However it is important to note that the inner product of a bra and a ket is not equivalent to matrix multiplication of their row and column representations. We can see this by expanding the inner product $\langle\psi|\psi\rangle$ in terms of the bases
$$\langle\psi|\psi\rangle = (\langle a|\psi_a^* + \langle b|\psi_b^*)(\psi_a|a\rangle + \psi_b|b\rangle) = \psi_a^*\psi_a\langle a|a\rangle + \psi_a^*\psi_b\langle a|b\rangle + \psi_b^*\psi_a\langle b|a\rangle +\psi_b^*\psi_b\langle b|b\rangle$$
which in general is different to $$(\langle\psi|_{B^*})(|\psi\rangle_B) =[\psi_a^*,\psi_b^*]\begin{bmatrix}\psi_a \\ \psi_b\end{bmatrix} = \psi_a^*\psi_a + \psi_b^*\psi_b$$
The difference depends on the value of the inner products, and its only in an orthonormal basis, where $\langle a|a\rangle = 1,\langle b|b\rangle = 1, \langle a|b\rangle = 0, \langle b|a\rangle = 0$, that the multiplication of components is equal to the inner product.