[Physics] Trying to understand inner product notation

hilbert-spacenotationquantum mechanics

I I'm taking a QM course and I'm trying to make sense of why observables are sometimes conjugated for no apparent reason in their inner products.

Right now I'm watching Dr. Susskind's lecture on youtube ( http://www.youtube.com/watch?v=8mi0PoPvLvs ) and around 13:16 he takes the inner product of the ket wavefunction $$ \mid\psi(x)\rangle $$ and the state representing the particle at $x$, $$\mid x \rangle $$ to make $$\langle x \mid \psi \rangle $$ Doesn't this conjugate the $x$ term? My question is why all of a sudden does he conjugate the position vector to complete the inner product? Right before that he took the inner product with two different state vectors, conjugating the $ x' $ term but not the $ x $ term. I guess I'm just trying to "visualize" why this is proper.

This also happens when I find the projection of one vector onto another vector, but I'm just confused as to why this is "legal".

So why does Susskind do this? And why does this work? I need to fully understand this as it's the foundation of quantum mechanics and I wouldn't be able to set up problems otherwise.

Best Answer

I'm taking a QM course and I'm trying to make sense of why observables are sometimes conjugated for no apparent reason in their inner products.

There is a lot of confusion even already. Let's start with the basics. There are vectors and operators and an inner product. If everything was finite dimensional the vectors would be complex column vectors, the operators would be complex square matrices and the inner product would be taken by transposing one vector and then conjugating every term in it and then doing regular matrix multiplication.

Note right away that unlike regular 3d space this new inner product is different. Firstly it is complex, secondly you no longer get that it is symmetric. This is important and serious. We can no longer just say "the inner products of vectors A and B." So we need a (new) clear notation.

One notation is $\langle A | B \rangle.$ And that is just like taking the A vector and transposing the column vector $|A\rangle$ into a row vector and complex conjugating every term in it to get a new row vector $\langle A |$ and then we take the natural product.

So whenever you see a ket vector (e.g. $|A\rangle$) just think column vector. And whenever you see a bra vector (e.g. $\langle A|$) just think row vector. And think of the vectors $|A\rangle$ and $\langle A |$ as being transpose conjugates of each other.

This operation of taking the transpose conjugate of one and multiplying by the other has the properties that it is linear in the second argument, conjugate linear in the first argument and is non negative when you give it the same argument in both the positions. Plus if you put the two vectors in the opposite order you get the complex conjugate of putting them in the first order. Those are really the properties you are looking for and you can get them even if there isn't a finite dimensional basis. For instance the operation that takes two functions and conjugates one and then multiplies them together and then integrates is like an infinite dimensional version of that.

Right now I'm watching Dr. Susskind's lecture on youtube ( https://www.youtube.com/watch?v=8mi0PoPvLvs ) and around 13:16 he takes the inner product of the ket wavefunction $$ |\psi(x)> $$ and the state representing the particle at x, $$ |x> $$ to make $$<x|\psi> $$

That isn't actually what he does. He takes the (column) vector $ |\psi\rangle $ and the (column) vector $ |x\rangle $ and makes the scalar $\langle x|\psi\rangle=\psi(x).$ Each of The vectors $ |\psi\rangle $ and $ |x\rangle $ was an entire vector and you are computing the complex scalar you get from the inner product. If you imagine fixing the vector $|\psi\rangle$ and having a different vector $|x\rangle$ for each position $x$ then you are effectively getting a complex number for each position. You are getting the wave function from the inner product.

And you are getting it the same way you get coefficients from a vector, but inner product long them with your basis vectors. You might be used to getting coefficients by just looking at a vector, but if we have an infinite dimensional basis we can't just write down an infinite number of coefficients. When you think about it, you get a coefficient by inner producting with the basis vectors.

Doesn't this conjugate the 'x' term?

What does this have to do with anything? And firstly we are starting with a vector space with an addition (like adding column vectors), a scaling of vectors by complex numbers (like scaling a column vector by a complex number), and an inner product (like as described). But it just has the properties we want. We are getting the wave function from that. If we used wave functions this would be circular. Of you wanted to do that then you could define the inner product as an integral, but then you won't have vectors like $x\rangle.$

My question is why all of a sudden does he conjugate the position vector to complete the inner product?

The inner product is always (in physics) conjugate linear in the first argument and linear in the second argument. But he's not doing that. He's just taking their inner product. If the $|x \rangle$ were itself a wave function then it would be a Dirac delta function, which can be real, and hence conjugating it doesn't do anything.

He's just finding a coefficient of the vector $\psi\rangle$ in the "basis" $\{|x\rangle\}.$

Right before that he took the inner product with two different state vectors, conjugating the $ x' $ term but not the $ x $ term. I guess I'm just trying to "visualize" why this is proper.

It is proper to take the inner product with any two vectors you feel like. Its just not a symmetric operation anymore. You do what you want, you get what you get.

This also happens when I find the projection of one vector onto another vector, but I'm just confused as to why this is "legal".

You can project A onto B by $\langle B |A \rangle | B\rangle/\langle B |B \rangle$ for exactly the same reason you normally can.

Really. Go review everything you know about linear algebra but where you aren't allowed to look at the coefficients (imagine there are too many to look at) and instead you can ask a magic device to compute the inner products and scaling and addition for you. That's all that is going on, except the inner product isn't symmetric anymore.

So why does Susskind do this?

Its how you take an inner product. And some notation.

And why does this work?

Work at what? Practicing doing linear algebra with complex column matrices. If you want to do calculus so you can have derivatives so you can set up a differential equation you need a sense of distance for your limit. One sense of distance between A and B is to take $\sqrt {\langle A-B|A-B\rangle}$ and you need that complex conjugate so that the square root gives a non negative number.

I need to fully understand this as it's the foundation of quantum mechanics and I wouldn't be able to set up problems otherwise.

Practice with finite dimensional complex vector spaces first.