Go to the U of M home page
School of Physics & Astronomy
School of Physics and Astronomy Wiki

User Tools


classes:2009:fall:phys4101.001:lec_notes_0916

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
Next revision
Previous revision
classes:2009:fall:phys4101.001:lec_notes_0916 [2009/09/12 08:52] ykclasses:2009:fall:phys4101.001:lec_notes_0916 [2009/09/20 19:08] (current) yk
Line 1: Line 1:
-===== Sept 16 (Wed) =====+===== Sept 16 (Wed) - What's so special about the Stationary State?=====
 ** Responsible party:  joh04684, Spherical Chicken**  ** Responsible party:  joh04684, Spherical Chicken** 
  
-**To go back to the lecture note list, click [[lec_notes]]**+**To go back to the lecture note list, click [[lec_notes]]**\\ 
 +**previous lecture note: [[lec_notes_0914]]**\\ 
 +**next lecture note: [[lec_notes_0918]]**\\ 
 + 
 +**Main class wiki page: [[home]]** 
 + 
 +=== Main Points === 
 + 
 +  * Stationary States 
 +  * Expectation Values of Operators 
 +  * Hamiltonions and Energy 
 + 
 +== For a stationary state wave function == 
 + 
 +<math> <\hat{H}>_\Psi_n(t) = \int_{-\infty}^{\infty}\Psi^*(x,t)\hat{H}\Psi(x,t)dx</math> 
 + 
 +  * <del>This lets us get rid of</del>//The integral removes x dependence, but //this integral// still possibly <del>have</del> //is// a function depending on time. 
 +  * If we <del>set</del> //calculate// the time derivative// <math>\frac{\partial}{\partial t} <\hat{H}>_\Psi__n </math>, //it turns out that it is zero (we can show this fairly easily)// <del>this gives us our stationary states,</del> implying that <math><\hat{H}></math> is constant as a function of time. 
 + 
 +== For general wave function, on the other hand == 
 + 
 +  * <del>The general form</del>//As an example of non-stationary state wave function, let's think about// : <math>\Psi_{\strike n} (x,t) = \frac{1}{\sqrt{2}}\psi_1(x)e^{\frac{-i E_1 t}{\hbar}} + \frac{1}{\sqrt{2}}\psi_2(x)e^{\frac{-i E_2 t}{\hbar}}</math> 
 +  * Then the complex conjugate: <math>\Psi^*_{\strike n} (x,t) = \frac{1}{\sqrt{2}}\psi^*_1(x)e^{\frac{+i E_1 t}{\hbar}} + \frac{1}{\sqrt{2}}\psi^*_2(x)e^{\frac{+i E_2 t}{\hbar}}</math> 
 +  * The cross terms of <math>\Psi^*(x,t)\Psi(x,t)</math> become <math>e^{i(\frac{E_1-E_2}{\hbar})t}</math> or <math>e^{-i(\frac{E_1-E_2}{\hbar})t}</math> 
 +  * If you add these two, the time dependence //still// doesn't <del>necessarily</del> disappear.  However, in the case of <math><\hat{H}></math>, these cross terms will go away //if you do more calculations, which happens because//<math>\psi_1(x)</math> //and// <math>\psi_2(x)</math> //areorthogonal!//
 +  * <del>For example: let</del>//Note that if we define //<math>\xi \equiv \frac{E_1 - E_2}{\hbar}t</math>, we //might// get <math>e^{i\xi} + e^{-i\xi}</math> //in the above calculations of the energy expectation value// . 
 +  * <del>Real part:</del>//Knowing that //<math>{\rm e}^{\pm\xi}=\cos{\xi} \pm i\sin{\xi}</math>//, the imaginary part of the above expression may cancel, but the real part will remain.// 
 +  * <del>Imaginary part cancels,</del> //This will still //leave us with a time dependent portion 
 + 
 +== For a general operator with stationary states == 
 + 
 +  * <math> <\hat{\Theta}>_\Psi_n = \int_{-\infty}^{\infty} \Psi_n^*(x,t)\hat{\Theta}\Psi_n(x,t) dx</math>, where <math>\Psi_n^*(x,t)</math> has an <math>e^{\frac{iE_nt}{\hbar}}</math> component, and <math>\Psi_n(x,t)</math> has an <math>e^{\frac{-iE_nt}{\hbar}</math> component. 
 +  * Time dependent part cancels, resulting in <math><\hat{\theta}></math> being constant with respect to time. 
 +  * Note: <math>\hat{\Theta}</math> is just a general operator<del>, and has</del> //with// no explicit time dependence //is assumed// Also, the subscript //n// appended to the two <math>\Psi</math> values implies we are dealing with stationary states. 
 +  * <del>We are trying to see if operators other than <math>\hat{H}</math> are constant, as well.</del>  //From the above, we conclude that //<math><\hat{\Theta}></math> //is generally constant in time.// 
 + 
 +== For a general operator with non-stationary states == 
 + 
 +  * For an arbitrary <math>\Psi(x,t)</math>, we have: <math><\hat{\Theta}>_\Psi = \int_{-\infty}^{\infty} \Psi^*(x,t)\hat{\theta}\Psi(x,t)dx</math>, which <del>is</del>//will// not necessarily //be// constant with respect to time. 
 +  * <del>Here,</del>//For example, if// <math>\Psi^*(x,t) = (c_1\psi_1(x)e^{\frac{-iE_1t}{\hbar}}+c_2\psi_2(x)e^{\frac{-iE_2t}{\hbar}})^*</math> 
 +  * <del>If you have only one term (i.e. ground state), you have stationary states.</del> 
 +  * <del>If you have multiple energy states, it's possible some <math><\hat{\theta}></math> are not stationary.</del> 
 +  * //Following the same logic as in the previous section above, we will conclude that the time dependence which arises from cross terms will not necessarily disappear.  It turns out that if the operator represent a physical quantity which would be conserved (like angular momentum under certain condition) in Classical Mechanics, the time dependence of its expectation value in QM will also be zero.// 
 + 
 +== Energy == 
 + 
 +  * <math><\hat{H}>_\psi_n = E_n</math>. //To show this, we can do the following// 
 +  * <del>Using this, </del>We <del>can</del>//often// write the <del>Schroedinger Equation</del>//Hamiltonian// in this form: <math> \hat{H}\strike{\psi_n = E_n\psi_n} = (-\frac{\hbar^2}{2m}\partial^2_x + V)</math>, where <math>\partial_x</math> is short-hand for <math>\frac{\partial}{\partial x}</math> 
 +  * <del>This short-hand form of</del>//Then// the Schroedinger Equation <del>lets us write</del>//becomes//<math> \hat{H}\psi_n = E_n\psi_n</math>// Using this form, we can calculate the Hamiltonian expectation value in the following fashion//: <math><\hat{H}>_\psi_n = \int \psi_n^*\underline{\hat{H}\psi_n}dx = \int\psi_n^*\underline{E_n\psi_n}dx</math>, where <math>E_n</math> is not an operator, but rather a number.  (See [[lec_notes_0914]].) 
 +  * Thus this becomes: <math>E_n\int\psi_n^*\psi_ndx = E_n</math> 
 +  * //Since wave functions have to be normalized,// we <del>are</del>//can// assume <math>\int\psi_n^*\psi_ndx=1</math> <del>to be normalized to 1</del>
 +  * <del>Giving</del> //If you calculate// <math>\sigma^2 \equiv <\hat{H}^2> - <\hat{H}>^2 </math>//you will find it to be zero.  If you don't see it immediately, you should think about this until you get zero.// 
 +  * <del>Since <math>\sigma_E^2 = 0</math>,</del> This implies that there is no fluctuation: You get the **same** measurement every time. 
 +== How to find C_i's <del>Linear Combination of Stationary States</del> == 
 + 
 +  * In general//, if we accept the idea that a set of stationary state wave functions forms a "complete" set, any wave functions can be expressed as// <math>\psi(x) = \sum_n c_n\psi_n</math> 
 +  * //A big questions may be, "How can we find out what //<math>c_i</math>//'s are, knowing what the function// <math>\psi(x)</math> // is?"// 
 +  * //To answer that question, Griffiths talks about "Fourier's trick."// 
 +  * //I would say, this is analogous to finding the //x// component of some vector //<math>\vec{r}</math>, which we can obtain by calculating <math>\vec{r}\cdot\hat{x}</math>// This method may sound overly complex because when //<math>\vec{r}=\[x, y, z \]</math>//, there is no need to come up with a more complex way to figure out what their x, y and z components are.  They are obviously x, y and z.  However, we sometimes encounter an example of a vector space where no basis vectors are defined (wave functions are one such example) so that we can express the vector as //<math>\vec{r}=\[x, y, z \]</math>// The method we are discussing right now works even in such a case, or when the basis vector used to define the vector space and the basis along which you want to decompose a vector are different, as long as the dot product is defined, which is always the case (since the dot product is the way we define the lengths of vectors.)// 
 +    * //Suppose you have succeeded in decomposing a vector to its components, the following should hold:// <math>\vec{r} = x\hat{x}+y\hat{y}+z\hat{z}</math> 
 +    * This equation is a vector equation, which means that there are effectively 3 (or whatever the dimension of the vector space is) equations.  To find the values of //x//, //y// and //z//, it may be easier if we convert it into individual equations.  Taking a dot product with this vector equation is one way to accomplish this goal.  If we take dot product with <math>\hat{x}</math>, for example, we will get <math>\hat{x}\cdot \vec{r} = \hat{x}\cdot(x\hat{x}+y\hat{y}+z\hat{z})</math>  
 +    * If one realizes that <math>\hat{x}\cdot \hat{x}=1</math>, <math>\hat{x}\cdot \hat{y}=0</math> and <math>\hat{x}\cdot \hat{z}=0,</math>, this is just //x// //i.e.// <math>x=\hat{x}\cdot \vec{r}</math>   
 + 
 +  * //We have not learned this yet, but it turns out that in QM, doing this: //<math>\int\psi_n^* ({\rm wave function})dx</math>// to the wave function// is <del>essentially</del> the dot product, where the subscript //n// implies the particular component (ie, x, y, z, 1, 2, 3, etc) 
 +    * //For example, the dot product between //<math>\psi_n(x)</math> //and// <math>\psi_m(x)</math>// would be //<math><m|n>=\int\psi^*_n(x)\psi^_m(x)dx=0</math> //unless// <math>m=n</math> //Note that in QM, a notation, //<math><m|n></math>// is used to represent the dot product between two wave functions.  Vectors are represented only by the indeces, m and n, here.  We say that the two wave functions are orthogonal.  The dot product of two orthogonal vectors is zero, too.//  
 +  * //Borrowing the idea used in the vector decomposition above, we will take a dot product of the vector equation above and //<math>\psi_m</math>// and get //<math>\int\psi_m^*(\psi(x))dx = \int\psi_m^*(\sum_n c_n\psi_n)dx</math> 
 +  * //Since //<math>\sum_n c_n</math>// is a set of constants, we can pull it out of the integral.  Since due to orthogonality:// 
 +    * For <math>m \neq n</math>, we have <math>\int\psi_m^*\psi_ndx = 0</math> and 
 +    * For <math>m = n</math>, we have <math>\int\psi_{m=n}^*\psi_ndx = 1</math> 
 +  * The only term in the sum which survives is the term where <math> m = n</math>, leaving us <math> \sum_n c_n\int\psi_m^*\psi_ndx = c_m</math> 
 +  * Thus, our left hand side of the equation becomes: <math>LHS = \int\psi_m^*\psi(x)dx = c_m</math> 
 +  * With: <math>\psi(x) = \sum_n c_n\psi_n</math> 
 + 
 +**To go back to the lecture note list, click [[lec_notes]]**\\ 
 +**previous lecture note: [[lec_notes_0914]]**\\ 
 +**next lecture note: [[lec_notes_0918]]**\\
  
-Please try to include the following 
  
-  * main points understood, and expand them - what is your understanding of what the points were. 
-    * expand these points by including many of the details the class discussed. 
-  * main points which are not clear.  - describe what you have understood and what the remain questions surrounding the point(s). 
-    * Other classmates can step in and clarify the points, and expand them.  
-  * How the main points fit with the big picture of QM.  Or what is not clear about how today's points fit in in a big picture. 
-  * wonderful tricks which were used in the lecture. 
classes/2009/fall/phys4101.001/lec_notes_0916.1252763530.txt.gz · Last modified: 2009/09/12 08:52 by yk