Go to the U of M home page
School of Physics & Astronomy
School of Physics and Astronomy Wiki

User Tools


classes:2009:fall:phys4101.001:lec_notes_0914

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
Next revision
Previous revision
classes:2009:fall:phys4101.001:lec_notes_0914 [2009/09/20 09:15] ykclasses:2009:fall:phys4101.001:lec_notes_0914 [2009/09/20 10:23] (current) yk
Line 7: Line 7:
 **next lecture note: [[lec_notes_0916]]**\\ **next lecture note: [[lec_notes_0916]]**\\
  
-**Main class wiki: [[home]]+**Main class wiki: [[home]]**
  
 === Main Points === === Main Points ===
Line 61: Line 61:
   * The general solution is a linear combination of separable solutions. Each <math>\psi_n(x)</math> has a corresponding exponential <math> \phi(t) </math> with an energy <math> E_n </math>, and a proportionality constant <math> c_n </math> (Griffiths p26-28)   * The general solution is a linear combination of separable solutions. Each <math>\psi_n(x)</math> has a corresponding exponential <math> \phi(t) </math> with an energy <math> E_n </math>, and a proportionality constant <math> c_n </math> (Griffiths p26-28)
  
-The Time Independent Schrodinger equation can be separated into a matrix operator that acts on the wave function and an eigenvector, which equal an eigenvalue multiplied by the same eigenvector:+In the following, the <del>deleted</del> and //italicized// parts are Yuichi's edits. 
 + 
 +The Time Independent Schrodinger equation can be <del>separated</del> //mapped// into a matrix //equation for eigenvector and eigen value.  //
  
 <math>[-\frac{\hbar^2}{2m}\frac{\partial^2}{\partial x^2} + V] \psi = E \psi  </math> <math>[-\frac{\hbar^2}{2m}\frac{\partial^2}{\partial x^2} + V] \psi = E \psi  </math>
 =>  <math> M \psi = \lambda \psi</math> =>  <math> M \psi = \lambda \psi</math>
  
-Where the Energy is the eigenvalue and the matrix is the Hamiltonian Operator:  +//This matrix equation can be interpreted in the following way.  When a matrix operate on a vector, it will result in a vector.  This resulting vector in general is different from the original vector in both direction and the length.//
-<math>H\psi = E\psi</math>+
  
-We know from Linear Algebra that an //n// dimensional matrix //M// and the Eigenvalue/vector equation can be solved for <math>(n-1)</math> variables and <math>\lambda</math>. Multiplication by the matrix M represents linear transformation of <math>\psi</math>, and the eigenvalue equation represents transformation that maps all values of <math>\psi</math> to zero. +//For example, we can think of// a simple transformation //created by familiar// 2-Dimensional rotation matrix: <math>R = \[\begin{array}{ccc}
- +
-A simple transformation is the 2-Dimensional rotation matrix: <math>\[\begin{array}{ccc}+
 cos(\theta) & sin(\theta) \\ cos(\theta) & sin(\theta) \\
 -sin(\theta) & cos(\theta) \end{array} \]</math> -sin(\theta) & cos(\theta) \end{array} \]</math>
-  * For the Eigenvalue equation, the matrix //M// does //not// change the direction of the eigenvector <math>\psi</math>, only its magnitude; it is equivalent to multiplication by a //constant//. 
  
-For most physics applications eigenvectors are perpendicular, so a vector **x** can be resolved into its perpendicular components projected onto the eigenvectors quite easily. +//With this matrix, all vectors are rotated by an angle θ and therefore change their directions.  
 +<del>For the Eigenvalue equation, the matrix //M// does //not// change the direction of the eigenvector <math>\psi</math>, only its magnitude; it is equivalent to multiplication by a //constant//.</del> 
 +Looking at the eigenvector equation above, it suggests that the two vectors, the original and the one after the transformation by the matrix, M, are in the same direction.  //i.e.// for the matrix M, vector// <math>\psi</math> //is a special vector which does not change its direction after being operated on by the matrix, M.// <del>operator that acts on the wave function and an eigenvector, which equal an eigenvalue multiplied by the same eigenvector</del>: 
 + 
 +//With this interpretation, the rotation matrix, //R//, shown above cannot have eigenvectors.  This is represented by the fact that this rotation matrix does NOT have real eigenvalues.  i.e. there is no vectors whose directions remain the same after the rotation.  However, if we expand ourselves and accept complex eigenvalues and vectors with complex components, even this matrix does have two eigen values and corresponding eigen vectors.  They are: //<math>\cos(\theta)\pm i\sin(\theta)={\rm e}^{\pm i\theta},</math> and <math>\[\begin{array}{c} 1 \\ \pm i  \end{array} \]  </math> 
 + 
 + 
 +//In order to be able to see or impress this parallel between the time-independent Schrödinger equation and the eigenvector equation, we often write the Schrödinger equation in the following form://  
 +<del>Where the Energy is the eigenvalue and the matrix is the Hamiltonian Operator: </del> 
 +<math>H\psi = E\psi</math> 
 + 
 +//It may seem strange to be able to figure out what the unknown wave function AND unknown energy value from a single equation.  It almost looks like two unknowns can be figured out from one equation.  For the eigenvector equation, the same thing is happening.  Unknown eigenvector has //n// unknowns, in addition to an additional unknown eigenvalue.  So altogether there are //n+1// unknowns whereas the matrix equation represents //n// equations.  How is this possible?  Doesn't it go against the basic of algebra?  It turns out that we are not able to determine everything about the eigenvector.  We can determine only its direction, but not the length.  The length must be determined by other means.  For our Schrödinger equation, the same thing happens - the normalization of the wave function is NOT determinable.  But our normalization requirement arising from the requirement that the total probability has to be 1 determine the "length" of the wave function.// 
 + 
 +<del>We know from Linear Algebra that an //n// dimensional matrix //M// and the Eigenvalue/vector equation can be solved for <math>(n-1)</math> variables and <math>\lambda</math>. Multiplication by the matrix M represents a linear transformation of <math>\psi</math>, and the eigenvalue equation represents a transformation that maps all values of <math>\psi</math> to zero.</del> 
 + 
 +For most physics applications, the matrix is Hermitian, and consequently, its eigenvectors are perpendicular, so //they usually form orthonormal basis with which all other vectors can be expressed by their linear combinations.// vector **x** can be <del>resolved</del> //decomposed// into its <del>perpendicular</del> components projected onto the eigenvectors quite easily. //Note that even if the eigenvectors are not orthogonal, as long as they are linearly **independent**, decomposition of vectors is possible, though figuring out the proper coefficients, c_n, will be trickier.//
  
 The Hydrogen Atom has an infinite number of Energy levels, so an infinite number of eigenvalues are possible. This also implies that the transformation matrix //M// can be infinite-dimensional. The Hydrogen Atom has an infinite number of Energy levels, so an infinite number of eigenvalues are possible. This also implies that the transformation matrix //M// can be infinite-dimensional.
Line 87: Line 100:
  
   * Every measurement of Energy will return the Exact same value, E.   * Every measurement of Energy will return the Exact same value, E.
 +
 +  * more on this topic on [[lec_notes_0916|tomorrow]].
  
 **To go back to the lecture note list, click [[lec_notes]]**\\ **To go back to the lecture note list, click [[lec_notes]]**\\
classes/2009/fall/phys4101.001/lec_notes_0914.1253456103.txt.gz · Last modified: 2009/09/20 09:15 by yk