Hilbert Subspace Calculations
FCCR Table of Contents


This appendix collects formulae that express certain quantities associated with the Hilbert space Hilb(n) in terms of those similarly calculated for its "positive" and "negative" subspaces. This division is invariantly defined under the adjoint action of the group U(G, n) by its Cartan canonical decomposition.




Consider influence on equation (13.8b) by the restriction to a subspace, enforced in the |n, k> basis. We break the calculation of (13.5) according to the "positive subspace" spanned by the N(n) eigenbasis


        |n, 0>, |n, 1>, ... |n, n-2>, where

             <n, k| G(n) |n, k> = 1  > 0

for k not= n-1; and the one dimensional "negative subspace" spanned by |n, n-1>, where

     <n, n-1| G(n) |n, n-1>  = (1-n) < 0

For some arbitrary vector,

          | |x> |
     |> = |     |
          |  w  |

where |x> is an n-1 dimensional vector. Similarly, an n dimensional linear functional is written: ( '*' = complex conjugate )

     | <x|   w* |

Then,

     <G(n)> = <x|x> - (n-1) |w|^2                  (G.1)


   which in QM would be the norm of the state |>.
   The general matrix A has the form


           | M_1    |v_1> |
     A   = |              |                       (G.2)
           | <u_1|   m_1  |

where M_1 is an (n-1) X (n-1) submatrix, |v_1> is an n-1 dimensional vector, <u_1| is a linear functional on an n-1 dimensional vector space and m_1 is a scalar.

If A is Hermitean with respect to the Euclidean inner product, m_1 = m_1*, (m_1 is real) and <u_1| = <v_1| where <v_1| is the conjugate transpose of |v_1>, and further M_1 is Hermitean with respect to its inherited Euclidean subnorm.

   Now express some standard formulae in terms of the
   n-1, 1 decomposition of Hilb(n).

   APPLY A MATRIX OPERATOR TO A VECTOR:


        |M_1    |v_1>| ||x>|       | M_1|x> + w|v_1> |
        |            | |   |   =   |                 |     (G.3)
        |<u_1|   m_1 | | w |       | <u_1|x> + w m_1 |


   APPLY A MATRIX OPERATOR TO A LINEAR FUNCTIONAL:


               |M_1    |v_1>|
      |<x| w*| |            |
               |<u_1|   m_1 |

         = |<x|M_1 + w*<u_1|   <x|v_1> + w*m_1|   (G.4)


AN EXPECTATION VALUE:


            | M_1    |v_1>| | |x> |
   |<x| w*| |             | |     |
            | <u_1|   m_1 | |  w  |

        = <x|M_1|x> + w<x|v_1> + w*<u_1|x> + m_1|w|


If A is Hermitean, just replace u_1 with v_1. So, the expectation value becomes:

     <x|M_1|x> + w<x|v_1> + w*<v_1|x> + m_1|w|^2
or
     <x|M_1|x> + 2Re( w<x|v_1> ) + m_1|w|^2


MULTIPLY TWO MATRICES:



  | M_1    |v_1>  ||  M_2   |v_2> |
  |               ||              | =
  | <u_1|   m_1   ||  <u_2|  m_2  |

                 | M_1 M_2 + |v_1><u_2|   M_1|v_2> + m_2|v_1> |
                 |                                            |
                 |<u_1|M_2 + m_1<u_2|    <u_1|v_2> + m_1 m_2  |

If both matrices are Hermitean, replace u_1 with v_1 and
u_2 with v_2.


                 |  M_1 M_2 + |v_1><v_2|    M_1|v_2> + m_2|v_1> |
                 |                                              |
                 |  <v_1|M_2 + m_1<v_2|    <v_1|v_2> + m_1 m_2  |


COMMUTATOR OF TWO MATRICES:

( [a,b] = "subcommutator" on n-1 dim. subspace )

  | | M_1    |v_1>| |  M_2   |v_2> | |
  | |             | |              | |        =
  | | <u_1|   m_1 |,|  <u_2|  m_2  | |

| [M_1,M_2] + (|v_1><u_2| - |v_2><u_1|)  (M_1 - m_1)|v_2> - (M_2 - m_2)|v_1> |
|                                                                            |
| <u_1|(M_2 - m_2) - <u_2|(M_1 - m_1)      <u_1|v_2> - <u_2|v_1>             |


If both matrices are Hermitean, take <u_1| = <v_1| and <u_2| = <v_2|,
eliminating <u_1| and <u_2|.  The commutator then has the form:


| [M_1,M_2] + (|v_1><v_2| - |v_2><v_1|)  (M_1 - m_1)|v_2> - (M_2 - m_2)|v_1> |
|                                                                            |
| <v_1|(M_2 - m_2) - <v_2|(M_1 - m_1)      <v_1|v_2> - <v_2|v_1>             |

The result is, of course, anti-Hermitean.


THE SQUARE OF AN HERMITEAN MATRIX:

              | M        |v> |
    A    =    |              |
              |<v|        m  |

is then

              | M M + |v><v|    (M + m)|v>  |
    A^2    =  |                             |
              | <v|(M + m)     <v|v> + m^2   |


Expectation value of the square of not necessarily Hermitean
matrix A on the vector | |x> w |:

     <A^2> = ( <x|M^2|x> + <x|v><u|x> )  +  w*<x|(M + m)|v>
             w*<u|(M + m)|x>   +   ( <u|v> + m^2 ) |w|^2

If w = 0, (restriction of vector to n-1 dimensional upper subspace)

          = <x|M^2|x> + <x|v><u|x>

With A Hermitean:

     <A^2> = ( <x|M^2|x> + <x|v><v|x> )  +  w <x|(M + m)|v>
           w*< <v|(M + m)|x>   +   ( <v|v> + m^2 ) |w|^2

            =  <x|M^2|x> + |<x|v>|^2 + 2 Re( w <x|(M + m)|v> )
            +   ( <v|v> + m^2 ) |w|^2

If w = 0,

      <A^2> = <x|M^2|x> + <x|v><v|x>

              = <x|M^2|x> + |<x|v>|^2


SQUARE OF THE EXPECTATION VALUE:

     <A>^2   =  <x|M|x>^2 + (w<x|v>)^2 + (w*<u|x>)^2 + (m |w|^2)^2
            + 2w<x|M|x><x|v> + 2w*<x|M|x><u|x> + 2m|w|^2 <x|M|x>
            + 2w*w<x|v><u|x> + 2mw |w|^2 <x|v>
            + 2mw* |w|^2 <u|x>

           =  <x|M|x>^2 + (w<x|v>)^2 + (w*<u|x>)^2 + (m |w|^2)^2
            + 2w<x|v>( <x|M|x> + m |w|^2 )
            + 2w*<u|x>( <x|M|x> + m |w|^2 )
            + 2|w|^2( m<x|M|x> + <x|v><u|x> )

     If w = 0

     <A>^2   =  <x|M|x>^2



If A is Hermitean and w not= 0:

     <A>^2   =  <x|M|x>^2 + (w<x|v>)^2 + (w*<v|x>)^2 + (m |w|^2)^2
            + 2w<x|M|x><x|v> + 2w*<x|M|x><v|x> + 2m|w|^2 <x|M|x>
            + 2w*w<x|v><v|x> + 2mw |w|^2 <x|v>
            + 2mw* |w|^2 <v|x>

           =  <x|M|x>^2 + (w<x|v>)^2 + (w*<v|x>)^2 + (m |w|^2)^2
            + 4Re( w<x|v> )( <x|M|x> + m |w|^2 )
            + 2|w|^2( m<x|M|x> + <x|v><v|x> )




Go to Table of Contents
Go to Physics Pages
Go to Home Page

Email me, Bill Hammel at
bhammel@graham.main.nc.us
READ WARNING BEFORE SENDING E-MAIL

The URL for this document is:
http://graham.main.nc.us/~bhammel/FCCR/apdxG.html
Created: August 1997
Last Updated: May 28, 2000
Last Updated: July 21, 2002