" "

Tuesday, July 14, 2015

15. Formalism of Quantum Mechanics: Vector Space



In quantum mechanics, the state of a system is described by an element of abstract vector space, the so-called state space. In Dirac notation, an element of this space is called a ket and is denoted by the symbol $\left| {} \right\rangle$
A vector space consists of a set of vectors | α >, | β >, | γ > ... together with a set of scalars (a, b, c …). In fact, we will be working with vectors that live in spaces of infinite dimension and the scalars will be ordinary complex numbers. Two important operations are: vector addition and scalar multiplication.

Vector addition:
The ‘sum’ of any two vectors is another vector: | α > + | β > = | γ >
Commutative: | α > + | β > = | β > + | α >         
Associative: | α > + (| β > + | γ >) = (| α > + | β >) + | γ >                    
There is a zero vector | 0 > with the property: | α > + | 0 > = | α >     
The associated inverse vector | -α > has the property: | α > + | -α >  = | 0 >

Scalar multiplication:
The ‘product’ of a scalar with a vector is another vector: a | α > = | γ >
Distributive:     a (| α > + | β >) = a | α > + a | β >
                           (a + b) | α > = a | α > + b | β >
Associative: a (b | α >) = (ab) | α >
Multiplication with scalars 0 and 1 has the properties: 0 | α > = 0; 1 | α > = | α > and |> = (-1)|α> 

Two important definitions in linear algebra are: linear combinations and linear dependence, these are closely related to systems of linear equations. A vector | v >  is a linear combination of vectors | α >, | β >, | γ > ... if there exist scalars (k1, k2, k3 …) such that:

| v > = k1 | α > + k2 | β > + k3 | γ > + ... + kn | ω >                                         (15.1)

that is, if the vector equation:

| v > = x1 | α > + x2 | β > + x3 | γ > + ...+ xn | ω >                                        (15.2)

Has a solution where xi are the unknown scalars.
Vectors | α >, | β >, | γ > ... are linearly dependent  if there exist scalars (k1, k2, k3 …), not all zero, such that:

k1 | α > + k2 | β > + k3 | γ > + ... + kn | ω > = 0                                      (15.3)

that is, if the vector equation:

x1 | α > + x2 | β > + x3 | γ > + ... + xn | ω > = 0                                   (15.4)

has a nonzero solution where xi are unknown scalars. Otherwise, the vectors are said to be linearly independent. For instance, in three dimensions the unit vector $\hat{k}$ is linearly independent of $\hat{i}$ and $\hat{j}$, but any vector in the xy-plane is linearly dependent on $\hat{i}$ and $\hat{j}$. A set of vectors is linearly independent if each one is linearly independent of all the rest. A collection of vectors span the space if every vector can be written as a linear combination of the members of this set. A set of linearly independent vectors that spans the space is called a basis. The number of vectors in any basis is the dimension of the space. Any given vector with n-tuple of its components | α > ↔ (a1, a2, a3 ... an), can be represented with respect to a prescribed basis | e1 >, | e2 >, | e3 > ... | en >:

| α > = a1 | e1 > + a2 | e2 > + a3 | e3 > + ... + an | en >                                 (15.5)

It is often more convenient to work with the components than with the ‘abstract’ vectors. For instance, addition of two vectors can be done by adding the corresponding components:

| α > + | β > ↔ (a1 + b1, a2 + b2, a3 + b3 ...  an + bn)                                 (15.6)

Multiplication by a scalar can be simply done by multiplying each component:

c | α > ↔ (ca1, ca2, ca3 ... can)                                               (15.7)

Component of zero vector is represented by a string of zeroes:

| 0 > ↔ (0, 0, 0 ... 0)                                                     (15.8)

And the components of the inverse vector have their signs reversed:

| -α > ↔ (-a1, -a2, -a3 ... -an)                                                (15.9)

There are two kinds of vector products in three dimensional space: the inner/dot product and cross product.
Vector space that is formed by inner product is called an inner product space. The dot product of two vectors | α > and | β > which is written as: < α | β >,  is a complex number with the following properties:

< β | α > = < α | β >*                                                   (15.10)

< α | α >    0, and < α | α > = 0 ↔ | α > = | 0 >                                 (15.11)

< α | (b | β > + c | γ >) = b < α | β > + c < α | γ >                                (15.12)

The inner product of any vector is a non-negative number therefore, its square root is real. We call this the norm or the “length” of the vector:

|| α || ≡ (< α | α >)1/2                                                   (15.13)

A unit vector with norm is 1, is said to be normalized and two vectors whose inner product is equal to zero are called orthogonal. An orthonormal set is a collection of mutually orthogonal normalized vectors, which can be defined as follows,

< αi | αj > ≡ δij                                                    (15.14)

It is always possible and convenient to choose and orthonormal basis, so that the dot product can be written in terms of their components:

< α | β > = $a_{1}^{*}{{b}_{1}}+a_{2}^{*}{{b}_{2}}+...+a_{n}^{*}{{b}_{n}}$                          (15.15)

And the squared norm of the vector becomes:

< α | β > = | a1 |2 + | a2 |2 + ... + | an |2                                   (15.16)

With the components themselves expressed in term of basis < ei | :

ai = < ei | α >                                                      (15.17)

The question then arises as to what is the angle between two vectors? In ordinary vector analysis the angle between two vectors is given by:

$\cos \theta =\frac{\vec{a}.\vec{b}}{|\vec{a}||\vec{b}|}$                                                    (15.18)

And by means of Schwarz inequality:

|< α | β >|2   < α | α > < β | β >                                                 (15.19)

The angle between | α > and | β > can be generalized by the formula:

$\cos \theta =\sqrt{\frac{<\alpha |\beta ><\beta |\alpha >}{<\alpha |\alpha ><\beta |\beta >}}$                                                 (15.20)

No comments:

Post a Comment