Title Cauchy-Schwarz Inequality and Heisenberg's Uncertainty Principle Norm (Mathematics) Mathematical Concepts Vector Space 59.9 KB 5
```                            Introduction
Inner Products and Norms
Decomposition of a Vector
Proof of the Cauchy-Schwarz Inequality
Alternate Proof
The Triangle Inequality
The Uncertainty Principle
References
```
##### Document Text Contents
Page 1

The Cauchy-Schwarz Inequality

July 24, 2009

1 Introduction

The Cauchy Schwarz Inequality states that any time you take an inner product
(i.e. dot product) of two vectors, you can’t get something longer than if you
just multiplied the two vectors’ lengths. This is the crucial theorem underlying
Heisenberg’s Uncertainty Principle.

In the simple vectors we draw in physics 101, the inequality is obvious,
because the dot product of two vectors ~a and ~b is

~a · ~b = |a||b| cos θ (1)

with θ the angle between the vectors. Because cos θ ≤ 1, the dot product is less
than or equal to the product of the lengths. Also, the dot product equals the
product of the lengths only when the vectors point in the same direction. Here,
we want to generalize this result to abstract vector spaces, working from the
axioms. Since we’re talking about abstract vectors from now on, rather than
little arrows in a plane, we’ll switch to the bra-ket notation.

2 Inner Products and Norms

Vector spaces can exist without inner products, but they’re less interesting that
way because we can’t define orthogonality or take projections. Inner products
are also closely tied to norms. If a vector space has an inner product defined,
we can define the norm of a vector v by

|v| =

〈v | v〉 (2)

Alternatively, if we have a norm but no inner product (as might be the case
in a physical situation, where the norm is what you get by laying down a ruler),
we can define the inner product of two vectors by

4〈v | w〉 ≡ |v + w|2 − |v − w|2 (3)

Although it isn’t immediately obvious, this definition is linear in both vec-
tors. For a complex vector space, the appropriate definition is

4〈a|b〉 ≡ |a+ b|2 − |a− b|2 − ı(|a+ ıb|2 − |a− ıb|2). (4)

1

Page 2

3 Decomposition of a Vector

When we have an inner product space, there are some vectors whose inner
products are especially simple - parallel and orthogonal vectors. Intuitively,
parallel vectors are vectors that point the same direction. Their definition is
two vectors for which the length of the sum is the sum of the lengths. Orthogonal
(or perpendicular) vectors are defined as having an inner product of zero. For
these vectors, the Pythagorean Theorem tells us that the square of the length
of the sum is the sum of the squares of the lengths.

Looking at any two vectors |y〉 and |x〉, it would be nice if we could under-
stand them simply in terms of parallel and orthogonal vectors, even though |y〉
and |x〉 are in general neither of these. What we’ll try to do then is take |y〉
apart into two pieces, one of which is parallel to |x〉, and the other orthogonal
to it.

Specifically, we want to find some scalar c such that

| y〉 = c | x〉+ | w〉 (5)

with

〈x | w〉 = 0 (6)

Dotting both sides of (5) with 〈x | gives

〈x | y〉 = c〈x | x〉 + 〈x | w〉

〈x | y〉 = c〈x | x〉 + 0

c =
〈x | y〉

〈x | x〉
(7)

Putting this result for c back into (5) yields

| y〉 =
〈x | y〉

〈x | x〉
| x〉+ | w〉. (8)

Geometrically, we’re creating a right triangle with | y〉 as the hypotenuse.

4 Proof of the Cauchy-Schwarz Inequality

We suspect that
|〈x | y〉| ≤ |x||y| (9)

.
That’s the Cauchy-Schwarz Inequality. We want to prove it from the axioms

of the inner product, which are

〈x | x〉 ≥ 0, and 〈x | x〉 = 0 ⇔| x〉 = 0

〈x | y〉 = 〈y | x〉∗

〈x | αy + βz〉 = α〈x | y〉 + β〈x | z〉 (10)

2

Page 3

Written out entirely in inner products, the Cauchy-Schwarz inequality is

|〈x | y〉| ≤

〈x | x〉〈y | y〉. (11)

Because the right hand side is positive, the statement is equivalent to what
we get by squaring both sides.

〈x | y〉2 ≤ 〈x | x〉〈y | y〉. (12)

We broke | y〉 down into parallel and perpendicular parts, so let’s substitute
(8) into (12) and see what we’ve got.

〈x | y〉2 ≤ 〈x | x〉

(

〈w | w〉 +
〈x | y〉

〈x | x〉

2

〈x | x〉

)

(13)

Simplifying a little gives that the Cauchy-Schwarz Inequality is equivalent
to

〈x | y〉2 ≤ 〈x | y〉2 + 〈x | x〉〈w | w〉 (14)

which is obviously true. Further, the equality obtains only when | x〉 = 0 or
| w〉 = 0, meaning | x〉 and | y〉 are linearly dependent.

5 Alternate Proof

Here’s another proof that is a bit more clever. We’ll work in a real vector space
for this one.

The Cauchy-Schwarz Inequality is trivial if | x〉 and | y〉 are dependent. So
assume they are independent, meaning

∀λ ∈ R, λ | y〉− | x〉 6= 0. (15)

Then the norm of this subtraction must be positive.

∀λ ∈ R, λ2〈y | y〉 − 2λ〈x | y〉 + 〈x | x〉 ≥ 0 (16)

The left hand side is quadratic in λ, but the quadratic has no roots because
it’s always greater than zero. Therefore the discriminant is negative. This
means

4〈x | y〉2 − 4〈y | y〉〈x | x〉 < 0

〈x | y〉2 < 〈x | x〉〈y | y〉 (17)

The square root of that is the Cauchy-Schwarz Inequality.

3