Read an Excerpt
The P(f)2 Euclidean (Quantum) Field Theory
By Barry Simon PRINCETON UNIVERSITY PRESS
Copyright © 1974 Princeton University Press
All rights reserved.
ISBN: 978-0-691-08144-1
CHAPTER 1
GAUSSIAN RANDOM PROCESSES, Q-SPACE AND FOCK SPACE
If one considers the free relativistic quantum field, one is led quite naturally to an attempt at simultaneously diagonalizing the time zero fields which form a family of (unbounded) commuting observables. While this idea is implicit in some of the earliest work in quantum field theory (see the historical note in [71]), it was not until the 1950's that Friedrichs [50] and Segal [158] faced up to the measure theoretic complexities involved in a careful treatment of this diagonalization. In a series of papers [159-162], Segal developed the theory from this point of view and especially emphasized the connection with a variety of ideas from probability theory.
This simultaneous diagonalization of the time zero fields realizing them as functions on a measure space, generally called Q-space, played an important role from the earliest days of constructive quantum field theory (see e.g., [131, 62, 147, 168]). In that work, there was, as an underlying philosophy, the notion of Fock space as fundamental and Q-space as "derived." The opposite view is possible, and in line with our decision to emphasize probabilistic ideas, we take this point of departure in §§1.1, 2. In §1.3, we link this Q-space approach with Fock space. In the remainder of the chapter, we continue the development of the theory of Q-space.
In §1.1, we include a number of probabilistic terms – we emphasize that the term "full" set of random variables is not standard. Of course, the term "Wick product of random variables" is taken from field theory although it must appear in the probabilistic literature under another name. We also note that what we call a "random process" is often called a "random field" – "process" being used for a special case. And we always use the word "random" while in the probability literature both "random" and "stochastic" are used.
§1.1. Gaussian Random Variables
A probability measure space is a triple (M, Σ, μ) where M is a set of "points," Σ is a Ó-field of subsets of M and μ is a (positive) measure on Σ normalized so that μ(M) = 1. Σ becomes a ring if the "sum" of two sets A, B [member of] Σ is their symmetric difference A Δ B = (A\B) [union] (B\A) and if their "product" is A [intersection] B. The family Iμ = {A [member of] Σ|μ (A) = 0} is an ideal and μ "lifts" to the quotient ring Σ/Iμ (called the "measurable sets modulo 0"). We call two probability measure spaces (M, Σ, μ) and (M', Σ', μ') isomorphic if there is an isomorphism of the rings Σ/Iμ and Σ'/Iμ' taking the measure μ into μ'. Notice that an isomorphism specifies nothing about a correspondence between points in M and M'. This places the following emphasis on the theory: points in M play no basic role, rather the basic objects are the elements of Σ/Iμ, i.e., families of subsets modulo 0, often called events.
A random variable is a measurable function, f, from M to R (with the measurable sets in R being the Borel sets), ∫ fdμ is called the expectation of f. Given a random variable, the measure induced on R by
μf(Ω) = μ(f-1[Ω]) (1.1)
is called the probability distribution for f and its (unnormalized) Fourier transform
[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII] (1.2)
is called the characteristic function of f. The characteristic function of f is a wonderous thing. In the first place, it is possible to directly characterize which functions arise as characteristic functions:
Theorem 1.1 (Bochner's Theorem). A necessary and sufficient condition for a function c (from R to C) to be the characteristic function of a random variable is that c () obey:
a) c(0) = 1
b) t [??] c(t) is continuous
c) For any [MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII] in R and [MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII] in C.
[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII]. (1.3)
Remarks:
1) The continuity condition can be weakened; it follows from measurability and (1.3). We state it to keep the analogy with Minlos' theorem (below) as direct as possible.
2) A proof of Bochner's theorem can be found in many places, e.g., [145B].
The second nice feature of the characteristic function is its connection to the moments of f. The nth moment of f is given by
[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII] (1.4)
wherever the integral is absolutely convergent. Clearly f has moments of all orders if and only if cf is C∞ and in that case:
[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII] (1.5)
Given two probability measure spaces, (M, Σ, μ) and (M', Σ', μ')isomorphic under a map [MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII]. We say random variables f on M and f' on M' correspond under the isomorphism if and only if
T(f-1[Ω]) = (f')-1 [Ω]
for all Ω [subset] R Borel sets.
Remark:
To be more precise and to be consistent with our emphasis on events, we should define a random variable to be a map F from the Borel sets, B of R Σ/Iμ] so that
[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII].
Given a measurable function f from M to R, F [equivalent to] f-1 defines a random variable in this sense. In some cases (e.g., model 2 of the next section) every F arises in this way, and in an intuitive sense, every such F "almost" comes from an f. In any event, we will slough over this point, and act as if every F does come from an f. In this way our proofs will be more natural looking although to be precise we should always translate into the "F-language."
One of the two basic objects of this section are:
Definition. A Gaussian random variable of mean 0 is a random variable, f, whose characteristic function has the form
cf(t) = exp (-1/2 at2); a ≥ 0 (1.6)
We henceforth drop the "of mean 0." The normalization constant 1/2 in (I.6) is chosen so that
[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII] (1.7)
on account of (1.9). a is often called the variance of f. Notice that the probability distribution of a Gaussian random variable (henceforth g.r.v.) is also a Gaussian:
[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII]. (1.8)
By expanding (1.6) in a power series and using (1.5), one immediately obtains all the moments of a g.r.v. of variance a, namely
∫ f2n+1 dμ = 0 n = 0, 1, ... (1.9a)
∫ f2n dμ = 1/2n (2n)!/n! an (1.9b)
The remainder of this section involves a discussion of a variety of notions, first for general random variables and then, in detail, for g.r.v.
* * *
If f1, ..., fn are random variables, then we define their joint probability distribution [MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII] on Rn by
[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII]
where [MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII] is given by
[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII]
and their joint characteristic function by
[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII].
Definition. A finite f1, ..., fn of variables is called jointly Gaussian if and only if their joint characteristic function has the form
[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII] (1.10)
where [MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII] is a (symmetric) real positive definite matrix.
One immediately has analogs of (1.7) and (1.8)
aij = ∫ fi fj dμ (1.11)
which is called the covariance matrix for {fi}. If {aij} is an invertible matrix and {bij} is its inverse, then
[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII]. (1.12)
Thus it is the inverse covariance matrix that enters in dμ.
Remarks:
1. It is not hard to show that [MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII] are jointly Gaussian if and only if [MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII] is a g.r.v. for any (a1, ..., an) [member of] Rn
2. While we will not use Gaussian random variables of mean different from zero, they may well play a role in the future development of the theory, so we mention their definition: f is called a Gaussian random variable of mean m and variance a if its characteristic function has the form
cf(t) = exp (-1/2 at2 + imt).
a is, of course, no longer given by (1.7) but by
a = ∫ f2 dμ - (∫ fdμ)2.
And similarly, f1, ..., fn are jointly Gaussian with general means if
[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII].
3. (1.9) extends directly (by power series expansion of (1.10) and the analog of (1.5)) to give:
Proposition I.2 (Wick's Theorem). If f1, ..., f2nare (not necessarily distinct) jointly Gaussian random variables, then
[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII] (1.13)
where = ∫ fi fj dμ and [summation over (pairings)] means the sum over all (2n)!/2nn! ways of writing 1, ..., 2n as n distinct (unordered) pairs (i1, j1), ..., (in, jn).
* * *
Next we turn to the notion of Wick powers and Wick product of random variables. We will deal with a fixed random variable, f, and let < > denote the integral with respect to μ (expectation). Given a formal power series in f, i.e., formal series where a) we don't worry about convergence, b) we don't identify two series which are identical by virtue of substituting in f (e.g., f and f2 are distinct as formal power series even if f = 1), we define
[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII].
Definition. Let f be a random variable with finite moments. Then :fn:, n = 0, 1, ... is defined recursively by:
:f0: = 1 (1.14a)
[partial derivative]/[partial derivative]f: fn: = n :fn-1: ; n = 1, 2, ... (1.14b)
<:fn = 0 n = 1, 2, ... (1.14c)
:fn: is called the nth Wick power of f. Notice that Wick powers depend on both f and the underlying measure. Thus, e.g.,
:f: = f -
:f2: = f2 - 2 f - + 2 2.
Properties of Wick powers are found most easily using the formal generating function
:exp (af): = [∞.summation over (n = 0)] an:fn:/n!. (1.15)
Clearly, by (1.14b)
[partial derivative]/[partial derivative]f: exp (af): = a: exp af:
and by (1.14a, c)
<:exp af:> = 1.
Thus
:exp af: = exp (af)/. (1.16)
(1.16) holds in the sense of formal power series in a.
If f is a Gaussian random variable, (1.16) is especially useful because the formal power series converge (for example in L1 (M, dμ)) and
= exp [[1/2] a2]. (1.17)
(1.17) can be obtained by direct computation from (1.8) or by noting it holds if a = it (t real) on account of (1.6) and then analytically continuing or by using (I.9). Thus, for a g.r.v. of variance :
[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII]. (1.18a)
By multiplying the series for exp (af) and exp (-1/2 a2) together, we find that
[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII]. (1.18b)
Conversely,
[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII] (1.19a)
so that
[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII]. (1.19b)
Remarks:
1. We emphasize that (1.17), (1.18) and (1.19) are for the special case of g.r.v.
2. If = 1, :fn: = Hn (f) where Hn is the nth Hermite polynomial. This follows from (1.18a) and the fact that exp (ax - 1/2 a2) is the generating function for the Hermite polynomials.
3. If (M, Σ) supports two measures μ and υ so that f is a g.r.v. w.r.t. both μ and υ, we can form :fn:μ and :fn:υ and ask for transformation laws from one to the other. From (1.18a) we find
[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII] (1.20a)
so that
[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII]. (1.20b)
4. (1.9), (1.13), (1.18) and (1.20) all generally go under the name of "Wick's theorem."
One can use (1.18a) to compute expectations of products of Wick powers. We will compute for the product of two powers, but a similar method works for more than two factors. In particular, in Section 1.5, we will quote the result for the product of four Wick powers without proof.
Theorem 1.3. Let f and g be g.r.v. Then
[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII]. (1.21)
Proof. [MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII].
Thus
[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII].
(1.21) follows by expanding the exponentials.
Remark:
(1.21) is special to g.r.v. In fact, if f is a random variable with = 0 and <:fn: :fm:> = 0 if n ≠ m, then f is a g.r.v. To see this note that = 0 allows us to compute in terms of fn and thus :fn+1: inductively.
* * *
Now consider several random variables f1, ..., fk. The Wick product [MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII] is defined recursively in [MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII] and
[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII].
With this definition one has a binomial theorem
[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII]. (1.22)
There is also a multinomial theorem.
Warning! Not all algebraic relations are preserved by : : . For example ff-1 = f0 but if g = f-1 and = = 0, then 0 = :fg: ≠ :f0: = 1.
Corollary 1.4
(a) If f1, ..., fnand g1, ... gm; are g.r.v. and n ≠ m then
<:f1 ... fn: :g1 ... gm:> = 0 (1.23a)
(b) If f1, ..., fkare g.r.v. with = δij, then
[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII]. (1.23b)
Proof. Follows from the multinomial theorem and Theorem 1.3.
* * *
Definition. Let (M, Σ, μ) be a probability measure space. Let V be a (real) vector space. A random process indexed by V is a map φ from V to the random variables on M, so that (almost everywhere):
[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII].
Remarks:
1. Often V is a topological vector space and φ is required to be continuous when the random variables are given the topology of convergence in measure or (with restriction on the range of φ) an LP-topology.
2. In many applications, V is a vector space of functions on Rn such as C∞0 (Rn) or S(Rn) [145; Chapter V] in which case one thinks of φ as a "random-variable-valued distribution" and writes φ(f) = ∫ φ(x)f(x)dnx (formally).
(Continues...)
Excerpted from The P(f)2 Euclidean (Quantum) Field Theory by Barry Simon. Copyright © 1974 Princeton University Press. Excerpted by permission of PRINCETON UNIVERSITY PRESS.
All rights reserved. No part of this excerpt may be reproduced or reprinted without permission in writing from the publisher.
Excerpts are provided by Dial-A-Book Inc. solely for the personal use of visitors to this web site.