Download Introduction to Probability (2nd Edition by Bertsekas) PDF

TitleIntroduction to Probability (2nd Edition by Bertsekas)
LanguageEnglish
File Size17.1 MB
Total Pages539
Table of Contents
                            Title Page
Preface
Preface to the Second Edition
CONTENTS
1. Sample Space and Probability
	1.1 SETS
	1.2 PROBABILISTIC MODELS
	1.3 CONDITIONAL PROBABILITY
	1.4 TOTAL PROBABILITY THEOREM AND BAYES' RULE
	1.5 INDEPENDENCE
	1.6 COUNTING
	SUMMARY AND DISCUSSION
	PROBLEMS
2. Discrete Random Variables
	2.1 BASIC CONCEPTS
	2.2 PROBABILITY MASS FUNCTIONS
	2.3 FUNCTIONS OF RANDOM VARIABLES
	2.4 EXPECTATION, MEAN, AND VARIANCE
	2.5 JOINT PMFS OF MULTIPLE RANDOM VARIABLES
	2.6 CONDITIONING
	2.7 INDEPENDENCE
	2.8 SUMMARY AND DISCUSSION
	PROBLEMS
3. General Random Variables
	3.1 CONTINUOUS RANDOM VARIABLES AND PDFS
	3.2 CUMULATIVE DISTRIBUTION FUNCTIONS
	3.3 NORMAL RANDOM VARIABLES
	3.4 JOINT PDFS OF MULTIPLE RANDOM VARIABLES
	3.5 CONDITIONING
	3.6 THE CONTINUOUS BAYES' RULE
	3.7 SUMMARY AND DISCUSSION
	PROBLEMS
4. Further Topics on Random Variables
	4.1 DERIVED DISTRIBUTIONS
	4.2 COVARIANCE AND CORRELATION
	4.3 CONDITIONAL EXPECTATION AND VARIANCE REVISITED
	4.4 TRANSFORMS
	4.5 SUM OF A RANDOM NUMBER OF INDEPENDENT RANDOMVARIABLES
	4.6 SUMMARY AND DISCUSSION
	PROBLEMS
5. Limit Theorems
	5.1 MARKOV AND CHEBYSHEV INEQUALITIES
	5.2 THE WEAK LAW OF LARGE NUMBERS
	5.3 CONVERGENCE IN PROBABILITY
	5.4 THE CENTRAL LIMIT THEOREM
	5.5 THE STRONG LAW OF LARGE NUMBERS
	5.6 SUMMARY AND DISCUSSION
	PROBBLEMS
6. The Bernoulli and Poisson Processes
	6.1 THE BERNOULLI PROCESS
	6.2 THE POISSON PROCESS
	6.3 SUMMARY AND DISCUSSION
	PROBLEMS
7. Markov Chains
	7.1 DISCRETE-TIME MARKOV CHAINS
	7.2 CLASSIFICATION OF STATES
	7.3 STEADY-STATE BEHAVIOR
	7.4 ABSORPTION PROBABILITIES AND EXPECTED TIMETO ABSORPTION
	7.5 CONTINUOUS-TIME MARKOV CHAINS
	7.6 SUMMARY AND DISCUSSION
	PROBLEMS
8. Bayesian Statistical Inference
	8.1 BAYESIAN INFERENCE AND THE POSTERIOR DISTRIBUTION
	8.2 POINT ESTIMATION, HYPOTHESIS TESTING, AND THE MAP RULE
	8.3 BAYESIAN LEAST MEAN  SQUARES ESTIMATION
	8.4 BAYESIAN LINEAR LEAST MEAN SQUARES ESTIMATION
	8.5 SUMMARY AND DISCUSSION
	PROBLEMS
9. Classical Statistical Inference
	9.1 CLASSICAL PARAMETER ESTIMATION
	9.2 LINEAR REGRESSION
	9.3 BINARY HYPOTHESIS TESTING
	9.4 SIGNIFICANCE TESTING
	9.5 SUMMARY AND DISCUSSION
	PROBLEMS
INDEX
                        
Document Text Contents
Page 1

Introduction to Probability

SECOND EDITION

Dimitri P. Bertsekas and John N. Tsitsiklis

Massachusetts Institute of Technology

WWW site for book information and orders
http://www.athenasc.com

~ Athena Scientific, Belmont, Massachusetts

Page 269

Problems

(b) We have

b
L 1 sk - e b - a + l k=a

sa b-a
_

e � esk b - a + l L..,. k=O
esa es(b-a+l ) - 1

b - a + l

sx Jb eSx eSb - esa M(s) = E[e 1 = a b _ a dx = s(b - a) .

259

Problem 40. * Suppose that the transform associated with a discrete random variable
X has the form

A(eS) M(s) = B(es ) '
where A(t) and B(t) are polynomials of the generic variable t . Assume that A(t) and
B(t) have no common roots and that the degree of A(t) is smaller than the degree of
B(t) . Assume also that B(t) has distinct, real, and nonzero roots that have absolute
value greater than 1 . Then it can be seen that M(s) can be written in the form

where I/Tl , . . . , I/Tm are the roots of B(t) and the ai are constants that are equal to
limes_ .!. ( 1 - TieS)M(s) , i = 1 , . . . , m. ri

(a) Show that the PMF of X has the form

P(X = k) = {f aiTf ' t= l
0,

if k = 0, 1 , . . . ,

otherwise.

Note: For large k, the PMF of X can be approximated by a-;TI ' where I is the
index corresponding to the largest ITi l (assuming I is unique) .

(b) Extend the result of part (a) to the case where M(s) = ebs A(eS ) / B (eS ) and b is
an integer.

Solution. (a) We have for all s such that ITi les < 1

Therefore,

Page 270

260 Further Topics on Random Variables

and by inverting this transform, we see that

m

P(X = k) I: air:
t=l

Chap. 4

for k ;::: 0, and P(X = k) = 0 for k < O. Note that if the coefficients a, are nonnegative,
this PMF is a mixture of geometric PMFs.

(b) In this case, AI (s) corresponds to the translation by b of a random variable whose
transform is A(e8 )/ B(e8 ) (cf. Example 4 .25 ) , so we have

P(X = k) = { �; a,rj'-b) , if k = b, b + 1 " " ,
0 , otherwise.

SECTION 4.4. Sum of a Random Number of Independent Random
Variables

Problem 41. At a certain time, the number of people that enter an elevator is a
Poisson random variable with parameter A. The weight of each person is independent
of every other person's weight, and is uniformly distributed between 100 and 200 lbs.
Let Xi be the fraction of 100 by which the ith person exceeds 100 lbs, e.g. , if the 7th

person weighs 175 lbs. , then X7 0.75. Let Y be the sum of the Xi .

(a) Find the transform associated with Y.

(b) Use the transform to compute the expected value of Y.
(c) Verify your answer to part (b) by using the law of iterated expectations.

Problem 42. Construct an example to show that the sum of a random number of
independent normal random variables is not normal (even though a fixed sum is) .
Problem 43. A motorist goes through 4 lights, each of which is found to be red with
probability 1/2 . The waiting times at each light are modeled as independent normal
random variables with mean 1 minute and standard deviation 1 /2 minute. Let X be
the total waiting time at the red lights.

(a) Use the total probability theorem to find the PDF and the transform associated
with X, and the probability that X exceeds 4 minutes. Is X normal?

(b) Find the transform associated with X by viewing X as a sum of a random number
of random variables.

Problem 44. Consider the calculation of the mean and variance of a sum

Y = X1 + " , + XN ,

where N is itself a sum of integer-valued random variables, i .e. ,

Page 539

ATHENA SCIENTIFIC BOOKS
OPTIMIZATION AND COMPUTATION SERIES

Similer Documents