本次CS代写的主要涉及如下领域: 北美程序代写,加拿大程序代写,数学Math代写,University of British Columbia代写
January 30, 2020
Math 318 Assignment 4: Due Wednesday, February 5 at start of class
Reminder: Test 1 will be held in class on Monday February 10, and will be based on the material covered in Assignments 1–4. No assignment will be given on February 5; Assignment 5 will be available on February 12.
 Problems to be handed in:
 A particle of mass 1 g has a random velocity X that is uniformly distributed between 2 cm/s and 3 cm/s.
 A particle of mass 1 g has a random velocity X that is uniformly distributed between 2 cm/s and 3 cm/s.



 Find the probability density function of T .
 Determine the mean and standard deviation of T in two ways:
 using the p.d.f. of T from part (b),
 using the Law of the Unconscious Statistician and the uniform density of X.
 Let X1, X2, . . . , Xn be independent random variables, each with uniform distribution on (0, 1). Let

M be the minimum of these random variables.


 Show that the cumulative distribution function of M is FM (x) = 1 − (1 − x)n, 0 ≤ x ≤ 1.
 Find the probability density function of M .
 Determine the mean and variance of M .


(The first is positive and the second is negative, consistent with the fact that X2 increases when X
increases and X2 decreases when X decreases, whereas the opposite is true for Y and Y 2.)

 The time T (in hours past noon) until the arrival of the first taxi has Exponential(5) distribution, and the time B until the first bus is independent with Exponential(3) distribution.
 Write down the joint probability density function of T and B.
 Find the probability that the first taxi arrives before the first bus.
 If you arrive at noon and take the first bus or taxi (whichever arrives first), what is the distribution of your waiting time (give the name and parameter(s))? [Hint: let X = min(T, B), and find P (X > y).]
 The time T (in hours past noon) until the arrival of the first taxi has Exponential(5) distribution, and the time B until the first bus is independent with Exponential(3) distribution.



 A point is uniformly chosen in the unit disk 0 ≤ x2 + y2 ≤ 1. Find the probability that its distance from the origin is less than r, for 0 ≤ r ≤ 1.
 Compute its expected distance from the origin.
 Let the coordinates of the point be (X, Y ). Determine the marginal p.d.f. of X. Are X and Y

independent?





 Another way to represent points in this unit circle is via polar coordinates. We might try naively to generate uniform random points in the circle by first generating a random radius R uniformly between 0 and 1, and then by generating a random angle T uniformly between 0 and 2π. Generate 5000 such random pairs (R, T ) and create a scatterplot of the resulting points in the standard xyplane; that is, so that each pair gives the point (x, y) = (R cos T, R sin T ). Compare this scatterplot to the one you created in (a). Does this look uniformly random?
 By definition, the density of uniformly random points in the circle with respect to polar coor dinates is the function f (r, θ) for which, if A is a subset of the circle,

∫∫A
area of A
f (r, θ) dr dθ = . π
Using your knowledge of multivariate calculus, what must f (r, θ) be?

 In this problem, print and submit code and plots, together with written answers to the questions.
 Use Python to simulate a standard normal random variable 10000 times and make a plot of the running average. That is, if Xi is the ith simulated value, make a plot of
 In this problem, print and submit code and plots, together with written answers to the questions.
X1 + · · · + Xn
n
against n for n from 1 to 10000. Does this plot look like it converges to 0?

Consider the model described in class where a spinner is located one unit of distance away from an infinite wall. The angle Y that the pointer makes with the perpendicular to the wall is uniformly distributed over the interval ( π/2, π/2). The distance X of the point that the spinner points at to the perpendicular is thus X = tan Y .
It was argued in class that X has no expectation (despite the fact that it might look as if it should have expected value 0). To illustrate this, use Python to simulate X 10000 times and make a plot of the running average of the simulated values. Does this plot look like it converges to 0? Does it look like it converges at all?
 Recommended problems: These provide additional practice but are not to be handed in. A. Chapter 2: 41[2(n − 1)p(1 − p)], 43[n/(m + 1)], 50, 51[r/p], 57.
 Chapter 5: 1 [e−1, e−1], 2 [6/µ], 3, 4 [assume that all service times are independent; 0, 1 , 1 ].
27 4
 Optional problem for those interested in quantum mechanics and the uncertainty principle:


Consider a quantum mechanical system in state ψ, where ψ is vector in the Hilbert space consisting of complexvalued squareintegrable f∫unctions, with inner product (ψ1, ψ2) = ψ1(x)∗ψ2(x)dx. Assume that
ψ is normalized, so that (ψ, ψ) = 
represented by selfa∫djoint linear operators on the Hilbert space. The expected value of an observable A is
given by (ψ, Aψ) = 
ψ(x)∗(Aψ)(x)dx. The standard deviation σ(A) of a measurement of the observable 
A is given by σ(A)2 = (ψ, (A−(ψ, Aψ))2ψ). It is a general mathematical theorem that for any selfadjoint
linear operators A and B, with commutator [A, B] = AB − BA,
2 2 1 2
(ψ, A ψ)(ψ, B ψ) ≥ 4 (ψ, [A, B]ψ) .
(You can prove this by adapting the proof of the Cauchy–Schwarz inequality, or see Lemma IV.6.1 of
E. Prugoveˇcki, Quantum Mechanics in Hilbert Space, Academic Press, 1971.)
The commutator of the position and momentum operators is [X , P] = ki. Using the above, show that
σ(X )σ(P) ≥ k/2. This is a statement of the uncertainty principle.