3.2. Let Xi, X2, X3 , X4 be a random sample from a distribution with density function
1--¥ f(xdfJ) = Pe
{0,
, for xi> 4
otherwise
where fJ > 0.
3.2.1. Find the maximum likelihood estimator of fJ
[6]
3.2.2. If the data from this random sample are 8. 2, 9.1, 10.6 and 4.9, respectively, what is the
maximum likelihood estimate of {3?
[3]
= 3.3. Observations Y1 , ... , Ynare assumed to come from a model with£(~)
2 + 0xi where
0 is an unknown parameter and Xi, x2 , ... , Xn are given constants. What is the least square
estimator of the parameter 0?
[6]
3.4. Supposethat£(0 1 ) = E(0 2 ) = 0, Var(0 1) = crfand Var(0 2) = crl,Furthermore, consider
= that 03 a0 1 + (1 - a)0 2, where a is any constant number. Then,
3.4.1. Show that 03 is unbiased estimator for 0
[2]
3.4.2. Find the efficiency of 01 relative to 03
(4]
Question 4 [9 marks)
4. Let X1,X2,..., Xnbe an independent Bernoulli random variables with probability of success 0 and
probability mass function
= f(xd0)
0X1(l - 0)l-X£
{ 0,
'
for X; = 0, 1
otherw•ise
Suppose 0 has a beta prior distribution with the parameters a and {3, with probability density
function
for O < 0 < l; a> 0 and {3 > 0
If the squared error loss function is used, show that the Bayes' estimator of 0 is given by
Ll~,qx,+a
a+P+n ·
= Hint: lfY~Beta(a,{3), then E(Y) a+ap·
[9]
Question 5 [20 marks)
5. Let Xi,X 2, ••• , Xn be a random sample from the exponential distribution with the parameter 0
and the probability density function xi is given by
for Xi> 0
otherwise
5.1. Show that the mean and variance of Xi are¼ and 0\\, respectively.
[6]
Hint: Mx,(t) = (e~t)
5.2. Show that the Xis a minimum variance unbiased estimator (MVUE)of¼-
[10]
5.3. Show that X is also a consistent estimator of¼-
[4)
Page 2 of2