**QUESTION**

3)LetX, Y be two continuous random variables with joint density function

8 xy / q ^{4}if0 < y < x< q

Don't use plagiarized sources. Get Your Custom Essay on

theory of statistics

Just from $13/Page

f(x,y½q)=

We want to estimate q relative to Mean Square Error (MSE).From lecture we already know that X is a sufficient statistic forq and has density function

f_{X }(x½q)=4 x^{3} / q^{4} if0 < x < q

0 otherwise

Consider the estimatord (x,y) =3 x y. Without computing any MSE find an estimator that will have smaller MSE than the estimatord (x,y)=3 x y and explain why that should be true.

4)We have one observation,Y,having pdf, for -1≤q≤1, equal to

2 y q + 1if-1/2<y<1/2

f (y½q)=

0otherwise.

**NOTE**:For f (y½q) to be a density function the only possible values ofq must be in the closed interval -1≤ q≤1.

Consider the estimator W(y) defined as

W( y)= 1ify³0

-1ify <0

a.IsW(y)an unbiased estimator ofq?

b..Show that W(y) is an MLEofq.

## ANSWER

- a) To determine if W(y) is an unbiased estimator of q, we need to calculate its expected value and check if it equals q.

For the range y ≥ 0, W(y) is defined as 1. Therefore, for y ≥ 0, the expected value of W(y) is:

E[W(y) | y ≥ 0] = 1 * P(y ≥ 0) = 1 * 1 = 1

For the range y < 0, W(y) is defined as -1. Therefore, for y < 0, the expected value of W(y) is:

E[W(y) | y < 0] = (-1) * P(y < 0) = (-1) * 1 = -1

Since the range -1 ≤ q ≤ 1, the expected value of W(y) is a piecewise function:

E[W(y)] = 1 * P(y ≥ 0) + (-1) * P(y < 0)

To determine the probability distribution for Y, we integrate the given pdf over its respective ranges:

P(y ≥ 0) = ∫[0,∞] (2yq + 1) dy = q + 1

P(y < 0) = ∫[-∞,0] (2yq + 1) dy = q – 1

Plugging these values into the expected value equation:

E[W(y)] = 1 * (q + 1) + (-1) * (q – 1)

= q + 1 – q + 1

= 2

Since E[W(y)] = 2 ≠ q, we can conclude that W(y) is not an unbiased estimator of q.

- b) To show that W(y) is the maximum likelihood estimator (MLE) of q, we need to maximize the likelihood function based on the given pdf.

The likelihood function L(q) is defined as the joint probability density function (pdf) evaluated at the observed data point Y. In this case, we have one observation Y with the pdf given by:

f(y|q) = (2yq + 1) if -1/2 < y < 1/2

0 otherwise

To find the MLE of q, we maximize the likelihood function L(q) by taking the derivative with respect to q and setting it equal to zero:

d/dq [L(q)] = d/dq [f(y|q)] = 0

For y in the range -1/2 < y < 1/2, the likelihood function can be written as:

L(q) = (2yq + 1)

Taking the derivative with respect to q:

d/dq [(2yq + 1)] = 2y

Setting it equal to zero:

2y = 0

Since y ≠ 0 (as y is in the range -1/2 < y < 1/2), there is no q that satisfies this equation. Therefore, we cannot find a unique value of q that maximizes the likelihood function for the given pdf.

Hence, we cannot conclude that W(y) is the MLE of q based on the provided pdf.