A sample of 100 independent random numbers is taken from this distribution, and its average is used to estimate the mean of the distribution

Question

A sample of 100 independent random numbers is taken from this distribution, and its average is used to estimate the mean of the distribution. What isthe standard error of this estimate?

in progress 0
Ella 3 weeks 2021-09-26T14:09:39+00:00 1 Answer 0

Answers ( )

    0
    2021-09-26T14:11:08+00:00

    Answer:

    The standard error in estimating the mean = (0.1 × standard deviation of the distribution)

    Step-by-step explanation:

    The standard error of the mean, for a sample, σₓ is related to the standard deviation, σ, through the relation

    σₓ = σ/(√n)

    n = sample size = 100

    σₓ = σ/(√100)

    σₓ = (σ/10) = 0.1σ

    Hence, the standard error in estimating the mean = (0.1 × standard deviation of the distribution)

Leave an answer

45:7+7-4:2-5:5*4+35:2 =? ( )