z-value

(noun)

the standardized value of an observation found by subtracting the mean from the observed value, and then dividing that value by the standard deviation; also called $z$-score

Related Terms

  • standard deviation

Examples of z-value in the following topics:

  • Summary of Formulas

    • To find the kth percentile when the z-score is known: k = µ + ( z ) σ
  • The Standard Normal Distribution

    • The standard normal distribution is a normal distribution of standardized values called z-scores.
    • A z-score is measured in units of the standard deviation.
    • x = µ + ( z ) σ = 5 + ( 3 )( 2 ) = 11 (6.1)
    • The transformation z = (x − µ)/σ produces the distribution Z ∼ N ( 0,1 ) .
    • The value x comes from a normal distribution with mean µ and standard deviation σ.
  • Normal probability table

    • A normal probability table, which lists Z scores and corresponding percentiles, can be used to identify a percentile based on the Z score (and vice versa).
    • Generally, we round Z to two decimals, identify the proper row in the normal probability table up through the first decimal, and then determine the column representing the second decimal value.
    • We can also find the Z score associated with a percentile.
    • For example, to identify Z for the 80th percentile, we look for the value closest to 0.8000 in the middle portion of the table: 0.7995.
    • We determine the Z score for the 80th percentile by combining the row and column Z values: 0.84.
  • Z-Scores and Location in a Distribution

    • Thus, a positive $z$-score represents an observation above the mean, while a negative $z$-score represents an observation below the mean.
    • $z$-scores are also called standard scores, $z$-values, normal scores or standardized variables.
    • The use of "$z$" is because the normal distribution is also known as the "$z$ distribution."
    • The absolute value of $z$ represents the distance between the raw score and the population mean in units of the standard deviation.
    • Define $z$-scores and demonstrate how they are converted from raw scores
  • Z-scores

    • The z-score tells you how many standard deviations that the value x is above (to the right of) or below (to the left of) the mean, µ.
    • Values of x that are larger than the mean have positive z-scores and values of x that are smaller than the mean have negative z-scores.
    • The z-score for y = 4 is z = 2.
    • The values -6 and 6 are within 1 standard deviation of the mean 50.
    • The values -12 and 12 are within 2 standard deviations of the mean 50.
  • The chi-square test statistic

    • That is, Z 1, Z 2, Z 3, and Z 4 must be combined somehow to help determine if they – as a group – tend to be unusually far from zero.
    • |Z 1 | + |Z 2 | + |Z 3 | + |Z 4 | = 4.58
    • However, it is more common to add the squared values:
    • The test statistic X 2 , which is the sum of the Z 2 values, is generally used for these reasons.
    • Using this distribution, we will be able to obtain a p-value to evaluate the hypotheses.
  • When Does the Z-Test Apply?

    • For each significance level, the $Z$-test has a single critical value (for example, $1.96$ for 5% two tailed) which makes it more convenient than the Student's t-test which has separate critical values for each sample size.
    • We then calculate the standard score $Z = \frac{(T-\theta)}{s}$, from which one-tailed and two-tailed $p$-values can be calculated as $\varphi(-Z)$ (for upper-tailed tests), $\varphi(Z)$ (for lower-tailed tests) and $2\varphi(\left|-Z\right|)$ (for two-tailed tests) where $\varphi$ is the standard normal cumulative distribution function.
    • To calculate the standardized statistic $Z = \frac{(X − μ_0)} {s}$ , we need to either know or have an approximate value for $\sigma^2$, from which we can calculate $s^2 = \frac{\sigma^2}{n}$.
    • For larger sample sizes, the $t$-test procedure gives almost identical $p$-values as the $Z$-test procedure.
    • $Z$-tests focus on a single parameter, and treat all other unknown parameters as being fixed at their true values.
  • Standardizing with Z scores

    • Observations above the mean always have positive Z scores while those below the mean have negative Z scores.
    • SAT score of 1500), then the Z score is 0.
    • One observation x1 is said to be more unusual than another observation x2 if the absolute value of its Z score is larger than the absolute value of the other observation's Z score: |Z1| > |Z2|.
    • 3.4: (a) Its Z score is given by Z = (x−)/σ = 5.19−3 2 = 2.19/2 = 1.095. ( b) The observation x is 1.095 standard deviations above the mean.We know it must be above the mean since Z is positive.
    • 3.6: Because the absolute value of Z score for the second observation is larger than that of the first, the second observation has a more unusual head length.
  • Finding the Area Under the Normal Curve

    • In order to do this, we use a $z$-score table.
    • However, this is the probability that the value is less than 1.17 sigmas above the mean.
    • The difficulty arrises from the fact that our table of values does not allow us to directly calculate $P(Z\leq -1.16)$.
    • This table gives the cumulative probability up to the standardized normal value $z$.
    • Interpret a $z$-score table to calculate the probability that a variable is within range in a normal distribution
  • Solving Systems of Equations in Three Variables

    • The introduction of the variable z means that the graphed functions now represent planes, rather than lines.
    • Plug in these values to each of the equations to see that the solution satisfies all three of the equations.
    • Since the coefficient of z is already 1 in the first equation, solve for z to get:
    • Now that you have the value of y, work back up the equation.
    • Working up again, plug $(1,2)$ into the first substituted equation and solve for z:
Subjects
  • Accounting
  • Algebra
  • Art History
  • Biology
  • Business
  • Calculus
  • Chemistry
  • Communications
  • Economics
  • Finance
  • Management
  • Marketing
  • Microbiology
  • Physics
  • Physiology
  • Political Science
  • Psychology
  • Sociology
  • Statistics
  • U.S. History
  • World History
  • Writing

Except where noted, content and user contributions on this site are licensed under CC BY-SA 4.0 with attribution required.