continuous random variable

(noun)

obtained from data that can take infinitely many values

Related Terms

  • random variable
  • standard deviation
  • discrete random variable

Examples of continuous random variable in the following topics:

  • Introduction

    • Continuous random variables have many applications.
    • The field of reliability depends on a variety of continuous random variables.
    • This chapter gives an introduction to continuous random variables and the many continuous distributions.
    • NOTE: The values of discrete and continuous random variables can be ambiguous.
    • If X is the distance you drive to work, then you measure values of X and X is a continuous random variable.
  • Two Types of Random Variables

    • A random variable $x$, and its distribution, can be discrete or continuous.
    • Random variables can be classified as either discrete (that is, taking any of a specified list of exact values) or as continuous (taking any numerical value in an interval or collection of intervals).
    • Continuous random variables, on the other hand, take on values that vary continuously within one or more real intervals, and have a cumulative distribution function (CDF) that is absolutely continuous.
    • Selecting random numbers between 0 and 1 are examples of continuous random variables because there are an infinite number of possibilities.
    • The image shows the probability density function (pdf) of the normal distribution, also called Gaussian or "bell curve", the most important continuous random distribution.
  • Continuous Probability Distributions

    • A continuous probability distribution is a representation of a variable that can take a continuous range of values.
    • If the distribution of $X$ is continuous, then $X$ is called a continuous random variable.
    • Intuitively, a continuous random variable is the one which can take a continuous range of values—as opposed to a discrete distribution, in which the set of possible values for the random variable is at most countable.
    • While for a discrete distribution an event with probability zero is impossible (e.g. rolling 3 and a half on a standard die is impossible, and has probability zero), this is not so in the case of a continuous random variable.
    • In theory, a probability density function is a function that describes the relative likelihood for a random variable to take on a given value.
  • Recognizing and Using a Histogram

    • As mentioned, a histogram is an estimate of the probability distribution of a continuous variable.
    • To define probability distributions for the simplest cases, one needs to distinguish between discrete and continuous random variables.
    • In contrast, when a random variable takes values from a continuum, probabilities are nonzero only if they refer to finite intervals.
    • Intuitively, a continuous random variable is the one which can take a continuous range of values — as opposed to a discrete distribution, where the set of possible values for the random variable is, at most, countable.
    • If the distribution of $x$ is continuous, then $x$ is called a continuous random variable and, therefore, has a continuous probability distribution.
  • Probability

    • Integration is commonly used in statistical analysis, especially when a random variable takes a continuum value.
    • In probability theory, a probability density function (pdf), or density of a continuous random variable, is a function that describes the relative likelihood for this random variable to take on a given value.
    • The probability for the random variable to fall within a particular region is given by the integral of this variable's probability density over the region.
    • A probability density function is most commonly associated with absolutely continuous univariate distributions.
    • For a continuous random variable $X$, the probability of $X$ to be in a range $[a,b]$ is given as:
  • The Correction Factor

    • In probability theory, the expected value refers, intuitively, to the value of a random variable one would "expect" to find if one could repeat the random variable process an infinite number of times and take the average of the values obtained.
    • The weights used in computing this average are the probabilities in the case of a discrete random variable (that is, a random variable that can only take on a finite number of values, such as a roll of a pair of dice), or the values of a probability density function in the case of a continuous random variable (that is, a random variable that can assume a theoretically infinite number of values, such as the height of a person).
    • From a rigorous theoretical standpoint, the expected value of a continuous variable is the integral of the random variable with respect to its probability measure.
    • Thus, for a continuous random variable the expected value is the limit of the weighted sum, i.e. the integral.
    • Suppose we have a random variable X, which represents the number of girls in a family of three children.
  • Expected Value

    • In probability theory, the expected value refers, intuitively, to the value of a random variable one would "expect" to find if one could repeat the random variable process an infinite number of times and take the average of the values obtained.
    • The weights used in computing this average are the probabilities in the case of a discrete random variable (that is, a random variable that can only take on a finite number of values, such as a roll of a pair of dice), or the values of a probability density function in the case of a continuous random variable (that is, a random variable that can assume a theoretically infinite number of values, such as the height of a person).
    • From a rigorous theoretical standpoint, the expected value of a continuous variable is the integral of the random variable with respect to its probability measure.
    • Thus, for a continuous random variable the expected value is the limit of the weighted sum, i.e. the integral.
    • Suppose we have a random variable $X$, which represents the number of girls in a family of three children.
  • Expectation

    • We call a variable or process with a numerical outcome a random variable, and we usually represent this random variable with a capital letter such as X, Y , or Z.
    • The amount of money a single student will spend on her statistics books is a random variable, and we represent it by X.
    • The expected value for a random variable represents the average outcome.
    • It is also possible to compute the expected value of a continuous random variable (see Section 2.5).
    • The probability distribution for the random variable X, repre- senting the bookstore's revenue from a single student
  • Chance Processes

    • A stochastic process is a collection of random variables that is often used to represent the evolution of some random value over time.
    • In probability theory, a stochastic process--sometimes called a random process-- is a collection of random variables that is often used to represent the evolution of some random value, or system, over time.
    • In other words, a stochastic process is a random function whose arguments are drawn from a range of continuously changing values.
    • Random variables are non-deterministic (single) quantities which have certain probability distributions.
    • Random variables corresponding to various times (or points, in the case of random fields) may be completely different.
  • Summary

    • Quantitative Data (a number)- Discrete (You count it. )- Continuous (You measure it. )
Subjects
  • Accounting
  • Algebra
  • Art History
  • Biology
  • Business
  • Calculus
  • Chemistry
  • Communications
  • Economics
  • Finance
  • Management
  • Marketing
  • Microbiology
  • Physics
  • Physiology
  • Political Science
  • Psychology
  • Sociology
  • Statistics
  • U.S. History
  • World History
  • Writing

Except where noted, content and user contributions on this site are licensed under CC BY-SA 4.0 with attribution required.