orthogonal

Statistics

(adjective)

statistically independent, with reference to variates

Related Terms

  • two-way ANOVA
  • homoscedastic
  • least squares
  • polynomial regression
Physics

(adjective)

Of two objects, at right angles; perpendicular to each other.

Related Terms

  • interferometer
  • superposition principle
  • vector
Art History

(noun)

In linear perspective drawing, the diagonal line pointing to the vanishing point; sometimes referred to as vanishing or convergence line.

Related Terms

  • colonnade
  • abacus
  • volute
  • dipteral
  • stoa
  • acanthus

Examples of orthogonal in the following topics:

  • A Geometrical Picture

    • Therefore all the elements in the null space are orthogonal to all the elements in the row space.
    • In mathematical terminology, the null space and the row space are orthogonal complements of one another.
    • Or, to say the same thing, they are orthogonal subspaces of Rm\mathbf{R}^{m}R​m​​ .
    • Similarly, vectors in the left null space of a matrix are orthogonal to all the columns of this matrix.
    • This means that the left null space of a matrix is the orthogonal complement of the column Rn\mathbf{R}^{n}R​n​​ .
  • Superposition and orthogonal projection

    • But the solution is especially simple if the xi\mathbf{x}_ix​i​​ are orthogonal.
    • In this case we can find the coefficients easily by projecting onto the orthogonal directions:
    • If the basis functions qi(x)q_i(x)q​i​​(x) are "orthogonal", then we should be able to compute the Fourier coefficients by simply projecting the function f(x)f(x)f(x) onto each of the orthogonal "vectors" qi(x)q_i(x)q​i​​(x) .
    • Then we will say that two functions are orthogonal if their inner product is zero.
    • Now we simply need to show that the sines and cosines (or complex exponentials) are orthogonal.
  • Some Special Matrices

    • A matrix Q∈Rn×nQ \in \mathbf{R}^{{n \times n}}Q∈R​n×n​​ is said to be orthogonal if QTQ=InQ^TQ = I_nQ​T​​Q=I​n​​ .
    • So why are these matrices called orthogonal?
    • An orthogonal matrix has an especially nice geometrical interpretation.
    • Therefore an orthogonal matrix maps a vector into another vector of the same norm.
  • The Linear Algebra of the DFT

    • The matrix QQQ is almost orthogonal.
    • We have said that a matrix AAA is orthogonal if AAT=ATA=IA A^T = A^T A = IAA​T​​=A​T​​A=I, where III is the N-dimensional identity matrix.
    • For complex matrices we need to generalize this definition slightly; for complex matrices we will say that AAA is orthogonal if (AT)∗A=A(AT)∗=I(A^T)^* A = A (A^T)^* = I(A​T​​)​∗​​A=A(A​T​​)​∗​​=I .
    • Once again, orthogonality saves us from having to solve a linear system of equations: since Q∗=Q−1Q^* = Q^{-1}Q​∗​​=Q​−1​​ , we have
  • Experimental Design

    • Orthogonality: Orthogonality concerns the forms of comparison (contrasts) that can be legitimately and efficiently carried out.
    • Contrasts can be represented by vectors and sets of orthogonal contrasts are uncorrelated and independently distributed if the data are normal.
    • Because of this independence, each orthogonal treatment provides different information to the others.
    • If there are TTT treatments and T−1T-1T−1 orthogonal contrasts, all the information that can be captured from the experiment is obtainable from the set of contrasts.
    • Outline the methodology for designing experiments in terms of comparison, randomization, replication, blocking, orthogonality, and factorial experiments
  • Properties of Spherical Harmonics and Legendre Polynomials

    • The Legendre polynomials and the spherical harmonics satisfy the following "orthogonality" relations.
    • We will see shortly that these properties are the analogs for functions of the usual orthogonality relations you already know for vectors.
    • Notice that the second relation is slightly different than the others; it says that for any given value of mmm, the polynomials PℓmP_{\ell m}P​ℓm​​ and Pℓ′mP_{\ell ' m}P​ℓ​′​​m​​ are orthogonal.
    • To compute the coefficients of this expansion we use the orthogonality relation exactly as you would with an ordinary vector.
  • Orthogonal decomposition of rectangular matrices

    • However, there is an amazingly useful generalization that pertains if we allow a different orthogonal matrix on each side of AAA .
    • And since SSS is symmetric it has orthogonal eigenvectors wi\mathbf{w}_iw​i​​λi\lambda _ iλ​i​​ with real eigenvalues λi\lambda _ iλ​i​​
    • Keep in mind that the matrices UUU and VVV whose columns are the model and data eigenvectors are square (respectively n×nn \times nn×nm×mm \times mm×m and m×mm \times mm×m ) and orthogonal.
  • Eigenvectors and Orthogonal Projections

    • Above we said that the matrices VVV and UUU were orthogonal so that VTV=VVT=ImV^T V = V V^T = I_mV​T​​V=VV​T​​=I​m​​ and UTU=UUT=InU^T U = U U^T = I_nU​T​​U=UU​T​​=I​n​​ .
  • A Matrix Appears

    • are orthogonal.
    • This orthogonality is an absolutely fundamental property of the natural modes of vibration of linear mechanical systems.
  • Introduction to Least Squares

    • Since AxlsA \mathbf{x_{ls}}Ax​ls​​ is, by definition, confined to the column space of AAA then Axls−yA\mathbf{x_{ls}} - \mathbf{y}Ax​ls​​−y (the error in fitting the data) must be in the orthogonal complement of the column space.
    • The orthogonal complement of the column space is the left null space, so Axls−yA\mathbf{x_{ls}} - \mathbf{y}Ax​ls​​−y must get mapped into zero by ATA^TA​T​​ :
    • Before when we did orthogonal projections, the projecting vectors/matrices were orthogonal, so ATAA^TAA​T​​A term would have been the identity, but the outer product structure in AxlsA\mathbf{x_{ls}}Ax​ls​​ is evident.
Subjects
  • Accounting
  • Algebra
  • Art History
  • Biology
  • Business
  • Calculus
  • Chemistry
  • Communications
  • Economics
  • Finance
  • Management
  • Marketing
  • Microbiology
  • Physics
  • Physiology
  • Political Science
  • Psychology
  • Sociology
  • Statistics
  • U.S. History
  • World History
  • Writing

Except where noted, content and user contributions on this site are licensed under CC BY-SA 4.0 with attribution required.