orthogonal

(adjective)

Of two objects, at right angles; perpendicular to each other.

Related Terms

  • interferometer
  • superposition principle
  • vector

Examples of orthogonal in the following topics:

  • A Geometrical Picture

    • Therefore all the elements in the null space are orthogonal to all the elements in the row space.
    • In mathematical terminology, the null space and the row space are orthogonal complements of one another.
    • Or, to say the same thing, they are orthogonal subspaces of $\mathbf{R}^{m}$ .
    • Similarly, vectors in the left null space of a matrix are orthogonal to all the columns of this matrix.
    • This means that the left null space of a matrix is the orthogonal complement of the column $\mathbf{R}^{n}$ .
  • Superposition and orthogonal projection

    • But the solution is especially simple if the $\mathbf{x}_i$ are orthogonal.
    • In this case we can find the coefficients easily by projecting onto the orthogonal directions:
    • If the basis functions $q_i(x)$ are "orthogonal", then we should be able to compute the Fourier coefficients by simply projecting the function $f(x)$ onto each of the orthogonal "vectors" $q_i(x)$ .
    • Then we will say that two functions are orthogonal if their inner product is zero.
    • Now we simply need to show that the sines and cosines (or complex exponentials) are orthogonal.
  • Some Special Matrices

    • A matrix $Q \in \mathbf{R}^{{n \times n}}$ is said to be orthogonal if $Q^TQ = I_n$ .
    • So why are these matrices called orthogonal?
    • An orthogonal matrix has an especially nice geometrical interpretation.
    • Therefore an orthogonal matrix maps a vector into another vector of the same norm.
  • The Linear Algebra of the DFT

    • The matrix $Q$ is almost orthogonal.
    • We have said that a matrix $A$ is orthogonal if $A A^T = A^T A = I$, where $I$ is the N-dimensional identity matrix.
    • For complex matrices we need to generalize this definition slightly; for complex matrices we will say that $A$ is orthogonal if $(A^T)^* A = A (A^T)^* = I$ .
    • Once again, orthogonality saves us from having to solve a linear system of equations: since $Q^* = Q^{-1}$ , we have
  • Properties of Spherical Harmonics and Legendre Polynomials

    • The Legendre polynomials and the spherical harmonics satisfy the following "orthogonality" relations.
    • We will see shortly that these properties are the analogs for functions of the usual orthogonality relations you already know for vectors.
    • Notice that the second relation is slightly different than the others; it says that for any given value of $m$, the polynomials $P_{\ell m}$ and $P_{\ell ' m}$ are orthogonal.
    • To compute the coefficients of this expansion we use the orthogonality relation exactly as you would with an ordinary vector.
  • Orthogonal decomposition of rectangular matrices

    • However, there is an amazingly useful generalization that pertains if we allow a different orthogonal matrix on each side of $A$ .
    • And since $S$ is symmetric it has orthogonal eigenvectors $\mathbf{w}_i$$\lambda _ i$ with real eigenvalues $\lambda _ i$
    • Keep in mind that the matrices $U$ and $V$ whose columns are the model and data eigenvectors are square (respectively $n \times n$$m \times m$ and $m \times m$ ) and orthogonal.
  • Eigenvectors and Orthogonal Projections

    • Above we said that the matrices $V$ and $U$ were orthogonal so that $V^T V = V V^T = I_m$ and $U^T U = U U^T = I_n$ .
  • A Matrix Appears

    • are orthogonal.
    • This orthogonality is an absolutely fundamental property of the natural modes of vibration of linear mechanical systems.
  • Introduction to Least Squares

    • Since $A \mathbf{x_{ls}}$ is, by definition, confined to the column space of $A$ then $A\mathbf{x_{ls}} - \mathbf{y}$ (the error in fitting the data) must be in the orthogonal complement of the column space.
    • The orthogonal complement of the column space is the left null space, so $A\mathbf{x_{ls}} - \mathbf{y}$ must get mapped into zero by $A^T$ :
    • Before when we did orthogonal projections, the projecting vectors/matrices were orthogonal, so $A^TA$ term would have been the identity, but the outer product structure in $A\mathbf{x_{ls}}$ is evident.
  • A few examples

    • So, of course, to make our lives simple we will choose an orthogonal matrix.
    • And for symmetric matrices we nearly always choose to diagonalize with orthogonal matrices.
    • Are the eigenvectors orthogonal?
    • 3.22 As we have seen, an orthogonal matrix corresponds to a rotation.
    • Consider the eigenvalue problem for a simple orthogonal matrix such as
Subjects
  • Accounting
  • Algebra
  • Art History
  • Biology
  • Business
  • Calculus
  • Chemistry
  • Communications
  • Economics
  • Finance
  • Management
  • Marketing
  • Microbiology
  • Physics
  • Physiology
  • Political Science
  • Psychology
  • Sociology
  • Statistics
  • U.S. History
  • World History
  • Writing

Except where noted, content and user contributions on this site are licensed under CC BY-SA 4.0 with attribution required.