orthogonal
Statistics
statistically independent, with reference to variates
Physics
Of two objects, at right angles; perpendicular to each other.
Art History
Examples of orthogonal in the following topics:
-
A Geometrical Picture
- Therefore all the elements in the null space are orthogonal to all the elements in the row space.
- In mathematical terminology, the null space and the row space are orthogonal complements of one another.
- Or, to say the same thing, they are orthogonal subspaces of .
- Similarly, vectors in the left null space of a matrix are orthogonal to all the columns of this matrix.
- This means that the left null space of a matrix is the orthogonal complement of the column .
-
Superposition and orthogonal projection
- But the solution is especially simple if the are orthogonal.
- In this case we can find the coefficients easily by projecting onto the orthogonal directions:
- If the basis functions are "orthogonal", then we should be able to compute the Fourier coefficients by simply projecting the function onto each of the orthogonal "vectors" .
- Then we will say that two functions are orthogonal if their inner product is zero.
- Now we simply need to show that the sines and cosines (or complex exponentials) are orthogonal.
-
Some Special Matrices
- A matrix is said to be orthogonal if .
- So why are these matrices called orthogonal?
- An orthogonal matrix has an especially nice geometrical interpretation.
- Therefore an orthogonal matrix maps a vector into another vector of the same norm.
-
The Linear Algebra of the DFT
- The matrix is almost orthogonal.
- We have said that a matrix is orthogonal if , where is the N-dimensional identity matrix.
- For complex matrices we need to generalize this definition slightly; for complex matrices we will say that is orthogonal if .
- Once again, orthogonality saves us from having to solve a linear system of equations: since , we have
-
Experimental Design
- Orthogonality: Orthogonality concerns the forms of comparison (contrasts) that can be legitimately and efficiently carried out.
- Contrasts can be represented by vectors and sets of orthogonal contrasts are uncorrelated and independently distributed if the data are normal.
- Because of this independence, each orthogonal treatment provides different information to the others.
- If there are treatments and orthogonal contrasts, all the information that can be captured from the experiment is obtainable from the set of contrasts.
- Outline the methodology for designing experiments in terms of comparison, randomization, replication, blocking, orthogonality, and factorial experiments
-
Properties of Spherical Harmonics and Legendre Polynomials
- The Legendre polynomials and the spherical harmonics satisfy the following "orthogonality" relations.
- We will see shortly that these properties are the analogs for functions of the usual orthogonality relations you already know for vectors.
- Notice that the second relation is slightly different than the others; it says that for any given value of , the polynomials and are orthogonal.
- To compute the coefficients of this expansion we use the orthogonality relation exactly as you would with an ordinary vector.
-
Orthogonal decomposition of rectangular matrices
- However, there is an amazingly useful generalization that pertains if we allow a different orthogonal matrix on each side of .
- And since is symmetric it has orthogonal eigenvectors with real eigenvalues
- Keep in mind that the matrices and whose columns are the model and data eigenvectors are square (respectively and ) and orthogonal.
-
Eigenvectors and Orthogonal Projections
- Above we said that the matrices and were orthogonal so that and .
-
A Matrix Appears
- are orthogonal.
- This orthogonality is an absolutely fundamental property of the natural modes of vibration of linear mechanical systems.
-
Introduction to Least Squares
- Since is, by definition, confined to the column space of then (the error in fitting the data) must be in the orthogonal complement of the column space.
- The orthogonal complement of the column space is the left null space, so must get mapped into zero by :
- Before when we did orthogonal projections, the projecting vectors/matrices were orthogonal, so term would have been the identity, but the outer product structure in is evident.