Projective Geometry and Transformations of 3D¶
3.2.2 Lines¶
(3.13) can be understood from (3.4). By applying (3.13) to itself i.e. \((\mathcal{L} \mid \mathcal{L})\), one arrives at (3.12). (3.12) can also be confirmed by evaluating the determinant directly. (3.14) can be derived from \(\mathbf{X} = \mathrm{L} \boldsymbol{\pi}\) via applying (3.2) twice.
[Bor] covers some useful properties of \(\text{Pl}\mathrm{\ddot{u}}\text{cker}\) line coordinates as a matrix-vector product that the book left out.
Define
Notice that (3.10) is equivalent to
The constraint (3.12) translates to
Given \(\mathbf{X} \in \mathbb{R}^4\), define
Observe that (3.8) and (3.9) can be rewritten respectively as
and
A point-only expression of the plane defined by the join of a point \(\mathbf{X}\) and line \(\mathrm{L}\) is
A plane-only expression of the point defined by the intersection of the line \(\mathrm{L}\) with the plane \(\boldsymbol{\pi}\) is
Note that this particular definition of \(\text{Pl}\mathrm{\ddot{u}}\text{cker}\) line coordinates reveals a connection to cross products in \(\mathbb{R}^3\).
3.2.4 Classification of Quadrics¶
[Yap] covers some useful properties of quadric surfaces that the book left out.
The zero set of a polynomial \(P(\mathbf{X}) = P(X_1, \ldots, X_n) \in \mathbb{Q}[X_1, \ldots, X_n]\) is a hypersurface in \(n\)-dimensional affine space. When \(P(\mathbf{X})\) is homogeneous, it is called a form and defines a hypersurface in \((n - 1)\)-dimensional projective space. When \(\text{deg}(P) = 2\), the polynomial is quadratic and the corresponding hypersurface is quadric.
(3.15) can be written as a polynomial:
Let \(\mathrm{Q}_u\) denote the upper \(3 \times 3\) submatrix of \(\mathrm{Q}\). The discriminant \(\Delta \mathrm{Q} = \det \mathrm{Q}\) and subdiscriminant \(\Delta \mathrm{Q}_u = \det \mathrm{Q}_u\) pair is an invariant that can be used to categorize the point quadrics. Another invariant is the number of positive and negative eigenvalues. Let \(\sigma^+(\mathrm{Q})\) and \(\sigma^-(\mathrm{Q})\) denote the number of positive and negative eigenvalues of \(\mathrm{Q}\). The pair
is called the inertia of \(\mathrm{Q}\). Note that \(\sigma^+(\mathrm{Q}) + \sigma^-(\mathrm{Q})\) is the rank of \(\mathrm{Q}\) while \(\sigma^+(\mathrm{Q}) - \sigma^-(\mathrm{Q})\) is the signature of \(\mathrm{Q}\).
(i) \(\text{Pl}\mathrm{\ddot{u}}\text{cker Coordinates}\)¶
(a)¶
The point of intersection of a line \(\mathrm{L}\) with a plane \(\boldsymbol{\pi}\) that is defined by a point \(\mathbf{X}\) and a line \(\mathrm{M}\) is
(b)¶
Observe that
By inspection and (3.2), \(\mathrm{M}^* \mathbf{X} = \boldsymbol{0}\) if and only if \(\mathbf{X}\) is on \(\mathrm{M}\).
Likewise, notice that
By inspection and (3.2), \(\mathrm{L} \boldsymbol{\pi} = \boldsymbol{0}\) if and only if \(\mathrm{L}\) is on \(\boldsymbol{\pi}\).
(c)¶
Let \(\mathbf{P}\) and \(\mathbf{Q}\) denote two planes and \(\mathrm{L}^*\) their intersection (3.9). Recall that \(\mathrm{L}\) is on \(\boldsymbol{\pi}_\infty\) if and only if
where \(\mathbf{n}_\cdot\) corresponds to the plane normal of Euclidean geometry (3.2). By inspection, the foregoing only holds when the normals are parallel i.e. the planes are parallel.
(d)¶
Note that the following proof also holds for (3.13) and (3.14) because of (3.10).
Let \(\mathbf{P}\) and \(\mathbf{Q}\) denote two planes and \(\mathrm{L}^*\) their intersection (3.9) satisfies (3.12). Likewise, define another line \(\hat{\mathrm{L}}^*\) as the intersection of planes \(\hat{\mathbf{P}}\) and \(\hat{\mathbf{Q}}\). The points of intersection are
where \(\mathbf{n}_\cdot\) corresponds to the plane normal of Euclidean geometry (3.2). By inspection, the homogeneous coordinates \(\mathbf{X}\) and \(\hat{\mathbf{X}}\) are equivalent only when the lines are parallel. Otherwise, result 3.5 states that the lines do not intersect because \((\mathrm{L} \mid \hat{\mathrm{L}}) \neq 0\).
(ii) Projective Transformations¶
Since \(\mathrm{Q}\) is a real symmetric matrix, there exists an eigenvalue decomposition such that \(\mathrm{Q} = \mathrm{U} \mathrm{D} \mathrm{U}^\top\) where \(\mathrm{U}\) is a real orthogonal matrix and \(\mathrm{D}\) is a diagonal matrix whose entries are the eigenvalues of \(\mathrm{Q}\). Likewise, \(\mathbf{Q}_u = \mathrm{U}' \mathrm{D}' {\mathrm{U}'}^\top\).
Define \(\sigma = \sigma(\mathrm{Q})\) and \(\sigma_u = \sigma(\mathrm{Q}_u)\). The point quadrics of interest under a point transformation (3.16) \(\mathrm{H} = \mathrm{U}^\top\) and \(\mathrm{H}_u = {\mathrm{U}'}^\top\) are
ellipsoid \(\left( \sigma = (3, 1), \sigma_u = (3, 0) \right)\),
paraboloid \(\left( \sigma = (3, 1), \sigma_u = (2, 0) \right)\),
hyperboloid of two sheets \(\left( \sigma = (3, 1), \sigma_u = (2, 1) \right)\), and
hyperboloid of one sheets \(\left( \sigma = (2, 2), \sigma_u = (2, 1) \right)\).
Observe that a projectivity (2.17) cannot map an ellipsoid to a hyperboloid of one sheet because
By inspection, a projectivity can map an ellipsoid to a elliptic paraboloid or hyperboloid of two sheets because
(iii) Screw Decomposition¶
Let \(\mathbf{I}\) denote an identity matrix whose size is dependent on the surrounding context. Recall that the eigenvalues of a Euclidean transformation \(\mathbf{A}\) can be determined from the roots of the characteristic polynomial
Notice that the rotation matrix \(\mathbf{R}\) can be defined as
without loss of generality because any rotation matrix \(\mathbf{R}'\) can be decomposed as
where \(\mathbf{P}\) denotes a change of basis according to Euler’s Displacement Theorem [Hab]. This could also be justified through examining the trace
Hence the remaining eigenvalues are the roots of
Therefore, the eigenvalues of \(\mathbf{A}\) are \(\lambda \in \left\{ 1, 1, e^{i \theta}, e^{-i \theta} \right\}\).
Recall that the characteristic vectors (eigenvectors) of \(\mathbf{A}\) must satisfy the equation
Let \(\mathbf{a}\) denote the direction of the rotation axis (i.e. \(\mathbf{R} \mathbf{a} = \mathbf{a}\)), and decompose the translation vector into \(\mathbf{t} = \mathbf{t}_\parallel + \mathbf{t}_\perp\) where \(\mathbf{t}_\parallel = (\mathbf{t} \cdot \mathbf{a}) \mathbf{a}\) and \(\mathbf{t}_\perp = \mathbf{t} - \mathbf{t}_\parallel\).
\(\lambda_1 = \lambda_2 = 1\)¶
Regardless of whether \(\mathbf{t}\) is orthogonal to \(\mathbf{a}\) or not, \(\tilde{\mathbf{v}}_1 = \begin{bmatrix} \mathbf{a}\\ 0 \end{bmatrix}\) is clearly an eigenvector of \(\mathbf{A}\). Furthermore, one cannot evaluate
directly because \(\mathbf{R} - \mathbf{I}\) is a singular matrix. This also applies to \(\mathbf{R}'\) because
When \(\mathbf{a} \cdot \mathbf{t} = 0\), the resulting homogeneous system
is underdetermined and has infinitely many solutions.
When \(\mathbf{a} \cdot \mathbf{t} \neq 0\), there are not enough free variables to choose \(\upsilon \in \mathbb{R} \setminus 0\) and still satisfy
because \(\mathbf{t}_\perp \cdot \mathbf{a} = 0\). Hence the only eigenvector available is \(\tilde{\mathbf{v}}_2 = \begin{bmatrix} \mathbf{a}\\ 0 \end{bmatrix}\).
\(\lambda_3 = e^{i \theta}, \lambda_4 = e^{-i \theta}\)¶
By inspection, the simplest pair of eigenvectors would have \(\upsilon = 0\). Consequently, the corresponding eigenvectors \(\tilde{\mathbf{v}}_3 = \begin{bmatrix} \mathbf{v}_3\\ 0 \end{bmatrix}\) and \(\tilde{\mathbf{v}}_4 = \begin{bmatrix} \mathbf{v}_4\\ 0 \end{bmatrix}\) span a plane that is orthogonal to the axis of rotation \(\mathbf{a}\). To see this, notice that
Since \(\lambda_3 - \lambda_4^{-1} \neq 0\), the eigenvectors must be mutually orthogonal.
References