Gaussian curvature of a 2D function graph z = f(x, y)
You may want to look at the article where I define Principal/Gaussian/Mean curvature (the article focus on triangle mesh surface but the general definition applies for all kind of surfaces).
Formula
The Gaussian curvature of a surface described by a function $z = f(x, y)$ can be calculated using the formula:
\[ K = \frac{{f_{xx} \cdot f_{yy} - (f_{xy})^2}}{{(1 + (f_x)^2 + (f_y)^2)^2}} = {{\det (Hess(f))} \over {{{(1 + f_x^2 + f_y^2)}^2}}} \]Where:
$ \begin{align*} & f_{xx} \text{ is the second partial derivative of } f \text{ with respect to } x \text{ twice} \\ & f_{yy} \text{ is the second partial derivative of } f \text{ with respect to } y \text{ twice} \\ & f_{xy} \text{ is the mixed partial derivative of } f \text{ with respect to } x \text{ and } y \\ & f_x \text{ is the partial derivative of } f \text{ with respect to } x \\ & f_y \text{ is the partial derivative of } f \text{ with respect to } y \text{ and} \\ & K \text{ is the Gaussian curvature} \\ & Hess(f) \text{ the hessian matrix of } f \end{align*} $
Alternate formulation
From "Consistent Computation of First- and Second-Order Differential Quantities for Surface Meshes"
$$ K = \frac{ \det (Hess(f))} { {\ell}^4 } $$ Where: $$ \begin{aligned} &\ell &=& \quad \| \vec s_u \times \vec s_v \| \\ &\ell &=& \quad \sqrt{(1 + f_x^2 + f_y^2)} \\ &\ell^2 &=& \quad (1 + f_x^2 + f_y^2) \\ &\ell^4 &=& \quad (1 + f_x^2 + f_y^2)^2 \\ \end{aligned} $$And $\vec s$ is a parametric function defined as below with $\vec s_u$ and $\vec s_v$ are respectively the partial derivatives in $u$ and $v$:
$$ \vec s(u,v) = \left \{ \begin{aligned} x(u,v) &= u \\ y(u,v) &= v \\ z(u,v) &= f(u,v) \end{aligned} \right . $$Remarks
The observations that follows are particularly useful in the context of solving optimization problems to find the minimal value of a function.
As highlighted by this stackexchange answer it's interesting to notice the relationship between curvature and the Hessian matrix.
Looking at the formula of the curvature for the specific case of a point $p(x,y)$ where the gradient is null $\nabla f = 0$ (i.e. $f_x=0$ and $f_y=0$), then the Gaussian curvature of $f$ is the determinant of the Hessian matrix. Since the determinant is the product of the Eigen values, the Gaussian curvature of $f$ is also the product of the Eigen values $\lambda_1$ and $\lambda_2$ of the Hessian for $\nabla f = 0$.
Now if the Eigen values are both positives and $\nabla f = 0$, the Gaussian curvature is positive which implies $f$ is bowled shaped; as opposed to a saddle shape. By definition, the Hessian is positive definite when Eigen values are all positives. So a positive definite Hessian for a point of $f$ where $\nabla f(p) = 0$ necessarily means the function is bowled shaped. What's more, as we see with the mean-curvature of z = f(x,y) positive Eigen values also mean it is convex (oriented "up" as opposed to concave which points downwards). This property is used to find the minimum of a function in optimization problems even in higher dimensions.
Note on principal curvatures: since $K = \lambda_1 . \lambda_2$ at $\nabla f(p) = 0$ by the very definition of the Gaussian curvature $K = \kappa_1.\kappa_2$ we can see that each Eigen value represents the principal curvatures $\kappa_1 = \lambda_1$ and $\kappa_2 = \lambda_2$.
Derivation
To derive this curvature formula we treat $z = f(x,y)$ as a special case of a parametric surface $\vec s(u,v) = [x(u,v), y(u,v), z(u,v)]$
$$ \vec s(u,v) = \left \{ \begin{aligned} x(u,v) &= u \\ y(u,v) &= v \\ z(u,v) &= f(u,v) \end{aligned} \right . $$
$ \begin{aligned} \mathbf{s}_u &= [1, 0, f_x] \\ \mathbf{s}_v &= [0, 1, f_y]\\ \mathbf{s}_{uu} &= [0, 0, f_{xx}] \\ \mathbf{s}_{uv} &= [0, 0, f_{xy}]\\ \mathbf{s}_{vv} &= [0, 0, f_{yy}] \end{aligned} $
Now we start from the general formulation of the Gaussian curvature for a parametric surface:
$$ K = \frac{ (\mathbf{s}_{uu} \cdot \mathbf{n})(\mathbf{s}_{vv} \cdot \mathbf{n}) - (\mathbf{s}_{uv} \cdot \mathbf{n})^2} { (\mathbf{s}_u \cdot \mathbf{s}_u)(\mathbf{s}_v \cdot \mathbf{s}_v) - (\mathbf{s}_u \cdot \mathbf{s}_v)^2} $$
First let's develop the denominator:
$$ \left \{ \begin{aligned} (\mathbf{s}_u \cdot \mathbf{s}_u) &= 1^2 + 0^2 + (f_x)^2 &= 1 + (f_x)^2 \\ (\mathbf{s}_v \cdot \mathbf{s}_v) &= 0^2 + 1^2 + (f_y)^2 &= 1 + (f_y)^2 \\ (\mathbf{s}_u \cdot \mathbf{s}_v)^2 &= (1 \times 0 + 0 \times 1 + f_x f_y)^2 &= (f_x f_y)^2 \\ \end{aligned} \right . $$
Then:
$$ \begin{aligned} &(\mathbf{s}_u \cdot \mathbf{s}_u)(\mathbf{s}_v \cdot \mathbf{s}_v) - (\mathbf{s}_u \cdot \mathbf{s}_v)^2\\ =& (1 + (f_x)^2) ( 1 + (f_y)^2) - (f_x f_y)^2 \\ =& 1 + (f_y)^2 + (f_x)^2 + (f_x f_y)^2 - (f_x f_y)^2 \\ =& 1 + (f_y)^2 + (f_x)^2 \\ \end{aligned} $$Now the numerator:
$$ \left \{ \begin{aligned} (\mathbf{s}_{uu} \cdot \mathbf{n}) &= 0 \times n_x + 0 \times n_y + n_z \ f_{xx} = n_z \ f_{xx} \\ (\mathbf{s}_{vv} \cdot \mathbf{n}) &= 0 \times n_x + 0 \times n_y + n_z \ f_{yy} = n_z \ f_{yy} \\ (\mathbf{s}_{uv} \cdot \mathbf{n})^2 &= (0 \times n_x + 0 \times n_y + n_z \ f_{xy})^2 = (n_z \ f_{xy})^2\\ \end{aligned} \right . $$
Then:
$$ \begin{aligned} &(\mathbf{s}_{uu} \cdot \mathbf{n})(\mathbf{s}_{vv} \cdot \mathbf{n}) - (\mathbf{s}_{uv} \cdot \mathbf{n})^2\\ =& (n_z \ f_{xx}) ( n_z \ f_{yy}) - (n_z \ f_{xy})^2 \\ =& (n_z)^2 ( f_{xx} \ f_{yy}) - (n_z)^2 ( f_{xy})^2 \\ =& (n_z)^2 ( f_{xx} \ f_{yy} - (f_{xy})^2) \\ \end{aligned} $$Let's develop the normal which is the cross product of the velocities along $u$ and $v$ normalized:
$$ \frac{\vec s_u \times \vec s_v}{\|\vec s_u \times \vec s_v \|} $$So:
$$ \begin{aligned} \vec s_u \times \vec s_v &= [1,0,f_x] \times [0,1,f_y] \\ &= [-f_x, -f_y, 1] \end{aligned} $$Normalized:
$$ \vec n = \frac{[-f_x, -f_y, 1]} { \sqrt{(f_x)^2 + (f_y)^2 + 1} } $$Finally gluing it all together:
$$ \frac{ \left ( \frac{1} { \sqrt{(f_x)^2 + (f_y)^2 + 1}} \right )^2 \left ( f_{xx} \ f_{yy} - (f_{xy})^2 \right ) } { 1 + (f_y)^2 + (f_x)^2 } $$and voila:
$$ \frac{ ( f_{xx} \ f_{yy} - (f_{xy})^2) } { (1 + (f_y)^2 + (f_x)^2)^2 } $$Related notes
No comments