Polynomial SOS

From Infogalactic: the planetary knowledge core
Jump to: navigation, search

<templatestyles src="Module:Hatnote/styles.css"></templatestyles>

In mathematics, a form (i.e. a homogeneous polynomial) h(x) of degree 2m in the real n-dimensional vector x is sum of squares of forms (SOS) if and only if there exist forms g_1(x),\ldots,g_k(x) of degree m such that


h(x)=\sum_{i=1}^k g_i(x)^2 .

Explicit sufficient conditions for a form to be SOS have been found.[1][2] However every real nonnegative form can be approximated as closely as desired (in the l_1-norm of its coefficient vector) by a sequence of forms \{f_\epsilon\} that are SOS.[3]

Square matricial representation (SMR)

To establish whether a form h(x) is SOS amounts to solving a convex optimization problem. Indeed, any h(x) can be written as


h(x)=x^{\{m\}'}\left(H+L(\alpha)\right)x^{\{m\}}

where x^{\{m\}} is a vector containing a base for the forms of degree m in x (such as all monomials of degree m in x), the prime ′ denotes the transpose, H is any symmetric matrix satisfying


h(x)=x^{\left\{m\right\}'}Hx^{\{m\}}

and L(\alpha) is a linear parameterization of the linear space


\mathcal{L}=\left\{L=L':~x^{\{m\}'} L x^{\{m\}}=0\right\}.

The dimension of the vector x^{\{m\}} is given by


\sigma(n,m)=\binom{n+m-1}{m}

whereas the dimension of the vector \alpha is given by


\omega(n,2m)=\frac{1}{2}\sigma(n,m)\left(1+\sigma(n,m)\right)-\sigma(n,2m).

Then, h(x) is SOS if and only if there exists a vector \alpha such that


H + L(\alpha) \ge 0,

meaning that the matrix H + L(\alpha) is positive-semidefinite. This is a linear matrix inequality (LMI) feasibility test, which is a convex optimization problem. The expression h(x)=x^{\{m\}'}\left(H+L(\alpha)\right)x^{\{m\}} was introduced in [4] with the name square matricial representation (SMR) in order to establish whether a form is SOS via an LMI. This representation is also known as Gram matrix.[5]

Examples

  • Consider the form of degree 4 in two variables h(x)=x_1^4-x_1^2x_2^2+x_2^4. We have

m=2,~x^{\{m\}}=\left(\begin{array}{c}x_1^2\\x_1x_2\\x_2^2\end{array}\right),
~H+L(\alpha)=\left(\begin{array}{ccc}
1&0&-\alpha_1\\0&-1+2\alpha_1&0\\-\alpha_1&0&1
\end{array}\right).
Since there exists α such that H+L(\alpha)\ge 0, namely \alpha=1, it follows that h(x) is SOS.
  • Consider the form of degree 4 in three variables h(x)=2x_1^4-2.5x_1^3x_2+x_1^2x_2x_3-2x_1x_3^3+5x_2^4+x_3^4. We have

m=2,~x^{\{m\}}=\left(\begin{array}{c}x_1^2\\x_1x_2\\x_1x_3\\x_2^2\\x_2x_3\\x_3^2\end{array}\right),
~H+L(\alpha)=\left(\begin{array}{cccccc}
2&-1.25&0&-\alpha_1&-\alpha_2&-\alpha_3\\
-1.25&2\alpha_1&0.5+\alpha_2&0&-\alpha_4&-\alpha_5\\
0&0.5+\alpha_2&2\alpha_3&\alpha_4&\alpha_5&-1\\
-\alpha_1&0&\alpha_4&5&0&-\alpha_6\\
-\alpha_2&-\alpha_4&\alpha_5&0&2\alpha_6&0\\
-\alpha_3&-\alpha_5&-1&-\alpha_6&0&1
\end{array}\right).
Since H+L(\alpha)\ge 0 for \alpha=(1.18,-0.43,0.73,1.13,-0.37,0.57), it follows that h(x) is SOS.

Generalizations

Matrix SOS

A matrix form F(x) (i.e., a matrix whose entries are forms) of dimension r and degree 2m in the real n-dimensional vector x is SOS if and only if there exist matrix forms G_1(x),\ldots,G_k(x) of degree m such that


F(x)=\sum_{i=1}^k G_i(x)'G_i(x) .

Matrix SMR

To establish whether a matrix form F(x) is SOS amounts to solving a convex optimization problem. Indeed, similarly to the scalar case any F(x) can be written according to the SMR as


F(x)=\left(x^{\{m\}}\otimes I_r\right)'\left(H+L(\alpha)\right)\left(x^{\{m\}}\otimes I_r\right)

where \otimes is the Kronecker product of matrices, H is any symmetric matrix satisfying


F(x)=\left(x^{\{m\}}\otimes I_r\right)'H\left(x^{\{m\}}\otimes I_r\right)

and L(\alpha) is a linear parameterization of the linear space


\mathcal{L}=\left\{L=L':~\left(x^{\{m\}}\otimes I_r\right)'L\left(x^{\{m\}}\otimes I_r\right)=0\right\}.

The dimension of the vector \alpha is given by


\omega(n,2m,r)=\frac{1}{2}r\left(\sigma(n,m)\left(r\sigma(n,m)+1\right)-(r+1)\sigma(n,2m)\right).

Then, F(x) is SOS if and only if there exists a vector \alpha such that the following LMI holds:


H+L(\alpha) \ge 0.

The expression F(x)=\left(x^{\{m\}}\otimes I_r\right)'\left(H+L(\alpha)\right)\left(x^{\{m\}}\otimes I_r\right) was introduced in [6] in order to establish whether a matrix form is SOS via an LMI.

Noncommutative polynomial SOS

Consider the free algebra RX⟩ generated by the n noncommuting letters X = (X1,...,Xn) and equipped with the involution T, such that T fixes R and X1,...,Xn and reverse words formed by X1,...,Xn. By analogy with the commutative case, the noncommutative symmetric polynomials f are the noncommutative polynomials of the form f=fT. When any real matrix of any dimension r x r is evaluated at a symmetric noncommutative polynomial f results in a positive semi-definite matrix, f is said to be matrix-positive.

A noncommutative polynomial is SOS if there exists noncommutative polynomials h_1,\ldots,h_k such that

 f(X) = \sum_{i=1}^{k} h_i(X)^T h_i(X).

Surprisingly, in the noncommutative scenario a noncommutative polynomial is SoS if and only if is matrix-positive.[7] Moreover, there exist algorithms available to decompose matrix-positive polynomials in sum of squares of noncommutative polynomials.[8]

References

<templatestyles src="Reflist/styles.css" />

Cite error: Invalid <references> tag; parameter "group" is allowed only.

Use <references />, or <references group="..." />

See also

  • Lua error in package.lua at line 80: module 'strict' not found.
  • Lua error in package.lua at line 80: module 'strict' not found.
  • Lua error in package.lua at line 80: module 'strict' not found.
  • Lua error in package.lua at line 80: module 'strict' not found.
  • Lua error in package.lua at line 80: module 'strict' not found.
  • Lua error in package.lua at line 80: module 'strict' not found.
  • Lua error in package.lua at line 80: module 'strict' not found.
  • Lua error in package.lua at line 80: module 'strict' not found.