Misplaced Pages

Basis pursuit

Article snapshot taken from Wikipedia with creative commons attribution-sharealike license. Give it a read and then ask your questions in the chat. We can research this topic together.
Optimization problem
This article includes a list of references, related reading, or external links, but its sources remain unclear because it lacks inline citations. Please help improve this article by introducing more precise citations. (April 2020) (Learn how and when to remove this message)

Basis pursuit is the mathematical optimization problem of the form

min x x 1 subject to y = A x , {\displaystyle \min _{x}\|x\|_{1}\quad {\text{subject to}}\quad y=Ax,}

where x is a N-dimensional solution vector (signal), y is a M-dimensional vector of observations (measurements), A is a M × N transform matrix (usually measurement matrix) and M < N. Basis pursuit is NP-hard.

It is usually applied in cases where there is an underdetermined system of linear equations y = Ax that must be exactly satisfied, and the sparsest solution in the L1 sense is desired.

When it is desirable to trade off exact equality of Ax and y in exchange for a sparser x, basis pursuit denoising is preferred.

Basis pursuit problems can be converted to linear programming problems in polynomial time and vice versa, making the two types of problems polynomially equivalent.

Equivalence to Linear Programming

A basis pursuit problem can be converted to a linear programming problem by first noting that

x 1 = | x 1 | + | x 2 | + + | x n | = ( x 1 + + x 1 ) + ( x 2 + + x 2 ) + + ( x n + + x n ) {\displaystyle {\begin{aligned}\|x\|_{1}&=|x_{1}|+|x_{2}|+\ldots +|x_{n}|\\&=(x_{1}^{+}+x_{1}^{-})+(x_{2}^{+}+x_{2}^{-})+\ldots +(x_{n}^{+}+x_{n}^{-})\end{aligned}}}

where x i + , x i 0 {\displaystyle x_{i}^{+},x_{i}^{-}\geq 0} . This construction is derived from the constraint x i = x i + x i {\displaystyle x_{i}=x_{i}^{+}-x_{i}^{-}} , where the value of | x i | {\displaystyle |x_{i}|} is intended to be stored in x i + {\displaystyle x_{i}^{+}} or x i {\displaystyle x_{i}^{-}} depending on whether x i {\displaystyle x_{i}} is greater or less than zero, respectively. Although a range of x i + {\displaystyle x_{i}^{+}} and x i {\displaystyle x_{i}^{-}} values can potentially satisfy this constraint, solvers using the simplex algorithm will find solutions where one or both of x i + {\displaystyle x_{i}^{+}} or x i {\displaystyle x_{i}^{-}} is zero, resulting in the relation | x i | = ( x i + + x i ) {\displaystyle |x_{i}|=(x_{i}^{+}+x_{i}^{-})} .

From this expansion, the problem can be recast in canonical form as:

Find a vector ( x + , x ) that minimizes 1 T x + + 1 T x subject to A x + A x = y and x + , x 0 . {\displaystyle {\begin{aligned}&{\text{Find a vector}}&&(\mathbf {x^{+}} ,\mathbf {x^{-}} )\\&{\text{that minimizes}}&&\mathbf {1} ^{T}\mathbf {x^{+}} +\mathbf {1} ^{T}\mathbf {x^{-}} \\&{\text{subject to}}&&A\mathbf {x^{+}} -A\mathbf {x^{-}} =\mathbf {y} \\&{\text{and}}&&\mathbf {x^{+}} ,\mathbf {x^{-}} \geq \mathbf {0} .\end{aligned}}}

See also

Notes

  1. Natarajan, B. K. (April 1995). "Sparse Approximate Solutions to Linear Systems". SIAM Journal on Computing. 24 (2): 227–234. doi:10.1137/S0097539792240406. ISSN 0097-5397.
  2. ^ A. M. Tillmann Equivalence of Linear Programming and Basis Pursuit, PAMM (Proceedings in Applied Mathematics and Mechanics) Volume 15, 2015, pp. 735-738, DOI: 10.1002/PAMM.201510351

References & further reading

  • Stephen Boyd, Lieven Vandenbergh: Convex Optimization, Cambridge University Press, 2004, ISBN 9780521833783, pp. 337–337
  • Simon Foucart, Holger Rauhut: A Mathematical Introduction to Compressive Sensing. Springer, 2013, ISBN 9780817649487, pp. 77–110

External links

Categories: