Abstract: |
Constrained least squares problems arise in many data-driven applications, for instance in dynamical system inference. The memory and computation costs are expensive in practice involving high-dimensional input data. We employ the so-called sketching strategy to project the least squares problem onto a space of a much lower sketching dimension via a random sketching matrix. In the meantime, the approximation accuracy is preserved.
Tensor structure is often present in the data matrices of least squares, including linearized inverse problems and tensor decompositions .In this work, we utilize a general class of row-wise tensorized sub-Gaussian matrices as sketching matrices in constrained optimizations for the sketching design`s compatibility with tensor structures. We provide theoretical guarantees on the sketching dimension in terms of error criterion and probability failure rate. Our theories are demonstrated in a few concrete examples, including unconstrained linear regression and sparse recovery problems. |
|