Recent advances in sensing technology have led to an explosion on the amount of data that can be harvested from systems. However, in order to benefit from the availability of this information, engineers must face the “curse of dimensionality.” Classical system design techniques are not directly applicable to "data deluged" scenarios due to their poor scaling properties and their inability to handle structural constraints on the information flow. Motivated by these difficulties, in the past few years, many research efforts have been devoted toward developing computationally tractable approaches to handle "Big Data." These ideas include exploiting the so called concentration of measure (inherent underlying sparsity) and self-similarity (high degree of spatio-temporal correlation in the data).
In this dissertation, using recent results from semi-algebraic optimization, Q parameterization, compressive sensing and manifold geometry, we show that many seemingly hard problems involving Big Data can be relaxed to tractable convex optimizations, in many cases with optimality certificates. Based on this analysis, computationally attractive convex tools are proposed for tasks including sensor selection, sparse controller design, robust regression, covariance feature propagation, e.t.c. The potential of these tools is illustrated through numerical and practical experiments by comparing against existing methods. Finally, some open problems and directions for future research are discussed.
Advisor: Mario Sznaier