We consider the task of recovering a sparse signal given a few measurements. The analysis of such recovery problems is often split into two viewpoints. On the one hand, the statistical analysis focuses on the ability to recover the signal from as few measurements as possible and the sensitivity of the estimator to perturbations. On the other hand, optimization analysis is interested in efficiently solving the problem using an iterative algorithm. In this talk, we show that both analyses rely on the same characteristic: the behavior of the optimization objective around its minimizers. On the statistical side, we show that the minimal conically restricted singular value of the observations, which controls the recovery threshold and the sensitivity to perturbations of an estimator, also parametrizes an error bound on the recovery problem. On the other hand, we develop restart schemes of classical optimization algorithms that exploit this error bound to obtain linear convergence guarantees. Overall, it means that a single quantity controls both statistical performance and optimization complexity in sparse recovery problems with the ell1, group, or nuclear norm. The insights gathered in recovery problems may serve the analysis of non-convex problems such as non-linear dynamical problems, for which I will briefly present a recent analysis.