Due to the Covid-19, we decided to make the defense fully online using zoom :
https://univ-grenoble-alpes-fr.zoom.us/j/94395402286?pwd=UVJtZExEdEZhUzB...
Jury committee:
In this thesis, we develop a framework to reduce the dimensionality of composite optimizationproblems with sparsity inducing regularizers. Based on the identification property ofproximal methods, we first develop a ``sketch-and-project” method that uses projectionsbased on the structure of the correct point. This method allows to work with randomlow-dimensional subspaces instead of considering the full space in the cases when thefinal solution is sparse. Second, we place ourselves in the context of the delay-tolerantasynchronous proximal methods and use our dimension reduction technique to decreasethe total size of communications. However, this technique is proven to converge only forwell-conditioned problems both in theory in practice. Thus, we investigate wrapping it upinto a proximal reconditioning framework. This leads to a theoretically backed algorithmthat is guaranteed to cost less in terms of communications compared with a non-sparsifiedversion; we show in practice that it implies faster runtime convergence when the sparsityof the problem is sufficiently big.