Statistics Seminar: Compute Faster and Learn Better: Machine Learning via Nonconvex Model-based Optimization
Compute Faster and Learn Better: Machine Learning via Nonconvex Model-based Optimization
Dr. Tuo Zhao, Assistant Professor, School of Industrial and Systems Engineering, Georgia Institute of Technology
Nonconvex optimization naturally arises in many machine learning problems. Machine learning researchers exploit various nonconvex formulations to gain modeling flexibility, estimation robustness, adaptivity, and computational scalability. Although classical computational complexity theory has shown that solving nonconvex optimization is generally NP-hard in the worst case, practitioners have proposed numerous heuristic optimization algorithms, which achieve outstanding empirical performance in real-world applications. To bridge this gap between practice and theory, we propose a new generation of model-based optimization algorithms and theories, which incorporate the statistical thinking into modern optimization. Specifically, when designing practical computational algorithms, we take the underlying statistical models into consideration. Our novel algorithms exploit hidden geometric structures behind many nonconvex optimization problems, and can obtain global optima with the desired statistics properties in polynomial time with high probability.