Date of Award
Campus Access Thesis
Master of Science (MS)
Dan A. Simovici
The least squares problem is one of the most important regression problems in statistics and machine learning. In this paper, we present an Averaging Projection Stochastic Gradient Descent (APSGD) algorithm to solve the large-scale least squares problem. APSGD improves the Stochastic Gradient Descent (SGD) by using the constraint that the linear regression line passes through the mean point of all the data points. It results in the best regret bound O(logT), and fastest convergence speed among all first order approaches. Empirical studies confirm the effectiveness of APSGD by comparing it with the state-of-art methods.
Mu, Yang, "Averaging Projected Stochastic Gradient Descent for Large Scale Square Problem" (2012). Graduate Masters Theses. 149.