We consider the question of fitting an approximately optimal manifold to data under differing conditions. In the first data is drawn i.i.d from an arbitrary probability distribution (not necessarily concentrated near a low dimensional manifold). In the second we consider data whose Hausdorff distance to some manifold is small. In the former case we perform exhaustive search over the space of manifolds at a certain granularity. Within each granule efficient convex optimization is performed. In the latter case, there is no exhaustive search. Approximate tangent spaces are constructed, which are then used to construct mollified local projection operators. These operators are composed to give a map onto the output manifold. If time permits, we will also discuss applications of these ideas to the question of constructing a Riemannian manifold that is close to a finite metric space in Gromov-Hausdorff distance when the finite metric space satisfies appropriate conditions. (This includes joint work with Charles Fefferman, Sergei Ivanov, Yaroslav Kurylev, Matti Lassas and Sanjoy Mitter.)
Dr. Hari Narayanan did his Dual Degree in Electrical Engineering from IIT Bombay in 2003 and M.S. and Ph.D. from University of Chicago in 2006 and 2009 resp. After a postdoctoral stint at MIT, he was a faculty at University of Washington, Seattle, for a few years before joining the School of Technology and Computer Science at TIFR, Mumbai, in Jan. 2017. His research interests are in manifold learning, Markov chain algorithms, and related topics in machine learning.