In this talk, we propose a method for achieving a "good" estimate for the gradient of a function on a high-dimensional space. Often such functions are not sensitive in all coordinates and the gradient of the function is almost sparse. We discuss a method for gradient estimation that combines ideas from Spall's Simultaneous Perturbation Stochastic Approximation (SPSA) with compressive sensing. The aim is to obtain "good" estimator without too many function evaluations. Simulations illustrating performance and comparison with plain SPSA are discussed. Application to estimating gradient outer product matrix as well as standard optimization problems are also illustrated via simulations.
Neeraja Sahasrabudhe is an alumnus of Indian Statistical Institute, Bangalore. She received her Ph.D. from University of Padova, Italy. She is currently a postdoctoral fellow in the Department of Electrical Engineering, IIT Bombay.