Katya Scheinberg

Katya Scheinberg

Positions: 

Full Professor
Harvey E. Wagner Endowed Chair Professor
ADVANCE Chair 2014-2015

Education: 

Ph.D., Columbia University

Research Area: 

Developing Practical Algorithms (and their theoretical analysis) for Various Problems in Continuous Optimization
Convex Optimization
Derivative Free Optimization
Machine Learning
Quadratic Programming

Classes Taught: 

Optimization Models and Applications
Optimization Methods in Machine Learning
Derivative free optimization

Interests: 

Mathematical programming
Linear, convex and nonlinear optimization
Derivative free optimization
Application of optimization to machine learning
Statistics and support vector machines
Conic programming
Interior point methods
Matrix factorizations in interior point methods
Sensitivity analysis

Professor Scheinberg joined the department from the Industrial Engineering and Operations Research Department at Columbia University in 2010. A native from Moscow, she earned her Master’s degree in operations research from the Lomonosov Moscow State University in 1992 and then received her Ph.D. in operations research from Columbia in 1997. Scheinberg was a Research Staff Member at the IBM T.J. Watson Research center, where she worked on various applied and theoretical problems in optimization, until coming back to Columbia as a visiting professor. Her main research areas are related to developing practical algorithms (and their theoretical analysis) for various problems in continuous optimization, such as convex optimization, derivative free optimization, machine learning, quadratic programming, etc. Scheinberg has also published a book in 2008 titled, Introduction to Derivative Free Optimization, which is co-authored with Andrew R. Conn and Luis N. Vicente. 

Publications: 

Introduction to Derivative Free Optimization with A. R. Conn and L. N. Vicente. Available from SIAM Series on Mathematical Programming. December 2008 / approx. xii + 277 pages / Softcover / ISBN: 978-0-898716-68-0.
Practical Inexact Proximal Quasi-Newton Method with Global Complexity Analysis, with Xiaocheng Tang, 2014, submitted.
Least-squares approach to risk parity in portfolio selection, with X. Bai and R. Tutuncu, 2013, to appear is Jounral of Computational Finance.
Aligning ligand binding cavities by optimizing superposed volume, with B. Chen and R. Chen, in BIBM 2012.
Convergence of trust-region methods based on probabilistic models, with A. Bandeira and L.N. Vicente, to appear in SIOPT.
Fast first-order methods for composite convex optimization with backtracking, with D. Goldfarb, FOCM, 2014, 14: 389-417.
Efficient Block-coordinate Descent Algorithms for the Group Lasso. with Z. Qin, and D. Goldfarb. Math Prog. Comp., 2013, Volume 5, Issue 2, pp 143-169.
Computation of sparse low degree interpolating polynomials and their application to derivative-free optimization, with A. Bandeira and L.N. Vicente, Math. Prog., Series B, (2012), 134, pp 223-257.
On partially sparse recovery, with A. Bandeira and L.N. Vicente, 2011
Fast alternating linearization methods for minimizing the sum of two convex functions, with D. Goldfarb and S. Ma, 2013, Math. Prog. Series A, 141: pp 349-382.
Sparse Inverse Covariance Selection via Alternating Linearization Methods, with D. Goldfarb and S. Ma, NIPS 2010
SINCO – a greedy coordinate ascent method for sparse inverse covariance selection problem, with I Rish, 2009
Optimization Methods for Sparse Inverse Covariance Selection Problem. with S. Ma, In S. Sra, S. Nowozin, and S. J. Wright editors: Optimization for Machine Learning, MIT Press, 2010
Row by row method for semidefinite programming, with Z. Wen, D. Goldfarb and S. Ma, submitted, 2009
A Derivative-Free Algorithm for the Least-square minimization, with H. Zhang and A.R. Conn, submitted, 2009
Self-correcting geometry in model-based algorithms for derivative-free unconstrained optimization with Ph. L. Toint, to appear, 2009.
A MAP approach to learning sparse gaussian markov networks. with N. Bani Asadi, I. Rish, D. Kanevsky, B. Ramabhadran, ICCASP 2009.
Global Convergence of General Derivative-Free Trust-Region Algorithms to First and Second Order Critical Points , with A.R. Conn and L.N. Vicente, SIAM J. on Optimization, (2009).
Geometry of Sample Sets in Derivative Free Optimization: polynomial regression and incomplete interpolation., with A.R. Conn and L.N. Vicente, IMA Journal of Numerical Analysis 2008 28(4):721-748.
Geometry of Interpolation Sets in Derivative Free Optimization., with A.R. Conn and L.N. Vicente, Mathematical Programming, 111 (2008), 141-172.
Product-Form LDL^T Factorizations in Interior-Point Methods for Convex Quadratic , with D. Goldfarb, IMA Journal of Numerical Analysis 2008 28(4):806-826.
IBM Research TRECVID-2006 Video Retrieval System with M. Campbell, S. Ebadollahi, D. Joshi, M. Naphade, A. Natsev, J. Seidl, J. R. Smith, J. Tesic and L. Xie.
Detecting Generic Visual Events with Temporal Cues. with Lexing Xie, Dong Xu, Shahram Ebadollahi, Shih-Fu Chang, John R. Smith. In Proc. 40th Asilomar Conference on Signals, Systems, and Computers, Pacific Grove, CA, October 2006.
An Efficient Implementation of an Active Set Method for SVM, J. of Machine Learning Research 7 (2006) 2237-2257.
(Conference version: Incas: An incremental active set method for SVM, with Shai Fine (2002) ).
Product-form Cholesky factorization in interior point methods for second-order cone programming, with D. Goldfarb, Mathematical Programming, v. 103 (2005), pp. 153-179.
A product-form Cholesky factorization method for handling dense columns in interior point methods for linear programming, with D. Goldfarb, Mathematical Programming, v. 99 (2004), pp. 1-34.
Incremental learning and selective sampling via parametric optimization framework for SVM, with S. Fine, in “Advances in Neural Information Processing Systems” 14, MIT Press, (2002) 705-711.
Efficient SVM training using low-rank Kernel representations, with S. Fine, J. of Machine Learning Research, Special issue on Kernel methods, 2(2001) 243-264.
Manual for FORTRAN software package DFO v1.2, (2001).
Parametric linear semidefinite programming, , In “Handbook on Semidefinite Programming”, eds. H. Wolkowicz, R. Saigal, L. Vandenberghe, pp 92–110. Kluwer, 2000.
On parametric semidefinite programming, with D. Goldfarb, Applied Numerical Mathematics, v. 29(3), pp. 361-377, 1999.
Modified barrier-penalty functions for constrained minimization problems, with Goldfarb D., Polyak R. and Yusefovich B. Computational Optimization and Applications, v. 14(1), pp. 55-74, 1999.
A derivative free optimization algorithm in practice, with Conn A. R. and Toint Ph. L., Proceedings of 7th AIAA/USAF/NASA/ISSMO Symposium on Multidisciplinary Analysis and Optimization, St. Louis, MO, 1998.
Interior point trajectories in semidefinite programming, with D. Goldfarb, SIAM J. on Optimization v. 8(4), pp. 871-886, 1998.
Recent progress in unconstrained nonlinear optimization without derivatives, with A.R. Conn and Ph. L. Toint, Mathematical Programming v. 79 (1997) 397-414.
On the convergence of derivative-free methods for unconstrained optimization, with A.R. Conn and Ph. L. Toint, “Approximation Theory and Optimization: Tributes to M. J. D. Powell”, eds. A. Iserles and M. Buhmann, pp 83–108, 1997.
Issues related to interior point methods for linear and semidefinite programming., PhD Thesis, Dept. of IEOR, Columbia University, 1997.
Extension of Karmarkar’s algorithm to convex quadratically constrained quadratic , with A. Nemirovskii, Mathematical Programming, v. 72 (1996) 273-289.