In many research projects, acceleration of solution methods is possible only when problems possess favorable mathematical properties, such as smoothness, strong convexity, or error-bound conditions. But what happens when a problem is poorly conditioned and lacks this convenient structure? One of my research interests focuses precisely on this challenge: can we reshape the mathematical landscape of a difficult problem to make it easier to solve? My work on feasibility problems demonstrates that through strategic preconditioning and rescaling techniques, it is possible to transform a computationally challenging system into one that algorithms can handle efficiently, then map the solution back to the original problem. This geometric transformation approach has achieved exponential improvements in algorithmic speed, showing that carefully designed mathematical reformulations can fundamentally expand what becomes computationally tractable.

I am also interested in developing projection-free first-order methods for solving large-scale constrained optimization problems. These methods avoid computationally expensive projection operations that can become bottlenecks in high-dimensional settings. One example of my work in this area focuses on problems where satisfying constraints is just as critical as optimizing the objective function. In machine learning, for example, fairness constraints may determine whether a classifier can be deployed in practice, while in operations research, risk or resource limitations often represent hard boundaries that cannot be violated. For these applications, I have developed first-order optimization methods that guarantee feasibility throughout the entire optimization process, not just at convergence. Using level-set formulations, these methods trace out a path of feasible solutions that can adapt to stochastic or high-dimensional environments while avoiding projection steps. By seamlessly integrating feasibility preservation with scalability, these algorithms enable reliable solution of constrained problems in domains where constraint satisfaction is paramount.

I am also developing first-order methods for large-scale Approximate Dynamic Programming (ADP) models, which provide powerful frameworks for sequential decision-making under uncertainty. Despite their potential, ADP models remain notoriously difficult to solve at scale due to their complexity. First-order methods, with their modest memory requirements and natural scalability, appear well-suited for these problems, yet efficient algorithms specifically tailored for ADP structures remain relatively underexplored. My research addresses this gap by creating algorithms that strategically exploit the mathematical structure inherent in ADP formulations while preserving their theoretical performance guarantees. The goal is to enable solution of high-dimensional decision problems that would otherwise be computationally intractable, thereby expanding the practical reach of ADP methodology across diverse application domains.

Submitted and Working Papers Published Papers