We propose a novel algorithm, termed soft quasi-Newton (soft QN), for optimization in the presence of bounded noise. Traditional quasi-Newton algorithms are vulnerable to such noise-induced perturbations. To develop a more robust quasi-Newton method, we replace the secant condition in the matrix optimization problem for the Hessian update with a penalty term in its objective and derive a closed-form update formula. A key feature of our approach is its ability to maintain positive definiteness of the Hessian inverse approximation throughout the iterations. Furthermore, we establish the following properties of soft QN: it recovers the BFGS method under specific limits, it treats positive and negative curvature equally, and it is scale invariant. Collectively, these features enhance the efficacy of soft QN in noisy environments. For strongly convex objective functions and Hessian approximations obtained using soft QN, we develop an algorithm that exhibits linear convergence toward a neighborhood of the optimal solution even when gradient and function evaluations are subject to bounded perturbations. Through numerical experiments, we demonstrate that soft QN consistently outperforms state-of-the-art methods across a range of scenarios.
QC 20250425