I will discuss an iterative algorithm for continuous and unconstrained nonlinear optimization that is an extension of a well-known approach for iteratively solving linear equation systems in numerical linear algebra. In this approach, linear combinations are taken of iterates generated by a simple process (which is called the "preconditioning process") in a way that minimizes a residual, thus potentially significantly accelerating the convergence of the simple process. I will explain how this approach can be generalized to continuous optimization problems, by considering nonlinear versions of the acceleration mechanism and adding a line search for globalization, leading to a provably convergent algorithm for certain preconditioning processes. The performance of the algorithm will be illustrated for problems of low-rank tensor approximation, which have applications in signal processing and data mining.