In solving large scale problems, the quasi-Newton method is known as the most efficient
method in solving optimization problems. We introduce a hybrid optimization approach in-
volving Gauss-Newton (GN) and limited-memory BFGS (L-BFGS) with projection and Wolfe
line search for solving the nonnegative tensor least squares (NN-TLS) problems. The method
adaptively combines GN and L-BFGS directions with a mixing parameter based on the size
of residual, such that the method globally converges with a faster local convergence property.
Convergence is proved under mild conditions in theory. Numerical experiments demonstrate
the efficiency of the new method.