Conjugate gradient methods are among the most efficient for solving optimization models. In this paper, a newly proposed conjugate method is problems as convex combination of Harger-Zhan and Dai-Yaun nonlinear methods, which capable producing sufficient descent condition with global convergence properties under strong Wolfe conditions. The numerical results demonstrate efficiency some benchmark...