21 research outputs found

    A comparison on classical-hybrid conjugate gradient method under exact line search

    Get PDF
    One of the popular approaches in modifying the Conjugate Gradient (CG) Method is hybridization. In this paper, a new hybrid CG is introduced and its performance is compared to the classical CG method which are Rivaie-Mustafa-Ismail-Leong (RMIL) and Syarafina-Mustafa-Rivaie (SMR) methods. The proposed hybrid CG is evaluated as a convex combination of RMIL and SMR method. Their performance are analyzed under the exact line search. The comparison performance showed that the hybrid CG is promising and has outperformed the classical CG of RMIL and SMR in terms of the number of iterations and central processing unit per time

    A New Conjugate Gradient Algorithm with Sufficient Descent Property for Unconstrained Optimization

    Get PDF
    A new nonlinear conjugate gradient formula, which satisfies the sufficient descent condition, for solving unconstrained optimization problem is proposed. The global convergence of the algorithm is established under weak Wolfe line search. Some numerical experiments show that this new WWPNPRP+ algorithm is competitive to the SWPPRP+ algorithm, the SWPHS+ algorithm, and the WWPDYHS+ algorithm

    METODE KONJUGAT GRADIEN HIBRID BARU: METODE HS-CD UNTUK MENYELESAIKAN MASALAH OPTIMASI TAK BERKENDALA

    Get PDF
    Metode konjugat gradien adalah salah satu metode yang efektif dalam menyelesaikan permasalahan optimasi tak-berkendala dan metode ini juga termasuk salah satu metode iteratif. Pada tulisan ini, peneliti mengusulkan metode konjugat gradien hibrid baru yaitu metode new hybrid 4 yang merupakan gabungan antara metode Hestenes dan Stiefel – Conjugate Descent, dimana metode tersebut diusulkan berdasarkan ide dari metode yang telah diusulkan sebelumnya yaitu metode Polak, Ribiѐre dan Polyak - Fletcher dan Reeves atau metode NH1, metode Hestenes dan Stiefel – Dai dan Yuan atau metode NH2 dan metode Liu dan Storey – Conjugate Descent (NH3). Peneliti mengusulkan metode tersebut dengan menggabungkan antara metode HS dan CD, dimana metode tersebut memiliki kekurangan masing-masing. Dalam penelitian ini, peneliti membandingkan hasil numerik antara metode baru yaitu Metode HS-CD (NH4) dengan metode-metode sebelumnya serta membuktikan bahwa memenuhi sifat konvergen global dan memenuhi kondisi descent setiap iterasinya. Hasil numerik menunjukkan bahwa metode baru adalah sangat efisien dalam menyelesaikan fungsi nonlinear tak-berkendala. Metode tersebut juga terbukti memenuhi sifat konvergen global menggunakan kondisi Wolfe serta memenuhi kondisi descent di setiap iterasinya

    A Self-Adjusting Spectral Conjugate Gradient Method for Large-Scale Unconstrained Optimization

    Get PDF
    This paper presents a hybrid spectral conjugate gradient method for large-scale unconstrained optimization, which possesses a self-adjusting property. Under the standard Wolfe conditions, its global convergence result is established. Preliminary numerical results are reported on a set of large-scale problems in CUTEr to show the convergence and efficiency of the proposed method

    Modelling and Using Response Times in Online Courses

    Full text link
    Each time a learner in a self-paced online course seeks to answer an assessment question, it takes some time for the student to read the question and arrive at an answer to submit. If multiple attempts are allowed, and the first answer is incorrect, it takes some time to provide a second answer. Here we study the distribution of such "response times." We find that the log-normal statistical model for such times, previously suggested in the literature, holds for online courses. Users who, according to this model, tend to take longer on submits are more likely to complete the course, have a higher level of engagement, and achieve a higher grade. This finding can be the basis for designing interventions in online courses, such as MOOCs, which would encourage "fast" users to slow down

    A Conjugate Gradient Type Method for the Nonnegative Constraints Optimization Problems

    Get PDF
    We are concerned with the nonnegative constraints optimization problems. It is well known that the conjugate gradient methods are efficient methods for solving large-scale unconstrained optimization problems due to their simplicity and low storage. Combining the modified Polak-Ribière-Polyak method proposed by Zhang, Zhou, and Li with the Zoutendijk feasible direction method, we proposed a conjugate gradient type method for solving the nonnegative constraints optimization problems. If the current iteration is a feasible point, the direction generated by the proposed method is always a feasible descent direction at the current iteration. Under appropriate conditions, we show that the proposed method is globally convergent. We also present some numerical results to show the efficiency of the proposed method

    Generalized Forward Sufficient Dimension Reduction for Categorical and Ordinal Responses

    Full text link
    We present a forward sufficient dimension reduction method for categorical or ordinal responses by extending the outer product of gradients and minimum average variance estimator to multinomial generalized linear model. Previous work in this direction extend forward regression to binary responses, and are applied in a pairwise manner to multinomial data, which is less efficient than our approach. Like other forward regression-based sufficient dimension reduction methods, our approach avoids the relatively stringent distributional requirements necessary for inverse regression alternatives. We show consistency of our proposed estimator and derive its convergence rate. We develop an algorithm for our methods based on repeated applications of available algorithms for forward regression. We also propose a clustering-based tuning procedure to estimate the tuning parameters. The effectiveness of our estimator and related algorithms is demonstrated via simulations and applications
    corecore