7 research outputs found

    Comparison descent directions for Conjugate Gradient Method

    Get PDF
    In the following manuscript we will show as a starting point a theoretical analysis of the gradient method, known as one of the first descent methods, and from this we will identify the strength of the conjugate gradient methods. Taking an objective function, we will determine the values that optimize it by means of different methods, indicating the differences of geometric type that these have. Different systems will be used, in order to serve as a test, obtaining their solution in each case and finding the speed at which they converge in accordance with the conjugate gradient methods proposed by Hestenes-Stiefel and Fletcher-Reeves.En el siguiente manuscrito mostraremos como punto de inicio un análisis teórico del método de gradiente, conocido como unos de los primeros  métodos de descenso, y a partir de ello identificar la fortaleza de los métodos del gradiente conjugado. Tomando una función objetivo determinaremos los valores que la optimizan mediante diferentes métodos indicando las diferencias de tipo geométrico que estos tengan. Se usarán distintos  sistemas , con el fin de que sirvan de prueba obteniendo en cada caso su solución y encontrando la velocidad en que convergen de conformidad con los métodos de gradiente conjugado propuestos por Hestenes-Stiefel y Fletcher- Reeves

    Comparison descent directions for Conjugate Gradient Method

    Get PDF
    In the following manuscript we will show as a starting point a theoretical analysis of the gradient method, known as one of the first descent methods, and from this we will identify the strength of the conjugate gradient methods. Taking an objective function, we will determine the values that optimize it by means of different methods, indicating the differences of geometric type that these have. Different systems will be used, in order to serve as a test, obtaining their solution in each case and finding the speed at which they converge in accordance with the conjugate gradient methods proposed by Hestenes-Stiefel and Fletcher-Reeves.En el siguiente manuscrito mostraremos como punto de inicio un análisis teórico del método de gradiente, conocido como unos de los primeros  métodos de descenso, y a partir de ello identificar la fortaleza de los métodos del gradiente conjugado. Tomando una función objetivo determinaremos los valores que la optimizan mediante diferentes métodos indicando las diferencias de tipo geométrico que estos tengan. Se usarán distintos  sistemas , con el fin de que sirvan de prueba obteniendo en cada caso su solución y encontrando la velocidad en que convergen de conformidad con los métodos de gradiente conjugado propuestos por Hestenes-Stiefel y Fletcher- Reeves

    Comparison descent directions for Conjugate Gradient Method

    Get PDF
    In the following manuscript we will show as a starting point a theoretical analysis of the gradient method, known as one of the first descent methods, and from this we will identify the strength of the conjugate gradient methods. Taking an objective function, we will determine the values that optimize it by means of different methods, indicating the differences of geometric type that these have. Different systems will be used, in order to serve as a test, obtaining their solution in each case and finding the speed at which they converge in accordance with the conjugate gradient methods proposed by Hestenes-Stiefel and Fletcher-Reeves.En el siguiente manuscrito mostraremos como punto de inicio un análisis teórico del método de gradiente, conocido como unos de los primeros  métodos de descenso, y a partir de ello identificar la fortaleza de los métodos del gradiente conjugado. Tomando una función objetivo determinaremos los valores que la optimizan mediante diferentes métodos indicando las diferencias de tipo geométrico que estos tengan. Se usarán distintos  sistemas , con el fin de que sirvan de prueba obteniendo en cada caso su solución y encontrando la velocidad en que convergen de conformidad con los métodos de gradiente conjugado propuestos por Hestenes-Stiefel y Fletcher- Reeves

    Job shop scheduling with artificial immune systems

    Get PDF
    The job shop scheduling is complex due to the dynamic environment. When the information of the jobs and machines are pre-defined and no unexpected events occur, the job shop is static. However, the real scheduling environment is always dynamic due to the constantly changing information and different uncertainties. This study discusses this complex job shop scheduling environment, and applies the AIS theory and switching strategy that changes the sequencing approach to the dispatching approach by taking into account the system status to solve this problem. AIS is a biological inspired computational paradigm that simulates the mechanisms of the biological immune system. Therefore, AIS presents appealing features of immune system that make AIS unique from other evolutionary intelligent algorithm, such as self-learning, long-lasting memory, cross reactive response, discrimination of self from non-self, fault tolerance, and strong adaptability to the environment. These features of AIS are successfully used in this study to solve the job shop scheduling problem. When the job shop environment is static, sequencing approach based on the clonal selection theory and immune network theory of AIS is applied. This approach achieves great performance, especially for small size problems in terms of computation time. The feature of long-lasting memory is demonstrated to be able to accelerate the convergence rate of the algorithm and reduce the computation time. When some unexpected events occasionally arrive at the job shop and disrupt the static environment, an extended deterministic dendritic cell algorithm (DCA) based on the DCA theory of AIS is proposed to arrange the rescheduling process to balance the efficiency and stability of the system. When the disturbances continuously occur, such as the continuous jobs arrival, the sequencing approach is changed to the dispatching approach that involves the priority dispatching rules (PDRs). The immune network theory of AIS is applied to propose an idiotypic network model of PDRs to arrange the application of various dispatching rules. The experiments show that the proposed network model presents strong adaptability to the dynamic job shop scheduling environment.postprin

    A vision-based optical character recognition system for real-time identification of tractors in a port container terminal

    Get PDF
    Automation has been seen as a promising solution to increase the productivity of modern sea port container terminals. The potential of increase in throughput, work efficiency and reduction of labor cost have lured stick holders to strive for the introduction of automation in the overall terminal operation. A specific container handling process that is readily amenable to automation is the deployment and control of gantry cranes in the container yard of a container terminal where typical operations of truck identification, loading and unloading containers, and job management are primarily performed manually in a typical terminal. To facilitate the overall automation of the gantry crane operation, we devised an approach for the real-time identification of tractors through the recognition of the corresponding number plates that are located on top of the tractor cabin. With this crucial piece of information, remote or automated yard operations can then be performed. A machine vision-based system is introduced whereby these number plates are read and identified in real-time while the tractors are operating in the terminal. In this paper, we present the design and implementation of the system and highlight the major difficulties encountered including the recognition of character information printed on the number plates due to poor image integrity. Working solutions are proposed to address these problems which are incorporated in the overall identification system.postprin

    Accelerated conjugate gradient algorithm with finite difference Hessian/vector product approximation for unconstrained optimization

    Get PDF
    AbstractIn this paper we propose a fundamentally different conjugate gradient method, in which the well-known parameter βk is computed by an approximation of the Hessian/vector product through finite differences. For search direction computation, the method uses a forward difference approximation to the Hessian/vector product in combination with a careful choice of the finite difference interval. For the step length computation we suggest an acceleration scheme able to improve the efficiency of the algorithm. Under common assumptions, the method is proved to be globally convergent. It is shown that for uniformly convex functions the convergence of the accelerated algorithm is still linear, but the reduction in function values is significantly improved. Numerical comparisons with conjugate gradient algorithms including CONMIN by Shanno and Phua [D.F. Shanno, K.H. Phua, Algorithm 500, minimization of unconstrained multivariate functions, ACM Trans. Math. Softw. 2 (1976) 87–94], SCALCG by Andrei [N. Andrei, Scaled conjugate gradient algorithms for unconstrained optimization, Comput. Optim. Appl. 38 (2007) 401–416; N. Andrei, Scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization, Optim. Methods Softw. 22 (2007) 561–571; N. Andrei, A scaled BFGS preconditioned conjugate gradient algorithm for unconstrained optimization, Appl. Math. Lett. 20 (2007) 645–650], and new conjugacy condition and related new conjugate gradient by Li, Tang and Wei [G. Li, C. Tang, Z. Wei, New conjugacy condition and related new conjugate gradient methods for unconstrained optimization, J. Comput. Appl. Math. 202 (2007) 523–539] or truncated Newton TN by Nash [S.G. Nash, Preconditioning of truncated-Newton methods, SIAM J. on Scientific and Statistical Computing 6 (1985) 599–616] using a set of 750 unconstrained optimization test problems show that the suggested algorithm outperforms these conjugate gradient algorithms as well as TN
    corecore