24 research outputs found

    An integrated assignment, routing, and speed model for roadway mobility and transportation with environmental, efficiency, and service goals

    Full text link
    Managing all the mobility and transportation services with autonomous vehicles for users of a smart city requires determining the assignment of the vehicles to the users and their routing in conjunction with their speed. Such decisions must ensure low emission, efficiency, and high service quality by also considering the impact on traffic congestion caused by other vehicles in the transportation network. In this paper, we first propose an abstract trilevel multi-objective formulation architecture to model all vehicle routing problems with assignment, routing, and speed decision variables and conflicting objective functions. Such an architecture guides the development of subproblems, relaxations, and solution methods. We also propose a way of integrating the various urban transportation services by introducing a constraint on the speed variables that takes into account the traffic volume caused across the different services. Based on the formulation architecture, we introduce a (bilevel) problem where assignment and routing are at the upper level and speed is at the lower level. To address the challenge of dealing with routing problems on urban road networks, we develop an algorithm that alternates between the assignment-routing problem on an auxiliary complete graph and the speed optimization problem on the original non-complete graph. The computational experiments show the effectiveness of the proposed approach in determining approximate Pareto fronts among the conflicting objectives

    A simheuristic algorithm for solving an integrated resource allocation and scheduling problem

    Get PDF
    Modern companies have to face challenging configuration issues in their manufacturing chains. One of these challenges is related to the integrated allocation and scheduling of resources such as machines, workers, energy, etc. These integrated optimization problems are difficult to solve, but they can be even more challenging when real-life uncertainty is considered. In this paper, we study an integrated allocation and scheduling optimization problem with stochastic processing times. A simheuristic algorithm is proposed in order to effectively solve this integrated and stochastic problem. Our approach relies on the hybridization of simulation with a metaheuristic to deal with the stochastic version of the allocation-scheduling problem. A series of numerical experiments contribute to illustrate the efficiency of our methodology as well as their potential applications in real-life enterprise settings

    Derivative-free methods for mixed-integer nonsmooth constrained optimization

    Get PDF
    In this paper, we consider mixed-integer nonsmooth constrained optimization problems whose objective/constraint functions are available only as the output of a black-box zeroth-order oracle (i.e., an oracle that does not provide derivative information) and we propose a new derivative-free linesearch-based algorithmic framework to suitably handle those problems. We first describe a scheme for bound constrained problems that combines a dense sequence of directions (to handle the nonsmoothness of the objective function) with primitive directions (to handle discrete variables). Then, we embed an exact penalty approach in the scheme to suitably manage nonlinear (possibly nonsmooth) constraints. We analyze the global convergence properties of the proposed algorithms toward stationary points and we report the results of an extensive numerical experience on a set of mixed-integer test problems

    On the Selection of Common Factors for Macroeconomic Forecasting

    Get PDF
    We address the problem of selecting the common factors that are relevant for forecasting macroeconomic variables. In economic forecasting using diffusion indexes the factors are ordered, according to their importance, in terms of relative variability, and are the same for each variable to predict, i.e. the process of selecting the factors is not supervised by the predictand. We propose a simple and operational supervised method, based on selecting the factors on the basis of their significance in the regression of the predictand on the predictors. Given a potentially large number of predictors, we consider linear transformations obtained by principal components analysis. The orthogonality of the components implies that the standard t-statistics for the inclusion of a particular component are independent, and thus applying a selection procedure that takes into account the multiplicity of the hypotheses tests is both correct and computationally feasible. We focus on three main multiple testing procedures: Holm’s sequential method, controlling the family wise error rate, the Benjamini-Hochberg method, controlling the false discovery rate, and a procedure for incorporating prior information on the ordering of the components, based on weighting the p-values according to the eigenvalues associated to the components. We compare the empirical performances of these methods with the classical diffusion index (DI) approach proposed by Stock and Watson, conducting a pseudo-real time forecasting exercise, assessing the predictions of 8 macroeconomic variables using factors extracted from an U.S. dataset consisting of 121 quarterly time series. The overall conclusion is that nature is tricky, but essentially benign: the information that is relevant for prediction is effectively condensed by the first few factors. However, variable selection, leading to exclude some of the low order principal components, can lead to a sizable improvement in forecasting in specific cases. Only in one instance, real personal income, we were able to detect a significant contribution from high order components

    New methods for simulation-based optimization and applications to emergency department management

    No full text
    The development of novel efficient algorithmic frameworks using simulation to provide solutions to real-world problems is prompted by the need to accurately represent the complex and uncertain processes of real systems, such as Emergency Departments (EDs). The resulting Simulation-Based Optimization (SBO) methodology has been receiving increasing attention in recent years, aiming to develop algorithms that do not require first-order information and support both continuous and integer variables. The trade-off between long-term goals and short-term decisions, as well as the computational cost of evaluating the black-box functions involved, determines whether to use exact Derivative-Free Optimization (DFO) algorithms, providing optimal solutions with long running time, or metaheuristic methods, returning fast solutions without optimality guarantees. Important SBO problems arise in dealing with ED management since a strong interest is shown in studying the impact of both the overcrowding phenomenon and sudden patient peak arrivals on everyday operations. To this end, further SBO approaches may be required to estimate the ED arrival rate and to recover the missing information from the real datasets in order to build Discrete Event Simulation (DES) models with a high level of reliability. In this thesis, SBO is used with a twofold goal. On the one hand, to propose methodological contributions from an algorithmic point of view, namely a metaheuristic-based algorithm to solve a specific SBO problem and a globally convergent DFO method for mixed-integer nonsmooth constrained optimization problems, frequently arising in practice. On the other hand, to develop SBO approaches to improve the accuracy of a DES model representing an ED. In particular, an integer nonlinear black-box optimization problem is solved to determine the best piecewise constant approximation of the time-varying arrival rate function by finding the optimal partition of the 24 hours into a suitable number of nonequally spaced intervals. Black-box constraints are adopted to ensure the validity of the Nonhomogeneous Poisson process, which is commonly used in the literature to model the ED arrival process. Moreover, a model calibration procedure is proposed to estimate the incomplete information in the ED patient flow by minimizing the deviation between the real data and the simulation output. The resulting DES model is used for solving a simulation-based resource allocation problem to determine the optimal settings of the ED unit devoted to low-complexity patients. The objective is to reduce the overcrowding level without using an excessive amount of resources. Two real case studies are considered to demonstrate the effectiveness of the proposed methodology

    Unveling Fermi unidentified sources with machine learning

    No full text
    The study of astrophysical sources of gamma rays can be very useful for analyzing the behavior of matter in extreme conditions, which are difficult to reproduce in the laboratory. The Fermi telescope is currently the most sensitive gamma-ray telescope in orbit. Its technological improvements have led to a large increase in the number of detected sources. However, in many cases these sources are still unidentified, i.e. we do not know what its nature is. The identification process requires long and expensive observation campaigns, which can however be accelerated using new data analysis techniques, such as Machine Learning. In this thesis we therefore analyze the potential and limits of various Machine Learning techniques for the classification of the unidentified sources of the Fermi telescope

    Nazioni Unite per lo sviluppo industriale. Cenni sullo scenario e dettagli di un caso studio

    No full text
    Il contributo si interroga sul ruolo delle Nazioni Unite nelle dinamiche di sviluppo industriale
    corecore