16 research outputs found

    Análisis del alto flujo vehicular para una vía de acceso a Medellín usando simulación basada en agentes

    Get PDF
    En Medellín, la vía que atraviesa el parque de Robledo ha jugado un papel importante como ruta de acceso a diferentes barrios del occidente y a municipios de la subregión occidental antioqueña. Diariamente, cientos de vehículos transitan por allí, por lo que esta se identifica como la única vía de acceso, situación por la que en ciertas horas del día se torna lenta. A pesar de que se buscan vías alternas, no se encuentran resultados satisfactorios, ya que la única posibilidad está igual o más congestionada que la primera. En este documento se propone realizar un análisis de la problemática de congestión vial presentada en la vía del parque de Robledo en el sentido oriente-occidente, mediante simulación basada en agentes. Para la realización del modelo de simulación, se caracterizan los individuos por medio de rasgos relevantes, y a su vez se desarrolla una evaluación del comportamiento por mejorar en capacidad vial por ampliación, y la validación respectiva de los resultados obtenidos por medio de esta alternativa. Como resultado se observa que la adecuación de la vía mediante su ampliación facilita y mejora las condiciones de movilidad, aumentando hasta en un 20 % la velocidad de desplazamiento y optimizándose la capacidad vehicular del trayecto hasta el doble en horas pico.In the municipality of Medellín, the road that crosses Robledo Park has played an important role as an access route to different neighborhoods in the west, and the Antioquia western subregions. Every day, hundreds of vehicles pass through, identifying it as the only access road; a situation that at certain times of the day becomes slow. Although alternative routes are sought, satisfactory results have not been found since the only possibility is equal or more congested than the first. In this document, the proposal was to carry out an analysis of the problem of road congestion presented on the road in the east-west direction, through agent-based simulation. For the simulation model, the individuals were characterized by relevant traits, and, in turn, an evaluation of the behavior was developed for the improvement in road capacity by enlargement, and the respective validation of the results obtained through this alternative. As a result, it was observed that the adaptation of the road through its extension facilitates and improves the mobility conditions, increasing its travel speed by up to 20% and optimizing the vehicular capacity of the route to double at peak times

    Pareto Optimal Solutions for Stochastic Dynamic Programming Problems via Monte Carlo Simulation

    Get PDF
    A heuristic algorithm is proposed for a class of stochastic discrete-time continuous-variable dynamic programming problems submitted to non-Gaussian disturbances. Instead of using the expected values of the objective function, the randomness nature of the decision variables is kept along the process, while Pareto fronts weighted by all quantiles of the objective function are determined. Thus, decision makers are able to choose any quantile they wish. This new idea is carried out by using Monte Carlo simulations embedded in an approximate algorithm proposed to deterministic dynamic programming problems. The new method is tested in instances of the classical inventory control problem. The results obtained attest for the efficiency and efficacy of the algorithm in solving these important stochastic optimization problems

    Automating warm-up length estimation

    Get PDF
    There are two key issues in assuring the accuracy of estimates of performance obtained from a simulation model. The first is the removal of any initialisation bias, the second is ensuring that enough output data is produced to obtain an accurate estimate of performance. This paper is concerned with the first issue, and more specifically warm-up estimation. Our aim is to produce an automated procedure, for inclusion into commercial simulation software, for estimating the length of warm-up and hence removing initialisation bias from simulation output data. This paper describes the extensive literature search that was carried out in order to find and assess the various existing warm-up methods, the process of short-listing and testing of candidate methods. In particular it details the extensive testing of the warm-up MSER-5 method. © 2010 Operational Research Society Ltd. All rights reserved

    Measures of Similarity Between Objects Based on Qualitative Shape Descriptions

    Get PDF
    A computational approach for comparing qualitative shape descriptions (QSDs) of objects within digital images is presented. First, the dissimilarity of qualitative features of shape is measured: (i) intuitively using conceptual neighbourhood diagrams; and (ii) mathematically using interval distances. Then, a similarity measure between QSDs is defined and tested using images of different categories of the MPEG-7-CE-Shape-1 library, images of tiles used to build mosaics, and a collection of Clipart images. The results obtained show the effectiveness of the similarity measure defined, which is invariant to translations, rotations and scaling, and which implicitly manages deformation of shape parts and incompleteness

    A statistical process control approach to selecting a warm-up period for a discrete-event simulation

    No full text
    The selection of a warm-up period for a discrete-event simulation continues to be problematic. A variety of selection methods have been devised, and are briefly reviewed. It is apparent that no one method can be recommended above any other. A new approach, based upon the principles of statistical process control, is described (SPC method). Because simulation output data are often highly autocorrelated and potentially non-normal, the batch means method is employed in constructing the control chart. The SPC method is tested on seven data sets and encouraging results are obtained concerning its accuracy. The approach is also discussed with respect to its ease of implementation, simplicity, generality of use and requirements for parameter estimation

    Statistical Process Control for Depropanizer Column at Petronas Gas Berhad, Kerteh

    Get PDF
    The final report is made in order to give all the details on the Final Year Project II which is "Statistical Process Control for Depropanizer Column at Petronas Gas Berhad (PGB), Kerteh ". This report is divided into five main chapters which are Introduction, Literature Review, Methodology, Results & Discussions and Conclusion. Statistical Process Control (SPC) has a very high demand in industry right now as it provides a way to monitor process behavior and able us to analyze the variations in the process that may affect the quality of the end product. In this project, the focus in on the depropanizer column at Petronas Gas Berhad (PGB), Kerteh. The short term targets are to apply Statistical Process Control on the column stated, to measure and analyze the variation in the processes, to monitor the consistency of processes used to manufacture the product as designed and finally to suggest the best way in controlling all the variables at the column. While for long term is to implement the results in our industry. Many variables have to be considered in order to complete the project, which are: (a) Calculated data surrounding the depropanizer column which include all the tag names, (b) Tag names and description, (c) Description of the depropanizer column and (d) Flow sheet for the column showing all the tag name surrounding the column. Mainly, two tools are required in executing this project which are (a) SSPS software and (b) LIMS - Laboratory Information Management System (in PGB, Kerteh). From here, an early analysis on all the variables obtained by using Microsoft Excel has been made. Mainly, the discussion is about the input variables that affecting the output variable, which in this case the output variable is the C3 composition. Some problems have been identified during the process of analyzing the data using Microsoft Excel. Simulation using SPSS software has been completed which includes: a) Descriptive Statistics a) Crosstabs b) Histograms b) One Way Anova c) Correlations c) Paired T -test d) Scatter Plots d) Linear Regressions Analysis has been completed for the results obtained from SPSS, focusing on the critical components inside the overhead product composition which is C3 and some input data that will most probably affect the overhead product composition: i) Reflux flow ( 4FC6203 .PV) ii) Energy input inside the column ( 4T1623l.PV) iii) Feed conditions (4FY62022.PV, 4TI6009.PV) It is proven that these four main input variables have a very strong relationship with C3 composition inside the overhead product and they all come from a general population mean. Increasing or decreasing their values will give a great impact to the C3 composition. As the samples proved to come from a general population mean, an optimum operating condition could be produced from the average data, in order to maintain C3 composition within the desired value (98.48 mole%): Input Description Optimum operating variables conditions suggested 4FC6203.PV Reflux flow 112.96 m,/hr 4TI623l.PV Energy input inside the column 116.03 oc I Reboiler temperature 4FY62022.PV Feed flowrate 143.36 m,/hr 4TI6009.PV Feed temperature 95.81 oc However, these optimum operating conditions suggested must be checked again so that it will not violate the design operating conditions. This project will not only improve the existing process control of the column but also improve the quality of end product and saving the cost to operate the column

    Análise de uma metodologia de apoio à decisão na seleção de parâmetros de simulação

    Get PDF
    Nos dias de hoje, e com o avançar da tecnologia, a simulação computacional é uma ferramenta cada vez mais usada por parte das empresas e mesmo a nível pessoal. A simulação permite que em poucos minutos consigamos ter uma perspetiva de como um modelo, uma linha de produção, uma rede se irá comportar em termos reais. Permite também que num curto espaço de tempo se consiga simular dias, meses e anos de uma ideia virtual. Para que uma simulação obtenha resultados mais precisos, existem três parâmetros importantes que devem ser considerados antes da simulação final e que irão permitir que os resultados sejam mais exatos e por sua vez a simulação terá um impacto mais positivo no modelo real. Estes três parâmetros são o período warm-up, o número de replicações e a duração da simulação. Assim, esta dissertação tem como objetivo o desenvolvimento de uma ampla revisão da literatura sobre os métodos existentes para estimar os três parâmetros anteriormente mencionados, bem como a apresentação de uma proposta de uma metodologia para a estimação dos mesmos. Pretende-se que esta metodologia seja bastante intuitiva e fácil de ser implementada em qualquer estudo de simulação.Nowadays, and with the advancement of technology, computational simulation is a tool increasingly used by companies and even at a personal level. The simulation allows us in a few minutes to get a perspective of how a model, a production line, a network will behave in real terms. It also allows in a short time to be able to simulate days, months and years of a virtual idea. For a simulation to obtain more accurate results, there are three important parameters that must be considered before the final simulation and which will allow the results to be more accurate and so the simulation will have a more positive impact on the real model. These three parameters are the warm-up period, the number of replications and the duration of the simulation. Thus, this dissertation aims to develop a broad literature review on the existing methods to estimate the three parameters mentioned above, as well as to present a proposal for a methodology for their estimation. It is intended that this methodology is very intuitive and easy to be implemented in any simulation study

    Veri birleştirmeye dayalı parçacık filtreleme ile gerçek zamanlı hareket izleme

    Get PDF
    06.03.2018 tarihli ve 30352 sayılı Resmi Gazetede yayımlanan “Yükseköğretim Kanunu İle Bazı Kanun Ve Kanun Hükmünde Kararnamelerde Değişiklik Yapılması Hakkında Kanun” ile 18.06.2018 tarihli “Lisansüstü Tezlerin Elektronik Ortamda Toplanması, Düzenlenmesi ve Erişime Açılmasına İlişkin Yönerge” gereğince tam metin erişime açılmıştır.Günümüz dünyasında ulaşım, iletişim ve eğlence gibi alanlarda insanlara yönelik hizmet veren birçok sistem, aynı şekilde ülkelerin dünya çapında caydırıcı güç olma yolunda sahip oldukları bilimsel araştırma platformları, otomasyon sistemleri ve askeri teknolojiler ile tüm bu sistemleri kuşatan bilişim teknolojileri mevcuttur. Bu teknoloji ve sistemlerin geliştirilme süreçlerinde ortaya çıkan teknik problemlerin, çoğunlukla gerçek-zamanlı ya da buna yakın bir hızda ve makul hata payı ile çözülmesi gereklidir. Aslında tüm bu sistemler yaşayan sistemler olarak kabul edilir. Geliştiricilerin kapsamlı çalışmalar sonucunda ortaya koydukları prensip ve algoritmalar doğrultusunda ürettikleri yazılım programları aracılığıyla bu tür sistemler, işleyiş ortamına bağlı olarak dış dünya ile sürekli etkileşim halinde bulunurlar. Bu tür sistemler, dışardan çok çeşitli sensörler yoluyla toplanan verilerin anlamlı bilgilere dönüştürülerek uygun şekilde kullanıma sokulmasını sağlayan mekanizmalar içerirler. Değişen ihtiyaç ve koşullar, yüksek yaşam standartları için beklentiler ve insanın sorgulayıcı tabiatı her geçen gün yeni ve daha yetenekli sistem ve hizmetlerin ortaya çıkmasını gerekli kılmaktadır. Bu nedenle, daha iyi donanım ve sensörlerin yanında zeki yazılımlarla donatılmış modern sistemlerin varlığı günümüzde her zamankinden çok daha fazla önem kazanmaktadır. Bu tür sistemlerin işleyiş döngüsünde, sistem geliştiricileri açısından en önemli husus herhangi bir anda sistemin hangi durumda olduğunu açıklayacak ve dolayısıyla gerekli aksiyonun yapılmasına yardımcı olacak bilgiyi sunan bir alt sistemin varlığıdır. Ancak, birçok durumda böyle bir bilgiye herhangi bir kısıtlama olmaksızın ulaşmak mümkün değildir. Bu bağlamda, sistemin durumlarına ilişkin doğrudan ölçülemeyen büyüklük değerleri gürültülü sensörler yardımıyla yapılan ölçümlerden yola çıkılarak elde edilmeye çalışılır. Böyle uygulamalar ise alt düzeyde Kestirim ve Filtreleme alanları kapsamına giren metotlarla yapılmaktadır. Bu çalışmanın temel konusu olan gerçek-zamanlı hareket izleme de bilimsel araştırma, askeri teknoloji ve eğlence gibi birçok alana yönelik olarak geliştirilen sistemler için çözülmesi gerekli bir alt problemdir. Bu problemin çözümü için son yıllarda araştırmacılar genelde olasılığa dayalı filtreleme özelde ise Parçacık filtreleme yöntemleri üzerine yoğunlaşmıştır. Bu çalışmada da farklı kaynaklardan gelen verilerin birleştirilmesine dayanan bir parçacık filtreleme uygulaması ile RGB-D sensörüne sahip bir aygıtın iç-tanımlı algoritmalarla yürüttüğü gerçek-zamanlı hareket izleme sürecinin belli varsayımlar çerçevesinde iyileştirilmesi hedeflenmiştir.There exist so many systems in today's world, developed for serving people in the fields such as transportation, communication or entertainment and also in scientific research platforms, automation systems and military technologies acquired by governments in the way of being a globally deterrent power. Technical problems emerging within the development process of such systems are required to be solved mostly in real-time or near real-time with acceptable accuracy rates. Indeed, these systems are deemed as living systems such that they permanently interact with the outside world depending on their operating environment through the software programs produced in line with the principles and algorithms revealed by the architectures as a result of comprehensive studies. Such systems include kind of mechanisms ensuring data collected by various sensors from the outside to be conveniently put into practice. Changing needs and conditions, expectations for higher living standards and exploration-driven nature of human being imply novel and more capable systems and services to be introduced with each passing day. Hence, modern systems equipped with better hardware and sensors at the same time supported by smarter software gain currency today more than ever. The most significant requirement from the point of architectures for such type of systems is to have a kind of sub-system allowing acquiring information at any time associated with the states of the system in order to take the appropriate action. However, in most situations it is quite difficult to reach this information without any restrictions. Within this context, the requested information about the system states which is not directly obtainable is attained via the available sensor measurements, of course, in a distorted form owing to the noise. Such applications undoubtedly refers to the methods of well-positioned research field namely "Estimation Theory" and "Filtering". Real-time motion tracking which is the main focus of this dissertation is obviously a sub-problem required to be solved for several systems developed within the scope of fields such as scientific research, military technologies and entertainment. Solution of this problem has recently lead to researchers directing to the field of probabilistic filtering in general and particle filtering in particular. In this study, it is aimed to enhance the real-time motion tracking process of a device having a RGB-D sensor by its built-in algorithms pursuant to specific assumptions using a sensor-fusion based particle filtering application

    Demand and Capacity Modelling of Acute Services Using Simulation and Optimization Techniques

    Get PDF
    The level of difficulty that hospital management have been experiencing over the past decade in terms of balancing demand and capacity needs has been at an unprecedented level in the UK. Due to shortage of capacity, hospitals are unable to treat patients, and in some cases, patients are transferred to other hospitals, outpatient referrals are delayed, and accident and emergency (A&E) waiting times are prolonged. So, it’s time to do things differently, because the current status quo is not an option. A whole hospital level decision support system (DSS) was developed to assess and respond to the needs of local populations. The model integrates every component of a hospital (including A&E, all outpatient and inpatient specialties) to aid with efficient and effective use of scarce resources. An individual service or a specialty cannot be assumed to be independent, they are all interconnected. It is clear from the literature that this level of generic hospital simulation model has never been developed before (so this is an innovative DSS). Using the Hospital Episode Statistics and local datasets, 768 forecasting models for the 28 outpatient and inpatient specialties are developed (to capture demand). Within this context, a variety of forecasting models (i.e. ARIMA, exponential smoothing, stepwise linear regression and STLF) for each specialty of outpatient and inpatient including the A&E department were developed. The best forecasting methods and periods were selected by comparing 4 forecasting methods and 3 periods (i.e. daily, weekly and monthly) according to forecast accuracy values calculated by the mean absolute scaled error (MASE). Demand forecasts were then used as an input into the simulation model for the entire hospital (all specialties). The generic hospital simulation model was developed by taking into account all specialties and interactions amongst the A&E, outpatient and inpatient specialties. Six hundred observed frequency distributions were established for the simulation model. All distributions used in the model were based on age groups. Using other inputs (i.e. financial inputs, number of follow ups, etc.), the hospital was therefore modelled to measure key output metrics in strategic planning. This decision support system eliminates the deficiencies of the current and past studies around modelling hospitals within a single framework. A new output metric which is called ‘demand coverage ratio’ was developed to measure the percentage of patients who are admitted and discharged with available resources of the associated specialty. In addition, a full factorial experimental design with 4 factors (A&E, elective and non-elective admissions and outpatient attendance) at 2 levels (possible 5% and 10% demand increases) was carried out in order to investigate the effects of demand increases on the key outputs (i.e. demand coverage ratio, bed occupancy rate and total revenue). As a result, each factor is found to affect total revenue, as well as the interaction between elective and non-elective admissions. The demand coverage ratio is affected by the changes in outpatient demands as well as A&E arrivals and non-elective admissions. In addition, the A&E arrivals, non-elective admissions and elective admissions are most important for bed occupancy rates, respectively. After an exhaustive review of the literature we notice that an entire hospital model has never been developed that combines forecasting, simulation and optimization techniques. A linear optimization model was developed to estimate the required bed capacity and staff needs of a mid-size hospital in England (using essential outputs from forecasting and forecasting-simulation) for each inpatient elective and non-elective specialty. In conclusion, these results will bring a different perspective to key decision makers with a decision support tool for short and long term strategic planning to make rational and realistic plans. This hospital decision support system can become a crucial instrument for decision makers for efficient service in hospitals in England and other parts of the world
    corecore