1,841 research outputs found

    Multi-Objective Mission Route Planning Using Particle Swarm Optimization

    Get PDF
    The Mission Routing Problem (MRP) is the selection of a vehicle path starting at a point, going through enemy terrain defended by radar sites to get to the target(s) and returning to a safe destination (usually the starting point). The MRP is a three-dimensional, multi-objective path search with constraints such as fuel expenditure, time limits, multi-targets, and radar sites with different levels of risks. It can severely task all the resources (people, hardware, software) of the system trying to compute the possible routes. The nature of the problem can cause operational planning systems to take longer to generate a solution than the time available. Since time is critical in MRP, it is important that a solution is reached within a relatively short time. It is not worth generating the solution if it takes days to calculate since the information may become invalid during that time. Particle Swarm Optimization (PSO) is an Evolutionary Algorithm (EA) technique that tries to find optimal solutions to complex problems using particles that interact with each other. Both Particle Swarm Optimization (PSO) and the Ant System (AS) have been shown to provide good solutions to Traveling Salesman Problem (TSP). PSO_AS is a synthesis of PSO and Ant System (AS). PSO_AS is a new approach for solving the MRP, and it produces good solutions. This thesis presents a new algorithm (PSO_AS) that functions to find the optimal solution by exploring the MRP search space stochastically

    The Global sphere reconstruction (GSR) - Demonstrating an independent implementation of the astrometric core solution for Gaia

    Get PDF
    Context. The Gaia ESA mission will estimate the astrometric and physical data of more than one billion objects, providing the largest and most precise catalog of absolute astrometry in the history of Astronomy. The core of this process, the so-called global sphere reconstruction, is represented by the reduction of a subset of these objects which will be used to define the celestial reference frame. As the Hipparcos mission showed, and as is inherent to all kinds of absolute measurements, possible errors in the data reduction can hardly be identified from the catalog, thus potentially introducing systematic errors in all derived work. Aims. Following up on the lessons learned from Hipparcos, our aim is thus to develop an independent sphere reconstruction method that contributes to guarantee the quality of the astrometric results without fully reproducing the main processing chain. Methods. Indeed, given the unfeasibility of a complete replica of the data reduction pipeline, an astrometric verification unit (AVU) was instituted by the Gaia Data Processing and Analysis Consortium (DPAC). One of its jobs is to implement and operate an independent global sphere reconstruction (GSR), parallel to the baseline one (AGIS, namely Astrometric Global Iterative Solution) but limited to the primary stars and for validation purposes, to compare the two results, and to report on any significant differences. Results. Tests performed on simulated data show that GSR is able to reproduce at the sub-Ī¼\muas level the results of the AGIS demonstration run presented in Lindegren et al. (2012). Conclusions. Further development is ongoing to improve on the treatment of real data and on the software modules that compare the AGIS and GSR solutions to identify possible discrepancies above the tolerance level set by the accuracy of the Gaia catalog.Comment: Accepted for publication on Astronomy & Astrophysic

    An optimization framework for solving capacitated multi-level lot-sizing problems with backlogging

    Get PDF
    This paper proposes two new mixed integer programming models for capacitated multi-level lot-sizing problems with backlogging, whose linear programming relaxations provide good lower bounds on the optimal solution value. We show that both of these strong formulations yield the same lower bounds. In addition to these theoretical results, we propose a new, effective optimization framework that achieves high quality solutions in reasonable computational time. Computational results show that the proposed optimization framework is superior to other well-known approaches on several important performance dimensions

    Dynamic lot sizing and tool management in automated manufacturing systems

    Get PDF
    Cataloged from PDF version of article.The overall aim of this study is to show that there is a critical interface between the lot sizing and tool management decisions, and these two problems cannot be viewed in isolation. We propose "ve alternative algorithms to solve lot sizing, tool allocation and machining conditions optimization problems simultaneously. The "rst algorithm is an exact algorithm which "nds the global optimum solution, and the others are heuristics equipped with a look-ahead mechanism to guarantee at least local optimality. The computational results indicate that the amount of improvement is statistically signi"cant for a set of randomly generated problems. The magnitude of cost savings is dependent on the system parameters

    Modelling the pacemaker in event-B: towards methodology for reuse

    No full text
    The cardiac pacemaker is one of the system modelling problems posed to the Formal Methods community by the {\it Grand Challenge for Dependable Systems Evolution} \cite{JOW:06}. The pacemaker is an intricate safety-critical system that supports and moderates the dysfunctional heart's intrinsic electrical control system. This paper focusses on (i) the problem (requirements) domain specification and its mapping to solution (implementation) domain models, (ii) the significant commonality of behaviour between its many operating modes, emphasising the potential for reuse, and (iii) development and verification of models.We introduce the problem and model three of the operating modes in the problem domain using a state machine notation. We then map each of these models into a solution domain state machine notation, designed as shorthand for a refinement-based solution domain development in the Event-B formal language and its RODIN toolki

    Using spreadsheets in production planning in a pharmaceutical company

    Get PDF
    Dissertation presented as the partial requirement for obtaining a Master's degree in Information Management, specialization in Information Systems and Technologies ManagementLiving in the technological era, a successful company nowadays is the company that integrates Information Technology (IT) with its business. Otherwise, it might face a huge risk of not being able to survive in todayā€™s market against the huge competition that is highly influenced by IT. However, integrating IT with business is not so simple due to several factors, namely: the available resources, choosing the right solution, top management support, time constraints, and finally achieving the proper user training and adoption. It is obviously not so wise to keep on waiting until all these obstacles are solved when there is a possibility of using some of the available resources such as Microsoft Office tools that might ease several processes of the business until the needed system is implemented and used. In a supply chain, as a supplier or a manufacturer, production usually follows a production plan that is typically created by the supply planning department. A production plan relies on a demand forecast, whereas a demand forecast usually relies on historical data, but the market demand changes and a forecast does not always match the demand, so whenever there is a change in the forecast, production plans are updated accordingly (Graves, 2011). Therefore, if we are looking to optimize the supply chain, it is necessary to build a strong relationship between the supply chain partners because their collaboration becomes vital in such a scenario. This collaboration means that the partners of the supply chain must share their information with each other (Groznik & Maslaric, 2012). Such information can be about the inventory stock levels of the customer towards the supplier which helps in optimizing the Reorder Level that is defined as ā€œthe point at which the company will reorder stockā€ (Meng, 2006), resulting in creating more successful production plans that matches the market demand. However, these processes can hardly be done and managed manually, theyactually require the help of an IT system that is integrated with the supply chain for achieving the expected results. Aligning IT with the supply chain and using e-business to manage the relationship between suppliers and customers can lower costs, this is due to the fact that IT can contribute in supporting the collaboration and coordination through an easy way of information sharing between the partners of the supply chain (Auramo, Kauremaa, & Tanskanen, 2005). Moreover, using IT in a supply chain does not necessarily need to be costly or difficult to use; insteadspreadsheets for instance can be used for Inventory Planning that is defined as ā€œfiguring out what your inventory should be (not counting what you have)ā€ (Estep, 2012). Even though using spreadsheet tools such as Microsoft Office tools does not require purchasing an IT system, it is still considered a type of integrating IT with a business process that can significantly improve the supply chain

    Improvements In computed tomography perfusion output using complex singular value decomposition and the maximum slope algorithm

    Full text link
    OBJECTIVE: Determine if complex singular value decomposition (cSVD) used as preprocessing in the maximum slope algorithm reduces image noise of resultant physiologic parametric images. Noise will be decreased in the parametric maps of cerebral blood flow (CBF), cerebral blood volume (CBV) as compared to the same algorithm and data set with no cSVD applied. MATERIALS AND METHODS: A set of 10 patients (n=15) underwent a total combined 15 CT perfusion studies upon presenting with stroke symptoms. It was determined these patients suffered from occlusions resulting in a prolonged arrival time of blood to the brain. DICOM data files of these patients scans were selected based on this increased arrival delay. We compared the output of estimation calculations for cerebral blood flow (CBF), and cerebral blood volume (CBV), using preprocessing cSVD against the same scan data with no preprocessing cSVD. Image noise was assessed through the calculation of the standard deviation within specific regions of interest copied to specific areas of grey and white matter as well as CSF space. A decrease in the standard deviation values will indicate improvement in the noise level of the resultant images.. Results for the mean value within the regions of interest are expected to be similar between the groups calculated using cSVD and those calculated under the standard method. This will indicate the presence of minimal bias. RESULTS: Between groups of the standard processing method and the cSVD method standard deviation (SD) reductions were seen in both CBF and CBV values across all three ROIs. In grey matter measures of CBV, SD was reduced an average of 0.0034 mL/100g while measures of CBF saw SD reduced by an average of 0.073 mL/100g/min. In samples of white matter, standard deviations of CBV values were reduced on average by 0.0041mL/100g while CBF SD's were reduced by 0.073 mL/100g/min. CSF ROIs in CBV calculations saw SD reductions averaging 0.0047 mL/100g and reductions of 0.074 mL/100g/min in measures of CBF. Bias within CBV calculations was at most minimal as determined by no significant changes in mean calculated values. Calculations of CBF saw large downward bias in the mean values. CONCLUSIONS: The application of the cSVD method to preprocessing of CT perfusion imaging studies produces an effective method of noise reduction. In calculations of CBV, cSVD noise reduction results in overall improvement. In calculations of CBF, cSVD, while effective in noise reduction, caused mean values to be statistically lower than the standard method. It should be noted that there is currently no evaluation of which values can be considered more accurate physiologically. Simulations of the effect of noise on CBF showed a positive correlation suggesting that the CBF algorithm itself is sensitive to the level of noise
    • ā€¦
    corecore