10,706 research outputs found
Distributed simulation with COTS simulation packages: A case study in health care supply chain simulation
The UK National Blood Service (NBS) is a public funded body that is responsible for distributing blood and asso-ciated products. A discrete-event simulation of the NBS supply chain in the Southampton area has been built using the commercial off-the-shelf simulation package (CSP) Simul8. This models the relationship in the health care supply chain between the NBS Processing, Testing and Is-suing (PTI) facility and its associated hospitals. However, as the number of hospitals increase simulation run time be-comes inconveniently large. Using distributed simulation to try to solve this problem, researchers have used techniques informed by SISO’s CSPI PDG to create a version of Simul8 compatible with the High Level Architecture (HLA). The NBS supply chain model was subsequently divided into several sub-models, each running in its own copy of Simul8. Experimentation shows that this distri-buted version performs better than its standalone, conven-tional counterpart as the number of hospitals increases
Department of Homeland Security Science and Technology Directorate: Developing Technology to Protect America
In response to a congressional mandate and in consultation with Department of Homeland Security's (DHS) Science and Technology Directorate (S&T), the National Academy conducted a review of S&T's effectiveness and efficiency in addressing homeland security needs. This review included a particular focus that identified any unnecessary duplication of effort, and opportunity costs arising from an emphasis on homeland security-related research. Under the direction of the National Academy Panel, the study team reviewed a wide variety of documents related to S&T and homeland security-related research in general. The team also conducted interviews with more than 200 individuals, including S&T officials and staff, officials from other DHS component agencies, other federal agencies engaged in homeland security-related research, and experts from outside government in science policy, homeland security-related research and other scientific fields.Key FindingsThe results of this effort indicated that S&T faces a significant challenge in marshaling the resources of multiple federal agencies to work together to develop a homeland security-related strategic plan for all agencies. Yet the importance of this role should not be underestimated. The very process of working across agencies to develop and align the federal homeland security research enterprise around a forward-focused plan is critical to ensuring that future efforts support a common vision and goals, and that the metrics by which to measure national progress, and make changes as needed, are in place
Recommended from our members
Comparing conventional and distributed approaches to simulation in complex supply-chain health systems
Decision making in modern supply chains can be extremely daunting due to their complex nature. Discrete-event simulation is a technique that can support decision making by providing what-if analysis and evaluation of quantitative data. However, modelling supply chain systems can result in massively large and complicated models that can take a very long time to run even with today's powerful desktop computers. Distributed simulation has been suggested as a possible solution to this problem, by enabling the use of multiple computers to run models. To investigate this claim, this paper presents experiences in implementing a simulation model with a 'conventional' approach and with a distributed approach. This study takes place in a healthcare setting, the supply chain of blood from donor to recipient. The study compares conventional and distributed model execution times of a supply chain model simulated in the simulation package Simul8. The results show that the execution time of the conventional approach increases almost linearly with the size of the system and also the simulation run period. However, the distributed approach to this problem follows a more linear distribution of the execution time in terms of system size and run time and appears to offer a practical alternative. On the basis of this, the paper concludes that distributed simulation can be successfully applied in certain situations
Report on the Second Catalog Interoperability Workshop
The events, resolutions, and recommendations of the Second Catalog Interoperability Workshop, held at JPL in January, 1988, are discussed. This workshop dealt with the issues of standardization and communication among directories, catalogs, and inventories in the earth and space science data management environment. The Directory Interchange Format, being constructed as a standard for the exchange of directory information among participating data systems, is discussed. Involvement in the Interoperability effort by NASA, NOAA, ISGS, and NSF is described, and plans for future interoperability considered. The NASA Master Directory prototype is presented and critiqued and options for additional capabilities debated
An ECOOP web portal for visualising and comparing distributed coastal oceanography model and in situ data
As part of a large European coastal operational oceanography project (ECOOP), we have developed a web portal for the display and comparison of model and in situ marine data. The distributed model and in situ datasets are accessed via an Open Geospatial Consortium Web Map Service (WMS) and Web Feature Service (WFS) respectively. These services were developed independently and readily integrated for the purposes of the ECOOP project, illustrating the ease of interoperability resulting from adherence to international standards. The key feature of the portal is the ability to display co-plotted timeseries of the in situ and model data and the quantification of misfits between the two. By using standards-based web technology we allow the user to quickly and easily explore over twenty model data feeds and compare these with dozens of in situ data feeds without being concerned with the low level details of differing file formats or the physical location of the data. Scientific and operational benefits to this work include model validation, quality control of observations, data assimilation and decision support in near real time. In these areas it is essential to be able to bring different data streams together from often disparate locations
- …