175,775 research outputs found

    Insights from computational modelling and simulation towards promoting public health among African countries

    Get PDF
    One of the problems associated with some African countries is the increasing trend of road mortality as a result of road fatalities. This has been a major concern. The negative impacts of these on public health cannot be underestimated. An issue of concern is the high record of casualties being recorded on an annual basis as a result of over-speeding, overtaking at dangerous bends, alcohol influence and non-chalant attitude of drivers to driving. The aim of this research is to explore and adapt the knowledge of finite state algorithm, modeling and simulation to design and implement a novel prototype of an advanced traffic light system towards promoting public health among African countries. Here, we specify and built a model of an advanced wireless traffic control system, which will help complement existing traffic control systems among African countries. This prototype is named Advanced Wireless Traffic Control System (WPDTCS). We developed this model using an event-driven programming approach. The technical details of the model were based on knowledge adapted from the Finite State Automation Transition algorithm. It is expected that the AWTCS will promote the evolution of teaching in modeling, simulation, public safety by offering trainees an advanced pedagogical product. It will also permit to strengthen the collaboration of knowledge from the fields of Computer Science, Public health, and Electrical Engineering. Keywords: public health, public safety, modelling , simulation, pr

    Research and Education in Computational Science and Engineering

    Get PDF
    Over the past two decades the field of computational science and engineering (CSE) has penetrated both basic and applied research in academia, industry, and laboratories to advance discovery, optimize systems, support decision-makers, and educate the scientific and engineering workforce. Informed by centuries of theory and experiment, CSE performs computational experiments to answer questions that neither theory nor experiment alone is equipped to answer. CSE provides scientists and engineers of all persuasions with algorithmic inventions and software systems that transcend disciplines and scales. Carried on a wave of digital technology, CSE brings the power of parallelism to bear on troves of data. Mathematics-based advanced computing has become a prevalent means of discovery and innovation in essentially all areas of science, engineering, technology, and society; and the CSE community is at the core of this transformation. However, a combination of disruptive developments---including the architectural complexity of extreme-scale computing, the data revolution that engulfs the planet, and the specialization required to follow the applications to new frontiers---is redefining the scope and reach of the CSE endeavor. This report describes the rapid expansion of CSE and the challenges to sustaining its bold advances. The report also presents strategies and directions for CSE research and education for the next decade.Comment: Major revision, to appear in SIAM Revie

    Advanced Cyberinfrastructure for Science, Engineering, and Public Policy

    Full text link
    Progress in many domains increasingly benefits from our ability to view the systems through a computational lens, i.e., using computational abstractions of the domains; and our ability to acquire, share, integrate, and analyze disparate types of data. These advances would not be possible without the advanced data and computational cyberinfrastructure and tools for data capture, integration, analysis, modeling, and simulation. However, despite, and perhaps because of, advances in "big data" technologies for data acquisition, management and analytics, the other largely manual, and labor-intensive aspects of the decision making process, e.g., formulating questions, designing studies, organizing, curating, connecting, correlating and integrating crossdomain data, drawing inferences and interpreting results, have become the rate-limiting steps to progress. Advancing the capability and capacity for evidence-based improvements in science, engineering, and public policy requires support for (1) computational abstractions of the relevant domains coupled with computational methods and tools for their analysis, synthesis, simulation, visualization, sharing, and integration; (2) cognitive tools that leverage and extend the reach of human intellect, and partner with humans on all aspects of the activity; (3) nimble and trustworthy data cyber-infrastructures that connect, manage a variety of instruments, multiple interrelated data types and associated metadata, data representations, processes, protocols and workflows; and enforce applicable security and data access and use policies; and (4) organizational and social structures and processes for collaborative and coordinated activity across disciplinary and institutional boundaries.Comment: A Computing Community Consortium (CCC) white paper, 9 pages. arXiv admin note: text overlap with arXiv:1604.0200

    Multiscale modeling of rapid granular flow with a hybrid discrete-continuum method

    Full text link
    Both discrete and continuum models have been widely used to study rapid granular flow, discrete model is accurate but computationally expensive, whereas continuum model is computationally efficient but its accuracy is doubtful in many situations. Here we propose a hybrid discrete-continuum method to profit from the merits but discard the drawbacks of both discrete and continuum models. Continuum model is used in the regions where it is valid and discrete model is used in the regions where continuum description fails, they are coupled via dynamical exchange of parameters in the overlap regions. Simulation of granular channel flow demonstrates that the proposed hybrid discrete-continuum method is nearly as accurate as discrete model, with much less computational cost

    Minimum Information About a Simulation Experiment (MIASE)

    Get PDF
    Reproducibility of experiments is a basic requirement for science. Minimum Information (MI) guidelines have proved a helpful means of enabling reuse of existing work in modern biology. The Minimum Information Required in the Annotation of Models (MIRIAM) guidelines promote the exchange and reuse of biochemical computational models. However, information about a model alone is not sufficient to enable its efficient reuse in a computational setting. Advanced numerical algorithms and complex modeling workflows used in modern computational biology make reproduction of simulations difficult. It is therefore essential to define the core information necessary to perform simulations of those models. The Minimum Information About a Simulation Experiment (MIASE, Glossary in Box 1) describes the minimal set of information that must be provided to make the description of a simulation experiment available to others. It includes the list of models to use and their modifications, all the simulation procedures to apply and in which order, the processing of the raw numerical results, and the description of the final output. MIASE allows for the reproduction of any simulation experiment. The provision of this information, along with a set of required models, guarantees that the simulation experiment represents the intention of the original authors. Following MIASE guidelines will thus improve the quality of scientific reporting, and will also allow collaborative, more distributed efforts in computational modeling and simulation of biological processes

    A stability condition for turbulence model: From EMMS model to EMMS-based turbulence model

    Full text link
    The closure problem of turbulence is still a challenging issue in turbulence modeling. In this work, a stability condition is used to close turbulence. Specifically, we regard single-phase flow as a mixture of turbulent and non-turbulent fluids, separating the structure of turbulence. Subsequently, according to the picture of the turbulent eddy cascade, the energy contained in turbulent flow is decomposed into different parts and then quantified. A turbulence stability condition, similar to the principle of the energy-minimization multi-scale (EMMS) model for gas-solid systems, is formulated to close the dynamic constraint equations of turbulence, allowing the heterogeneous structural parameters of turbulence to be optimized. We call this model the `EMMS-based turbulence model', and use it to construct the corresponding turbulent viscosity coefficient. To validate the EMMS-based turbulence model, it is used to simulate two classical benchmark problems, lid-driven cavity flow and turbulent flow with forced convection in an empty room. The numerical results show that the EMMS-based turbulence model improves the accuracy of turbulence modeling due to it considers the principle of compromise in competition between viscosity and inertia.Comment: 26 pages, 13 figures, 2 table

    Agent-based modeling: a systematic assessment of use cases and requirements for enhancing pharmaceutical research and development productivity.

    Get PDF
    A crisis continues to brew within the pharmaceutical research and development (R&D) enterprise: productivity continues declining as costs rise, despite ongoing, often dramatic scientific and technical advances. To reverse this trend, we offer various suggestions for both the expansion and broader adoption of modeling and simulation (M&S) methods. We suggest strategies and scenarios intended to enable new M&S use cases that directly engage R&D knowledge generation and build actionable mechanistic insight, thereby opening the door to enhanced productivity. What M&S requirements must be satisfied to access and open the door, and begin reversing the productivity decline? Can current methods and tools fulfill the requirements, or are new methods necessary? We draw on the relevant, recent literature to provide and explore answers. In so doing, we identify essential, key roles for agent-based and other methods. We assemble a list of requirements necessary for M&S to meet the diverse needs distilled from a collection of research, review, and opinion articles. We argue that to realize its full potential, M&S should be actualized within a larger information technology framework--a dynamic knowledge repository--wherein models of various types execute, evolve, and increase in accuracy over time. We offer some details of the issues that must be addressed for such a repository to accrue the capabilities needed to reverse the productivity decline
    • …
    corecore