578,614 research outputs found
Simulation in manufacturing and business: A review
Copyright @ 2009 Elsevier B.V.This paper reports the results of a review of simulation applications published within peer-reviewed literature between 1997 and 2006 to provide an up-to-date picture of the role of simulation techniques within manufacturing and business. The review is characterised by three factors: wide coverage, broad scope of the simulation techniques, and a focus on real-world applications. A structured methodology was followed to narrow down the search from around 20,000 papers to 281. Results include interesting trends and patterns. For instance, although discrete event simulation is the most popular technique, it has lower stakeholder engagement than other techniques, such as system dynamics or gaming. This is highly correlated with modelling lead time and purpose. Considering application areas, modelling is mostly used in scheduling. Finally, this review shows an increasing interest in hybrid modelling as an approach to cope with complex enterprise-wide systems
AUGUR: Forecasting the Emergence of New Research Topics
Being able to rapidly recognise new research trends is strategic for many stakeholders, including universities, institutional funding bodies, academic publishers and companies. The literature presents several approaches to identifying the emergence of new research topics, which rely on the assumption that the topic is already exhibiting a certain degree of popularity and consistently referred to by a community of researchers. However, detecting the emergence of a new research area at an embryonic stage, i.e., before the topic has been consistently labelled by a community of researchers and associated with a number of publications, is still an open challenge. We address this issue by introducing Augur, a novel approach to the early detection of research topics. Augur analyses the diachronic relationships between research areas and is able to detect clusters of topics that exhibit dynamics correlated with the emergence of new research topics. Here we also present the Advanced Clique Percolation Method (ACPM), a new community detection algorithm developed specifically for supporting this task. Augur was evaluated on a gold standard of 1,408 debutant topics in the 2000-2011 interval and outperformed four alternative approaches in terms of both precision and recall
Recommended from our members
From bedside to bench: Comroe and dripps revisited
Twenty-five years ago a paper published in Science by Julius Comroe and Robert Dripps purported to demonstrate that 41 per cent of all articles judged to be essential for later clinical advances were not clinically oriented at the time of the study and 62 per cent of key articles were the result of basic research.
Since that analysis, support for basic research has increased in the G7 countries. In the UK, Research Council expenditure on basic research has increased from a low of ÂŁ444 million (or 42 per cent of total civil R&D) in 1991/92 to ÂŁ769 million (or 61 per cent of total civil R&D) in 1998/99. Although it would be difficult to argue that Comroe and Dripps were directly responsible for a strategic shift (or drift) in the type of science supported by research funders, their arguments are often cited (albeit at times implicitly) in support of the increased funding for basic biomedical research.
In 1987 Richard Smith wrote a critical paper reassessing Comroe and Dripps. His main argument was that the original study was in itself âunscientificâ and that it should be âfollowed by bigger and better studiesâ. This study is, in part, an answer to that challenge.
Given the increased support for basic research, and the apparent importance based on the work of Comroe and Dripps, we felt it was important to investigate Smithâs comments by replicating Comroe and Drippsâs study and at the same time try to improve upon the methodology. The current project had two objectives:
1. To see if the original Comroe and Drippsâs methodology was âreplicableâ.
2. To validate the key findings of Comroe and Dripps.
By looking at neonatal intensive care (NIC), we concluded that Comroe and Drippsâ study â as reported â is not repeatable, reliable or valid, and thus is an insufficient evidence base for increased expenditure on basic biomedical research. We did, however, develop an alternative methodology which used bibliographic databases and bibliometric techniques to describe the research underpinning five of the most important clinical advances in NIC, as identified through a Delphi survey.
Using the revised bibliometric protocol, we demonstrated that after a time-lag of about 17 years, between 2 and 21 per cent of research underpinning the clinical advances could be described as basic. This observation is at odds with Comroe and Drippsâs finding that 62 per cent of key research articles judged to be essential for latter clinical advance were the result of basic research.
In reaching this conclusion we are acutely aware of the significant limitations to the revised methodology and, therefore, we caution against the over-interpretation of our results. However, we would argue that there needs to be a greater understanding of how basic research supports healthcare and hope this report will inform part of this wider debate.R&D Directorate of the NHS Executive London; Wellcome Trus
Theoretical studies of the historical development of the accounting discipline: a review and evidence
Many existing studies of the development of accounting thought have either been atheoretical or have adopted Kuhn's model of scientific growth. The limitations of this 35-year-old model are discussed. Four different general neo-Kuhnian models of scholarly knowledge development are reviewed and compared with reference to an analytical matrix. The models are found to be mutually consistent, with each focusing on a different aspect of development. A composite model is proposed. Based on a hand-crafted database, author co-citation analysis is used to map empirically the entire literature structure of the accounting discipline during two consecutive time periods, 1972â81 and 1982â90. The changing structure of the accounting literature is interpreted using the proposed composite model of scholarly knowledge development
Fixing Rule 702: The PCAST Report and Steps to Ensure the Reliability of Forensic Feature-Comparison Methods in the Criminal Courts
In response to PCASTâs recommendation, the Standing Advisory Committee on Evidence Rules convened a meeting on forensic expert testimony, Daubert, and Rule 702 on October 27, 2017, at Boston College Law School to inform itself about the issues.22 The meeting included presentations by twenty-six speakers (including myself) and discussion among the attendees. The purpose of this Article is to summarize aspects of the PCAST report relevant to its recommendation to the Standing Advisory Committee on Evidence Rules and to propose a path forward with respect to Rule 702
The Child is Father of the Man: Foresee the Success at the Early Stage
Understanding the dynamic mechanisms that drive the high-impact scientific
work (e.g., research papers, patents) is a long-debated research topic and has
many important implications, ranging from personal career development and
recruitment search, to the jurisdiction of research resources. Recent advances
in characterizing and modeling scientific success have made it possible to
forecast the long-term impact of scientific work, where data mining techniques,
supervised learning in particular, play an essential role. Despite much
progress, several key algorithmic challenges in relation to predicting
long-term scientific impact have largely remained open. In this paper, we
propose a joint predictive model to forecast the long-term scientific impact at
the early stage, which simultaneously addresses a number of these open
challenges, including the scholarly feature design, the non-linearity, the
domain-heterogeneity and dynamics. In particular, we formulate it as a
regularized optimization problem and propose effective and scalable algorithms
to solve it. We perform extensive empirical evaluations on large, real
scholarly data sets to validate the effectiveness and the efficiency of our
method.Comment: Correct some typos in our KDD pape
Recommended from our members
Methodology for profiling literature in healthcare simulation
The publications that relate to the application of simulation to healthcare have steadily increased over the years. These publications are scattered amongst various journals that belong to several subject categories, including Operational Research, Health Economics and Pharmacokinetics. The simulation techniques that are applied to the study of healthcare problems are also varied. The aim of this study is to present
a methodology for profiling literature in
healthcare simulation. In our methodology, we
have considered papers on healthcare that have been published between 1970 and 2007 in
journals with impact factors that belonging to various subject categories reporting on the application of four simulation techniques, namely, Monte Carlo Simulation, Discrete-Event Simulation, System Dynamics and Agent-Based Simulation. The methodology has the following objectives: (a) to categorise the papers under the different simulation techniques and identify the
healthcare problems that each technique is
employed to investigate; (b) to profile, within our dataset, variables such as authors, article citations, etc.; (c) to identify turning point (strategically important) papers and authors through co-citation analysis of references cited
by the papers in our dataset. The focus of the paper is on the literature profiling methodology, and not the results that have been derived through the application of this methodology. The authors hope that the methodology presented here will be used to conduct similar work in not only healthcare but also other research domains
- âŠ