46,751 research outputs found
Merging expert and empirical data for rare event frequency estimation : pool homogenisation for empirical Bayes models
Empirical Bayes provides one approach to estimating the frequency of rare events as a weighted average of the frequencies of an event and a pool of events. The pool will draw upon, for example, events with similar precursors. The higher the degree of homogeneity of the pool, then the Empirical Bayes estimator will be more accurate. We propose and evaluate a new method using homogenisation factors under the assumption that events are generated from a Homogeneous Poisson Process. The homogenisation factors are scaling constants, which can be elicited through structured expert judgement and used to align the frequencies of different events, hence homogenising the pool. The estimation error relative to the homogeneity of the pool is examined theoretically indicating that reduced error is associated with larger pool homogeneity. The effects of misspecified expert assessments of the homogenisation factors are examined theoretically and through simulation experiments. Our results show that the proposed Empirical Bayes method using homogenisation factors is robust under different degrees of misspecification
Recommended from our members
Predicting with sparse data
It is well known that effective prediction of project cost related factors is an important aspect of software engineering. Unfortunately, despite extensive research over more than 30 years, this remains a significant problem for many practitioners. A major obstacle is the absence of reliable and systematic historic data, yet this is a sine qua non for almost all proposed methods: statistical, machine learning or calibration of existing models. In this paper we describe our sparse data method (SDM) based upon a pairwise comparison technique and Saaty's Analytic Hierarchy Process (AHP). Our minimum data requirement is a single known point. The technique is supported by a software tool known as DataSalvage. We show, for data from two companies, how our approach â based upon expert judgement â adds value to expert judgement by producing significantly more accurate and less biased results. A sensitivity analysis shows that our approach is robust to pairwise comparison errors. We then describe the results of a small usability trial with a practising project manager. From this empirical work we conclude that the technique is promising and may help overcome some of the present barriers to effective project prediction
A Methodology for Variability Reduction in Manufacturing Cost Estimating in the Automotive Industry based on Design Features
Organised by: Cranfield UniversitySmall to medium manufacturing companies are coming to realise the increasing importance of performing
fast and accurate cost estimates at the early stages of projects to address customersâ requests for
quotation. However, they cannot afford the implementation of a knowledge-based cost estimating software.
This paper explains the development and validation of a consistent methodology for the cost estimating of
manufactured parts (focused on pistons) based on the design features. The research enabled the
identification of the sources of variability in cost estimates, and the main one is the lack of formal procedures
for the cost estimates in manufacturing SMEs. Finally, a software prototype was developed that reduces the
variability in the cost estimates by defining a formal procedure, following the most appropriate cost
estimating techniques.Mori Seiki â The Machine Tool Compan
Estimating past inhalation exposure to asbestos: a tool for risk attribution and disease screening
Introduction:
Late presentation is common in mesothelioma. Reliable assessment of past exposure to asbestos is a necessary first step for risk attribution and for the development of a future screening programme. Such a programme could maximise access to trials of novel therapies and would pave the way for development of novel chemoprophylaxis strategies. This paper describes a method for individual exposure reconstruction along with data from a validation study.
Methods:
The exposure assessment method uses only descriptive information about the circumstances of the work that could be obtained from questioning the worker. The assessment is based on the tasks carried out and includes parameters for substance emission potential, activity emission potential, the effectiveness of any local control measures, passive emission, the fractional time the asbestos source is active and the efficiency of any respiratory protection worn.
Results:
There was a good association between the estimated and measured exposure levels. Pearsonâs correlation coefficient between the log-transformed measurements and estimates from the model was 0.86, and 95% of the estimated individual values were within about a factor of ten of the associated measured value. The method described would be suitable for pre-selecting individuals at high risk of malignant pleural mesothelioma for screening using appropriate tools and/or enrolment in clinical trials of chemo-prophylaxis.
Discussion:
This method is of potential clinical value in developing novel treatment approaches for mesothelioma. Pilot studies to test this approach are urgently needed
Estimating the cost of a new technology intensive automotive product: A case study approach.
Estimating cost of new technology intensive products is very ad hoc within the
automotive industry. There is a need to develop a systematic approach to the
cost estimating, which will make the estimates more realistic. This research
proposes a methodology that uses parametric, analogy and detailed estimating
techniques to enable a cost to be built for an automotive powertrain product
with a high content of new technology. The research defines a process for
segregating new or emerging technologies from current technologies to enable the
various costing techniques to be utilised. The cost drivers from an internal
combustion engine's characteristics to facilitate a cost estimate for high-
volume production are also presented. A process to enable a costing expert to
either build an estimate for the new technology under analysis or use a
comparator and then develop a variant for the new system is also discussed. Due
to the open nature of the statement ânew technologyâ, research is also conducted
to provide a meaningful definition applicable to the automotive industry and
this pro
Recommended from our members
The use of function points to find cost analogies
Finding effective techniques for the early estimation of project effort remains an important â and frustratingly elusive â research objective for the software development community. We have conducted an empirical study of 21 real time projects for a major software developer. The study collected a range of counts and measures derived from specification documents, including a derivative of Function Points intended for highly constrained systems. Notwithstanding the fact that the projects were drawn from a comparatively stable environment, traditional approaches for building prediction systems, (for example, regression analysis) failed to yield a useful predictive model. By contrast, estimation based upon the automated search for analogous projects produced more accurate estimates. How much this is a characteristic of this particular dataset and how much these findings might be more generally replicated is uncertain. Nevertheless, these results should act as encouragement for follow up research on a much under utilised estimation technique
Quantified Risk and Uncertainty Analysis
The legal requirement in the UK for the duty holder of
a chemical process plant to demonstrate that risk is
as low as reasonably practicable (ALARP) means that
quantified risk assessments (QRAs) must be accurate and
robust and that identified risks are adequately mitigated. Bayesian belief networks(BBN) is an emerging technique which can be used to determine the likelihood of an event in support of the QRA process. It is a statistical method involving estimating the probability distribution for a given hypothesis. The most interesting features which distinguish this QRA technique from all the others are:
⢠it can analyse complex systems of any given number of
variables and their dependability within a single analysis;
⢠it can analyse parameters over a range of probability
values for any given set of conditions, providing a better
understanding in terms of sensitivity analysis;
⢠it engages expert judgement and learning from previous
events to update the probability distribution, thus
improving QRA accuracy; and
⢠it is not just restricted to fault analysis and can be used
to support plant operational decision making using a
quantified approac
Multivariate reliability modelling with empirical Bayes inference
Recent developments in technology permit detailed descriptions of system performance to be collected and stored. Consequently, more data are available about the occurrence, or non-occurrence, of events across a range of classes through time. Typically this implies that reliability analysis has more information about the exposure history of a system within different classes of events. For highly reliable systems, there may be relatively few failure events. Thus there is a need to develop statistical inference to support reliability estimation when there is a low ratio of failures relative to event classes. In this paper we show how Empirical Bayes methods can be used to estimate a multivariate reliability function for a system by modelling the vector of times to realise each failure root cause
Recommended from our members
The price of risk in construction projects: contingency approximation model (CAM)
Little attention has been focussed on a precise definition and evaluation mechanism for project management risk specifically related to contractors. When bidding, contractors traditionally price risks using unsystematic approaches. The high business failure rate our industry records may indicate that the current unsystematic mechanisms contractors use for building up contingencies may be inadequate. The reluctance of some contractors to include a price for risk in their tenders when bidding for work competitively may also not be a useful approach. Here, instead, we first define the meaning of contractor contingency, and then we develop a facile quantitative technique that contractors can use to estimate a price for project risk. This model will help contractors analyse their exposure to project risks; and help them express the risk in monetary terms for management action. When bidding for work, they can decide how to allocate contingencies strategically in a way that balances risk and reward
- âŚ