107,143 research outputs found
Towards automatic Markov reliability modeling of computer architectures
The analysis and evaluation of reliability measures using time-varying Markov models is required for Processor-Memory-Switch (PMS) structures that have competing processes such as standby redundancy and repair, or renewal processes such as transient or intermittent faults. The task of generating these models is tedious and prone to human error due to the large number of states and transitions involved in any reasonable system. Therefore model formulation is a major analysis bottleneck, and model verification is a major validation problem. The general unfamiliarity of computer architects with Markov modeling techniques further increases the necessity of automating the model formulation. This paper presents an overview of the Automated Reliability Modeling (ARM) program, under development at NASA Langley Research Center. ARM will accept as input a description of the PMS interconnection graph, the behavior of the PMS components, the fault-tolerant strategies, and the operational requirements. The output of ARM will be the reliability of availability Markov model formulated for direct use by evaluation programs. The advantages of such an approach are (a) utility to a large class of users, not necessarily expert in reliability analysis, and (b) a lower probability of human error in the computation
An integrative computational model for intestinal tissue renewal
Objectives\ud
\ud
The luminal surface of the gut is lined with a monolayer of epithelial cells that acts as a nutrient absorptive engine and protective barrier. To maintain its integrity and functionality, the epithelium is renewed every few days. Theoretical models are powerful tools that can be used to test hypotheses concerning the regulation of this renewal process, to investigate how its dysfunction can lead to loss of homeostasis and neoplasia, and to identify potential therapeutic interventions. Here we propose a new multiscale model for crypt dynamics that links phenomena occurring at the subcellular, cellular and tissue levels of organisation.\ud
\ud
Methods\ud
\ud
At the subcellular level, deterministic models characterise molecular networks, such as cell-cycle control and Wnt signalling. The output of these models determines the behaviour of each epithelial cell in response to intra-, inter- and extracellular cues. The modular nature of the model enables us to easily modify individual assumptions and analyse their effects on the system as a whole.\ud
\ud
Results\ud
\ud
We perform virtual microdissection and labelling-index experiments, evaluate the impact of various model extensions, obtain new insight into clonal expansion in the crypt, and compare our predictions with recent mitochondrial DNA mutation data. \ud
\ud
Conclusions\ud
\ud
We demonstrate that relaxing the assumption that stem-cell positions are fixed enables clonal expansion and niche succession to occur. We also predict that the presence of extracellular factors near the base of the crypt alone suffices to explain the observed spatial variation in nuclear beta-catenin levels along the crypt axis
Are Britainās railways costing too much? Perspectives based on TFP comparisons with British Rail: 1963-2002.
Following the Hatfield accident in October 2000, the cost of running Britainās railways has increased very sharply, leading to considerable debate about whether current cost levels are reasonable. This paper seeks to inform this debate by assessing post-Hatfield cost and TFP levels (2000/01 to 2001/02) against the historical precedents set by British Rail and the early experience of the newly-privatised industry (1963 to 1999/00). The results show that industry cash costs rose by 47% between 1999/00, the last financial year before Hatfield, and 2001/02 - but, surprisingly, with train operating costs (TOCs and freight operators) accounting for 42% of this growth. The results also show that the post-Hatfield cost spike is unprecedented when compared against historical benchmarks, indicating that recent cost rises cannot simply be explained by the investment cycle or so-called ābow-waveā effects. Furthermore, according to the preferred models, post-Hatfield productivity levels are lower than at any time over the last four decades. Analysis of long-term data on quality and safety measures indicates that an excessive focus on rail safety may offer part of the explanation for the recent cost growth, with the emphasis on safety also resulting in less attention to punctuality and reliability
Non-equilibrium dynamics of stochastic point processes with refractoriness
Stochastic point processes with refractoriness appear frequently in the
quantitative analysis of physical and biological systems, such as the
generation of action potentials by nerve cells, the release and reuptake of
vesicles at a synapse, and the counting of particles by detector devices. Here
we present an extension of renewal theory to describe ensembles of point
processes with time varying input. This is made possible by a representation in
terms of occupation numbers of two states: Active and refractory. The dynamics
of these occupation numbers follows a distributed delay differential equation.
In particular, our theory enables us to uncover the effect of refractoriness on
the time-dependent rate of an ensemble of encoding point processes in response
to modulation of the input. We present exact solutions that demonstrate generic
features, such as stochastic transients and oscillations in the step response
as well as resonances, phase jumps and frequency doubling in the transfer of
periodic signals. We show that a large class of renewal processes can indeed be
regarded as special cases of the model we analyze. Hence our approach
represents a widely applicable framework to define and analyze non-stationary
renewal processes.Comment: 8 pages, 4 figure
Report : review of the literature : maintenance and rehabilitation costs for roads (Risk-based Analysis)
Realistic estimates of short- and long-term (strategic) budgets for maintenance and
rehabilitation of road assessment management should consider the stochastic
characteristics of asset conditions of the road networks so that the overall variability
of road asset data conditions is taken into account.
The probability theory has been used for assessing life-cycle costs for bridge
infrastructures by Kong and Frangopol (2003), Zayed et.al. (2002), Kong and
Frangopol (2003), Liu and Frangopol (2004), Noortwijk and Frangopol (2004), Novick
(1993). Salem 2003 cited the importance of the collection and analysis of existing
data on total costs for all life-cycle phases of existing infrastructure, including bridges,
road etc., and the use of realistic methods for calculating the probable useful life of
these infrastructures (Salem et. al. 2003). Zayed et. al. (2002) reported conflicting
results in life-cycle cost analysis using deterministic and stochastic methods.
Frangopol et. al. 2001 suggested that additional research was required to develop
better life-cycle models and tools to quantify risks, and benefits associated with
infrastructures.
It is evident from the review of the literature that there is very limited information on
the methodology that uses the stochastic characteristics of asset condition data for
assessing budgets/costs for road maintenance and rehabilitation (Abaza 2002,
Salem et. al. 2003, Zhao, et. al. 2004). Due to this limited information in the research
literature, this report will describe and summarise the methodologies presented by
each publication and also suggest a methodology for the current research project
funded under the Cooperative Research Centre for Construction Innovation CRC CI
project no 2003-029-C
Equivalent random analysis of a buffered optical switch with general interarrival times
We propose an approximate analytic model of an optical switch with fibre delay lines and wavelength converters by employing Equivalent Random Theory. General arrival traffic is modelled by means of Gamma-distributed interarrival times. The analysis is formulated in terms of virtual traffic flows within the optical switch from which we derive expressions for burst blocking probability, fibre delay line occupancy and mean delay. Emphasis is on approximations that give good numerical efficiency so that the method can be useful for formulating dimensioning problems for large-scale networks. Numerical solution values from the proposed analysis method compare well with results from a discrete-event simulation of an optical burst switch
Recommended from our members
Assessing the efficient cost of sustaining Britain's rail network: Perspectives based on zonal comparisons
The objective of this paper is to inform the debate on how efficiency targets for Network Rail (formerly Railtrack) should be set during the 2002/03 Interim Review and beyond. Given the problems experienced during the 2000 Periodic Review, which focused on external benchmarks, we propose an internal benchmarking approach, drawing on data for seven geographical Zones within Railtrack (over the period 1995/96 to 2001/02). Our approach mirrors the yardstick competition method used in other UK regulated industries. Three efficiency measurement techniques are applied to this data (DEA; COLS; SFA). Our results suggest that Railtrack (as a whole) delivered substantial improvements in productivity in the early years after privatisation, although these savings were largely offset by the post-Hatfield cost increases. However, looking forward, Zonal efficiency differences suggest that the company could make significant savings in future years by applying (its own) best practice consistently across the network
- ā¦