1,277 research outputs found

    Distinguishing Hidden Markov Chains

    Full text link
    Hidden Markov Chains (HMCs) are commonly used mathematical models of probabilistic systems. They are employed in various fields such as speech recognition, signal processing, and biological sequence analysis. We consider the problem of distinguishing two given HMCs based on an observation sequence that one of the HMCs generates. More precisely, given two HMCs and an observation sequence, a distinguishing algorithm is expected to identify the HMC that generates the observation sequence. Two HMCs are called distinguishable if for every ε>0\varepsilon > 0 there is a distinguishing algorithm whose error probability is less than ε\varepsilon. We show that one can decide in polynomial time whether two HMCs are distinguishable. Further, we present and analyze two distinguishing algorithms for distinguishable HMCs. The first algorithm makes a decision after processing a fixed number of observations, and it exhibits two-sided error. The second algorithm processes an unbounded number of observations, but the algorithm has only one-sided error. The error probability, for both algorithms, decays exponentially with the number of processed observations. We also provide an algorithm for distinguishing multiple HMCs. Finally, we discuss an application in stochastic runtime verification.Comment: This is the full version of a LICS'16 pape

    Ultraviolet Spectrometric Characterization of Metal Decontamination

    Get PDF
    The viability of an ultraviolet fiber optic method for quantifying contamination levels on metal substrates was demonstrated. Contaminated metal samples obtained from industrial processing operations were cleaned with alternative solvents being used to replace chlorofluorocarbons, which are in the process of being phased out. Cleaned samples with varying amounts of oils remaining on them were analyzed for residual amounts using four testing methods: X-ray spectroscopy, infrared spectroscopy, ultraviolet spectroscopy, and gravimetric analysis. Multivariate data analysis techniques were used to prepare a model using ultraviolet and infrared data. Both principal component regression (PCR) and partial least squares (PLS) methods were used in the analysis of the data. A one-factor model was found to be adequate for the infrared data. Interpretation of the results using different types of cross validation showed that two factors were required for the ultraviolet model. Cleanliness levels were also determined using electron scanning for chemical analysis (ESCA) and gravimetric methods. The results from all four methods were expressed in terms of amount of contamination in milligrams per unit area. The values obtained were compared against actual contamination data, and the sum of squares of errors was calculated. The predicted residual sum of squares of error for an equal number of comparison points for IR, UV, ESCA, and gravimetry were 0.20, 0.84, 1.39, and 2.54 (mg. /ft.2)2 respectively. The predictive ability of each method was related to its cost and ease of use. The ultraviolet instrumentation tested in this study ranked second in predictive ability and second in cost, making it a viable option for cleanliness testing purposes

    EAC: A program for the error analysis of STAGS results for plates

    Get PDF
    A computer code is now available for estimating the error in results from the STAGS finite element code for a shell unit consisting of a rectangular orthotropic plate. This memorandum contains basic information about the computer code EAC (Error Analysis and Correction) and describes the connection between the input data for the STAGS shell units and the input data necessary to run the error analysis code. The STAGS code returns a set of nodal displacements and a discrete set of stress resultants; the EAC code returns a continuous solution for displacements and stress resultants. The continuous solution is defined by a set of generalized coordinates computed in EAC. The theory and the assumptions that determine the continuous solution are also outlined in this memorandum. An example of application of the code is presented and instructions on its usage on the Cyber and the VAX machines have been provided

    Crustal deformation of the Central Indian Ocean, south of Sri Lanka as inferred from gravity and magnetic data

    Get PDF
    Bathymetry, gravity, and magnetic data across the Central Indian Ocean Basin (CIOB) along a WE track between 5°N to 1°N latitudes and 77°E to 90°E longitudes are used to identify crustal deformation due to tectonic features such as the Comorin Ridge, 85°E ridge, Ninety East Ridge, and major fracture zones. The tectonic features were interpreted along the North Central Indian Ocean using 2D gravity modelling to understand the origin and tectonic activity of the subsurface features. The Comorin Ridge is coupled with gravity anomalies with small amplitude varying 25–30 mGal in comparison with the ridge relief which suggests that the ridge is compensated at deeper depths. The focus of the present study is to prepare a reasonable crustal model of the Central Indian Ocean using gravity and magnetic data. The crustal depths of the Central Indian Ocean Basin (CIOB) determined from gravity data using the spectral method are compared with the 2D gravity modelling results. It has been observed that the crustal depths obtained from the Spectral method are in good correlation with results obtained from 2D gravity modelling. The average basement depths for the profiles were obtained as ~5 km and perhaps deviated approximately 1–2 km from the mean. In the case of curie isotherm, the crustal depths vary 9–12 km for all magnetic profiles which may indicate deformation

    A Structural and Behavioral Analysis Approach for Process Model Evaluation.

    Get PDF
    Manufacturing of process driven business applications can be supported by process modeling efforts in order to link the gap between business requirements and system conditions. However, deviating purposes of business process modeling inventiveness have led to considerable problems of aligning related models at distinct abstract levels and distinct outlooks. Verifying the consistency of such related models is a big challenge for process modeling theory and practice. Our contribution is a concept called behavioral profile that sum up the fundamental behavioral limits of a process model. We show that these outlines can be calculated effectively, i.e., in cubic time for sound free-choice Petri nets w.r.t. their number of places and changeovers. In addition to the above Support Vector Machines (SVM) usage is helpful to improve consistency with greater confidence to evaluate behavioral and structural consistency

    RoBuSt: A Crash-Failure-Resistant Distributed Storage System

    Full text link
    In this work we present the first distributed storage system that is provably robust against crash failures issued by an adaptive adversary, i.e., for each batch of requests the adversary can decide based on the entire system state which servers will be unavailable for that batch of requests. Despite up to γn1/loglogn\gamma n^{1/\log\log n} crashed servers, with γ>0\gamma>0 constant and nn denoting the number of servers, our system can correctly process any batch of lookup and write requests (with at most a polylogarithmic number of requests issued at each non-crashed server) in at most a polylogarithmic number of communication rounds, with at most polylogarithmic time and work at each server and only a logarithmic storage overhead. Our system is based on previous work by Eikel and Scheideler (SPAA 2013), who presented IRIS, a distributed information system that is provably robust against the same kind of crash failures. However, IRIS is only able to serve lookup requests. Handling both lookup and write requests has turned out to require major changes in the design of IRIS.Comment: Revised full versio

    Effectiveness of water for dust control on Las Vegas Valley soils

    Full text link
    In this thesis, five simple and rapid methods were developed to investigate the effectiveness of water in controlling dust emitted from different soil surfaces undergoing either wind erosion or construction activity. Poof, Scrape and Small Wind Tunnel tests determine the variation in dust emissions from dry, abraded and wetted surfaces with respect to elapse time. Poof and Scrape tests were able to distinguish among different soil groups classified based on their Particulate Emission Potential (PEP) and surface conditions. The Pie pan tests found one PEP group to have weaker crusts than the others. Simplified and standard infiltration tests gave comparable results and showed significant rate differences among the Hydrologic soil groups. Rapid volumetric moisture measurements combined with bulk density measurements gave results comparable to the ASTM Standard method

    Allen Linear (Interval) Temporal Logic --Translation to LTL and Monitor Synthesis--

    Get PDF
    The relationship between two well established formalisms for temporal reasoning is first investigated, namely between Allen's interval algebra (or Allen's temporal logic, abbreviated \ATL) and linear temporal logic (\LTL). A discrete variant of \ATL is defined, called Allen linear temporal logic (\ALTL), whose models are \omega-sequences of timepoints, like in \LTL. It is shown that any \ALTL formula can be linearly translated into an equivalent \LTL formula, thus enabling the use of \LTL techniques and tools when requirements are expressed in \ALTL. %This translation also implies the NP-completeness of \ATL satisfiability. Then the monitoring problem for \ALTL is discussed, showing that it is NP-complete despite the fact that the similar problem for \LTL is EXPSPACE-complete. An effective monitoring algorithm for \ALTL is given, which has been implemented and experimented with in the context of planning applications
    corecore