2,585 research outputs found

    Quantification of temporal fault trees based on fuzzy set theory

    Get PDF
    © Springer International Publishing Switzerland 2014. Fault tree analysis (FTA) has been modified in different ways to make it capable of performing quantitative and qualitative safety analysis with temporal gates, thereby overcoming its limitation in capturing sequential failure behaviour. However, for many systems, it is often very difficult to have exact failure rates of components due to increased complexity of systems, scarcity of necessary statistical data etc. To overcome this problem, this paper presents a methodology based on fuzzy set theory to quantify temporal fault trees. This makes the imprecision in available failure data more explicit and helps to obtain a range of most probable values for the top event probability

    Multigraded Castelnuovo-Mumford Regularity

    Full text link
    We develop a multigraded variant of Castelnuovo-Mumford regularity. Motivated by toric geometry, we work with modules over a polynomial ring graded by a finitely generated abelian group. As in the standard graded case, our definition of multigraded regularity involves the vanishing of graded components of local cohomology. We establish the key properties of regularity: its connection with the minimal generators of a module and its behavior in exact sequences. For an ideal sheaf on a simplicial toric variety X, we prove that its multigraded regularity bounds the equations that cut out the associated subvariety. We also provide a criterion for testing if an ample line bundle on X gives a projectively normal embedding.Comment: 30 pages, 5 figure

    Hennessy-Milner Logic with Greatest Fixed Points as a Complete Behavioural Specification Theory

    Get PDF
    There are two fundamentally different approaches to specifying and verifying properties of systems. The logical approach makes use of specifications given as formulae of temporal or modal logics and relies on efficient model checking algorithms; the behavioural approach exploits various equivalence or refinement checking methods, provided the specifications are given in the same formalism as implementations. In this paper we provide translations between the logical formalism of Hennessy-Milner logic with greatest fixed points and the behavioural formalism of disjunctive modal transition systems. We also introduce a new operation of quotient for the above equivalent formalisms, which is adjoint to structural composition and allows synthesis of missing specifications from partial implementations. This is a substantial generalisation of the quotient for deterministic modal transition systems defined in earlier papers

    Banks' risk assessment of Swedish SMEs

    Get PDF
    Building on the literatures on asymmetric information and risk taking, this paper applies conjoint experiments to investigate lending officers' probabilities of supporting credit to established or existing SMEs. Using a sample of 114 Swedish lending officers, we test hypotheses concerning how information on the borrower's ability to repay the loan; alignment of risk preferences; and risk sharing affect their willingness to grant credit. Results suggest that features that reduce the risk to the bank and shift the risk to the borrower have the largest impact. The paper highlights the interaction between factors that influence the credit decision. Implications for SMEs, banks and research are discussed

    Class and rank of differential modules

    Full text link
    A differential module is a module equipped with a square-zero endomorphism. This structure underpins complexes of modules over rings, as well as differential graded modules over graded rings. We establish lower bounds on the class--a substitute for the length of a free complex--and on the rank of a differential module in terms of invariants of its homology. These results specialize to basic theorems in commutative algebra and algebraic topology. One instance is a common generalization of the equicharacteristic case of the New Intersection Theorem of Hochster, Peskine, P. Roberts, and Szpiro, concerning complexes over noetherian commutative rings, and of a theorem of G. Carlsson on differential graded modules over graded polynomial rings.Comment: 27 pages. Minor changes; mainly stylistic. To appear in Inventiones Mathematica

    Coarse-grained entanglement classification through orthogonal arrays

    Full text link
    Classification of entanglement in multipartite quantum systems is an open problem solved so far only for bipartite systems and for systems composed of three and four qubits. We propose here a coarse-grained classification of entanglement in systems consisting of NN subsystems with an arbitrary number of internal levels each, based on properties of orthogonal arrays with NN columns. In particular, we investigate in detail a subset of highly entangled pure states which contains all states defining maximum distance separable codes. To illustrate the methods presented, we analyze systems of four and five qubits, as well as heterogeneous tripartite systems consisting of two qubits and one qutrit or one qubit and two qutrits.Comment: 38 pages, 1 figur

    Modelling winter organic aerosol at the European scale with CAMx : evaluation and source apportionment with a VBS parameterization based on novel wood burning smog chamber experiments

    Get PDF
    We evaluated a modified VBS (volatility basis set) scheme to treat biomass-burning-like organic aerosol (BBOA) implemented in CAMx (Comprehensive Air Quality Model with extensions). The updated scheme was parameterized with novel wood combustion smog chamber experiments using a hybrid VBS framework which accounts for a mixture of wood burning organic aerosol precursors and their further functionalization and fragmentation in the atmosphere. The new scheme was evaluated for one of the winter EMEP intensive campaigns (February March 2009) against aerosol mass spectrometer (AMS) measurements performed at 11 sites in Europe. We found a considerable improvement for the modelled organic aerosol (OA) mass compared to our previous model application with the mean fractional bias (MFB) reduced from 61 to 29 %. We performed model-based source apportionment studies and compared results against positive matrix factorization (PMF) analysis performed on OA AMS data. Both model and observations suggest that OA was mainly of secondary origin at almost all sites. Modelled secondary organic aerosol (SOA) contributions to total OA varied from 32 to 88 % (with an average contribution of 62 %) and absolute concentrations were generally under-predicted. Modelled primary hydrocarbon-like organic aerosol (HOA) and primary biomass-burning-like aerosol (BBPOA) fractions contributed to a lesser extent (HOA from 3 to 30 %, and BBPOA from 1 to 39 %) with average contributions of 13 and 25 %, respectively. Modelled BBPOA fractions were found to represent 12 to 64 % of the total residential-heating-related OA, with increasing contributions at stations located in the northern part of the domain. Source apportionment studies were performed to assess the contribution of residential and non-residential combustion precursors to the total SOA. Non-residential combustion and road transportation sector contributed about 30-40 % to SOA formation (with increasing contributions at urban and near industrialized sites), whereas residential combustion (mainly related to wood burning) contributed to a larger extent, around 60-70 %. Contributions to OA from residential combustion precursors in different volatility ranges were also assessed: our results indicate that residential combustion gas-phase precursors in the semivolatile range (SVOC) contributed from 6 to 30 %, with higher contributions predicted at stations located in the southern part of the domain On the other hand, the oxidation products of higher-volatility precursors (the sum of intermediate-volatility compounds (IVOCs) and volatile organic compounds (VOCs)) contribute from 15 to 38 % with no specific gradient among the stations. Although the new parameterization leads to a better agreement between model results and observations, it still under predicts the SOA fraction, suggesting that uncertainties in the new scheme and other sources and/or formation mechanisms remain to be elucidated. Moreover, a more detailed characterization of the semivolatile components of the emissions is needed.Peer reviewe

    A Very Sensitive 21cm Survey for Galactic High-Velocity HI

    Get PDF
    Very sensitive HI 21cm observations have been made in 860 directions at dec >= -43deg in search of weak, Galactic, high-velocity HI emission lines at moderate and high Galactic latitudes. One-third of the observations were made toward extragalactic objects. The median 4-sigma detection level is NHI = 8x10^{17} cm^-2 over the 21' telescope beam. High-velocity HI emission is detected in 37% of the directions; about half of the lines could not have been seen in previous surveys. The median FWHM of detected lines is 30.3 km/s. High- velocity HI lines are seen down to the sensitivity limit of the survey implying that there are likely lines at still lower values of NHI. The weakest lines have a kinematics and distribution on the sky similar to that of the strong lines, and thus do not appear to be a new population. Most of the emission originates from objects which are extended over several degrees; few appear to be compact sources. At least 75%, and possibly as many as 90%, of the lines are associated with one of the major high-velocity complexes. The Magellanic Stream extends at least 10 deg to higher Galactic latitude than previously thought and is more extended in longitude as well. Although there are many lines with low column density, their numbers do not increase as rapidly as NHI^-1, so most of the HI mass in the high-velocity cloud phenomenon likely resides in the more prominent clouds. The bright HI features may be mere clumps within larger structures, and not independent objects.Comment: 88 pages includes 22 figures Accepted for Publication in ApJ Suppl. June 200

    Honk against homophobia : rethinking relations between media and sexual minorities

    Full text link
    The theory of “symbolic annihilation” or “symbolic violence” has been used in academic literature to describe the way in which sexual minorities have been ignored, trivialized, or condemned by the media. This article aims to de-center research from issues of media representation to consider the capacity for minority groups to proactively use new media and its various avenues for interactivity, social networking, and feedback to fight social exclusion. This work suggests that new media has become a space in which the nominally marginal in society may acquire “social artillery”—a term used to describe how sexual minorities utilize their expanding and more readily accessible social connections in digital space to combat instances of homophobia. The research draws on the results of an inquiry into the relation between media and a regional youth social justice group in Australia tackling homophobia. The research demonstrates that the group is becoming increasingly adept and comfortable with using a cross-section of media platforms to fulfill their own objectives, rather than seeing themselves as passive subjects of media representation. This article argues that this sets an example for other socially excluded groups looking to renegotiate their relation with the media in regional areas
    • 

    corecore