1,184 research outputs found

    Structural aspects of tetanus toxin

    Get PDF

    "The Predication Semantics Model: The Role of Predicate: Class in Text Comprehension and Recall"

    Get PDF
    This paper presents and tests the predication semantics model, a computational model of text comprehension. It goes beyond previous case grammar approaches to text comprehension in employing a propositional rather than a rigid hierarchical tree notion, attempting to maintain a coherent set of propositions in working memory. The authors' assertion is that predicate class contains semantic information that readers use to make generally accurate predictions of a given proposition. Thus, the main purpose of the model-which works as a series of input and reduction cycles-is to explore the extent to which predicate categories play a role in reading comprehension and recall. In the reduction phase of the model, the propositions entered into the memory during the input phase are decreased while coherence is maintained among them. In an examination of the working memory at the end of each cycle, the computational model maintained coherence for 70% of cycles. The model appeared prone to serial dependence in errors: the coherence problem appears to occur because (unlike real readers) the simulation docs not reread when necessary. Overall, the experiment suggested that the predication semantics model is robust. The results suggested that the model emulates a primary process in text comprehension: predicate categories provide semantic information that helps to initiate and control automatic processes in reading, and allows people to grasp the gist of a text even when they have only minimal background knowledge. While needing refinement in several areas presenting minor problems-for example, the lack of a sufficiently complex memory to ensure that when the simulation of the model goes wrong it does not, as at present, stay wrong for successive intervals-the success of the model even at the current restrictive level of detail demonstrates the importance of the semantic information in predicate categories.

    Stable carbon, nitrogen and sulphur isotope analysis of permafrost preserved human hair from rescue excavations (2009, 2010) at the precontact site of Nunalleq, Alaska

    Get PDF
    Acknowledgments This work was funded by an Arts and Humanities Research Council (AH/K006029/1) grant awarded to Rick Knecht, Kate Britton and Charlotta Hillerdal (Aberdeen); an AHRC-LabEx award (AH/N504543/1) to KB, RK, Keith Dobney (Liverpool) and Isabelle Sidéra (Nanterre); the Carnegie Trust to the Universities of Scotland (travel grant to KB); and the Max Planck Institute for Evolutionary Anthropology. The onsite collection of samples was carried out by staff and students from the University of Aberdeen, volunteer excavators and the residents of Quinhagak. We had logistical and planning support for fieldwork by the Qanirtuuq Incorporated, Quinhagak, Alaska, and the people of Quinhagak, who we also thank for sampling permissions. Special thanks to Warren Jones and Qanirtuuq Incorporated (especially Michael Smith and Lynn Church), and to all Nunalleq project team members, in Aberdeen and at other institutions, particularly Charlotta Hillerdal and Edouard Masson-Maclean (Aberdeen) for comments on earlier versions of this manuscript, and also to Véronique Forbes, Ana Jorge, Carly Ameen and Ciara Mannion (Aberdeen) for their inputs. Thanks also to Michelle Alexander (York). Finally, thank you to Ian Scharlotta (Alberta) for inviting us to contribute to this special issue, to the Editor, and to three anonymous reviewers, whose suggestions and recommended changes to an earlier version of this manuscript greatly improved the paper.Peer reviewedPublisher PD

    FPGA Reliability and Failure Rate Analysis for Launch and Space Vehicles Environments

    Get PDF
    Field Programmable Gate Arrays (FPGAs) integrated circuits (IC) are one of the key electronic components in today's sophisticated launch and space vehicle complex avionic systems, largely due to their superb reprogrammable and reconfigurable capabilities combined with relatively low non-recurring engineering costs (NRE) and short design cycle. Consequently, FPGAs are prevalent ICs in communication protocols and control signal commands. This paper will demonstrate guidelines to estimate FPGA failure rates for ascent and in space operations. The guidelines will account for hardware and radiation-induced failures, as well as Bayesian updates to failure rates. The hardware contribution of the approach accounts for physical failures of the FPGA IC. The radiation portion will expand on FPGA susceptibility to different space radiation environments

    A Prototype Comparison of Human Trafficking Warning Signs: U.S. Midwest Frontline Workers’ Perceptions

    Get PDF
    Guided by the cognitive prototype approach, this article examines the prototype structure of the frontline workers’ perceptions concerning warning sign indicators in human trafficking. Online survey responses across a range of workplace sectors were analyzed using multiple-group confirmatory factor analysis (MG-CFA) for three groups. These groups were based on respondents’ self-reported human trafficking experiences: no witness (no encounter of human trafficking), sex trafficking witness, and labor trafficking witness. The MG-CFA analysis revealed a three-factor structure – physical condition, reproductive health, and personal risk – representing the participants’ perceptions of the warning signs. Further analysis showed group-level mean (latent intercept) and variance differences between the prototype structures of the three witness groups. The final structural model results indicate that these group-level prototype differences can be explained by two organizational resource variables: identification protocol and training. The results are discussed in light of the current empirical literature on human trafficking identification, stereotypical frames of victimhood, and policy practices

    Common Cause Failure Modeling in Space Launch Vehicles

    Get PDF
    Common Cause Failures (CCFs) are a known and documented phenomenon that defeats system redundancy. CCFs are a set of dependent type of failures that can be caused for example by system environments, manufacturing, transportation, storage, maintenance, and assembly. Since there are many factors that contribute to CCFs, they can be reduced, but are difficult to eliminate entirely. Furthermore, failure databases sometimes fail to differentiate between independent and dependent CCF. Because common cause failure data is limited in the aerospace industry, the Probabilistic Risk Assessment (PRA) Team at Bastion Technology Inc. is estimating CCF risk using generic data collected by the Nuclear Regulatory Commission (NRC). Consequently, common cause risk estimates based on this database, when applied to other industry applications, are highly uncertain. Therefore, it is important to account for a range of values for independent and CCF risk and to communicate the uncertainty to decision makers. There is an existing methodology for reducing CCF risk during design, which includes a checklist of 40+ factors grouped into eight categories. Using this checklist, an approach to produce a beta factor estimate is being investigated that quantitatively relates these factors. In this example, the checklist will be tailored to space launch vehicles, a quantitative approach will be described, and an example of the method will be presented

    Uncertainty Estimation Cheat Sheet for Probabilistic Risk Assessment

    Get PDF
    "Uncertainty analysis itself is uncertain, therefore, you cannot evaluate it exactly," Source Uncertain Quantitative results for aerospace engineering problems are influenced by many sources of uncertainty. Uncertainty analysis aims to make a technical contribution to decision-making through the quantification of uncertainties in the relevant variables as well as through the propagation of these uncertainties up to the result. Uncertainty can be thought of as a measure of the 'goodness' of a result and is typically represented as statistical dispersion. This paper will explain common measures of centrality and dispersion; and-with examples-will provide guidelines for how they may be estimated to ensure effective technical contributions to decision-making

    Field Programmable Gate Array Failure Rate Estimation Guidelines for Launch Vehicle Fault Tree Models

    Get PDF
    Today's launch vehicles complex electronic and avionic systems heavily utilize the Field Programmable Gate Array (FPGA) integrated circuit (IC). FPGAs are prevalent ICs in communication protocols such as MIL-STD-1553B, and in control signal commands such as in solenoid/servo valves actuations. This paper will demonstrate guidelines to estimate FPGA failure rates for a launch vehicle, the guidelines will account for hardware, firmware, and radiation induced failures. The hardware contribution of the approach accounts for physical failures of the IC, FPGA memory and clock. The firmware portion will provide guidelines on the high level FPGA programming language and ways to account for software/code reliability growth. The radiation portion will provide guidelines on environment susceptibility as well as guidelines on tailoring other launch vehicle programs historical data to a specific launch vehicle
    corecore