317,194 research outputs found
Can the Heinrich ratio be used to predict harm from medication errors?
The purpose of this study was to establish whether, for medication errors, there exists a fixed Heinrich ratio between the number of incidents which did not result in harm, the number that caused minor harm, and the number that caused serious harm. If this were the case then it would be very useful in estimating any changes in harm following an intervention. Serious harm resulting from medication errors is relatively rare, so it can take a great deal of time and resource to detect a significant change. If the Heinrich ratio exists for medication errors, then it would be possible, and far easier, to measure the much more frequent number of incidents that did not result in harm and the extent to which they changed following an intervention; any reduction in harm could be extrapolated from this
Some properties of the theory of n-categories
Let denote the Dwyer-Kan localization of the category of weak
n-categories divided by the n-equivalences. We propose a list of properties
that this simplicial category is likely to have, and conjecture that these
properties characterize up to equivalence. We show, using these
properties, how to obtain the morphism -categories between two points in
an object of and how to obtain the composition map between the morphism
objects
Measuring Syntactic Complexity in Spoken and Written Learner Language: Comparing the Incomparable?
Spoken and written language are two modes of language. When learners aim at higher skill levels, the expected outcome of successful second language learning is usually to become a fluent speaker and writer who can produce accurate and complex language in the target language. There is an axiomatic difference between speech and writing, but together they form the essential parts of learners’ L2 skills. The two modes have their own characteristics, and there are differences between native and nonnative language use. For instance, hesitations and pauses are not visible in the end result of the writing process, but they are characteristic of nonnative spoken language use. The present study is based on the analysis of L2 English spoken and written productions of 18 L1 Finnish learners with focus on syntactic complexity. As earlier spoken language segmentation units mostly come from fluency studies, we conducted an experiment with a new unit, the U-unit, and examined how using this unit as the basis of spoken language segmentation affects the results. According to the analysis, written language was more complex than spoken language. However, the difference in the level of complexity was greatest when the traditional units, T-units and AS-units, were used in segmenting the data. Using the U-unit revealed that spoken language may, in fact, be closer to written language in its syntactic complexity than earlier studies had suggested. Therefore, further research is needed to discover whether the differences in spoken and written learner language are primarily due to the nature of these modes or, rather, to the units and measures used in the analysis
Defining determinism
The article puts forward a branching - style framework for the analysis of determinism and indeterminism of scientific theories, starting from the core idea that an indeterministic system is one whose present allows for more than one alternative possible future. We describe how a definition of determinism stated in terms of branching models supplements and improves current treatments of determinism of theories of physics. In these treatments, we identify three main approaches: one based on the study of (differential) equations, one based on mappings between temporal realizations, and one based on branching models. We first give an overview of these approaches and show that current orthodoxy advocates a combination of the mapping- and the equations - based approaches. After giving a detailed formal explication of a branching - based definition of determinism, we consider three concrete applications and end with a formal comparison of the branching- and the mapping-based approach. We conclude that the branching - based definition of determinism most usefully combines formal clarity, connection with an underlying philosophical notion of determinism, and relevance for the practical assessment of theories
The duality diagram in data analysis: Examples of modern applications
Today's data-heavy research environment requires the integration of different
sources of information into structured data sets that can not be analyzed as
simple matrices. We introduce an old technique, known in the European data
analyses circles as the Duality Diagram Approach, put to new uses through the
use of a variety of metrics and ways of combining different diagrams together.
This issue of the Annals of Applied Statistics contains contemporary examples
of how this approach provides solutions to hard problems in data integration.
We present here the genesis of the technique and how it can be seen as a
precursor of the modern kernel based approaches.Comment: Published in at http://dx.doi.org/10.1214/10-AOAS408 the Annals of
Applied Statistics (http://www.imstat.org/aoas/) by the Institute of
Mathematical Statistics (http://www.imstat.org
Termination of rewrite relations on -terms based on Girard's notion of reducibility
In this paper, we show how to extend the notion of reducibility introduced by
Girard for proving the termination of -reduction in the polymorphic
-calculus, to prove the termination of various kinds of rewrite
relations on -terms, including rewriting modulo some equational theory
and rewriting with matching modulo , by using the notion of
computability closure. This provides a powerful termination criterion for
various higher-order rewriting frameworks, including Klop's Combinatory
Reductions Systems with simple types and Nipkow's Higher-order Rewrite Systems
- …