159 research outputs found

    Organization of atomic bond tensions in model glasses

    Full text link
    In order to understand whether internal stresses in glasses are correlated or randomly distributed, we study the organization of atomic bond tensions (normal forces between pairs of atoms). Measurements of the invariants of the atomic bond tension tensor in simulated 2D and 3D binary Lennard-Jones glasses, reveal new and unexpected correlations and provide support for Alexander's conjecture about the non-random character of internal stresses in amorphous solids

    The Computational Complexity of Knot and Link Problems

    Full text link
    We consider the problem of deciding whether a polygonal knot in 3-dimensional Euclidean space is unknotted, capable of being continuously deformed without self-intersection so that it lies in a plane. We show that this problem, {\sc unknotting problem} is in {\bf NP}. We also consider the problem, {\sc unknotting problem} of determining whether two or more such polygons can be split, or continuously deformed without self-intersection so that they occupy both sides of a plane without intersecting it. We show that it also is in NP. Finally, we show that the problem of determining the genus of a polygonal knot (a generalization of the problem of determining whether it is unknotted) is in {\bf PSPACE}. We also give exponential worst-case running time bounds for deterministic algorithms to solve each of these problems. These algorithms are based on the use of normal surfaces and decision procedures due to W. Haken, with recent extensions by W. Jaco and J. L. Tollefson.Comment: 32 pages, 1 figur

    Collapsible Pushdown Automata and Recursion Schemes

    Get PDF
    International audienceWe consider recursion schemes (not assumed to be homogeneously typed, and hence not necessarily safe) and use them as generators of (possibly infinite) ranked trees. A recursion scheme is essentially a finite typed {deterministic term} rewriting system that generates, when one applies the rewriting rules ad infinitum, an infinite tree, called its value tree. A fundamental question is to provide an equivalent description of the trees generated by recursion schemes by a class of machines. In this paper we answer this open question by introducing collapsible pushdown automata (CPDA), which are an extension of deterministic (higher-order) pushdown automata. A CPDA generates a tree as follows. One considers its transition graph, unfolds it and contracts its silent transitions, which leads to an infinite tree which is finally node labelled thanks to a map from the set of control states of the CPDA to a ranked alphabet. Our contribution is to prove that these two models, higher-order recursion schemes and collapsible pushdown automata, are equi-expressive for generating infinite ranked trees. This is achieved by giving an effective transformations in both directions

    Study protocol: developing a decision system for inclusive housing: applying a systematic, mixed-method quasi-experimental design

    Get PDF
    Background Identifying the housing preferences of people with complex disabilities is a much needed, but under-developed area of practice and scholarship. Despite the recognition that housing is a social determinant of health and quality of life, there is an absence of empirical methodologies that can practically and systematically involve consumers in this complex service delivery and housing design market. A rigorous process for making effective and consistent development decisions is needed to ensure resources are used effectively and the needs of consumers with complex disability are properly met. Methods/Design This 3-year project aims to identify how the public and private housing market in Australia can better respond to the needs of people with complex disabilities whilst simultaneously achieving key corporate objectives. First, using the Customer Relationship Management framework, qualitative (Nominal Group Technique) and quantitative (Discrete Choice Experiment) methods will be used to quantify the housing preferences of consumers and their carers. A systematic mixed-method, quasi-experimental design will then be used to quantify the development priorities of other key stakeholders (e.g., architects, developers, Government housing services etc.) in relation to inclusive housing for people with complex disabilities. Stakeholders randomly assigned to Group 1 (experimental group) will participate in a series of focus groups employing Analytical Hierarchical Process (AHP) methodology. Stakeholders randomly assigned to Group 2 (control group) will participate in focus groups employing existing decision making processes to inclusive housing development (e.g., Risk, Opportunity, Cost, Benefit considerations). Using comparative stakeholder analysis, this research design will enable the AHP methodology (a proposed tool to guide inclusive housing development decisions) to be tested. Discussion It is anticipated that the findings of this study will enable stakeholders to incorporate consumer housing preferences into commercial decisions. Housing designers and developers will benefit from the creation of a parsimonious set of consumer-led housing preferences by which to make informed investments in future housing and contribute to future housing policy. The research design has not been applied in the Australian research context or elsewhere, and will provide a much needed blueprint for market investment to develop viable, consumer directed inclusive housing options for people with complex disability

    Using redundancy to cope with failures in a delay tolerant network

    Full text link
    We consider the problem of routing in a delay tolerant net-work (DTN) in the presence of path failures. Previous work on DTN routing has focused on using precisely known network dy-namics, which does not account for message losses due to link failures, buffer overruns, path selection errors, unscheduled de-lays, or other problems. We show how to split, replicate, and erasure code message fragments over multiple delivery paths to optimize the probability of successful message delivery. We provide a formulation of this problem and solve it for two cases: a 0/1 (Bernoulli) path delivery model where messages are ei-ther fully lost or delivered, and a Gaussian path delivery model where only a fraction of a message may be delivered. Ideas from the modern portfolio theory literature are borrowed to solve the underlying optimization problem. Our approach is directly relevant to solving similar problems that arise in replica place-ment in distributed file systems and virtual node placement in DHTs. In three different simulated DTN scenarios covering a wide range of applications, we show the effectiveness of our ap-proach in handling failures

    Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science

    Get PDF
    Abstract Background Many interventions found to be effective in health services research studies fail to translate into meaningful patient care outcomes across multiple contexts. Health services researchers recognize the need to evaluate not only summative outcomes but also formative outcomes to assess the extent to which implementation is effective in a specific setting, prolongs sustainability, and promotes dissemination into other settings. Many implementation theories have been published to help promote effective implementation. However, they overlap considerably in the constructs included in individual theories, and a comparison of theories reveals that each is missing important constructs included in other theories. In addition, terminology and definitions are not consistent across theories. We describe the Consolidated Framework For Implementation Research (CFIR) that offers an overarching typology to promote implementation theory development and verification about what works where and why across multiple contexts. Methods We used a snowball sampling approach to identify published theories that were evaluated to identify constructs based on strength of conceptual or empirical support for influence on implementation, consistency in definitions, alignment with our own findings, and potential for measurement. We combined constructs across published theories that had different labels but were redundant or overlapping in definition, and we parsed apart constructs that conflated underlying concepts. Results The CFIR is composed of five major domains: intervention characteristics, outer setting, inner setting, characteristics of the individuals involved, and the process of implementation. Eight constructs were identified related to the intervention (e.g., evidence strength and quality), four constructs were identified related to outer setting (e.g., patient needs and resources), 12 constructs were identified related to inner setting (e.g., culture, leadership engagement), five constructs were identified related to individual characteristics, and eight constructs were identified related to process (e.g., plan, evaluate, and reflect). We present explicit definitions for each construct. Conclusion The CFIR provides a pragmatic structure for approaching complex, interacting, multi-level, and transient states of constructs in the real world by embracing, consolidating, and unifying key constructs from published implementation theories. It can be used to guide formative evaluations and build the implementation knowledge base across multiple studies and settings.http://deepblue.lib.umich.edu/bitstream/2027.42/78272/1/1748-5908-4-50.xmlhttp://deepblue.lib.umich.edu/bitstream/2027.42/78272/2/1748-5908-4-50-S1.PDFhttp://deepblue.lib.umich.edu/bitstream/2027.42/78272/3/1748-5908-4-50-S3.PDFhttp://deepblue.lib.umich.edu/bitstream/2027.42/78272/4/1748-5908-4-50-S4.PDFhttp://deepblue.lib.umich.edu/bitstream/2027.42/78272/5/1748-5908-4-50.pdfhttp://deepblue.lib.umich.edu/bitstream/2027.42/78272/6/1748-5908-4-50-S2.PDFPeer Reviewe

    Modelling feedbacks between human and natural processes in the land system

    Get PDF
    The unprecedented use of Earth's resources by humans, in combination with increasing natural variability in natural processes over the past century, is affecting the evolution of the Earth system. To better understand natural processes and their potential future trajectories requires improved integration with and quantification of human processes. Similarly, to mitigate risk and facilitate socio-economic development requires a better understanding of how the natural system (e.g. climate variability and change, extreme weather events, and processes affecting soil fertility) affects human processes. Our understanding of these interactions and feedback between human and natural systems has been formalized through a variety of modelling approaches. However, a common conceptual framework or set of guidelines to model human-natural-system feedbacks is lacking. The presented research lays out a conceptual framework that includes representing model coupling configuration in combination with the frequency of interaction and coordination of communication between coupled models. Four different approaches used to couple representations of the human and natural system are presented in relation to this framework, which vary in the processes represented and in the scale of their application. From the development and experience associated with the four models of coupled human-natural systems, the following eight lessons were identified that if taken into account by future coupled human-natural-systems model developments may increase their success: (1) leverage the power of sensitivity analysis with models, (2) remember modelling is an iterative process, (3) create a common language, (4) make code open-access, (5) ensure consistency, (6) reconcile spatio-temporal mismatch, (7) construct homogeneous units, and (8) incorporating feedback increases non-linearity and variability. Following a discussion of feedbacks, a way forward to expedite model coupling and increase the longevity and interoperability of models is given, which suggests the use of a wrapper container software, a standardized applications programming interface (API), the incorporation of standard names, the mitigation of sunk costs by creating interfaces to multiple coupling frameworks, and the adoption of reproducible workflow environments to wire the pieces together

    Peak Stir Zone Temperatures during Friction Stir Processing

    Get PDF
    The stir zone (SZ) temperature cycle was measured during the friction stir processing (FSP) of NiAl bronze plates. The FSP was conducted using a tool design with a smooth concave shoulder and a 12.7-mm step-spiral pin. Temperature sensing was accomplished using sheathed thermocouples embedded in the tool path within the plates, while simultaneous optical pyrometry measurements of surface temperatures were also obtained. Peak SZ temperatures were 990 ⁰Cto 1015 ⁰C (0.90 to 0.97 TMelt) and were not affected by preheating to 400⁰C, although the dwell time above 900 ⁰C was increased by the preheating. Thermocouple data suggested little variation in peak temperature across the SZ, although thermocouples initially located on the advancing sides and at the centerlines of the tool traverses were displaced to the retreating sides, precluding direct assessment of the temperature variation across the SZ. Microstructure-based estimates of local peak SZ temperatures have been made on these and on other similarly processed materials. Altogether, the peak-temperature determinations from these different measurement techniques are in close agreement

    Science Objectives for an X-Ray Microcalorimeter Observing the Sun

    Get PDF
    We present the science case for a broadband X-ray imager with high-resolution spectroscopy, including simulations of X-ray spectral diagnostics of both active regions and solar flares. This is part of a trilogy of white papers discussing science, instrument (Bandler et al. 2010), and missions (Bookbinder et al. 2010) to exploit major advances recently made in transition-edge sensor (TES) detector technology that enable resolution better than 2 eV in an array that can handle high count rates. Combined with a modest X-ray mirror, this instrument would combine arcsecondscale imaging with high-resolution spectra over a field of view sufficiently large for the study of active regions and flares, enabling a wide range of studies such as the detection of microheating in active regions, ion-resolved velocity flows, and the presence of non-thermal electrons in hot plasmas. It would also enable more direct comparisons between solar and stellar soft X-ray spectra, a waveband in which (unusually) we currently have much better stellar data than we do of the Sun
    corecore