90 research outputs found

    Lithospheric controls on melt production during continental breakup at slow rates of extension: Application to the North Atlantic

    No full text
    Rifted margins form from extension and breakup of the continentallithosphere. If this extension is coeval with a region of hotter lithosphere,then it is generally assumed that a volcanic margin would follow. Herewe present the results of numerical simulations of rift margin evolution byextending continental lithosphere above a thermal anomaly. We find that unlessthe lithosphere is thinned prior to the arrival of the thermal anomalyor half spreading rates are more than ? 50mmyr?1, the lithosphere actsas a lid to the hot material. The thermal anomaly cools significantly by conductionbefore having an effect on decompression melt production. If the lithosphereis thinned by the formation of extensional basins then the thermalanomaly advects into the thinned region and leads to enhanced decompressionmelting. In the North Atlantic a series of extensional basins off the coastof northwest Europe and Greenland provide the required thinning. This observationsuggests that volcanic margins that show slow rates of extension,only occur where there is the combination of a thermal anomaly and previousregional thinning of the lithosphere

    Controls on the location of compressional deformation on the NW European margin

    Get PDF
    The distribution of Cenozoic compressional structures along the NW European margin has been compared with maps of the thickness of the crystalline crust derived from a compilation of seismic refraction interpretations and gravity modelling, and with the distribution of high-velocity lower crust and/or partially serpentinized upper mantle detected by seismic experiments. Only a subset of the mapped compressional structures coincide with areas susceptible to lithospheric weakening as a result of crustal hyperextension and partial serpentinization of the upper mantle. Notably, partially serpentinized upper mantle is well documented beneath the central part of the southern Rockall Basin, but compressional features are sparse in that area. Where compressional structures have formed but the upper mantle is not serpentinized, simple rheological modelling suggests an alternative weakening mechanism involving ductile lower crust and lithospheric decoupling. The presence of pre-existing weak zones (associated with the properties of the gouge and overpressure in fault zones) and local stress magnitude and orientation are important contributing factors

    Naturally Rehearsing Passwords

    Full text link
    We introduce quantitative usability and security models to guide the design of password management schemes --- systematic strategies to help users create and remember multiple passwords. In the same way that security proofs in cryptography are based on complexity-theoretic assumptions (e.g., hardness of factoring and discrete logarithm), we quantify usability by introducing usability assumptions. In particular, password management relies on assumptions about human memory, e.g., that a user who follows a particular rehearsal schedule will successfully maintain the corresponding memory. These assumptions are informed by research in cognitive science and validated through empirical studies. Given rehearsal requirements and a user's visitation schedule for each account, we use the total number of extra rehearsals that the user would have to do to remember all of his passwords as a measure of the usability of the password scheme. Our usability model leads us to a key observation: password reuse benefits users not only by reducing the number of passwords that the user has to memorize, but more importantly by increasing the natural rehearsal rate for each password. We also present a security model which accounts for the complexity of password management with multiple accounts and associated threats, including online, offline, and plaintext password leak attacks. Observing that current password management schemes are either insecure or unusable, we present Shared Cues--- a new scheme in which the underlying secret is strategically shared across accounts to ensure that most rehearsal requirements are satisfied naturally while simultaneously providing strong security. The construction uses the Chinese Remainder Theorem to achieve these competing goals

    Building Performance Assessment Protocol for Timber Dwellings – Conducting Thermography Tests on Live Construction sites

    Get PDF
    This paper introduces the pan-Wales (UK) Home-Grown Homes (HGH) project (2018 to 2020) which focusses on three areas of improvement for delivering high performance, affordable and healthy homes. The HGH project is funded by Powys County Council, through the European Regional Development Fund’s Agricultural Stream. The HGH project is being delivered by Woodknowledge Wales in a consortium with Cardiff Metropolitan University (CMU), TRADA and Coed Cymru, with seven work packages. ‘More and Better Homes from Wood’ (work package (WP) WP3) focusses on the assessment of building performance for dwellings using timber, and is being delivered by a multi-disciplinary team at CMU through the Sustainable and Resilient Built Environment (SuRBe) group. This paper discusses the context and need for the HGH project as Wales launched its low carbon agenda in March 2019. The focus of this paper on introducing the building performance assessment (BPA) protocol to be implemented by SuRBe across several housing case studies in Wales, through the design, in-construction and occupancy phases, to address thermal and fire (TaF) performance issues, and impacts on occupants’ quality of life, comfort and safety. Preliminary results of in-construction testing on a live construction site are presented, with the challenges of conducting thermography tests whilst construction is in progress and weather conditions in spring in the UK (April 2019). This paper will be useful for academics, architects, building contractors, housing developers and professionals undertaking building performance assessment and evaluation on live construction sites

    Development and external validation study of a melanoma risk prediction model incorporating clinically assessed naevi and solar lentigines

    Get PDF
    Background: Melanoma risk prediction models could be useful for matching preventive interventions to patients’ risk. Objectives: To develop and validate a model for incident first‐primary cutaneous melanoma using clinically assessed risk factors. Methods: We used unconditional logistic regression with backward selection from the Australian Melanoma Family Study (461 cases and 329 controls) in which age, sex and city of recruitment were kept in each step, and we externally validated it using the Leeds Melanoma Case–Control Study (960 cases and 513 controls). Candidate predictors included clinically assessed whole‐body naevi and solar lentigines, and self‐assessed pigmentation phenotype, sun exposure, family history and history of keratinocyte cancer. We evaluated the predictive strength and discrimination of the model risk factors using odds per age‐ and sex‐adjusted SD (OPERA) and the area under curve (AUC), and calibration using the Hosmer–Lemeshow test. Results: The final model included the number of naevi ≥ 2 mm in diameter on the whole body, solar lentigines on the upper back (a six‐level scale), hair colour at age 18 years and personal history of keratinocyte cancer. Naevi was the strongest risk factor; the OPERA was 3·51 [95% confidence interval (CI) 2·71–4·54] in the Australian study and 2·56 (95% CI 2·23–2·95) in the Leeds study. The AUC was 0·79 (95% CI 0·76–0·83) in the Australian study and 0·73 (95% CI 0·70–0·75) in the Leeds study. The Hosmer–Lemeshow test P‐value was 0·30 in the Australian study and < 0·001 in the Leeds study. Conclusions: This model had good discrimination and could be used by clinicians to stratify patients by melanoma risk for the targeting of preventive interventions. What's already known about this topic? Melanoma risk prediction models may be useful in prevention by tailoring interventions to personalized risk levels. For reasons of feasibility, time and cost many melanoma prediction models use self‐assessed risk factors. However, individuals tend to underestimate their naevus numbers. What does this study add? We present a melanoma risk prediction model, which includes clinically‐assessed whole‐body naevi and solar lentigines, and self‐assessed risk factors including pigmentation phenotype and history of keratinocyte cancer. This model performs well on discrimination, the model's ability to distinguish between individuals with and without melanoma, and may assist clinicians to stratify patients by melanoma risk for targeted preventive interventions

    Non-crossing dependencies: Least effort, not grammar

    Get PDF
    The use of null hypotheses (in a statistical sense) is common in hard sciences but not in theoretical linguistics. Here the null hypothesis that the low frequency of syntactic dependency crossings is expected by an arbitrary ordering of words is rejected. It is shown that this would require star dependency structures, which are both unrealistic and too restrictive. The hypothesis of the limited resources of the human brain is revisited. Stronger null hypotheses taking into account actual dependency lengths for the likelihood of crossings are presented. Those hypotheses suggests that crossings are likely to reduce when dependencies are shortened. A hypothesis based on pressure to reduce dependency lengths is more parsimonious than a principle of minimization of crossings or a grammatical ban that is totally dissociated from the general and non-linguistic principle of economy.Postprint (author's final draft
    corecore