225 research outputs found

    Binary pattern tile set synthesis is NP-hard

    Full text link
    In the field of algorithmic self-assembly, a long-standing unproven conjecture has been that of the NP-hardness of binary pattern tile set synthesis (2-PATS). The kk-PATS problem is that of designing a tile assembly system with the smallest number of tile types which will self-assemble an input pattern of kk colors. Of both theoretical and practical significance, kk-PATS has been studied in a series of papers which have shown kk-PATS to be NP-hard for k=60k = 60, k=29k = 29, and then k=11k = 11. In this paper, we close the fundamental conjecture that 2-PATS is NP-hard, concluding this line of study. While most of our proof relies on standard mathematical proof techniques, one crucial lemma makes use of a computer-assisted proof, which is a relatively novel but increasingly utilized paradigm for deriving proofs for complex mathematical problems. This tool is especially powerful for attacking combinatorial problems, as exemplified by the proof of the four color theorem by Appel and Haken (simplified later by Robertson, Sanders, Seymour, and Thomas) or the recent important advance on the Erd\H{o}s discrepancy problem by Konev and Lisitsa using computer programs. We utilize a massively parallel algorithm and thus turn an otherwise intractable portion of our proof into a program which requires approximately a year of computation time, bringing the use of computer-assisted proofs to a new scale. We fully detail the algorithm employed by our code, and make the code freely available online

    On Models and Code:A Unified Approach to Support Large-Scale Deductive Program Verification

    Get PDF
    Despite the substantial progress in the area of deductive program verification over the last years, it still remains a challenge to use deductive verification on large-scale industrial applications. In this abstract, I analyse why this is case, and I argue that in order to solve this, we need to soften the border between models and code. This has two important advantages: (1) it would make it easier to reason about high-level behaviour of programs, using deductive verification, and (2) it would allow to reason about incomplete applications during the development process. I discuss how the first steps towards this goal are supported by verification techniques within the VerCors project, and I will sketch the future steps that are necessary to realise this goal

    A formally verified compiler back-end

    Get PDF
    This article describes the development and formal verification (proof of semantic preservation) of a compiler back-end from Cminor (a simple imperative intermediate language) to PowerPC assembly code, using the Coq proof assistant both for programming the compiler and for proving its correctness. Such a verified compiler is useful in the context of formal methods applied to the certification of critical software: the verification of the compiler guarantees that the safety properties proved on the source code hold for the executable compiled code as well

    Inequitable access to substance abuse treatment services in Cape Town, South Africa

    Get PDF
    BACKGROUND:Despite high levels of substance use disorders in Cape Town, substance abuse treatment utilization is low among people from disadvantaged communities in Cape Town, South Africa. To improve substance abuse treatment utilization, it is important to identify any potential barriers to treatment initiation so that interventions to reduce these barriers can be implemented. To date, substance abuse research has not examined the factors associated with substance abuse treatment utilization within developing countries. Using the Behavioural Model of Health Services Utilization as an analytic framework, this study aimed to redress this gap by examining whether access to substance abuse treatment is equitable and the profile of variables associated with treatment utilization for people from poor communities in Cape Town, South Africa. METHODS: This study used a case-control design to compare 434 individuals with substance use disorders from disadvantaged communities who had accessed treatment with 555 controls who had not accessed treatment on a range of predisposing, treatment need and enabling/restricting variables thought to be associated with treatment utilization. A hierarchical logistic regression was conducted to assess the unique contribution that the need for treatment, predisposing and enabling/restricting variable blocks made on substance abuse treatment utilization. RESULTS: Findings revealed that non-need enabling/restricting variables accounted for almost equal proportions of the variance in service utilization as the need for treatment variables. These enabling/restricting variables also attenuated the influence of the treatment need and predisposing variables domains on chances of treatment utilization. Several enabling/restricting variables emerged as powerful partial predictors of utilization including competing financial priorities, geographic access barriers and awareness of treatment services. Perceived severity of drug use, a need for treatment variable) was also a partial predictor of utilization. CONCLUSIONS: Findings point to inequitable access to substance abuse treatment services among people from poor South African communities, with non-need factors being significant determinants of treatment utilization. In these communities, treatment utilization can be enhanced by (i) expanding the existing repertoire of services to include low threshold services that target individuals with less severe problems; (ii) providing food and transport vouchers as part of contingency management efforts, thereby reducing some of the financial and geographic access barriers; (iii) introducing community-based mobile outpatient treatment services that are geographically accessible; and (iv) employing community-based outreach workers that focus on improving awareness of where, when and how to access existing treatment services

    Earliest evidence of pollution by heavy metals in archaeological sites

    Get PDF
    Homo species were exposed to a new biogeochemical environment when they began to occupy caves. Here we report the first evidence of palaeopollution through geochemical analyses of heavy metals in four renowned archaeological caves of the Iberian Peninsula spanning the last million years of human evolution. Heavy metal contents reached high values due to natural (guano deposition) and anthropogenic factors (e.g. combustion) in restricted cave environments. The earliest anthropogenic pollution evidence is related to Neanderthal hearths from Gorham's Cave (Gibraltar), being one of the first milestones in the so-called “Anthropocene”. According to its heavy metal concentration, these sediments meet the present-day standards of “contaminated soil”. Together with the former, the Gibraltar Vanguard Cave, shows Zn and Cu pollution ubiquitous across highly anthropic levels pointing to these elements as potential proxies for human activities. Pb concentrations in Magdalenian and Bronze age levels at El Pirulejo site can be similarly interpreted. Despite these high pollution levels, the contaminated soils might not have posed a major threat to Homo populations. Altogether, the data presented here indicate a long-term exposure of Homo to these elements, via fires, fumes and their ashes, which could have played certain role in environmental-pollution tolerance, a hitherto neglected influence.Francisco J. Jiménez Palacios and to the Analytical Chemistry Department (Sevilla University) are gratefully acknowledged for their help in the use of Carbolite electric oven. A.G.-A. was supported by a Marie Curie Intra-European Fellowship of the 7th Framework Programme for Research, Technological Development and Demonstration (European Commission). R.B. is a Beatriu de Pinós-A post-doctoral fellowship recipient (Generalitat de Catalunya and COFUND Marie Curie Actions, EU-FP7). This work also was partially financed by projects 19434/PI/14 Fundación Séneca, HARP2013-44269P, CGL-BOS-2012-34717, CGL2012-38434-C03-03 and CGL2012-38358 Ministerio de Economía y Competitividad, 2014 SGR 900 and 2014/100573 Generalitat de Catalunya-AGAUR, RNM 432 Research Group 179 (Junta de Andalucia) and MEXT-Japan

    Handheld computers for self-administered sensitive data collection: A comparative study in Peru

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Low-cost handheld computers (PDA) potentially represent an efficient tool for collecting sensitive data in surveys. The goal of this study is to evaluate the quality of sexual behavior data collected with handheld computers in comparison with paper-based questionnaires.</p> <p>Methods</p> <p>A PDA-based program for data collection was developed using Open-Source tools. In two cross-sectional studies, we compared data concerning sexual behavior collected with paper forms to data collected with PDA-based forms in Ancon (Lima).</p> <p>Results</p> <p>The first study enrolled 200 participants (18–29 years). General agreement between data collected with paper format and handheld computers was 86%. Categorical variables agreement was between 70.5% and 98.5% (Kappa: 0.43–0.86) while numeric variables agreement was between 57.1% and 79.8% (Spearman: 0.76–0.95). Agreement and correlation were higher in those who had completed at least high school than those with less education. The second study enrolled 198 participants. Rates of responses to sensitive questions were similar between both kinds of questionnaires. However, the number of inconsistencies (p = 0.0001) and missing values (p = 0.001) were significantly higher in paper questionnaires.</p> <p>Conclusion</p> <p>This study showed the value of the use of handheld computers for collecting sensitive data, since a high level of agreement between paper and PDA responses was reached. In addition, a lower number of inconsistencies and missing values were found with the PDA-based system. This study has demonstrated that it is feasible to develop a low-cost application for handheld computers, and that PDAs are feasible alternatives for collecting field data in a developing country.</p

    A randomized trial of an intervention to improve use and adherence to effective coronary heart disease prevention strategies

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Efficacious strategies for the primary prevention of coronary heart disease (CHD) are underused, and, when used, have low adherence. Existing efforts to improve use and adherence to these efficacious strategies have been so intensive that they are impractical for clinical practice.</p> <p>Methods</p> <p>We conducted a randomized trial of a CHD prevention intervention (including a computerized decision aid and automated tailored adherence messages) at one university general internal medicine practice. After obtaining informed consent and collecting baseline data, we randomized patients (men and women age 40-79 with no prior history of cardiovascular disease) to either the intervention or usual care. We then saw them for two additional study visits over 3 months. For intervention participants, we administered the decision aid at the primary study visit (1 week after baseline visit) and then mailed 3 tailored adherence reminders at 2, 4, and 6 weeks. We assessed our outcomes (including the predicted likelihood of angina, myocardial infarction, and CHD death over 10 years (CHD risk) and self-reported adherence) between groups at 3 month follow-up. Data collection occurred from June 2007 through December 2009. All study procedures were IRB approved.</p> <p>Results</p> <p>We randomized 160 eligible patients (81 intervention; 79 control) and followed 96% to study conclusion. Mean predicted CHD risk at baseline was 11.3%. The intervention increased self-reported adherence to chosen risk reducing strategies by 25 percentage points (95% CI 8% to 42%), with the biggest effect for aspirin. It also changed predicted CHD risk by -1.1% (95% CI -0.16% to -2%), with a larger effect in a pre-specified subgroup of high risk patients.</p> <p>Conclusion</p> <p>A computerized intervention that involves patients in CHD decision making and supports adherence to effective prevention strategies can improve adherence and reduce predicted CHD risk.</p> <p>Clinical trials registration number</p> <p>ClinicalTrials.gov: <a href="http://www.clinicaltrials.gov/ct2/show/NCT00494052">NCT00494052</a></p

    Complement in glomerular injury

    Get PDF
    In recent years, research into the role of complement in the immunopathogenesis of renal disease has broadened our understanding of the fragile balance between the protective and harmful functions of the complement system. Interventions into the complement system in various models of immune-mediated renal disease have resulted in both favourable and unfavourable effects and will allow us to precisely define the level of the complement cascade at which a therapeutic intervention will result in an optimal effect. The discovery of mutations of complement regulatory molecules has established a role of complement in the haemolytic uremic syndrome and membranoproliferative glomerulonephritis, and genotyping for mutations of the complement system are already leaving the research laboratory and have entered clinical practice. These clinical discoveries have resulted in the creation of relevant animal models which may provide crucial information for the development of highly specific therapeutic agents. Research into the role of complement in proteinuria has helped to understand pathways of inflammation which ultimately lead to renal failure irrespective of the underlying renal disease and is of major importance for the majority of renal patients. Complement science is a highly exciting area of translational research and hopefully will result in meaningful therapeutic advances in the near future
    • …
    corecore