1,963 research outputs found

    Extending Highthroughput Technologies: The Automation of Mechanism Discovery Investigations into the mode and action of the thio-Michael Reaction

    Get PDF
    This project concerns the investigation of the thio-Michael reaction (see scheme). The approach has employed a process development methodology to chemical discovery rather than more traditional research methods. Stage One involved the investigation of rates of addition with and without different catalysts to develop an understanding of the thio-Michael system. Further studies with asymmetric catalysts were carried out. Stage Two involved the use of adiabatic calorimetry and advanced reaction modelling to provide a complete understanding of the thio-Michael reaction under study. From this, the mechanism we propose is a new, self-accelerating process, in which the product may catalyse collapse of one observable intermediate to the product. The exact mode of this autocatalytic step is not clear at present, but is the subject of ongoing studies as is the generality of this new mechanistic insight into the Michael addition reaction

    Zinc calixarene complexes for the ring opening polymerization of cyclic esters

    Get PDF
    Reaction of Zn(C₆F₅)₂·toluene (two equivalents) with 1,3-dipropoxy-p-tert-butyl-calix[4]arene (LÂčH₂) led to the isolation of the complex [{Zn(C₆F₅)}₂LÂč] (1), whilst similar use of Zn(Me)₂ resulted in the known complex [{Zn(Me)}₂LÂč] (2). Treatment of LÂčH₂ with in situ prepared Zn{N(SiMe₃)₂}₂ in refluxing toluene led to the isolation of the compound [(Na)ZnN(SiMe₃)₂LÂč] (3). The stepwise reaction of LÂčH₂ and sodium hydride, followed by ZnCl₂ and finally NaN(SiMe₃)₂ yielded the compound [Zn{N(SiMe₃)₂}₂LÂč] (4). The reaction between three equivalents of Zn(C₆F₅)₂·toluene and oxacalix[3]arene (LÂČH₃) at room temperature formed the compound {[Zn(C₆F₅)]₃LÂČ} (5); heating of 5 in acetonitrile caused the ring opening of the parent oxacalix[3]arene and rearrangement to afford the complex [(LÂČ)Zn₆(C₆F₅)(R)(RH)OH]·5MeCN R = C₆F₅CH₂-(p-á”—BuPhenolate-CH₂OCH₂–)₂–p-á”—BuPhenolate-CH₂O⁻)³⁻ (6). The molecular structures of the new complexes 1, 3 and 6, together with that of the known complex 2, whose solid state structure has not previously been reported, have been determined. Compounds 1, 3–5 have been screened for the ring opening polymerization (ROP) of Δ-caprolactone (Δ-CL) and rac-lactide. Compounds featuring a Zn–C₆F₅ fragment were found to be poor ROP pre-catalysts as they did not react with benzyl alcohol to form an alkoxide. By contrast, compound 4, which contains a zinc silylamide linkage, was the most active of the zinc-based calix[4]arene compounds screened and was capable of ROP at ambient temperature with 65% conversion over 4 h

    Evidence-Based Dialogue Maps as a research tool to evaluate the quality of school pupils’ scientific argumentation

    Get PDF
    This pilot study focuses on the potential of Evidence-based Dialogue Mapping as a participatory action research tool to investigate young teenagers’ scientific argumentation. Evidence-based Dialogue Mapping is a technique for representing graphically an argumentative dialogue through Questions, Ideas, Pros, Cons and Data. Our research objective is to better understand the usage of Compendium, a Dialogue Mapping software tool, as both (1) a learning strategy to scaffold school pupils’ argumentation and (2) as a method to investigate the quality of their argumentative essays. The participants were a science teacher-researcher, a knowledge mapping researcher and 20 pupils, 12-13 years old, in a summer science course for “gifted and talented” children in the UK. This study draws on multiple data sources: discussion forum, science teacher-researcher’s and pupils’ Dialogue Maps, pupil essays, and reflective comments about the uses of mapping for writing. Through qualitative analysis of two case studies, we examine the role of Evidence-based Dialogue Maps as a mediating tool in scientific reasoning: as conceptual bridges for linking and making knowledge intelligible; as support for the linearisation task of generating a coherent document outline; as a reflective aid to rethinking reasoning in response to teacher feedback; and as a visual language for making arguments tangible via cartographic conventions

    GPU-based volume deformation.

    Get PDF

    Are decisions using cost-utility analyses robust to choice of SF-36/SF-12 preference-based algorithm?

    Get PDF
    BACKGROUND: Cost utility analysis (CUA) using SF-36/SF-12 data has been facilitated by the development of several preference-based algorithms. The purpose of this study was to illustrate how decision-making could be affected by the choice of preference-based algorithms for the SF-36 and SF-12, and provide some guidance on selecting an appropriate algorithm. METHODS: Two sets of data were used: (1) a clinical trial of adult asthma patients; and (2) a longitudinal study of post-stroke patients. Incremental costs were assumed to be 2000peryearoverstandardtreatment,andQALYgainsrealizedovera1−yearperiod.Tenpublishedalgorithmswereidentified,denotedbyfirstauthor:Brazier(SF−36),Brazier(SF−12),Shmueli,Fryback,Lundberg,Nichol,Franks(3algorithms),andLawrence.Incrementalcost−utilityratios(ICURs)foreachalgorithm,statedindollarsperquality−adjustedlifeyear(2000 per year over standard treatment, and QALY gains realized over a 1-year period. Ten published algorithms were identified, denoted by first author: Brazier (SF-36), Brazier (SF-12), Shmueli, Fryback, Lundberg, Nichol, Franks (3 algorithms), and Lawrence. Incremental cost-utility ratios (ICURs) for each algorithm, stated in dollars per quality-adjusted life year (/QALY), were ranked and compared between datasets. RESULTS: In the asthma patients, estimated ICURs ranged from Lawrence's SF-12 algorithm at 30,769/QALY(9530,769/QALY (95% CI: 26,316 to 36,697) to Brazier's SF-36 algorithm at 63,492/QALY (95% CI: 48,780 to 83,333). ICURs for the stroke cohort varied slightly more dramatically. The MEPS-based algorithm by Franks et al. provided the lowest ICUR at 27,972/QALY(9527,972/QALY (95% CI: 20,942 to 41,667). The Fryback and Shmueli algorithms provided ICURs that were greater than 50,000/QALY and did not have confidence intervals that overlapped with most of the other algorithms. The ICUR-based ranking of algorithms was strongly correlated between the asthma and stroke datasets (r = 0.60). CONCLUSION: SF-36/SF-12 preference-based algorithms produced a wide range of ICURs that could potentially lead to different reimbursement decisions. Brazier's SF-36 and SF-12 algorithms have a strong methodological and theoretical basis and tended to generate relatively higher ICUR estimates, considerations that support a preference for these algorithms over the alternatives. The "second-generation" algorithms developed from scores mapped from other indirect preference-based measures tended to generate lower ICURs that would promote greater adoption of new technology. There remains a need for an SF-36/SF-12 preference-based algorithm based on the US general population that has strong theoretical and methodological foundations

    Categorical Colormap Optimization with Visualization Case Studies

    Get PDF
    —Mapping a set of categorical values to different colors is an elementary technique in data visualization. Users of visualization software routinely rely on the default colormaps provided by a system, or colormaps suggested by software such as ColorBrewer. In practice, users often have to select a set of colors in a semantically meaningful way (e.g., based on conventions, color metaphors, and logological associations), and consequently would like to ensure their perceptual differentiation is optimized. In this paper, we present an algorithmic approach for maximizing the perceptual distances among a set of given colors. We address two technical problems in optimization, i.e., (i) the phenomena of local maxima that halt the optimization too soon, and (ii) the arbitrary reassignment of colors that leads to the loss of the original semantic association. We paid particular attention to different types of constraints that users may wish to impose during the optimization process. To demonstrate the effectiveness of this work, we tested this technique in two case studies. To reach out to a wider range of users, we also developed a web application called Colourmap Hospital

    Glyph visualization: A fail-safe design scheme based on quasi-hamming distances

    Get PDF
    © 1981-2012 IEEE. In many spatial and temporal visualization applications, glyphs provide an effective means for encoding multivariate data. However, because glyphs are typically small, they are vulnerable to various perceptual errors. This article introduces the concept of a quasi-Hamming distance in the context of glyph design and examines the feasibility of estimating the quasi-Hamming distance between a pair of glyphs and the minimal Hamming distance for a glyph set. The authors demonstrate the design concept by developing a file-system event visualization that can depict the activities of multiple users

    Monte Carlo Tree Search applied to co-operative problems

    Get PDF
    This paper highlights an experiment to see how standard Monte Carlo Tree Search handles simple co-operative problems with no prior or provided knowledge. These problems are formed from a simple grid world that has a set of goals, doors and buttons as well as walls that cannot be walked through. Two agents have to reach every goal present on the map. For a door to be open, an agent must be present on at least one of the buttons that is linked to it. When laid out correctly, the world requires each agent to do certain things at certain times in order to achieve the goal. With no modification to allow communication between the two agents, Monte Carlo Tress Search performs well and very 'purposefully' when given enough computational time

    Modelling the impact of specific food policy options on coronary heart disease and stroke deaths in Ireland

    Get PDF
    Objective: To estimate the potential reduction in cardiovascular (CVD) mortality possible by decreasing salt, trans fat and saturated fat consumption, and by increasing fruit and vegetable (F/V) consumption in Irish adults aged 25–84years for 2010. Design: Modelling study using the validated IMPACT Food Policy Model across two scenarios. Sensitivity analysis was undertaken. First, a conservative scenario: reductions in dietary salt by 1 g/day, trans fat by 0.5% of energy intake, saturated fat by 1% energy intake and increasing F/V intake by 1 portion/day. Second, a more substantial but politically feasible scenario: reductions in dietary salt by 3 g/day, trans fat by 1% of energy intake, saturated fat by 3% of energy intake and increasing F/V intake by 3 portions/day. Setting: Republic of Ireland. Outcomes: Coronary heart disease (CHD) and stroke deaths prevented. Results: The small, conservative changes in food policy could result in approximately 395 fewer cardiovascular deaths per year; approximately 190 (minimum 155, maximum 230) fewer CHD deaths in men, 50 (minimum 40, maximum 60) fewer CHD deaths in women, 95 (minimum 75, maximum 115) fewer stroke deaths in men, and 60 (minimum 45, maximum 70) fewer stroke deaths in women. Approximately 28%, 22%, 23% and 26% of the 395 fewer deaths could be attributable to decreased consumptions in trans fat, saturated fat, dietary salt and to increased F/V consumption, respectively. The 395 fewer deaths represent an overall 10% reduction in CVD mortality. Modelling the more substantial but feasible food policy options, we estimated that CVD mortality could be reduced by up to 1070 deaths/year, representing an overall 26% decline in CVD mortality. Conclusions: A considerable CVD burden is attributable to the excess consumption of saturated fat, trans fat, salt and insufficient fruit and vegetables. There are significant opportunities for Government and industry to reduce CVD mortality through effective, evidence-based food policies
    • 

    corecore