327 research outputs found

    The impact of alkyl chain purity on lipid based nucleic acid delivery systems – is the utilization of lipid components with technical grade justified?

    No full text
    The physicochemical properties and transfection efficacies of two samples of a cationic lipid have been investigated and compared in 2D (monolayers at the air/liquid interface) and 3D (aqueous bulk dispersions) model systems using different techniques. The samples differ only in their chain composition due to the purity of the oleylamine (chain precursor). Lipid 8 (using the oleylamine of technical grade for cost-efficient synthesis) shows lateral phase separation in the Langmuir layers. However, the amount of attached DNA, determined by IRRAS, is for both samples the same. In 3D systems, lipid 8 p forms cubic phases, which disappear after addition of DNA. At physiological temperatures, both lipids (alone and in mixture with cholesterol) assemble to lamellar aggregates and exhibit comparable DNA delivery efficiency. This study demonstrates that non-lamellar structures are not compulsory for high transfection rates. The results legitimate the utilization of oleyl chains of technical grade in the synthesis of cationic transfection lipid

    Galleria mellonella as a host model to study Candida glabrata virulence and antifungal efficacy

    Get PDF
    This is the author accepted manuscript. The final version is available from Taylor & Francis via the DOI in this record.This work was supported in part by the Wellcome Trust Strategic Award for Medical Mycology and Fungal Immunology 097377/Z/11/

    Customer Focused Price Optimisation

    Get PDF
    Tesco want to better understand how to set online prices for their general merchandise (i.e. not groceries or clothes) in the UK. Because customers can easily compare prices from different retailers we expect they will be very sensitive to price, so it is important to get it right. There are four aspects of the problem. • Forecasting: Estimating the customer demand as a function of the price chosen (especially hard for products with no sales history or infrequent sales). • Objective function: What exactly should Tesco aim to optimise? Sales volume? Profit? Profit margin? Conversion rates? • Optimisation: How to choose prices for many related products to optimise the chosen objective function. • Evalution: How to demonstrate that the chosen prices are optimal, especially to people without a mathematical background. Aggregate sales data was provided for about 400 products over about 2 years so that quantitive approaches could be tested. For some products competitors’ prices were also provided

    Measurement of event-by-event transverse momentum and multiplicity fluctuations using strongly intensive measures Δ[PT,N]\Delta[P_T, N] and Σ[PT,N]\Sigma[P_T, N] in nucleus-nucleus collisions at the CERN Super Proton Synchrotron

    Full text link
    Results from the NA49 experiment at the CERN SPS are presented on event-by-event transverse momentum and multiplicity fluctuations of charged particles, produced at forward rapidities in central Pb+Pb interactions at beam momenta 20AA, 30AA, 40AA, 80AA, and 158AA GeV/c, as well as in systems of different size (p+pp+p, C+C, Si+Si, and Pb+Pb) at 158AA GeV/c. This publication extends the previous NA49 measurements of the strongly intensive measure ΦpT\Phi_{p_T} by a study of the recently proposed strongly intensive measures of fluctuations Δ[PT,N]\Delta[P_T, N] and Σ[PT,N]\Sigma[P_T, N]. In the explored kinematic region transverse momentum and multiplicity fluctuations show no significant energy dependence in the SPS energy range. However, a remarkable system size dependence is observed for both Δ[PT,N]\Delta[P_T, N] and Σ[PT,N]\Sigma[P_T, N], with the largest values measured in peripheral Pb+Pb interactions. The results are compared with NA61/SHINE measurements in p+pp+p collisions, as well as with predictions of the UrQMD and EPOS models.Comment: 12 pages, 14 figures, to be submitted to PR

    Antideuteron and deuteron production in mid-central Pb+Pb collisions at 158AA GeV

    Get PDF
    Production of deuterons and antideuterons was studied by the NA49 experiment in the 23.5% most central Pb+Pb collisions at the top SPS energy of sNN\sqrt{s_{NN}}=17.3 GeV. Invariant yields for dˉ\bar{d} and dd were measured as a function of centrality in the center-of-mass rapidity range 1.2<y<0.6-1.2<y<-0.6. Results for dˉ(d)\bar{d}(d) together with previously published pˉ(p)\bar{p}(p) measurements are discussed in the context of the coalescence model. The coalescence parameters B2B_2 were deduced as a function of transverse momentum ptp_t and collision centrality.Comment: 9 figure

    Confronting the Challenge of Modeling Cloud and Precipitation Microphysics

    Get PDF
    In the atmosphere, microphysics refers to the microscale processes that affect cloud and precipitation particles and is a key linkage among the various components of Earth\u27s atmospheric water and energy cycles. The representation of microphysical processes in models continues to pose a major challenge leading to uncertainty in numerical weather forecasts and climate simulations. In this paper, the problem of treating microphysics in models is divided into two parts: (i) how to represent the population of cloud and precipitation particles, given the impossibility of simulating all particles individually within a cloud, and (ii) uncertainties in the microphysical process rates owing to fundamental gaps in knowledge of cloud physics. The recently developed Lagrangian particle‐based method is advocated as a way to address several conceptual and practical challenges of representing particle populations using traditional bulk and bin microphysics parameterization schemes. For addressing critical gaps in cloud physics knowledge, sustained investment for observational advances from laboratory experiments, new probe development, and next‐generation instruments in space is needed. Greater emphasis on laboratory work, which has apparently declined over the past several decades relative to other areas of cloud physics research, is argued to be an essential ingredient for improving process‐level understanding. More systematic use of natural cloud and precipitation observations to constrain microphysics schemes is also advocated. Because it is generally difficult to quantify individual microphysical process rates from these observations directly, this presents an inverse problem that can be viewed from the standpoint of Bayesian statistics. Following this idea, a probabilistic framework is proposed that combines elements from statistical and physical modeling. Besides providing rigorous constraint of schemes, there is an added benefit of quantifying uncertainty systematically. Finally, a broader hierarchical approach is proposed to accelerate improvements in microphysics schemes, leveraging the advances described in this paper related to process modeling (using Lagrangian particle‐based schemes), laboratory experimentation, cloud and precipitation observations, and statistical methods
    corecore