38,702 research outputs found

    Universality and Evolution of TMDs

    Get PDF
    In this talk, we summarize how QCD evolution can be exploited to improve the treatment of transverse momentum dependent (TMD) parton distribution and fragmentation functions. The methods allow existing non-perturbative fits to be turned into fully evolved TMDs that are consistent with a complete TMD-factorization formalism over the full range of kT. We argue that evolution is essential to the predictive power of calculations that utilize TMD parton distribution and fragmentation functions, especially TMD observables that are sensitive to transverse spin.Comment: To appear in the proceedings of the Third International Workshop on Transverse Polarization Phenomena in Hard Scattering (Transversity 2011), in Veli Losinj, Croatia, 29 August - 2 September 2011. 5 pages, 1 figur

    General practitioners' perceptions of effective health care

    Get PDF
    Objectives: To explore general practitioners' perceptions of effective health care and its application in their own practice; to examine how these perceptions relate to assumptions about clinicians' values and behaviour implicit in the evidence based medicine approach. Design: A qualitative study using semistructured interviews. Setting: Eight general practices in North Thames region that were part of the Medical Research Council General Practice Research Framework. Participants: 24 general practitioners, three from each practice. Main outcome measures: Respondents' definitions of effective health care, reasons for not practising effectively according to their own criteria, sources of information used to answer clinical questions about patients, reasons for making changes in clinical practice. Results: Three categories of definitions emerged: clinical, patient related, and resource related. Patient factors were the main reason given for not practising effectively; others were lack of time, doctors' lack of knowledge and skills, lack of resources, and "human failings." Main sources of information used in situations of clinical uncertainty were general practitioner partners and hospital doctors. Contact with hospital doctors and observation of hospital practice were just as likely as information from medical and scientific literature to bring about changes in clinical practice. Conclusions: The findings suggest that the central assumptions of the evidence based medicine paradigm may not be shared by many general practitioners, making its application in general practice problematic. The promotion of effective care in general practice requires a broader vision and a more pragmatic approach which takes account of practitioners' concerns and is compatible with the complex nature of their work

    Material Flow Analysis: Outcome Focus (MFA:OF) for Elucidating the Role of Infrastructure in the Development of a Liveable City

    Get PDF
    Engineered infrastructures (i.e., utilities, transport & digital) underpin modern society. Delivering services via these is especially challenging in cities where differing infrastructures form a web of interdependencies. There must be a step change in how infrastructures deliver services to cities, if those cities are to be liveable in the future (i.e., provide for citizen wellbeing, produce less CO2 & ensure the security of the resources they use). Material Flow Analysis (MFA) is a useful methodology for understanding how infrastructures transfer resources to, within and from cities and contribute to the city’s metabolism. Liveable Cities, a five-year research programme was established to identify & test radical engineering interventions leading to liveable cities of the future. In this paper, the authors propose an outcome-focussed variation on the MFA methodology (MFA: OF), evidenced through work on the resource flows of Birmingham, UK. These flows include water, energy, food & carbon-intensive materials (e.g., steel, paper, glass), as well as their associated waste. The contribution MFA: OF makes to elucidating the interactions & interdependencies between the flows is highlighted and suggestions are made for how it can contribute to the (radical) rethinking of the engineered infrastructure associated with such flow

    The latent process decomposition of cDNA microarray data sets

    Get PDF
    We present a new computational technique (a software implementation, data sets, and supplementary information are available at http://www.enm.bris.ac.uk/lpd/) which enables the probabilistic analysis of cDNA microarray data and we demonstrate its effectiveness in identifying features of biomedical importance. A hierarchical Bayesian model, called latent process decomposition (LPD), is introduced in which each sample in the data set is represented as a combinatorial mixture over a finite set of latent processes, which are expected to correspond to biological processes. Parameters in the model are estimated using efficient variational methods. This type of probabilistic model is most appropriate for the interpretation of measurement data generated by cDNA microarray technology. For determining informative substructure in such data sets, the proposed model has several important advantages over the standard use of dendrograms. First, the ability to objectively assess the optimal number of sample clusters. Second, the ability to represent samples and gene expression levels using a common set of latent variables (dendrograms cluster samples and gene expression values separately which amounts to two distinct reduced space representations). Third, in contrast to standard cluster models, observations are not assigned to a single cluster and, thus, for example, gene expression levels are modeled via combinations of the latent processes identified by the algorithm. We show this new method compares favorably with alternative cluster analysis methods. To illustrate its potential, we apply the proposed technique to several microarray data sets for cancer. For these data sets it successfully decomposes the data into known subtypes and indicates possible further taxonomic subdivision in addition to highlighting, in a wholly unsupervised manner, the importance of certain genes which are known to be medically significant. To illustrate its wider applicability, we also illustrate its performance on a microarray data set for yeast

    Multivariable Repetitive-predictive Controllers using Frequency Decomposition

    No full text
    Repetitive control is a methodology for the tracking of a periodic reference signal. This paper develops a new approach to repetitive control systems design using receding horizon control with frequency decomposition of the reference signal. Moreover, design and implementation issues for this form of repetitive predictive control are investigated from the perspectives of controller complexity and the effects of measurement noise. The analysis is supported by a simulation study on a multi-input multi-output robot arm where the model has been constructed from measured frequency response data, and experimental results from application to an industrial AC motor

    INS3D: An incompressible Navier-Stokes code in generalized three-dimensional coordinates

    Get PDF
    The operation of the INS3D code, which computes steady-state solutions to the incompressible Navier-Stokes equations, is described. The flow solver utilizes a pseudocompressibility approach combined with an approximate factorization scheme. This manual describes key operating features to orient new users. This includes the organization of the code, description of the input parameters, description of each subroutine, and sample problems. Details for more extended operations, including possible code modifications, are given in the appendix

    Old and New Fields on Super Riemann Surfaces

    Get PDF
    The ``new fields" or ``superconformal functions" on N=1N=1 super Riemann surfaces introduced recently by Rogers and Langer are shown to coincide with the Abelian differentials (plus constants), viewed as a subset of the functions on the associated N=2N=2 super Riemann surface. We confirm that, as originally defined, they do not form a super vector space.Comment: 9 pages, LaTex. Published version: minor changes for clarity, two new reference

    High-Performance Bioinstrumentation for Real-Time Neuroelectrochemical Traumatic Brain Injury Monitoring

    Get PDF
    Traumatic brain injury (TBI) has been identified as an important cause of death and severe disability in all age groups and particularly in children and young adults. Central to TBIs devastation is a delayed secondary injury that occurs in 30–40% of TBI patients each year, while they are in the hospital Intensive Care Unit (ICU). Secondary injuries reduce survival rate after TBI and usually occur within 7 days post-injury. State-of-art monitoring of secondary brain injuries benefits from the acquisition of high-quality and time-aligned electrical data i.e., ElectroCorticoGraphy (ECoG) recorded by means of strip electrodes placed on the brains surface, and neurochemical data obtained via rapid sampling microdialysis and microfluidics-based biosensors measuring brain tissue levels of glucose, lactate and potassium. This article progresses the field of multi-modal monitoring of the injured human brain by presenting the design and realization of a new, compact, medical-grade amperometry, potentiometry and ECoG recording bioinstrumentation. Our combined TBI instrument enables the high-precision, real-time neuroelectrochemical monitoring of TBI patients, who have undergone craniotomy neurosurgery and are treated sedated in the ICU. Electrical and neurochemical test measurements are presented, confirming the high-performance of the reported TBI bioinstrumentation
    corecore