314 research outputs found

    Tilt testing

    Get PDF
    Tilt testing can help to diagnose unexplained syncope, by precipitating an episode during cardiac monitoring. The Italian protocol, now most widely used, involves giving sublingual nitroglycerine after 15 min, while monitoring beat-to-beat blood pressure (BP) and recording on video. Tilt testing is time-consuming but it is clinically useful and can guide therapy. Complications are rare. Syncope types include vasovagal syncope where BP falls after >3 min of tilt-up and later the heart rate falls; classic orthostatic hypotension where there is an immediate, progressive BP fall with minimal heart rate change; delayed orthostatic hypotension with a late BP fall after a stable phase but little or no heart rate rise; psychogenic pseudosyncope with apparent loss of consciousness, but no BP fall and a moderate heart rate rise; and postural orthostatic tachycardia syndrome where there is a significant heart rate rise but no BP fall

    Approaching invisibility : experiencing the photographs and writings of Minor White

    Full text link
    University of Technology, Sydney. Faculty of Design, Architecture and Building.Within his published writings on photography Minor White (1908-1976) makes frequent use of the term 'invisible'. While his use of this term is always suggestive, often provocative, and sometimes allusive, his meaning is rarely made clear. Nonetheless, White appears to refer to intangible aspects of photography that go beyond the visible elements of photographs themselves. This thesis aims to elucidate White's use of the term 'invisible' by determining (i) precisely what he is referring to in his use of this word, (ii) where the 'invisible' resides, and (iii) how it is encountered. In order to achieve these objectives a close examination and analysis of the writings of White is made, with particular emphasis being given to fifteen identified uses of the term 'invisible'. Since White's use of this term is always open to interpretation it is first necessary, however, to establish a comprehensive foundation from which explanations can be made. Hence, the first chapter of the thesis provides a brief overview of the formative years of White's life up until 1946. On the basis that six of the fifteen uses of the term 'invisible' refer directly to Alfred Stieglitz and/or his theory of ‘equivalence’, an analysis of ‘equivalence’ from the perspectives of Stieglitz and White respectively will be given in chapters two and three. The theory of ‘equivalence’ invests a photograph with an ability to express more than its literal representation, in doing so the viewer’s subjective experience is paramount. In addition to analysis of the writings of Stieglitz and White the writings of post-Stieglitz photographic critics and commentators such as Peter Bunnell, Joel Eisinger, Allan Sekula and John Szarkowski are also examined. The thesis then assigns each of White’s uses of this term to one of three categories developed in my research and reflection that I have named ‘extra-invisibility’ ‘intra-invisibility’ and ‘interinvisibility’. Thus it will be shown that the majority of the occasions on which White uses the term ‘invisible’ pertain both to the viewer’s experience of photographs and to the affective qualities of the photograph. While the meaning of White’s term ‘invisible’ is not always the same, the thesis concludes that the usage that dominates within his writing, pertains to feeling states that are evoked within a viewer’s internal world via his or her interaction with a photograph. How such experiences of invisibility are encountered is thus determined by the viewer’s personal background and approach to the photograph, by the social context in which the image is seen, and, to some extent, by the visible elements of the photograph itself

    Spatio-temporal Models of Lymphangiogenesis in Wound Healing

    Full text link
    Several studies suggest that one possible cause of impaired wound healing is failed or insufficient lymphangiogenesis, that is the formation of new lymphatic capillaries. Although many mathematical models have been developed to describe the formation of blood capillaries (angiogenesis), very few have been proposed for the regeneration of the lymphatic network. Lymphangiogenesis is a markedly different process from angiogenesis, occurring at different times and in response to different chemical stimuli. Two main hypotheses have been proposed: 1) lymphatic capillaries sprout from existing interrupted ones at the edge of the wound in analogy to the blood angiogenesis case; 2) lymphatic endothelial cells first pool in the wound region following the lymph flow and then, once sufficiently populated, start to form a network. Here we present two PDE models describing lymphangiogenesis according to these two different hypotheses. Further, we include the effect of advection due to interstitial flow and lymph flow coming from open capillaries. The variables represent different cell densities and growth factor concentrations, and where possible the parameters are estimated from biological data. The models are then solved numerically and the results are compared with the available biological literature.Comment: 29 pages, 9 Figures, 6 Tables (39 figure files in total

    SNAIL vs vitamin D receptor expression in colon cancer: therapeutics implications

    Get PDF
    Vitamin D analogues with reduced hypercalcemic activity are under clinical investigation for use against colon cancer and other neoplasias. However, only a subset of patients responds to this therapy, most probably due to loss of vitamin D receptor (VDR) expression during tumour progression. Recent data show that SNAIL transcription factor represses VDR expression, and thus abolishes the antiproliferative and prodifferentiation effects of VDR ligands in cultured cancer cells and their antitumour action in xenografted mice. Accordingly, upregulation of SNAIL in human colon tumours associates with downregulation of VDR. These findings suggest that SNAIL may be associated with loss of responsiveness to vitamin D analogues and may thus be used as an indicator of patients who are unlikely to respond to this therapy

    Plant Trait Diversity Buffers Variability in Denitrification Potential over Changes in Season and Soil Conditions

    Get PDF
    BACKGROUND: Denitrification is an important ecosystem service that removes nitrogen (N) from N-polluted watersheds, buffering soil, stream, and river water quality from excess N by returning N to the atmosphere before it reaches lakes or oceans and leads to eutrophication. The denitrification enzyme activity (DEA) assay is widely used for measuring denitrification potential. Because DEA is a function of enzyme levels in soils, most ecologists studying denitrification have assumed that DEA is less sensitive to ambient levels of nitrate (NO(3)(-)) and soil carbon and thus, less variable over time than field measurements. In addition, plant diversity has been shown to have strong effects on microbial communities and belowground processes and could potentially alter the functional capacity of denitrifiers. Here, we examined three questions: (1) Does DEA vary through the growing season? (2) If so, can we predict DEA variability with environmental variables? (3) Does plant functional diversity affect DEA variability? METHODOLOGY/PRINCIPAL FINDINGS: The study site is a restored wetland in North Carolina, US with native wetland herbs planted in monocultures or mixes of four or eight species. We found that denitrification potentials for soils collected in July 2006 were significantly greater than for soils collected in May and late August 2006 (p<0.0001). Similarly, microbial biomass standardized DEA rates were significantly greater in July than May and August (p<0.0001). Of the soil variables measured--soil moisture, organic matter, total inorganic nitrogen, and microbial biomass--none consistently explained the pattern observed in DEA through time. There was no significant relationship between DEA and plant species richness or functional diversity. However, the seasonal variance in microbial biomass standardized DEA rates was significantly inversely related to plant species functional diversity (p<0.01). CONCLUSIONS/SIGNIFICANCE: These findings suggest that higher plant functional diversity may support a more constant level of DEA through time, buffering the ecosystem from changes in season and soil conditions

    Utilisation of an operative difficulty grading scale for laparoscopic cholecystectomy

    Get PDF
    Background A reliable system for grading operative difficulty of laparoscopic cholecystectomy would standardise description of findings and reporting of outcomes. The aim of this study was to validate a difficulty grading system (Nassar scale), testing its applicability and consistency in two large prospective datasets. Methods Patient and disease-related variables and 30-day outcomes were identified in two prospective cholecystectomy databases: the multi-centre prospective cohort of 8820 patients from the recent CholeS Study and the single-surgeon series containing 4089 patients. Operative data and patient outcomes were correlated with Nassar operative difficultly scale, using Kendall’s tau for dichotomous variables, or Jonckheere–Terpstra tests for continuous variables. A ROC curve analysis was performed, to quantify the predictive accuracy of the scale for each outcome, with continuous outcomes dichotomised, prior to analysis. Results A higher operative difficulty grade was consistently associated with worse outcomes for the patients in both the reference and CholeS cohorts. The median length of stay increased from 0 to 4 days, and the 30-day complication rate from 7.6 to 24.4% as the difficulty grade increased from 1 to 4/5 (both p < 0.001). In the CholeS cohort, a higher difficulty grade was found to be most strongly associated with conversion to open and 30-day mortality (AUROC = 0.903, 0.822, respectively). On multivariable analysis, the Nassar operative difficultly scale was found to be a significant independent predictor of operative duration, conversion to open surgery, 30-day complications and 30-day reintervention (all p < 0.001). Conclusion We have shown that an operative difficulty scale can standardise the description of operative findings by multiple grades of surgeons to facilitate audit, training assessment and research. It provides a tool for reporting operative findings, disease severity and technical difficulty and can be utilised in future research to reliably compare outcomes according to case mix and intra-operative difficulty

    Jet energy measurement with the ATLAS detector in proton-proton collisions at root s=7 TeV

    Get PDF
    The jet energy scale and its systematic uncertainty are determined for jets measured with the ATLAS detector at the LHC in proton-proton collision data at a centre-of-mass energy of √s = 7TeV corresponding to an integrated luminosity of 38 pb-1. Jets are reconstructed with the anti-kt algorithm with distance parameters R=0. 4 or R=0. 6. Jet energy and angle corrections are determined from Monte Carlo simulations to calibrate jets with transverse momenta pT≥20 GeV and pseudorapidities {pipe}η{pipe}<4. 5. The jet energy systematic uncertainty is estimated using the single isolated hadron response measured in situ and in test-beams, exploiting the transverse momentum balance between central and forward jets in events with dijet topologies and studying systematic variations in Monte Carlo simulations. The jet energy uncertainty is less than 2. 5 % in the central calorimeter region ({pipe}η{pipe}<0. 8) for jets with 60≤pT<800 GeV, and is maximally 14 % for pT<30 GeV in the most forward region 3. 2≤{pipe}η{pipe}<4. 5. The jet energy is validated for jet transverse momenta up to 1 TeV to the level of a few percent using several in situ techniques by comparing a well-known reference such as the recoiling photon pT, the sum of the transverse momenta of tracks associated to the jet, or a system of low-pT jets recoiling against a high-pT jet. More sophisticated jet calibration schemes are presented based on calorimeter cell energy density weighting or hadronic properties of jets, aiming for an improved jet energy resolution and a reduced flavour dependence of the jet response. The systematic uncertainty of the jet energy determined from a combination of in situ techniques is consistent with the one derived from single hadron response measurements over a wide kinematic range. The nominal corrections and uncertainties are derived for isolated jets in an inclusive sample of high-pT jets. Special cases such as event topologies with close-by jets, or selections of samples with an enhanced content of jets originating from light quarks, heavy quarks or gluons are also discussed and the corresponding uncertainties are determined. © 2013 CERN for the benefit of the ATLAS collaboration

    Measurement of the inclusive and dijet cross-sections of b-jets in pp collisions at sqrt(s) = 7 TeV with the ATLAS detector

    Get PDF
    The inclusive and dijet production cross-sections have been measured for jets containing b-hadrons (b-jets) in proton-proton collisions at a centre-of-mass energy of sqrt(s) = 7 TeV, using the ATLAS detector at the LHC. The measurements use data corresponding to an integrated luminosity of 34 pb^-1. The b-jets are identified using either a lifetime-based method, where secondary decay vertices of b-hadrons in jets are reconstructed using information from the tracking detectors, or a muon-based method where the presence of a muon is used to identify semileptonic decays of b-hadrons inside jets. The inclusive b-jet cross-section is measured as a function of transverse momentum in the range 20 < pT < 400 GeV and rapidity in the range |y| < 2.1. The bbbar-dijet cross-section is measured as a function of the dijet invariant mass in the range 110 < m_jj < 760 GeV, the azimuthal angle difference between the two jets and the angular variable chi in two dijet mass regions. The results are compared with next-to-leading-order QCD predictions. Good agreement is observed between the measured cross-sections and the predictions obtained using POWHEG + Pythia. MC@NLO + Herwig shows good agreement with the measured bbbar-dijet cross-section. However, it does not reproduce the measured inclusive cross-section well, particularly for central b-jets with large transverse momenta.Comment: 10 pages plus author list (21 pages total), 8 figures, 1 table, final version published in European Physical Journal

    Comparing De Novo Genome Assembly: The Long and Short of It

    Get PDF
    Recent advances in DNA sequencing technology and their focal role in Genome Wide Association Studies (GWAS) have rekindled a growing interest in the whole-genome sequence assembly (WGSA) problem, thereby, inundating the field with a plethora of new formalizations, algorithms, heuristics and implementations. And yet, scant attention has been paid to comparative assessments of these assemblers' quality and accuracy. No commonly accepted and standardized method for comparison exists yet. Even worse, widely used metrics to compare the assembled sequences emphasize only size, poorly capturing the contig quality and accuracy. This paper addresses these concerns: it highlights common anomalies in assembly accuracy through a rigorous study of several assemblers, compared under both standard metrics (N50, coverage, contig sizes, etc.) as well as a more comprehensive metric (Feature-Response Curves, FRC) that is introduced here; FRC transparently captures the trade-offs between contigs' quality against their sizes. For this purpose, most of the publicly available major sequence assemblers – both for low-coverage long (Sanger) and high-coverage short (Illumina) reads technologies – are compared. These assemblers are applied to microbial (Escherichia coli, Brucella, Wolbachia, Staphylococcus, Helicobacter) and partial human genome sequences (Chr. Y), using sequence reads of various read-lengths, coverages, accuracies, and with and without mate-pairs. It is hoped that, based on these evaluations, computational biologists will identify innovative sequence assembly paradigms, bioinformaticists will determine promising approaches for developing “next-generation” assemblers, and biotechnologists will formulate more meaningful design desiderata for sequencing technology platforms. A new software tool for computing the FRC metric has been developed and is available through the AMOS open-source consortium

    Pretense and Imagination

    Get PDF
    Issues of pretense and imagination are of central interest to philosophers, psychologists, and researchers in allied fields. In this entry, we provide a roadmap of some of the central themes around which discussion has been focused. We begin with an overview of pretense, imagination, and the relationship between them. We then shift our attention to the four specific topics where the disciplines' research programs have intersected or where additional interactions could prove mutually beneficial: the psychological underpinnings of performing pretense and of recognizing pretense, the cognitive capacities involved in imaginative engagement with fictions, and the real-world impact of make-believe. In the final section, we discuss more briefly a number of other mental activities that arguably involve imagining, including counterfactual reasoning, delusions, and dreaming
    corecore