51 research outputs found

    Subjective responses to display bezel characteristics

    Get PDF
    MisericordsBuilt 119

    Reliability and validity of methods in the assessment of cold-induced shivering thermogenesis

    Get PDF
    Purpose: To compare two analytical methods for the estimation of the shivering onset inflection point, segmental regression and visual inspection of data, and to assess the test–retest reliability and validity of four metrics of shivering measurement; oxygen uptake (V̇O2), electromyography (EMG), mechanomyography (MMG) and bedside shivering assessment scale (BSAS). Methods: Ten volunteers attended three identical experimental sessions involving passive deep-body cooling via cold water immersion at 10 °C. V̇O2, EMG, and MMG were continuously assessed, while the time elapsed at each BSAS stage was recorded. Metrics were graphed as a function of time and rectal temperature (Tre). Inflection points for intermittent and constant shivering were visually identified for every graph and compared to segmental regression. Results: Excellent agreement was seen between segmental regression and visual inspection (ICC, 0.92). All measurement metrics presented good-to-excellent test–retest reliability (ICC’s > 0.75 and 0.90 respectively), with the exception of visual identification of intermittent shivering for V̇O2 measurement (ICC, 0.73) and segmental regression for EMG measurement (ICC, 0.74). In the assessment of signal-to-noise ratio (SNR), EMG showed the largest SNR at the point of shivering onset followed by MMG and finally V̇O2. Conclusions: Segmental regression provides a successful analytical method for identifying shivering onset. Good-to-excellent reliability can be seen across V̇O2, EMG, MMG, and BSAS, yet given the observed lag times, SNRs, along with known advantages/disadvantaged of each metric, it is recommended that no single metric is used in isolation. An integrative, real-time measure of shivering is proposed

    Independent and combined impact of hypoxia and acute inorganic nitrate ingestion on thermoregulatory responses to the cold

    Get PDF
    Purpose: This study assessed the impact of normobaric hypoxia and acute nitrate ingestion on shivering thermogenesis, cutaneous vascular control, and thermometrics in response to cold stress. Method: Eleven male volunteers underwent passive cooling at 10 °C air temperature across four conditions: (1) normoxia with placebo ingestion, (2) hypoxia (0.130 FiO2) with placebo ingestion, (3) normoxia with 13 mmol nitrate ingestion, and (4) hypoxia with nitrate ingestion. Physiological metrics were assessed as a rate of change over 45 min to determine heat loss, and at the point of shivering onset to determine the thermogenic thermoeffector threshold. Result: Independently, hypoxia expedited shivering onset time (p = 0.05) due to a faster cooling rate as opposed to a change in central thermoeffector thresholds. Specifically, compared to normoxia, hypoxia increased skin blood flow (p = 0.02), leading to an increased core-cooling rate (p = 0.04) and delta change in rectal temperature (p = 0.03) over 45 min, yet the same rectal temperature at shivering onset (p = 0.9). Independently, nitrate ingestion delayed shivering onset time (p = 0.01), mediated by a change in central thermoeffector thresholds, independent of changes in peripheral heat exchange. Specifically, compared to placebo ingestion, no difference was observed in skin blood flow (p = 0.5), core-cooling rate (p = 0.5), or delta change in rectal temperature (p = 0.7) over 45 min, while nitrate reduced rectal temperature at shivering onset (p = 0.04). No interaction was observed between hypoxia and nitrate ingestion. Conclusion: These data improve our understanding of how hypoxia and nitric oxide modulate cold thermoregulation

    Subjective responses to display bezel characteristics

    Get PDF
    © 2015. This manuscript version is made available under the CC-BY-NC-ND 4.0 licensehttp://creativecommons.org/licenses/by-nc-nd/4.0/High quality flat panel computer displays (FPDs) with high resolution screens are now commonplace, and black, grey, white, beige and silver surrounds (‘bezels’), matt or glossy, are in widespread use. It has been suggested that bezels with high reflectance, or with a high gloss, could cause eyestrain, and we have investigated this issue. Twenty office workers (unaware of the study purpose) used six different FPDs, for a week each, at their own desk. These displays were identical apart from the bezel colour (black, white or silver) and shininess (matt or glossy). Participants completed questionnaires about their visual comfort at the end of each week, and were fully debriefed in lunch-time focus groups at the end of the study. For the white and the silver bezels, the glossiness of the bezel was not an issue of concern. The participants were significantly less content with the glossy black surround than with the matt black surround, and in general the glossy black bezel was the least-liked of all those used. With the possible exception of this surround, there was no evidence of significantly increased visual discomfort, indicative of eyestrain, as a result of high or low bezel reflectance, or of high glossiness

    How well do second-year students learn physical diagnosis? Observational study of an objective structured clinical examination (OSCE)

    Get PDF
    BACKGROUND: Little is known about using the Objective Structured Clinical Examination (OSCE) in physical diagnosis courses. The purpose of this study was to describe student performance on an OSCE in a physical diagnosis course. METHODS: Cross-sectional study at Harvard Medical School, 1997–1999, for 489 second-year students. RESULTS: Average total OSCE score was 57% (range 39–75%). Among clinical skills, students scored highest on patient interaction (72%), followed by examination technique (65%), abnormality identification (62%), history-taking (60%), patient presentation (60%), physical examination knowledge (47%), and differential diagnosis (40%) (p < .0001). Among 16 OSCE stations, scores ranged from 70% for arthritis to 29% for calf pain (p < .0001). Teaching sites accounted for larger adjusted differences in station scores, up to 28%, than in skill scores (9%) (p < .0001). CONCLUSIONS: Students scored higher on interpersonal and technical skills than on interpretive or integrative skills. Station scores identified specific content that needs improved teaching

    Between the Vinča and Linearbandkeramik worlds: the diversity of practices and identities in the 54th–53rd centuries cal BC in south-west Hungary and beyond

    Get PDF
    Szederk&eacute;ny-Kukorica-dűlő is a large settlement in south-east Transdanubia, Hungary, excavated in advance of road construction, which is notable for its combination of pottery styles, variously including Vinča A, Raži&scaron;te and LBK, and longhouses of a kind otherwise familiar from the LBK world. Formal modelling of its date establishes that the site probably began in the later 54th century cal BC, lasting until the first decades of the 52nd century cal BC. Occupation, featuring longhouses, pits and graves, probably began at the same time on the east and west parts of the settlement, the central part starting a decade or two later; the western part was probably abandoned last. Vinča pottery is predominantly associated with the east and central parts of the site, and Raži&scaron;te pottery with the west. Formal modelling of the early history and diaspora of longhouses in the LBK world suggests their emergence in the Formative LBK of Transdanubia c. 5500 cal BC and then rapid diaspora in the middle of the 54th century cal BC, associated with the &lsquo;earliest&rsquo; (&auml;lteste) LBK. The adoption of longhouses at Szederk&eacute;ny thus appears to come a few generations after the start of the diaspora. Rather than explaining the mixture of things, practices and perhaps people at Szederk&eacute;ny by reference to problematic notions such as hybridity, we propose instead a more fluid and varied vocabulary including combination and amalgamation, relationships and performance in the flow of social life, and networks; this makes greater allowance for diversity and interleaving in a context of rapid change

    Oncogenic BRAF, unrestrained by TGFβ-receptor signalling, drives right-sided colonic tumorigenesis

    Get PDF
    Right-sided (proximal) colorectal cancer (CRC) has a poor prognosis and a distinct mutational profile, characterized by oncogenic BRAF mutations and aberrations in mismatch repair and TGFβ signalling. Here, we describe a mouse model of right-sided colon cancer driven by oncogenic BRAF and loss of epithelial TGFβ-receptor signalling. The proximal colonic tumours that develop in this model exhibit a foetal-like progenitor phenotype (Ly6a/Sca1+) and, importantly, lack expression of Lgr5 and its associated intestinal stem cell signature. These features are recapitulated in human BRAF-mutant, right-sided CRCs and represent fundamental differences between left- and right-sided disease. Microbial-driven inflammation supports the initiation and progression of these tumours with foetal-like characteristics, consistent with their predilection for the microbe-rich right colon and their antibiotic sensitivity. While MAPK-pathway activating mutations drive this foetal-like signature via ERK-dependent activation of the transcriptional coactivator YAP, the same foetal-like transcriptional programs are also initiated by inflammation in a MAPK-independent manner. Importantly, in both contexts, epithelial TGFβ-receptor signalling is instrumental in suppressing the tumorigenic potential of these foetal-like progenitor cells

    Circulating microRNAs in sera correlate with soluble biomarkers of immune activation but do not predict mortality in ART treated individuals with HIV-1 infection: A case control study

    Get PDF
    Introduction: The use of anti-retroviral therapy (ART) has dramatically reduced HIV-1 associated morbidity and mortality. However, HIV-1 infected individuals have increased rates of morbidity and mortality compared to the non-HIV-1 infected population and this appears to be related to end-organ diseases collectively referred to as Serious Non-AIDS Events (SNAEs). Circulating miRNAs are reported as promising biomarkers for a number of human disease conditions including those that constitute SNAEs. Our study sought to investigate the potential of selected miRNAs in predicting mortality in HIV-1 infected ART treated individuals. Materials and Methods: A set of miRNAs was chosen based on published associations with human disease conditions that constitute SNAEs. This case: control study compared 126 cases (individuals who died whilst on therapy), and 247 matched controls (individuals who remained alive). Cases and controls were ART treated participants of two pivotal HIV-1 trials. The relative abundance of each miRNA in serum was measured, by RTqPCR. Associations with mortality (all-cause, cardiovascular and malignancy) were assessed by logistic regression analysis. Correlations between miRNAs and CD4+ T cell count, hs-CRP, IL-6 and D-dimer were also assessed. Results: None of the selected miRNAs was associated with all-cause, cardiovascular or malignancy mortality. The levels of three miRNAs (miRs -21, -122 and -200a) correlated with IL-6 while miR-21 also correlated with D-dimer. Additionally, the abundance of miRs -31, -150 and -223, correlated with baseline CD4+ T cell count while the same three miRNAs plus miR- 145 correlated with nadir CD4+ T cell count. Discussion: No associations with mortality were found with any circulating miRNA studied. These results cast doubt onto the effectiveness of circulating miRNA as early predictors of mortality or the major underlying diseases that contribute to mortality in participants treated for HIV-1 infection

    Development and Validation of a Risk Score for Chronic Kidney Disease in HIV Infection Using Prospective Cohort Data from the D:A:D Study

    Get PDF
    Ristola M. on työryhmien DAD Study Grp ; Royal Free Hosp Clin Cohort ; INSIGHT Study Grp ; SMART Study Grp ; ESPRIT Study Grp jäsen.Background Chronic kidney disease (CKD) is a major health issue for HIV-positive individuals, associated with increased morbidity and mortality. Development and implementation of a risk score model for CKD would allow comparison of the risks and benefits of adding potentially nephrotoxic antiretrovirals to a treatment regimen and would identify those at greatest risk of CKD. The aims of this study were to develop a simple, externally validated, and widely applicable long-term risk score model for CKD in HIV-positive individuals that can guide decision making in clinical practice. Methods and Findings A total of 17,954 HIV-positive individuals from the Data Collection on Adverse Events of Anti-HIV Drugs (D:A:D) study with >= 3 estimated glomerular filtration rate (eGFR) values after 1 January 2004 were included. Baseline was defined as the first eGFR > 60 ml/min/1.73 m2 after 1 January 2004; individuals with exposure to tenofovir, atazanavir, atazanavir/ritonavir, lopinavir/ritonavir, other boosted protease inhibitors before baseline were excluded. CKD was defined as confirmed (>3 mo apart) eGFR In the D:A:D study, 641 individuals developed CKD during 103,185 person-years of follow-up (PYFU; incidence 6.2/1,000 PYFU, 95% CI 5.7-6.7; median follow-up 6.1 y, range 0.3-9.1 y). Older age, intravenous drug use, hepatitis C coinfection, lower baseline eGFR, female gender, lower CD4 count nadir, hypertension, diabetes, and cardiovascular disease (CVD) predicted CKD. The adjusted incidence rate ratios of these nine categorical variables were scaled and summed to create the risk score. The median risk score at baseline was -2 (interquartile range -4 to 2). There was a 1: 393 chance of developing CKD in the next 5 y in the low risk group (risk score = 5, 505 events), respectively. Number needed to harm (NNTH) at 5 y when starting unboosted atazanavir or lopinavir/ritonavir among those with a low risk score was 1,702 (95% CI 1,166-3,367); NNTH was 202 (95% CI 159-278) and 21 (95% CI 19-23), respectively, for those with a medium and high risk score. NNTH was 739 (95% CI 506-1462), 88 (95% CI 69-121), and 9 (95% CI 8-10) for those with a low, medium, and high risk score, respectively, starting tenofovir, atazanavir/ritonavir, or another boosted protease inhibitor. The Royal Free Hospital Clinic Cohort included 2,548 individuals, of whom 94 individuals developed CKD (3.7%) during 18,376 PYFU (median follow-up 7.4 y, range 0.3-12.7 y). Of 2,013 individuals included from the SMART/ESPRIT control arms, 32 individuals developed CKD (1.6%) during 8,452 PYFU (median follow-up 4.1 y, range 0.6-8.1 y). External validation showed that the risk score predicted well in these cohorts. Limitations of this study included limited data on race and no information on proteinuria. Conclusions Both traditional and HIV-related risk factors were predictive of CKD. These factors were used to develop a risk score for CKD in HIV infection, externally validated, that has direct clinical relevance for patients and clinicians to weigh the benefits of certain antiretrovirals against the risk of CKD and to identify those at greatest risk of CKD.Peer reviewe

    Between the Vinča and Linearbandkeramik Worlds: The Diversity of Practices and Identities in the 54th–53rd Centuries cal BC in Southwest Hungary and Beyond

    Get PDF
    corecore