1,317 research outputs found

    Reading, Trauma and Literary Caregiving 1914-1918: Helen Mary Gaskell and the War Library

    Get PDF
    This article is about the relationship between reading, trauma and responsive literary caregiving in Britain during the First World War. Its analysis of two little-known documents describing the history of the War Library, begun by Helen Mary Gaskell in 1914, exposes a gap in the scholarship of war-time reading; generates a new narrative of "how," "when," and "why" books went to war; and foregrounds gender in its analysis of the historiography. The Library of Congress's T. W. Koch discovered Gaskell's ground-breaking work in 1917 and reported its successes to the American Library Association. The British Times also covered Gaskell's library, yet researchers working on reading during the war have routinely neglected her distinct model and method, skewing the research base on war-time reading and its association with trauma and caregiving. In the article's second half, a literary case study of a popular war novel demonstrates the extent of the "bitter cry for books." The success of Gaskell's intervention is examined alongside H. G. Wells's representation of textual healing. Reading is shown to offer sick, traumatized and recovering combatants emotional and psychological caregiving in ways that she could not always have predicted and that are not visible in the literary/historical record

    A critical experimental study of the classical tactile threshold theory

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>The tactile sense is being used in a variety of applications involving tactile human-machine interfaces. In a significant number of publications the classical threshold concept plays a central role in modelling and explaining psychophysical experimental results such as in stochastic resonance (SR) phenomena. In SR, noise enhances detection of sub-threshold stimuli and the phenomenon is explained stating that the required amplitude to exceed the sensory threshold barrier can be reached by adding noise to a sub-threshold stimulus. We designed an experiment to test the validity of the classical vibrotactile threshold. Using a second choice experiment, we show that individuals can order sensorial events below the level known as the classical threshold. If the observer's sensorial system is not activated by stimuli below the threshold, then a second choice could not be above the chance level. Nevertheless, our experimental results are above that chance level contradicting the definition of the classical tactile threshold.</p> <p>Results</p> <p>We performed a three alternative forced choice detection experiment on 6 subjects asking them first and second choices. In each trial, only one of the intervals contained a stimulus and the others contained only noise. According to the classical threshold assumptions, a correct second choice response corresponds to a guess attempt with a statistical frequency of 50%. Results show an average of 67.35% (STD = 1.41%) for the second choice response that is not explained by the classical threshold definition. Additionally, for low stimulus amplitudes, second choice correct detection is above chance level for any detectability level.</p> <p>Conclusions</p> <p>Using a second choice experiment, we show that individuals can order sensorial events below the level known as a classical threshold. If the observer's sensorial system is not activated by stimuli below the threshold, then a second choice could not be above the chance level. Nevertheless, our experimental results are above that chance level. Therefore, if detection exists below the classical threshold level, then the model to explain the SR phenomenon or any other tactile perception phenomena based on the psychophysical classical threshold is not valid. We conclude that a more suitable model of the tactile sensory system is needed.</p

    New insights into ion regulation of cephalopod molluscs: a role of epidermal ionocytes in acid-base regulation during embryogenesis

    Get PDF
    The constraints of an active life in a pelagic habitat led to numerous convergent morphological and physiological adaptations that enable cephalopod molluscs and teleost fishes to compete for similar resources. Here we show for the first time that such convergent developments are also found in the ontogenetic progression of ion regulatory tissues: as in teleost fish epidermal ionocytes scattered on skin and yolk sac of cephalopod embryos appear to be responsible for ionic and acid-base regulation before gill epithelia become functional. Ion and acid-base regulation is crucial in cephalopod embryos, as they are surrounded by a hypercapnic egg fluid with a pCO2 of 0.2-0.4 kPa. Epidermal ionocytes were characterized via immunohistochemistry, in situ hybridization and vital dye staining techniques. We found one group of cells that is recognized by Concavalin A and MitoTracker, which also expresses Na+/H+ exchangers (NHE) and Na+/K+-ATPase. Similar to findings obtained in teleosts these NHE3-rich cells take up sodium in exchange for protons, illustrating the energetic superiority of NHE based proton excretion in marine systems. In vivo electrophysiological techniques demonstrated that acid equivalents are secreted by the yolk and skin integument. Intriguingly, epidermal ionocytes of cephalopod embryos are ciliated as demonstrated by scanning electron microscopy suggesting a dual function of epithelial cells in water convection and ion regulation. These findings add significant knowledge to our mechanistic understanding of hypercapnia tolerance in marine organisms, as it demonstrates that marine taxa which were identified as powerful acid-base regulators during hypercapnic challenges already exhibit strong acid-base regulatory abilities during embryogenesis

    Major depression, fibromyalgia and labour force participation: A population-based cross-sectional study

    Get PDF
    BACKGROUND: Previous studies have documented an elevated frequency of depressive symptoms and disorders in fibromyalgia, but have not examined the association between this comorbidity and occupational status. The purpose of this study was to describe these epidemiological associations using a national probability sample. METHODS: Data from iteration 1.1 of the Canadian Community Health Survey (CCHS) were used. The CCHS 1.1 was a large-scale national general health survey. The prevalence of major depression in subjects reporting that they had been diagnosed with fibromyalgia by a health professional was estimated, and then stratified by demographic variables. Logistic regression models predicting labour force participation were also examined. RESULTS: The annual prevalence of major depression was three times higher in subjects with fibromyalgia: 22.2% (95% CI 19.4 – 24.9), than in those without this condition: 7.2% (95% CI 7.0 – 7.4). The association persisted despite stratification for demographic variables. Logistic regression models predicting labour force participation indicated that both conditions had an independent (negative) effect on labour force participation. CONCLUSION: Fibromyalgia and major depression commonly co-occur and may be related to each other at a pathophysiological level. However, each syndrome is independently and negatively associated with labour force participation. A strength of this study is that it was conducted in a large probability sample from the general population. The main limitations are its cross-sectional nature, and its reliance on self-reported diagnoses of fibromyalgia

    Methods for comprehensive chromosome screening of oocytes and embryos: capabilities, limitations, and evidence of validity

    Get PDF
    Preimplantation aneuploidy screening of cleavage stage embryos using fluorescence in situ hybridization (FISH) may no longer be considered the standard of care in reproductive medicine. Over the last few years, there has been considerable development of novel technologies for comprehensive chromosome screening (CCS) of the human genome. Among the notable methodologies that have been incorporated are whole genome amplification, metaphase and array based comparative genomic hybridization, single nucleotide polymorphism microarrays, and quantitative real-time PCR. As these methods become more integral to treating patients with infertility, it is critical that clinicians and scientists obtain a better understanding of their capabilities and limitations. This article will focus on reviewing these technologies and the evidence of their validity

    Collaborative denoising autoencoder for high glycated haemoglobin prediction.

    Get PDF
    A pioneering study is presented demonstrating that the presence of high glycated haemoglobin (HbA1c) levels in a patient’s blood can be reliably predicted from routinely collected clinical data. This paves the way for performing early detection of Type-2 Diabetes Mellitus (T2DM). This will save healthcare providers a major cost associated with the administration and assessment of clinical tests for HbA1c. A novel collaborative denoising autoencoder framework is used to address this challenge. The framework builds an independent denoising autoencoder model for the high and low HbA1c level, which extracts feature representations in the latent space. A baseline model using just three features: patient age together with triglycerides and glucose level achieves 76% F1-score with an SVM classifier. The collaborative denoising autoencoder uses 78 features and can predict HbA1c level with 81% F1-score

    Concurrent Exposure of Bottlenose Dolphins (Tursiops truncatus) to Multiple Algal Toxins in Sarasota Bay, Florida, USA

    Get PDF
    Sentinel species such as bottlenose dolphins (Tursiops truncatus) can be impacted by large-scale mortality events due to exposure to marine algal toxins. In the Sarasota Bay region (Gulf of Mexico, Florida, USA), the bottlenose dolphin population is frequently exposed to harmful algal blooms (HABs) of Karenia brevis and the neurotoxic brevetoxins (PbTx; BTX) produced by this dinoflagellate. Live dolphins sampled during capture-release health assessments performed in this region tested positive for two HAB toxins; brevetoxin and domoic acid (DA). Over a ten-year study period (2000–2009) we have determined that bottlenose dolphins are exposed to brevetoxin and/or DA on a nearly annual basis (i.e., DA: 2004, 2005, 2006, 2008, 2009; brevetoxin: 2000, 2004, 2005, 2008, 2009) with 36% of all animals testing positive for brevetoxin (n = 118) and 53% positive for DA (n = 83) with several individuals (14%) testing positive for both neurotoxins in at least one tissue/fluid. To date there have been no previously published reports of DA in southwestern Florida marine mammals, however the May 2008 health assessment coincided with a Pseudo-nitzschia pseudodelicatissima bloom that was the likely source of DA observed in seawater and live dolphin samples. Concurrently, both DA and brevetoxin were observed in common prey fish. Although no Pseudo-nitzschia bloom was identified the following year, DA was identified in seawater, fish, sediment, snails, and dolphins. DA concentrations in feces were positively correlated with hematologic parameters including an increase in total white blood cell (p = 0.001) and eosinophil (p<0.001) counts. Our findings demonstrate that dolphins within Sarasota Bay are commonly exposed to two algal toxins, and provide the impetus to further explore the potential long-term impacts on bottlenose dolphin health

    Computed tomographic pulmonary angiography and pulmonary embolism: predictive value of a d-dimer assay

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Computed tomographic pulmonary angiography (CTPA) is increasingly being used as first investigation for suspected pulmonary embolism (PE). The investigation has high predictive value, but is resource and time intensive and exposes patients to considerable radiation. Our aim was to assess the potential value of a negative d-dimer assay to exclude pulmonary emboli and reduce the number of performed CTPAs.</p> <p>Methods</p> <p>All CTPAs performed in a Scottish secondary care hospital for a fourteen month period were retrospectively reviewed. Collected data included the presence or absence of PE, d-dimer results and patient demographics. PE positive CTPAs were reviewed by a specialist panel.</p> <p>Results</p> <p>Pulmonary embolisms were reported for 66/405 (16.3%) CTPAs and d-dimer tests were performed for 216 (53%). 186/216 (86%) patients had a positive and 30 (14%) a negative d-dimer result. The panel agreed 5/66 (7.6%) false positive examinations. The d-dimer assay's negative predictive value was 93.3% (95% CI = 76.5%-98.8%) based on the original number of positive CTPAs and 100% (95% CI = 85.9%-100%) based on expert review. Significant non-PE intrapulmonary pathology was reported for 312/405 (77.0) CTPAs, including 13 new diagnoses of carcinoma.</p> <p>Conclusions</p> <p>We found that a low d-dimer score excluded all pulmonary embolisms, after a further specialist panel review identified initial false positive reports. However, current evidence-based guidelines still recommend that clinicians combine a d-dimer result with a validated clinical risk score when selecting suitable patients for CTPA. This may result in better use of limited resources, prevent patients being exposed to unnecessary irradiation and prevent potential complications as a result of iodinated contrast.</p

    A mechanistic model of infection: why duration and intensity of contacts should be included in models of disease spread

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Mathematical models and simulations of disease spread often assume a constant per-contact transmission probability. This assumption ignores the heterogeneity in transmission probabilities, e.g. due to the varying intensity and duration of potentially contagious contacts. Ignoring such heterogeneities might lead to erroneous conclusions from simulation results. In this paper, we show how a mechanistic model of disease transmission differs from this commonly used assumption of a constant per-contact transmission probability.</p> <p>Methods</p> <p>We present an exposure-based, mechanistic model of disease transmission that reflects heterogeneities in contact duration and intensity. Based on empirical contact data, we calculate the expected number of secondary cases induced by an infector (i) for the mechanistic model and (ii) under the classical assumption of a constant per-contact transmission probability. The results of both approaches are compared for different basic reproduction numbers <it>R</it><sub>0</sub>.</p> <p>Results</p> <p>The outcomes of the mechanistic model differ significantly from those of the assumption of a constant per-contact transmission probability. In particular, cases with many different contacts have much lower expected numbers of secondary cases when using the mechanistic model instead of the common assumption. This is due to the fact that the proportion of long, intensive contacts decreases in the contact dataset with an increasing total number of contacts.</p> <p>Conclusion</p> <p>The importance of highly connected individuals, so-called super-spreaders, for disease spread seems to be overestimated when a constant per-contact transmission probability is assumed. This holds particularly for diseases with low basic reproduction numbers. Simulations of disease spread should weight contacts by duration and intensity.</p
    corecore