34 research outputs found

    The Poetic Witness: Owen and Sassoon's Perspective on the Human Condition in WWI

    Get PDF
    openThis thesis aims to examine the impact of "War Poets" on the perception of war and the human condition through the lens of literature. These poets, primarily active during the First and Second World Wars, produced works that reflect their direct experiences in battle and the profound emotions associated with armed conflict. The analysis of their poems will allow us to understand how these authors influenced how society has approached and understood war, providing a humanistic perspective in a context often dominated by political and strategic aspects. The study delves into the poetic works of Owen and Sassoon to dissect their unique perspectives on the human condition during WWI. Their verses act as a lens through which the physical and psychological toll of warfare is examined, revealing the dehumanizing effects of combat, the loss of innocence, and the fragmentation of individual and collective identities. By analyzing their poems, including Owen's "Dulce et Decorum Est" and Sassoon's "The General," we gain insight into the moral outrage, disillusionment, and existential crisis that characterized their wartime experiences. These war poets did not merely critique war in isolation; they influenced societal perceptions. Their poems were published and widely read during and after the war, contributing to a growing anti-war sentiment. The horrors depicted in their verses ignited public outrage, fostering a sense of urgency to prevent future conflicts. Public awareness of the suffering endured by soldiers on the frontlines led to greater support for veterans, including improved medical care and psychological support. This societal shift can be seen as a direct response to the stark realities unveiled by the war poets.This thesis aims to examine the impact of "War Poets" on the perception of war and the human condition through the lens of literature. These poets, primarily active during the First and Second World Wars, produced works that reflect their direct experiences in battle and the profound emotions associated with armed conflict. The analysis of their poems will allow us to understand how these authors influenced how society has approached and understood war, providing a humanistic perspective in a context often dominated by political and strategic aspects. The study delves into the poetic works of Owen and Sassoon to dissect their unique perspectives on the human condition during WWI. Their verses act as a lens through which the physical and psychological toll of warfare is examined, revealing the dehumanizing effects of combat, the loss of innocence, and the fragmentation of individual and collective identities. By analyzing their poems, including Owen's "Dulce et Decorum Est" and Sassoon's "The General," we gain insight into the moral outrage, disillusionment, and existential crisis that characterized their wartime experiences. These war poets did not merely critique war in isolation; they influenced societal perceptions. Their poems were published and widely read during and after the war, contributing to a growing anti-war sentiment. The horrors depicted in their verses ignited public outrage, fostering a sense of urgency to prevent future conflicts. Public awareness of the suffering endured by soldiers on the frontlines led to greater support for veterans, including improved medical care and psychological support. This societal shift can be seen as a direct response to the stark realities unveiled by the war poets

    Maximizing entropy of image models for 2-D constrained coding

    Get PDF
    This paper considers estimating and maximizing the entropy of two-dimensional (2-D) fields with application to 2-D constrained coding. We consider Markov random fields (MRF), which have a non-causal description, and the special case of Pickard random fields (PRF). The PRF are 2-D causal finite context models, which define stationary probability distributions on finite rectangles and thus allow for calculation of the entropy. We consider two binary constraints and revisit the hard square constraint given by forbidding neighboring 1s and provide novel results for the constraint that no uniform 2 £ 2 squares contains all 0s or all 1s. The maximum values of the entropy for the constraints are estimated and binary PRF satisfying the constraint are characterized and optimized w.r.t. the entropy. The maximum binary PRF entropy is 0.839 bits/symbol for the no uniform squares constraint. The entropy of the Markov random field defined by the 2-D constraint is estimated to be (upper bounded by) 0.8570 bits/symbol using the iterative technique of Belief Propagation on 2 £ 2 finite lattices. Based on combinatorial bounding techniques the maximum entropy for the constraint was determined to be 0.848

    Anharmonic Effects on the Squeezing of Axion Perturbations

    Full text link
    It is assumed in standard cosmology that the Universe underwent a period of inflation in its earliest phase, providing the seeds for structure formation through vacuum fluctuations of the inflaton scalar field. These fluctuations get stretched by the quasi-exponential expansion of the Universe and become squeezed. From an observational point of view, if we consider Gaussian states, the expectation value of physical quantities on a squeezed state is indistinguishable from a classical average of a stochastic distribution. This renders cosmological perturbations arising from quantum fluctuations of free fields effectively identical to those with a classical origin. The cosmological squeezing has been largely studied in the literature, however most works have focused on nearly free fields. The aim of this paper is to deepen the understanding of the quantum-to-classical transition considering the effect of self-interactions. For this purpose, we study axion-like fields. In particular we follow the evolution of the axion's fluctuation modes from the horizon exit during inflation to the radiation-dominated epoch. We compute Bogoliubov coefficients and squeezing parameters, which are linked to the axion particles number and isocurvature perturbation. We find that the quantum mechanical particle production and the squeezing of the perturbations are enhanced, if one accounts for anharmonic effects, i.e., the effect of higher order terms in the potential. This effect becomes particularly strong towards the hilltop of the potential.Comment: 21 pages + appendices, 10 figure

    Improvement of ALT decay kinetics by all-oral HCV treatment: Role of NS5A inhibitors and differences with IFN-based regimens

    Get PDF
    Background: Intracellular HCV-RNA reduction is a proposed mechanism of action of direct-acting antivirals (DAAs), alternative to hepatocytes elimination by pegylated-interferon plus ribavirin (PR). We modeled ALT and HCV-RNA kinetics in cirrhotic patients treated with currently-used all-DAA combinations to evaluate their mode of action and cytotoxicity compared with telaprevir (TVR)+PR. Study design: Mathematical modeling of ALT and HCV-RNA kinetics was performed in 111 HCV-1 cirrhotic patients, 81 treated with all-DAA regimens and 30 with TVR+PR. Kinetic-models and Cox-analysis were used to assess determinants of ALT-decay and normalization. Results: HCV-RNA kinetics was biphasic, reflecting a mean effectiveness in blocking viral production >99.8%. The first-phase of viral-decline was faster in patients receiving NS5A-inhibitors compared to TVR+PR or sofosbuvir+simeprevir (p<0.001), reflecting higher efficacy in blocking assembly/secretion. The second-phase, noted \u3b4 and attributed to infected-cell loss, was faster in patients receiving TVR+PR or sofosbuvir+simeprevir compared to NS5A-inhibitors (0.27 vs 0.21 d-1, respectively, p = 0.0012). In contrast the rate of ALT-normalization, noted \u3bb, was slower in patients receiving TVR+PR or sofosbuvir+simeprevir compared to NS5A-inhibitors (0.17 vs 0.27 d-1, respectively, p<0.001). There was no significant association between the second-phase of viral-decline and ALT normalization rate and, for a given level of viral reduction, ALT-normalization was more profound in patients receiving DAA, and NS5A in particular, than TVR+PR. Conclusions: Our data support a process of HCV-clearance by all-DAA regimens potentiated by NS5A-inhibitor, and less relying upon hepatocyte death than IFN-containing regimens. This may underline a process of "cell-cure" by DAAs, leading to a fast improvement of liver homeostasis

    TRAM (Transcriptome Mapper): database-driven creation and analysis of transcriptome maps from multiple sources

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Several tools have been developed to perform global gene expression profile data analysis, to search for specific chromosomal regions whose features meet defined criteria as well as to study neighbouring gene expression. However, most of these tools are tailored for a specific use in a particular context (e.g. they are species-specific, or limited to a particular data format) and they typically accept only gene lists as input.</p> <p>Results</p> <p>TRAM (Transcriptome Mapper) is a new general tool that allows the simple generation and analysis of quantitative transcriptome maps, starting from any source listing gene expression values for a given gene set (e.g. expression microarrays), implemented as a relational database. It includes a parser able to assign univocal and updated gene symbols to gene identifiers from different data sources. Moreover, TRAM is able to perform intra-sample and inter-sample data normalization, including an original variant of quantile normalization (scaled quantile), useful to normalize data from platforms with highly different numbers of investigated genes. When in 'Map' mode, the software generates a quantitative representation of the transcriptome of a sample (or of a pool of samples) and identifies if segments of defined lengths are over/under-expressed compared to the desired threshold. When in 'Cluster' mode, the software searches for a set of over/under-expressed consecutive genes. Statistical significance for all results is calculated with respect to genes localized on the same chromosome or to all genome genes. Transcriptome maps, showing differential expression between two sample groups, relative to two different biological conditions, may be easily generated. We present the results of a biological model test, based on a meta-analysis comparison between a sample pool of human CD34+ hematopoietic progenitor cells and a sample pool of megakaryocytic cells. Biologically relevant chromosomal segments and gene clusters with differential expression during the differentiation toward megakaryocyte were identified.</p> <p>Conclusions</p> <p>TRAM is designed to create, and statistically analyze, quantitative transcriptome maps, based on gene expression data from multiple sources. The release includes FileMaker Pro database management runtime application and it is freely available at <url>http://apollo11.isto.unibo.it/software/</url>, along with preconfigured implementations for mapping of human, mouse and zebrafish transcriptomes.</p

    From Data to Software to Science with the Rubin Observatory LSST

    Full text link
    The Vera C. Rubin Observatory Legacy Survey of Space and Time (LSST) dataset will dramatically alter our understanding of the Universe, from the origins of the Solar System to the nature of dark matter and dark energy. Much of this research will depend on the existence of robust, tested, and scalable algorithms, software, and services. Identifying and developing such tools ahead of time has the potential to significantly accelerate the delivery of early science from LSST. Developing these collaboratively, and making them broadly available, can enable more inclusive and equitable collaboration on LSST science. To facilitate such opportunities, a community workshop entitled "From Data to Software to Science with the Rubin Observatory LSST" was organized by the LSST Interdisciplinary Network for Collaboration and Computing (LINCC) and partners, and held at the Flatiron Institute in New York, March 28-30th 2022. The workshop included over 50 in-person attendees invited from over 300 applications. It identified seven key software areas of need: (i) scalable cross-matching and distributed joining of catalogs, (ii) robust photometric redshift determination, (iii) software for determination of selection functions, (iv) frameworks for scalable time-series analyses, (v) services for image access and reprocessing at scale, (vi) object image access (cutouts) and analysis at scale, and (vii) scalable job execution systems. This white paper summarizes the discussions of this workshop. It considers the motivating science use cases, identified cross-cutting algorithms, software, and services, their high-level technical specifications, and the principles of inclusive collaborations needed to develop them. We provide it as a useful roadmap of needs, as well as to spur action and collaboration between groups and individuals looking to develop reusable software for early LSST science.Comment: White paper from "From Data to Software to Science with the Rubin Observatory LSST" worksho

    Overexpression of the Cytokine BAFF and Autoimmunity Risk

    Get PDF
    BACKGROUND\textbf{BACKGROUND}: Genomewide association studies of autoimmune diseases have mapped hundreds of susceptibility regions in the genome. However, only for a few association signals has the causal gene been identified, and for even fewer have the causal variant and underlying mechanism been defined. Coincident associations of DNA variants affecting both the risk of autoimmune disease and quantitative immune variables provide an informative route to explore disease mechanisms and drug-targetable pathways. METHODS\textbf{METHODS}: Using case-control samples from Sardinia, Italy, we performed a genomewide association study in multiple sclerosis followed by TNFSF13B locus-specific association testing in systemic lupus erythematosus (SLE). Extensive phenotyping of quantitative immune variables, sequence-based fine mapping, cross-population and cross-phenotype analyses, and gene-expression studies were used to identify the causal variant and elucidate its mechanism of action. Signatures of positive selection were also investigated. RESULTS\textbf{RESULTS}: A variant in TNFSF13B, encoding the cytokine and drug target B-cell activating factor (BAFF), was associated with multiple sclerosis as well as SLE. The disease-risk allele was also associated with up-regulated humoral immunity through increased levels of soluble BAFF, B lymphocytes, and immunoglobulins. The causal variant was identified: an insertion-deletion variant, GCTGT→A (in which A is the risk allele), yielded a shorter transcript that escaped microRNA inhibition and increased production of soluble BAFF, which in turn up-regulated humoral immunity. Population genetic signatures indicated that this autoimmunity variant has been evolutionarily advantageous, most likely by augmenting resistance to malaria. CONCLUSIONS\textbf{CONCLUSIONS}: A TNFSF13B variant was associated with multiple sclerosis and SLE, and its effects were clarified at the population, cellular, and molecular levels. (Funded by the Italian Foundation for Multiple Sclerosis and others.).Supported by grants (2011/R/13 and 2015/R/09, to Dr. Cucca) from the Italian Foundation for Multiple Sclerosis; contracts (N01-AG-1-2109 and HHSN271201100005C, to Dr. Cucca) from the Intramural Research Program of the National Institute on Aging, National Institutes of Health (NIH); a grant (FaReBio2011 “Farmaci e Reti Biotecnologiche di Qualità,” to Dr. Cucca) from the Italian Ministry of Economy and Finance; a grant (633964, to Dr. Cucca) from the Horizon 2020 Research and Innovation Program of the European Union; a grant (U1301.2015/AI.1157.BE Prat. 2015-1651, to Dr. Cucca) from Fondazione di Sardegna; grants (“Centro per la ricerca di nuovi farmaci per malattie rare, trascurate e della povertà” and “Progetto collezione di composti chimici ed attività di screening,” to Dr. Cucca) from Ministero dell’Istruzione, dell’Università e della Ricerca; grants (HG005581, HG005552, HG006513, and HG007022, to Dr. Abecasis) from the National Human Genome Research Institute; a grant (9-2011-253, to Dr. Todd) from JDRF; a grant (091157, to Dr. Todd) from the Wellcome Trust; a grant (to Dr. Todd) from the National Institute for Health Research (NIHR); and the NIHR Cambridge Biomedical Research Centre. Dr. Idda was a recipient of a Master and Back fellowship from the Autonomous Region of Sardinia

    From Data to Software to Science with the Rubin Observatory LSST

    Full text link
    editorial reviewedThe Vera C. Rubin Observatory Legacy Survey of Space and Time (LSST) dataset will dramatically alter our understanding of the Universe, from the origins of the Solar System to the nature of dark matter and dark energy. Much of this research will depend on the existence of robust, tested, and scalable algorithms, software, and services. Identifying and developing such tools ahead of time has the potential to significantly accelerate the delivery of early science from LSST. Developing these collaboratively, and making them broadly available, can enable more inclusive and equitable collaboration on LSST science. To facilitate such opportunities, a community workshop entitled "From Data to Software to Science with the Rubin Observatory LSST" was organized by the LSST Interdisciplinary Network for Collaboration and Computing (LINCC) and partners, and held at the Flatiron Institute in New York, March 28-30th 2022. The workshop included over 50 in-person attendees invited from over 300 applications. It identified seven key software areas of need: (i) scalable cross-matching and distributed joining of catalogs, (ii) robust photometric redshift determination, (iii) software for determination of selection functions, (iv) frameworks for scalable time-series analyses, (v) services for image access and reprocessing at scale, (vi) object image access (cutouts) and analysis at scale, and (vii) scalable job execution systems. This white paper summarizes the discussions of this workshop. It considers the motivating science use cases, identified cross-cutting algorithms, software, and services, their high-level technical specifications, and the principles of inclusive collaborations needed to develop them. We provide it as a useful roadmap of needs, as well as to spur action and collaboration between groups and individuals looking to develop reusable software for early LSST science

    Used Cooking Oils in the Biogas Chain: A Technical and Economic Assessment

    Get PDF
    The current concerns on global energy security, climate change, and environmental pollution represent some of the major elements of the growing interest on renewable energy. In this framework agro-food energy systems are at the center of a twofold debate: on the one hand they represent a key option for energy production while on the other their sustainability is threatened by the expansion of the bioenergy market that could lead to negative social and environmental consequences. The aim of this work is to evaluate—through a case study—the technical and economic feasibility of the replacement of energy crops (ECs) with used cooking oil (UCO) in an anaerobic digestion (AD) full-scale plant. At this purpose, a full-scale plant performing AD was monitored for two years. Three scenarios were developed and compared to evaluate the impacts and the potential benefits in terms of land saving in case of a substitution of ECs with UCO. Results highlighted a reduction of land use of over 50% if UCO is introduced in co-digestion with ECs. The lack of an appropriate legislative framework limits the utilization of used cooking oils (UCOs) in AD with a consequently missed opportunity for biogas owners that could find an important alternative in UCO
    corecore