187 research outputs found

    Decoding type I interferon response dynamics using microfluidics and modeling

    Get PDF

    'Ane end of an auld song?': macro and micro perspectives on written Scots in correspondence during the union of parliaments debates

    Get PDF
    This thesis examines the relationship between political identity and variation from a diachronic perspective. Specifically, it explores the use of written Scots features in the personal correspondence of Scottish politicians active during the Union of the Parliaments debates. Written Scots by 1700 had steadily retreated from most text-types in the face of ongoing anglicisation, but simultaneously the Union debates sparked heated discussion around questions of nationality and Scotland's separate identity. I consider the extent to which the use of Scots features may have been influenced by such discourse, but also how they may have become indexical markers used to lay claim to these ideologies. Drawing from the frameworks of First, Second and Third Wave perspectives on variation, and combining quantitative, macro-social methods with micro-social analysis, broad socio-political factors are explored alongside plausible stylistic intentions in conditioning or influencing the linguistic behaviour of these writers. The first analysis examines variation in the corpus temporally, using the chronologically-organised clustering technique VNC - Variability-based Neighbor Clustering (Gries and Hilpert, 2008), to measure Scots features over time. The crucial years of the debates (1700-1707) are compared with correspondence either side, and the VNC analysis identifies heightened use of Scots falling within the key years of the debates. The following macro-social analysis explores the factors driving this variation quantitatively, using a number of different statistical models to examine the data from various perspectives. Probabilities of Scots are found to correlate with certain political factors, though in complex and multilayered ways that reflect the composite nature of the historical figures operating in the Scottish parliament. The third analysis focuses on the features of written Scots itself and how these pattern in aggregate and across the individual authors who comprise the corpus. Findings suggest the persistence of written Scots was not being driven by a singular feature or set of tokens, rather, authors varied widely in their range and proportion of different variants. Finally, the micro-analysis examines the intra-writer variation of four individuals representing different political interests, exploring their Scots use across various recipients. Close-up inspection of features within particular extracts and letters suggests the subtle social and stylistic functions Scots had acquired for these writers. Its occurrence was found to reflect but also constitute the macro-social patterns identified earlier. Taken together, results indicate the use of Scots features was both influenced by, and contributed to, the political and ideological loyalties these writers harboured. Moreover, they tentatively suggest a process of reinterpretation was underway, in which Scots features were becoming a resource that could be selectively employed for particular indexical and communicative purposes

    An Iterative CT Reconstruction Algorithm for Fast Fluid Flow Imaging

    Get PDF
    The study of fluid flow through solid matter by computed tomography (CT) imaging has many applications, ranging from petroleum and aquifer engineering to biomedical, manufacturing, and environmental research. To avoid motion artifacts, current experiments are often limited to slow fluid flow dynamics. This severely limits the applicability of the technique. In this paper, a new iterative CT reconstruction algorithm for improved a temporal/spatial resolution in the imaging of fluid flow through solid matter is introduced. The proposed algorithm exploits prior knowledge in two ways. First, the time-varying object is assumed to consist of stationary (the solid matter) and dynamic regions (the fluid flow). Second, the attenuation curve of a particular voxel in the dynamic region is modeled by a piecewise constant function over time, which is in accordance with the actual advancing fluid/air boundary. Quantitative and qualitative results on different simulation experiments and a real neutron tomography data set show that, in comparison with the state-of-the-art algorithms, the proposed algorithm allows reconstruction from substantially fewer projections per rotation without image quality loss. Therefore, the temporal resolution can be substantially increased, and thus fluid flow experiments with faster dynamics can be performed

    Extending defoe for the efficient analysis of historical texts at scale

    Get PDF
    Funding: This work was partly funded by the Data-Driven Innovation Programme as part of the Edinburgh and South East Scotland City Region Deal, by the University of Edinburgh, and by Google Cloud Platform research credits program.This paper presents the new facilities provided in defoe, a parallel toolbox for querying a wealth of digitised newspapers and books at scale. defoe has been extended to work with further Natural Language Processing () tools such as the Edinburgh Geoparser, to store the preprocessed text in several storage facilities and to support different types of queries and analyses. We have also extended the collection of XML schemas supported by defoe, increasing the versatility of the tool for the analysis of digital historical textual data at scale. Finally, we have conducted several studies in which we worked with humanities and social science researchers who posed complex and interested questions to large-scale digital collections. Results shows that defoe allows researchers to conduct their studies and obtain results faster, while all the large-scale text mining complexity is automatically handled by defoe.Postprin

    Pore REconstruction and Segmentation (PORES) method for improved porosity quantification of nanoporous materials

    Get PDF
    Electron tomography is currently a versatile tool to investigate the connection between the structure and properties of nanomaterials. However, a quantitative interpretation of electron tomography results is still far from straightforward. Especially accurate quantification of pore-space is hampered by artifacts introduced in all steps of the processing chain, i.e., acquisition, reconstruction, segmentation and quantification. Furthermore, most common approaches require subjective manual user input. In this paper, the PORES algorithm “POre REconstruction and Segmentation” is introduced; it is a tailor-made, integral approach, for the reconstruction, segmentation, and quantification of porous nanomaterials. The PORES processing chain starts by calculating a reconstruction with a nanoporous-specific reconstruction algorithm: the Simultaneous Update of Pore Pixels by iterative REconstruction and Simple Segmentation algorithm (SUPPRESS). It classifies the interior region to the pores during reconstruction, while reconstructing the remaining region by reducing the error with respect to the acquired electron microscopy data. The SUPPRESS reconstruction can be directly plugged into the remaining processing chain of the PORES algorithm, resulting in accurate individual pore quantification and full sample pore statistics. The proposed approach was extensively validated on both simulated and experimental data, indicating its ability to generate accurate statistics of nanoporous materials

    Region based 4D tomographic image reconstruction: Application to cardiac x-ray CT

    Get PDF
    X-ray computed tomography (CT) is a powerful tool for noninvasive cardiac imaging. However, radiation dose is a major issue. In this paper, we propose an iterative reconstruction method that reduces the radiation dose without compromising image quality. This is achieved by exploiting prior knowledge in two ways: the reconstructed object is assumed to consist of both stationary and dynamic regions over time and the dynamic region is assumed to have sparse structures after a proper sparsifying space-time transform. Experiments on simulation data and a real micro-CT cardiac mouse dataset show that, with comparable image quality, the radiation dose can be substantially reduced compared to conventional acquisition/reconstruction protocols

    A comprehensive platform for quality control of botanical drugs (PhytomicsQC): a case study of Huangqin Tang (HQT) and PHY906

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Establishing botanical extracts as globally-accepted polychemical medicines and a new paradigm for disease treatment, requires the development of high-level quality control metrics. Based on comprehensive chemical and biological fingerprints correlated with pharmacology, we propose a general approach called PhytomicsQC to botanical quality control.</p> <p>Methods</p> <p>Incorporating the state-of-the-art analytical methodologies, PhytomicsQC was employed in this study and included the use of liquid chromatography/mass spectrometry (LC/MS) for chemical characterization and chemical fingerprinting, differential cellular gene expression for bioresponse fingerprinting and animal pharmacology for <it>in vivo </it>validation. A statistical pattern comparison method, Phytomics Similarity Index (PSI), based on intensities and intensity ratios, was used to determine the similarity of the chemical and bioresponse fingerprints among different manufactured batches.</p> <p>Results</p> <p>Eighteen batch samples of Huangqin Tang (HQT) and its pharmaceutical grade version (PHY906) were analyzed using the PhytomicsQC platform analysis. Comparative analysis of the batch samples with a clinically tested standardized batch obtained values of PSI similarity between 0.67 and 0.99.</p> <p>Conclusion</p> <p>With rigorous quality control using analytically sensitive and comprehensive chemical and biological fingerprinting, botanical formulations manufactured under standardized manufacturing protocols can produce highly consistent batches of products.</p
    corecore