14,853 research outputs found

    Meso-scale FDM material layout design strategies under manufacturability constraints and fracture conditions

    Get PDF
    In the manufacturability-driven design (MDD) perspective, manufacturability of the product or system is the most important of the design requirements. In addition to being able to ensure that complex designs (e.g., topology optimization) are manufacturable with a given process or process family, MDD also helps mechanical designers to take advantage of unique process-material effects generated during manufacturing. One of the most recognizable examples of this comes from the scanning-type family of additive manufacturing (AM) processes; the most notable and familiar member of this family is the fused deposition modeling (FDM) or fused filament fabrication (FFF) process. This process works by selectively depositing uniform, approximately isotropic beads or elements of molten thermoplastic material (typically structural engineering plastics) in a series of pre-specified traces to build each layer of the part. There are many interesting 2-D and 3-D mechanical design problems that can be explored by designing the layout of these elements. The resulting structured, hierarchical material (which is both manufacturable and customized layer-by-layer within the limits of the process and material) can be defined as a manufacturing process-driven structured material (MPDSM). This dissertation explores several practical methods for designing these element layouts for 2-D and 3-D meso-scale mechanical problems, focusing ultimately on design-for-fracture. Three different fracture conditions are explored: (1) cases where a crack must be prevented or stopped, (2) cases where the crack must be encouraged or accelerated, and (3) cases where cracks must grow in a simple pre-determined pattern. Several new design tools, including a mapping method for the FDM manufacturability constraints, three major literature reviews, the collection, organization, and analysis of several large (qualitative and quantitative) multi-scale datasets on the fracture behavior of FDM-processed materials, some new experimental equipment, and the refinement of a fast and simple g-code generator based on commercially-available software, were developed and refined to support the design of MPDSMs under fracture conditions. The refined design method and rules were experimentally validated using a series of case studies (involving both design and physical testing of the designs) at the end of the dissertation. Finally, a simple design guide for practicing engineers who are not experts in advanced solid mechanics nor process-tailored materials was developed from the results of this project.U of I OnlyAuthor's request

    Process intensification of oxidative coupling of methane

    No full text

    Likelihood Asymptotics in Nonregular Settings: A Review with Emphasis on the Likelihood Ratio

    Full text link
    This paper reviews the most common situations where one or more regularity conditions which underlie classical likelihood-based parametric inference fail. We identify three main classes of problems: boundary problems, indeterminate parameter problems -- which include non-identifiable parameters and singular information matrices -- and change-point problems. The review focuses on the large-sample properties of the likelihood ratio statistic. We emphasize analytical solutions and acknowledge software implementations where available. We furthermore give summary insight about the possible tools to derivate the key results. Other approaches to hypothesis testing and connections to estimation are listed in the annotated bibliography of the Supplementary Material

    Properties of a model of sequential random allocation

    Get PDF
    Probabilistic models of allocating shots to boxes according to a certain probability distribution have commonly been used for processes involving agglomeration. Such processes are of interest in many areas of research such as ecology, physiology, chemistry and genetics. Time could be incorporated into the shots-and-boxes model by considering multiple layers of boxes through which the shots move, where the layers represent the passing of time. Such a scheme with multiple layers, each with a certain number of occupied boxes is naturally associated with a random tree. It lends itself to genetic applications where the number of ancestral lineages of a sample changes through the generations. This multiple-layer scheme also allows us to explore the difference in the number of occupied boxes between layers, which gives a measure of how quickly merges are happening. In particular, results for the multiple-layer scheme corresponding to those known for a single-layer scheme, where, under certain conditions, the limiting distribution of the number of occupied boxes is either Poisson or normal, are derived. To provide motivation and demonstrate which methods work well, a detailed study of a small, finite example is provided. A common approach for establishing a limiting distribution for a random variable of interest is to first show that it can be written as a sum of independent Bernoulli random variables as this then allows us to apply standard central limit theorems. Additionally, it allows us to, for example, provide an upper bound on the distance to a Poisson distribution. One way of showing that a random variable can be written as a sum of independent Bernoulli random variables is to show that its probability generating function (p.g.f.) has all real roots. Various methods are presented and considered for proving the p.g.f. of the number of occupied boxes in any given layer of the scheme has all real roots. By considering small finite examples some of these methods could be ruled out for general N. Finally, the scheme for general N boxes and n shots is considered, where again a uniform allocation of shots is used. It is shown that, under certain conditions, the distribution of the number of occupied boxes tends towards either a normal or Poisson limit. Equivalent results are also demonstrated for the distribution of the difference in the number of occupied boxes between consecutive layers

    Modelling uncertainties for measurements of the H → γγ Channel with the ATLAS Detector at the LHC

    Get PDF
    The Higgs boson to diphoton (H → γγ) branching ratio is only 0.227 %, but this final state has yielded some of the most precise measurements of the particle. As measurements of the Higgs boson become increasingly precise, greater import is placed on the factors that constitute the uncertainty. Reducing the effects of these uncertainties requires an understanding of their causes. The research presented in this thesis aims to illuminate how uncertainties on simulation modelling are determined and proffers novel techniques in deriving them. The upgrade of the FastCaloSim tool is described, used for simulating events in the ATLAS calorimeter at a rate far exceeding the nominal detector simulation, Geant4. The integration of a method that allows the toolbox to emulate the accordion geometry of the liquid argon calorimeters is detailed. This tool allows for the production of larger samples while using significantly fewer computing resources. A measurement of the total Higgs boson production cross-section multiplied by the diphoton branching ratio (σ × Bγγ) is presented, where this value was determined to be (σ × Bγγ)obs = 127 ± 7 (stat.) ± 7 (syst.) fb, within agreement with the Standard Model prediction. The signal and background shape modelling is described, and the contribution of the background modelling uncertainty to the total uncertainty ranges from 18–2.4 %, depending on the Higgs boson production mechanism. A method for estimating the number of events in a Monte Carlo background sample required to model the shape is detailed. It was found that the size of the nominal γγ background events sample required a multiplicative increase by a factor of 3.60 to adequately model the background with a confidence level of 68 %, or a factor of 7.20 for a confidence level of 95 %. Based on this estimate, 0.5 billion additional simulated events were produced, substantially reducing the background modelling uncertainty. A technique is detailed for emulating the effects of Monte Carlo event generator differences using multivariate reweighting. The technique is used to estimate the event generator uncertainty on the signal modelling of tHqb events, improving the reliability of estimating the tHqb production cross-section. Then this multivariate reweighting technique is used to estimate the generator modelling uncertainties on background V γγ samples for the first time. The estimated uncertainties were found to be covered by the currently assumed background modelling uncertainty

    Wildlife trade in Latin America: people, economy and conservation

    Get PDF
    Wildlife trade is among the main threats to biodiversity conservation and may pose a risk to human health because of the spread of zoonotic diseases. To avoid social, economic and environmental consequences of illegal trade, it is crucial to understand the factors influencing the wildlife market and the effectiveness of policies already in place. I aim to unveil the biological and socioeconomic factors driving wildlife trade, the health risks imposed by the activity, and the effectiveness of certified captive-breeding as a strategy to curb the illegal market in Latin America through a multidisciplinary approach. I assess socioeconomic correlates of the emerging international trade in wild cat species from Latin America using a dataset of >1,000 seized cats, showing that high levels of corruption and Chinese private investment and low income per capita were related to higher numbers of jaguar seizures. I assess the effectiveness of primate captive-breeding programmes as an intervention to curb wildlife trafficking. Illegal sources held >70% of the primate market share. Legal primates are more expensive, and the production is not sufficiently high to fulfil the demand. I assess the scale of the illegal trade and ownership of venomous snakes in Brazil. Venomous snake taxa responsible for higher numbers of snakebites were those most often kept as pets. I uncover how online wildlife pet traders and consumers responded to campaigns associating the origin of the COVID-19 pandemic. Of 20,000 posts on Facebook groups, only 0.44% mentioned COVID-19 and several stimulated the trade in wild species during lockdown. Despite the existence of international and national wildlife trade regulations, I conclude that illegal wildlife trade is still an issue that needs further addressing in Latin America. I identify knowledge gaps and candidate interventions to amend the current loopholes to reduce wildlife trafficking. My aspiration with this thesis is to provide useful information that can inform better strategies to tackle illegal wildlife trade in Latin America

    A Molecular Approach to the Diagnosis, Assessment, Monitoring and Treatment of Pulmonary Non-Tuberculous Mycobacterial Disease

    Get PDF
    Introduction: Non-Tuberculous Mycobacteria (NTM) can cause disease of the lungs and sinuses, lymph nodes, joints and central nervous system as well as disseminated infections in immunocompromised individuals. Efforts to tackle infections in NTM are hampered by a lack of reliable biomarkers for diagnosis, assessment of disease activity, and prognostication. Aims: The broad aims of this thesis are: 1. to develop molecular assays capable of quantifying the 6 most common pathogenic mycobacteria (M. abscessus, M. avium, M. intracellulare, M. malmoense, M. kansasii, M. xenopi) and calculate comparative sensitivities and specificities for each assay. 2. to assess patients’ clinical course over 12 – 18 months by performing the developed molecular assays against DNA extracted from sputum from patients with NTM infection. 3. to assess dynamic bacterial changes of the lung microbiome in patients on treatment for NTM disease and those who are treatment na ve. Methods: DNA was extracted from a total of 410 sputum samples obtained from 38 patients who were either: • commencing treatment for either M. abscessus or Mycobacterium avium complex. • considered colonised with M. abscessus or Mycobacterium avium complex (i.e. cultured NTM but were not deemed to have infection as they did not meet ATS or BTS criteria for disease). • Diagnosed with cystic fibrosis (CF) or non-CF bronchiectasis but had never cultured NTM. For the development of quantitative molecular assays, NTM hsp65 gene sequences were aligned and interrogated for areas of variability. These variable regions enabled the creation of species specific probes. In vitro sensitivity and specificity for each probe was determined by testing each probe against a panel of plasmids containing hsp65 gene inserts from different NTM species. Quantification accuracy was determined by using each assay against a mock community containing serial dilutions of target DNA. Each sample was tested with the probes targeting: M. abscessus, M. avium and M. intracellulare producing a longitudinal assessment of NTM copy number during each patient’s clinical course. In addition, a total of 64 samples from 16 patients underwent 16S rRNA gene sequencing to characterise longitudinal changes in the microbiome of both NTM disease and controls. Results: In vitro sensitivity for the custom assays were 100% and specificity ranged from 91.6% to 100%. In terms of quantification accuracy, there was no significant difference between the measured results of each assay and the expected values when performed in singleplex. The assays were able to accurately determine NTM copy number to a theoretical limit of 10 copies/μl. When used against samples derived from human sputum and using culture results as a gold standard, the sensitivity of the assay for M. abscessus was found to be 0.87 and 0.86 for MAC. The specificity of the assay for M. abscessus was 0.95 and 0.62 for MAC. The negative predictive value of the assay for M. abscessus was 0.98 and 0.95 for MAC. This resulted in an AUC of 0.92 for M. abscessus and 0.74 for MAC. Longitudinal analysis of the lung microbiome using 16SrRNA gene sequencing showed that bacterial burden initially decreases after initiation of antibiotic therapy but begins to return to normal levels over several months of antibiotic therapy. This effect is mirrored by changes in alpha diversity. The decrease in bacterial burden and loss of alpha diversity was found to be secondary to significant changes in specific genera such as Veillonella and Streptococcus. The abundance of other Proteobacteria such as Pseudomonas remain relatively constant. Conclusion: The molecular assay has shown high in vitro sensitivity and specificity for the detection and accurate quantification of the 6 most commonly pathogenic NTM species. The assays successfully identified NTM DNA from human sputum samples. A notable association between NTM copy number and the cessation of one or more antibiotics existed (i.e. when one antibiotic was stopped because of patient intolerance, NTM copy number increased, often having been unrecordable prior to this). The qPCR assays developed in this thesis provide an affordable, real time and rapid measurement of NTM burden allowing clinicians to act on problematic results sooner than currently possible. There was no significant difference between the microbiome in bronchiectasis and cystic fibrosis nor was there a significant difference between the microbiome in patients requiring treatment for NTM and those who did not. Patients receiving treatment experienced an initial decrease in bacterial burden over the first weeks of treatment followed by a gradual increase towards baseline over the next weeks to months. This change was mirrored in measures of alpha diversity. Changes in abundance and diversity were accounted for by decreases in specific bacteria whilst the abundance of other bacteria increased, occupying the microbial niche created. These bacteria (for example Pseudomonas spp) are often associated with morbidity.Open Acces

    Search for third generation vector-like leptons with the ATLAS detector

    Get PDF
    The Standard Model of particle physics provides a concise description of the building blocks of our universe in terms of fundamental particles and their interactions. It is an extremely successful theory, providing a plethora of predictions that precisely match experimental observation. In 2012, the Higgs boson was observed at CERN and was the last particle predicted by the Standard Model that had yet-to-be discovered. While this added further credibility to the theory, the Standard Model appears incomplete. Notably, it only accounts for 5% of the energy density of the universe (the rest being ``dark matter'' and ``dark energy''), it cannot resolve the gravitational force with quantum theory, it does not explain the origin of neutrino masses and cannot account for matter/anti-matter asymmetry. The most plausible explanation is that the theory is an approximation and new physics remains. Vector-like leptons are well-motivated by a number of theories that seek to provide closure on the Standard Model. They are a simple addition to the Standard Model and can help to resolve a number of discrepancies without disturbing precisely measured observables. This thesis presents a search for vector-like leptons that preferentially couple to tau leptons. The search was performed using proton-proton collision data from the Large Hadron Collider collected by the ATLAS experiment from 2015 to 2018 at center-of-mass energy of 13 TeV, corresponding to an integrated luminosity of 139 inverse femtobarns. Final states of various lepton multiplicities were considered to isolate the vector-like lepton signal against Standard Model and instrumental background. The major backgrounds mimicking the signal are from WZ, ZZ, tt+Z production and from mis-identified leptons. A number of boosted decision trees were used to improve rejection power against background where the signal was measured using a binned-likelihood estimator. No excess relative to the Standard Model was observed. Exclusion limits were placed on vector-like leptons in the mass range of 130 to 898 GeV

    Optimizing transcriptomics to study the evolutionary effect of FOXP2

    Get PDF
    The field of genomics was established with the sequencing of the human genome, a pivotal achievement that has allowed us to address various questions in biology from a unique perspective. One question in particular, that of the evolution of human speech, has gripped philosophers, evolutionary biologists, and now genomicists. However, little is known of the genetic basis that allowed humans to evolve the ability to speak. Of the few genes implicated in human speech, one of the most studied is FOXP2, which encodes for the transcription factor Forkhead box protein P2 (FOXP2). FOXP2 is essential for proper speech development and two mutations in the human lineage are believed to have contributed to the evolution of human speech. To address the effect of FOXP2 and investigate its evolutionary contribution to human speech, one can utilize the power of genomics, more specifically gene expression analysis via ribonucleic acid sequencing (RNA-seq). To this end, I first contributed in developing mcSCRB-seq, a highly sensitive, powerful, and efficient single cell RNA-seq (scRNA-seq) protocol. Previously having emerged as a central method for studying cellular heterogeneity and identifying cellular processes, scRNA-seq was a powerful genomic tool but lacked the sensitivity and cost-efficiency of more established protocols. By systematically evaluating each step of the process, I helped find that the addition of polyethylene glycol increased sensitivity by enhancing the cDNA synthesis reaction. This, along with other optimizations resulted in developing a sensitive and flexible protocol that is cost-efficient and ideal in many research settings. A primary motivation driving the extensive optimizations surrounding single cell transcriptomics has been the generation of cellular atlases, which aim to identify and characterize all of the cells in an organism. As such efforts are carried out in a variety of research groups using a number of different RNA-seq protocols, I contributed in an effort to benchmark and standardize scRNA-seq methods. This not only identified methods which may be ideal for the purpose of cell atlas creation, but also highlighted optimizations that could be integrated into existing protocols. Using mcSCRB-seq as a foundation as well as the findings from the scRNA-seq benchmarking, I helped develop prime-seq, a sensitive, robust, and most importantly, affordable bulk RNA-seq protocol. Bulk RNA-seq was frequently overlooked during the efforts to optimize and establish single-cell techniques, even though the method is still extensively used in analyzing gene expression. Introducing early barcoding and reducing library generation costs kept prime-seq cost-efficient, but basing it off of single-cell methods ensured that it would be a sensitive and powerful technique. I helped verify this by benchmarking it against TruSeq generated data and then helped test the robustness by generating prime-seq libraries from over seventeen species. These optimizations resulted in a final protocol that is well suited for investigating gene expression in comprehensive and high-throughput studies. Finally, I utilized prime-seq in order to develop a comprehensive gene expression atlas to study the function of FOXP2 and its role in speech evolution. I used previously generated mouse models: a knockout model containing one non-functional Foxp2 allele and a humanized model, which has a variant Foxp2 allele with two human-specific mutations. To study the effect globally across the mouse, I helped harvest eighteen tissues which were previously identified to express FOXP2. By then comparing the mouse models to wild-type mice, I helped highlight the importance of FOXP2 within lung development and the importance of the human variant allele in the brain. Both mcSCRB-seq and prime-seq have already been used and published in numerous studies to address a variety of biological and biomedical questions. Additionally, my work on FOXP2 not only provides a thorough expression atlas, but also provides a detailed and cost-efficient plan for undertaking a similar study on other genes of interest. Lastly, the studies on FOXP2 done within this work, lay the foundation for future studies investigating the role of FOXP2 in modulating learning behavior, and thereby affecting human speech

    Analysis of spatial point patterns on surfaces

    Get PDF
    With the advent of improved data acquisition technologies more complex spatial datasets can be collected at scale meaning theoretical and methodological developments in spatial statistics are imperative in order to analyse and generate meaningful conclusions. Spatial statistics has seen a plethora of applications in life sciences with particular emphasis on ecology, epidemiology and cell microscopy. Applications of these techniques provides researchers with insight on how the locations of objects of interest can be influenced by their neighbours and the environment. Examples include understanding the spatial distribution of trees observed within some window, and understanding how neighbouring trees and potentially soil contents can influence this. Whilst the literature for spatial statistics is rich the common assumption is that point processes are usually restricted to some d-dimensional Euclidean space, for example cell locations in a rectangular window of 2-dimensional Euclidean space. As such current theory is not capable of handling patterns which lie on more complex spaces, for example cubes and ellipsoids. Recent efforts have successfully extended methodology from Euclidean space to spheres by using the chordal distance (the shortest distance between any two points on a sphere) in place of the Euclidean distance. In this thesis we build on this work by considering point processes lying on more complex surfaces. Our first significant contribution discusses the construction of functional summary statistics for Poisson processes which lie on compact subsets of Rd which are off lower dimension. We map the process from its original space to the sphere where it is possible to take advantage of rotational symmetries which allow for well-defined summary statistics. These in turn can be used to determine whether an observed point patterns exhibits clustered or regular behaviour. Partnering this work we also provide a hypothesis testing procedure based on these functional summary statistics to determine whether an observed point pattern is complete spatially random. Two test statistics are proposed, one based on the commonly used L-function for planar processes and the other a standardisation of the K-function. These test statistics are compared in an extensive simulation study across ellipsoids of varying dimensions and processes which display differing levels of aggregation or regularity. Estimates of first order properties of a point process are extremely important. They can provide a graphical illustration of inhomogeneity and are useful in second order analysis. We demonstrate how kernel estimation can be extended from a Euclidean space to a Riemannian manifold where the Euclidean metric is now substituted for a Riemannian one. Many of the desirable properties for Euclidean kernel estimates carry over to the Riemannian setting. The issue of edge correction is also discussed and two criteria for bandwidth selection are proposed. These two selection criteria are explored through a simulation study. Finally, an important area of research in spatial statistics is exploring the interaction between different processes, for example how different species of plant spatially interact within some window. Under the framework of marked point processes we show that functional summary statistics for multivariate point patterns can be constructed on the sphere. This is extended to more general convex shapes through an appropriate mapping from the original shape to the sphere. A number of examples highlight that these summary statistics can capture independence, aggregation and repulsion between components of a multivariate process on both the sphere and more general surfaces.Open Acces
    • …
    corecore