10,488 research outputs found

    Process intensification of oxidative coupling of methane

    No full text

    Classification of annotation semirings over containment of conjunctive queries

    Get PDF
    Funding: This work is supported under SOCIAM: The Theory and Practice of Social Machines, a project funded by the UK Engineering and Physical Sciences Research Council (EPSRC) under grant number EP/J017728/1. This work was also supported by FET-Open Project FoX, grant agreement 233599; EPSRC grants EP/F028288/1, G049165 and J015377; and the Laboratory for Foundations of Computer Science.We study the problem of query containment of conjunctive queries over annotated databases. Annotations are typically attached to tuples and represent metadata, such as probability, multiplicity, comments, or provenance. It is usually assumed that annotations are drawn from a commutative semiring. Such databases pose new challenges in query optimization, since many related fundamental tasks, such as query containment, have to be reconsidered in the presence of propagation of annotations. We axiomatize several classes of semirings for each of which containment of conjunctive queries is equivalent to existence of a particular type of homomorphism. For each of these types, we also specify all semirings for which existence of a corresponding homomorphism is a sufficient (or necessary) condition for the containment. We develop new decision procedures for containment for some semirings which are not in any of these classes. This generalizes and systematizes previous approaches.PostprintPeer reviewe

    Fiabilité de l’underfill et estimation de la durée de vie d’assemblages microélectroniques

    Get PDF
    Abstract : In order to protect the interconnections in flip-chip packages, an underfill material layer is used to fill the volumes and provide mechanical support between the silicon chip and the substrate. Due to the chip corner geometry and the mismatch of coefficient of thermal expansion (CTE), the underfill suffers from a stress concentration at the chip corners when the temperature is lower than the curing temperature. This stress concentration leads to subsequent mechanical failures in flip-chip packages, such as chip-underfill interfacial delamination and underfill cracking. Local stresses and strains are the most important parameters for understanding the mechanism of underfill failures. As a result, the industry currently relies on the finite element method (FEM) to calculate the stress components, but the FEM may not be accurate enough compared to the actual stresses in underfill. FEM simulations require a careful consideration of important geometrical details and material properties. This thesis proposes a modeling approach that can accurately estimate the underfill delamination areas and crack trajectories, with the following three objectives. The first objective was to develop an experimental technique capable of measuring underfill deformations around the chip corner region. This technique combined confocal microscopy and the digital image correlation (DIC) method to enable tri-dimensional strain measurements at different temperatures, and was named the confocal-DIC technique. This techique was first validated by a theoretical analysis on thermal strains. In a test component similar to a flip-chip package, the strain distribution obtained by the FEM model was in good agreement with the results measured by the confocal-DIC technique, with relative errors less than 20% at chip corners. Then, the second objective was to measure the strain near a crack in underfills. Artificial cracks with lengths of 160 μm and 640 μm were fabricated from the chip corner along the 45° diagonal direction. The confocal-DIC-measured maximum hoop strains and first principal strains were located at the crack front area for both the 160 μm and 640 μm cracks. A crack model was developed using the extended finite element method (XFEM), and the strain distribution in the simulation had the same trend as the experimental results. The distribution of hoop strains were in good agreement with the measured values, when the model element size was smaller than 22 μm to capture the strong strain gradient near the crack tip. The third objective was to propose a modeling approach for underfill delamination and cracking with the effects of manufacturing variables. A deep thermal cycling test was performed on 13 test cells to obtain the reference chip-underfill delamination areas and crack profiles. An artificial neural network (ANN) was trained to relate the effects of manufacturing variables and the number of cycles to first delamination of each cell. The predicted numbers of cycles for all 6 cells in the test dataset were located in the intervals of experimental observations. The growth of delamination was carried out on FEM by evaluating the strain energy amplitude at the interface elements between the chip and underfill. For 5 out of 6 cells in validation, the delamination growth model was consistent with the experimental observations. The cracks in bulk underfill were modelled by XFEM without predefined paths. The directions of edge cracks were in good agreement with the experimental observations, with an error of less than 2.5°. This approach met the goal of the thesis of estimating the underfill initial delamination, areas of delamination and crack paths in actual industrial flip-chip assemblies.Afin de protéger les interconnexions dans les assemblages, une couche de matériau d’underfill est utilisée pour remplir le volume et fournir un support mécanique entre la puce de silicium et le substrat. En raison de la géométrie du coin de puce et de l’écart du coefficient de dilatation thermique (CTE), l’underfill souffre d’une concentration de contraintes dans les coins lorsque la température est inférieure à la température de cuisson. Cette concentration de contraintes conduit à des défaillances mécaniques dans les encapsulations de flip-chip, telles que la délamination interfaciale puce-underfill et la fissuration d’underfill. Les contraintes et déformations locales sont les paramètres les plus importants pour comprendre le mécanisme des ruptures de l’underfill. En conséquent, l’industrie utilise actuellement la méthode des éléments finis (EF) pour calculer les composantes de la contrainte, qui ne sont pas assez précises par rapport aux contraintes actuelles dans l’underfill. Ces simulations nécessitent un examen minutieux de détails géométriques importants et des propriétés des matériaux. Cette thèse vise à proposer une approche de modélisation permettant d’estimer avec précision les zones de délamination et les trajectoires des fissures dans l’underfill, avec les trois objectifs suivants. Le premier objectif est de mettre au point une technique expérimentale capable de mesurer la déformation de l’underfill dans la région du coin de puce. Cette technique, combine la microscopie confocale et la méthode de corrélation des images numériques (DIC) pour permettre des mesures tridimensionnelles des déformations à différentes températures, et a été nommée le technique confocale-DIC. Cette technique a d’abord été validée par une analyse théorique en déformation thermique. Dans un échantillon similaire à un flip-chip, la distribution de la déformation obtenues par le modèle EF était en bon accord avec les résultats de la technique confocal-DIC, avec des erreurs relatives inférieures à 20% au coin de puce. Ensuite, le second objectif est de mesurer la déformation autour d’une fissure dans l’underfill. Des fissures artificielles d’une longueuer de 160 μm et 640 μm ont été fabriquées dans l’underfill vers la direction diagonale de 45°. Les déformations circonférentielles maximales et principale maximale étaient situées aux pointes des fissures correspondantes. Un modèle de fissure a été développé en utilisant la méthode des éléments finis étendue (XFEM), et la distribution des contraintes dans la simuation a montré la même tendance que les résultats expérimentaux. La distribution des déformations circonférentielles maximales était en bon accord avec les valeurs mesurées lorsque la taille des éléments était plus petite que 22 μm, assez petit pour capturer le grand gradient de déformation près de la pointe de fissure. Le troisième objectif était d’apporter une approche de modélisation de la délamination et de la fissuration de l’underfill avec les effets des variables de fabrication. Un test de cyclage thermique a d’abord été effectué sur 13 cellules pour obtenir les zones délaminées entre la puce et l’underfill, et les profils de fissures dans l’underfill, comme référence. Un réseau neuronal artificiel (ANN) a été formé pour établir une liaison entre les effets des variables de fabrication et le nombre de cycles à la délamination pour chaque cellule. Les nombres de cycles prédits pour les 6 cellules de l’ensemble de test étaient situés dans les intervalles d’observations expérimentaux. La croissance de la délamination a été réalisée par l’EF en évaluant l’énergie de la déformation au niveau des éléments interfaciaux entre la puce et l’underfill. Pour 5 des 6 cellules de la validation, le modèle de croissance du délaminage était conforme aux observations expérimentales. Les fissures dans l’underfill ont été modélisées par XFEM sans chemins prédéfinis. Les directions des fissures de bord étaient en bon accord avec les observations expérimentales, avec une erreur inférieure à 2,5°. Cette approche a répondu à la problématique qui consiste à estimer l’initiation des délamination, les zones de délamination et les trajectoires de fissures dans l’underfill pour des flip-chips industriels

    Optimizing transcriptomics to study the evolutionary effect of FOXP2

    Get PDF
    The field of genomics was established with the sequencing of the human genome, a pivotal achievement that has allowed us to address various questions in biology from a unique perspective. One question in particular, that of the evolution of human speech, has gripped philosophers, evolutionary biologists, and now genomicists. However, little is known of the genetic basis that allowed humans to evolve the ability to speak. Of the few genes implicated in human speech, one of the most studied is FOXP2, which encodes for the transcription factor Forkhead box protein P2 (FOXP2). FOXP2 is essential for proper speech development and two mutations in the human lineage are believed to have contributed to the evolution of human speech. To address the effect of FOXP2 and investigate its evolutionary contribution to human speech, one can utilize the power of genomics, more specifically gene expression analysis via ribonucleic acid sequencing (RNA-seq). To this end, I first contributed in developing mcSCRB-seq, a highly sensitive, powerful, and efficient single cell RNA-seq (scRNA-seq) protocol. Previously having emerged as a central method for studying cellular heterogeneity and identifying cellular processes, scRNA-seq was a powerful genomic tool but lacked the sensitivity and cost-efficiency of more established protocols. By systematically evaluating each step of the process, I helped find that the addition of polyethylene glycol increased sensitivity by enhancing the cDNA synthesis reaction. This, along with other optimizations resulted in developing a sensitive and flexible protocol that is cost-efficient and ideal in many research settings. A primary motivation driving the extensive optimizations surrounding single cell transcriptomics has been the generation of cellular atlases, which aim to identify and characterize all of the cells in an organism. As such efforts are carried out in a variety of research groups using a number of different RNA-seq protocols, I contributed in an effort to benchmark and standardize scRNA-seq methods. This not only identified methods which may be ideal for the purpose of cell atlas creation, but also highlighted optimizations that could be integrated into existing protocols. Using mcSCRB-seq as a foundation as well as the findings from the scRNA-seq benchmarking, I helped develop prime-seq, a sensitive, robust, and most importantly, affordable bulk RNA-seq protocol. Bulk RNA-seq was frequently overlooked during the efforts to optimize and establish single-cell techniques, even though the method is still extensively used in analyzing gene expression. Introducing early barcoding and reducing library generation costs kept prime-seq cost-efficient, but basing it off of single-cell methods ensured that it would be a sensitive and powerful technique. I helped verify this by benchmarking it against TruSeq generated data and then helped test the robustness by generating prime-seq libraries from over seventeen species. These optimizations resulted in a final protocol that is well suited for investigating gene expression in comprehensive and high-throughput studies. Finally, I utilized prime-seq in order to develop a comprehensive gene expression atlas to study the function of FOXP2 and its role in speech evolution. I used previously generated mouse models: a knockout model containing one non-functional Foxp2 allele and a humanized model, which has a variant Foxp2 allele with two human-specific mutations. To study the effect globally across the mouse, I helped harvest eighteen tissues which were previously identified to express FOXP2. By then comparing the mouse models to wild-type mice, I helped highlight the importance of FOXP2 within lung development and the importance of the human variant allele in the brain. Both mcSCRB-seq and prime-seq have already been used and published in numerous studies to address a variety of biological and biomedical questions. Additionally, my work on FOXP2 not only provides a thorough expression atlas, but also provides a detailed and cost-efficient plan for undertaking a similar study on other genes of interest. Lastly, the studies on FOXP2 done within this work, lay the foundation for future studies investigating the role of FOXP2 in modulating learning behavior, and thereby affecting human speech

    Methods for the analysis of oscillatory integrals and Bochner-Riesz operators

    Get PDF
    For a smooth surface Γ of arbitrary codimension, one can consider the Lp mapping properties of the Bochner-Riesz multiplier m(ζ) = dist(ζ,Γ)^α φ(ζ), where α > 0 and φ is an appropriate smooth cutoff function. Even for the sphere, the exact Lp boundedness range remains a central open problem in Euclidean harmonic analysis. We consider the Lp integrability of the Bochner-Riesz convolution kernel for a particular class of surfaces (of any codimension). For a subclass of these surfaces the range of Lp integrability of the kernels differs substantially from the Lp boundedness range of the corresponding Bochner-Riesz multiplier operator. Extending work of Mockenhoupt, we then establish a range of operator bounds, which are sharp in the α exponent, under the assumption of an appropriate L2 restriction estimate. Hickman and Wright established sharp oscillatory integral estimates, associated with a particular class of surfaces, and derived restriction estimates. We extend this work to certain curves of standard type and corresponding surfaces of revolution. These surfaces are discussed as an explicit class for which we have Lp → Lp boundedness of the corresponding Bochner-Riesz operators. Understanding the structure of the roots of real polynomials is important in obtaining stable bounds for oscillatory integrals with polynomial phases. For real polynomials with exponents in some fixed set, Ψ(t)=x+y1 t^{k1} +...+yL t^{kL}, we analyse the different possible root structures that can occur as the coefficients vary. We first establish a stratification of roots into tiers containing roots of comparable sizes. We then show that at most L non-zero roots can cluster about a point. Supposing additional restrictions on the coefficients, we derive structural refinements. These structural results extend work of Kowalski and Wright and provide a characteristic picture of root structure at coarse scales. As an application, these results are used to recover the sharp oscillatory integral estimates of Hickman and Wright, using bounds for oscillatory integrals of Phong and Stein

    Cis-Regulation of Gremlin1 Expression during Mouse Limb Bud Development and its Diversification during Vertebrate Evolution

    Get PDF
    Embryonic development and organogenesis rely on tightly controlled gene expression, which is achieved by cis-regulatory modules (CRMs) interacting with distinct transcription factors (TFs) that control spatio-temporal and tissue-specific gene expression. During organogenesis, gene regulatory networks (GRNs) with selfregulatory feedback properties coordinately control growth and patterning and provide systemic robustness against genetic and/or environmental perturbations. During limb bud development, various interlinked GRNs control outgrowth and patterning along all three limb axes. A paradigm network is the epithelial-mesenchymal (e-m) SHH/GREM1/AER-FGF feedback signaling system which controls limb bud outgrowth and digit patterning. The BMP antagonist GREMLIN1 (GREM1) is central to this e-m interactions as its antagonism of BMP activity is essential to maintain both AER-Fgf and Shh expression. In turn, SHH signaling upregulates Grem1 expression, which results in establishment of a self-regulatory signaling network. One previous study provided evidence that several CRMs could regulate Grem1 expression during limb bud development. However, the cis-regulatory logics underlying the spatio-temporal regulation of the Grem1 expression dynamics remained obscure. From an evolutionary point of view, diversification of CRMs can result in diversification of gene regulation which can drive the establishment of morphological novelties and adaptions. This was evidenced by the observed differences in Grem1 expression in different species that correlates with the evolutionary plasticity of tetrapod digit patterning. Hence, a better understanding of spatio-temporal regulation of the Grem1 expression dynamics and underlying cis-regulatory logic is of interest from both adevelopmental and an evolutionary perspective. Recently, multiple candidate CRMs have been identified that might be functionally relevant for Grem1 expression during mouse limb bud development. For my PhD project, I genetically analyzed which of these CRMs are involved in the regulation of the spatial-temporal Grem1 expression dynamics in limb buds. Therefore, we generated various single and compound CRM mutant alleles using CRISPR/Cas9. Our CRMs allelic series revealed a complex Grem1 cis-regulation among a minimum of six CRMs, where a subset of CRMs regulates Grem1 transcript levels in an additive manner. Surprisingly, phenotypic robustness depends not on threshold transcript levels but the spatial integrity of the Grem1 expression domain. In particular, interactions among five CRMs control the characteristic asymmetrical and posteriorly biased Grem1 expression in mouse limb buds. Our results provide an example of how multiple seemingly redundant limb-specific CRMs provide phenotypical robustness by cooperative/synergistic regulation of the spatial Grem1 expression dynamics. Three CRMs are conserved along the phylogeny of extant vertebrates with paired appendages. Of those, the activities of two CRMs recapitulate the major spatiotemporal aspects of Grem1 expression in mouse limb buds. In order to study their functions in species-specific regulation of Grem1 expression and their functional diversification in tetrapods, I tested the orthologous of both CRMs from representative species using LacZ reporter assays in transgenic mice, in comparison to the endogenous Grem1 expression in limb buds of the species of origin. Surprisingly, the activities of CRM orthologues display high evolutionary plasticity, which correlates better with the Grem1 expression pattern in limb buds of the species of origin than its mouse orthologue. This differential responsiveness to the GRNs in mouse suggests that TF binding site alterations in CRMs could underlie the spatial diversification of Grem1 in limb buds during tetrapod evolution. While the fish fin and tetrapod limb share some homologies of proximal bones, the autopod is a neomorphic feature of tetrapods. The Grem1 requirement for digit patterning and conserved expression in fin buds prompted us to assess the enhancer activity of fish CRM orthologues in transgenic mice. Surprisingly, all tested fish CRMs are active in the mouse autopod primordia providing strong evidence that Grem1 CRMs are active in fin buds and that they predate the fin-to-limb transition. Our results corroborate increasing evidence that CRMs governing autopodial gene expression have been co-opted during the emergence of tetrapod autopod. Furthermore, as part of a collaboration with Dr. S. Jhanwar, I contributed to the study of shared and species-specific epigenomic and genomic variations during mouse and chicken limb bud development. In this analysis, Dr. S. Jhanwar identified putative enhancers that show higher chicken-specific sequence turnover rates in comparison to their mouse orthologues, which defines them as so-called chicken accelerated regions (CARs). Here, I analyzed the CAR activities in comparison to their mouse orthologues by transgenic LacZ reporter assays, which was complemented by analysis of the endogenous gene expression in limb buds of both species. This analysis indicates that diversified activity of CARs and their mouse orthologues could be linked to the differential gene expression patterns in limb buds of both species

    Gradient descent provably escapes saddle points in the training of shallow ReLU networks

    Full text link
    Dynamical systems theory has recently been applied in optimization to prove that gradient descent algorithms avoid so-called strict saddle points of the loss function. However, in many modern machine learning applications, the required regularity conditions are not satisfied. In particular, this is the case for rectified linear unit (ReLU) networks. In this paper, we prove a variant of the relevant dynamical systems result, a center-stable manifold theorem, in which we relax some of the regularity requirements. Then, we verify that shallow ReLU networks fit into the new framework. Building on a classification of critical points of the square integral loss of shallow ReLU networks measured against an affine target function, we deduce that gradient descent avoids most saddle points. We proceed to prove convergence to global minima if the initialization is sufficiently good, which is expressed by an explicit threshold on the limiting loss

    Developing automated meta-research approaches in the preclinical Alzheimer's disease literature

    Get PDF
    Alzheimer’s disease is a devastating neurodegenerative disorder for which there is no cure. A crucial part of the drug development pipeline involves testing therapeutic interventions in animal disease models. However, promising findings in preclinical experiments have not translated into clinical trial success. Reproducibility has often been cited as a major issue affecting biomedical research, where experimental results in one laboratory cannot be replicated in another. By using meta-research (research on research) approaches such as systematic reviews, researchers aim to identify and summarise all available evidence relating to a specific research question. By conducting a meta-analysis, researchers can also combine the results from different experiments statistically to understand the overall effect of an intervention and to explore reasons for variations seen across different publications. Systematic reviews of the preclinical Alzheimer’s disease literature could inform decision making, encourage research improvement, and identify gaps in the literature to guide future research. However, due to the vast amount of potentially useful evidence from animal models of Alzheimer’s disease, it remains difficult to make sense of and utilise this data effectively. Systematic reviews are common practice within evidence based medicine, yet their application to preclinical research is often limited by the time and resources required. In this thesis, I develop, build-upon, and implement automated meta-research approaches to collect, curate, and evaluate the preclinical Alzheimer’s literature. I searched several biomedical databases to obtain all research relevant to Alzheimer’s disease. I developed a novel deduplication tool to automatically identify and remove duplicate publications identified across different databases with minimal human effort. I trained a crowd of reviewers to annotate a subset of the publications identified and used this data to train a machine learning algorithm to screen through the remaining publications for relevance. I developed text-mining tools to extract model, intervention, and treatment information from publications and I improved existing automated tools to extract reported measures to reduce the risk of bias. Using these tools, I created a categorised database of research in transgenic Alzheimer’s disease animal models and created a visual summary of this dataset on an interactive, openly accessible online platform. Using the techniques described, I also identified relevant publications within the categorised dataset to perform systematic reviews of two key outcomes of interest in transgenic Alzheimer’s disease models: (1) synaptic plasticity and transmission in hippocampal slices and (2) motor activity in the open field test. Over 400,000 publications were identified across biomedical research databases, with 230,203 unique publications. In a performance evaluation across different preclinical datasets, the automated deduplication tool I developed could identify over 97% of duplicate citations and a had an error rate similar to that of human performance. When evaluated on a test set of publications, the machine learning classifier trained to identify relevant research in transgenic models performed was highly sensitive (captured 96.5% of relevant publications) and excluded 87.8% of irrelevant publications. Tools to identify the model(s) and outcome measure(s) within the full-text of publications may reduce the burden on reviewers and were found to be more sensitive than searching only the title and abstract of citations. Automated tools to assess risk of bias reporting were highly sensitive and could have the potential to monitor research improvement over time. The final dataset of categorised Alzheimer’s disease research contained 22,375 publications which were then visualised in the interactive web application. Within the application, users can see how many publications report measures to reduce the risk of bias and how many have been classified as using each transgenic model, testing each intervention, and measuring each outcome. Users can also filter to obtain curated lists of relevant research, allowing them to perform systematic reviews at an accelerated pace with reduced effort required to search across databases, and a reduced number of publications to screen for relevance. Both systematic reviews and meta-analyses highlighted failures to report key methodological information within publications. Poor transparency of reporting limited the statistical power I had to understand the sources of between-study variation. However, some variables were found to explain a significant proportion of the heterogeneity. Transgenic animal model had a significant impact on results in both reviews. For certain open field test outcomes, wall colour of the open field arena and the reporting of measures to reduce the risk of bias were found to impact results. For in vitro electrophysiology experiments measuring synaptic plasticity, several electrophysiology parameters, including magnesium concentration of the recording solution, were found to explain a significant proportion of the heterogeneity. Automated meta-research approaches and curated web platforms summarising preclinical research could have the potential to accelerate the conduct of systematic reviews and maximise the potential of existing evidence to inform translation

    Development, Validation, and Application of Methods for High Time-Response Measurement of Gaseous Atmospheric Chlorinated Species

    Get PDF
    Halogenated compounds that participate in catalytic cycles in the atmosphere can influence the fate of chemicals, including ozone, methane, and volatile organic compounds (VOCs). These halogen radicals, in particular atomic chlorine (Cl), can deplete ozone and will react rapidly with VOCs. Reliable, sensitive, and widely available hydrogen chloride (HCl) measurements are important for understanding Cl initiated oxidation in many regions of the troposphere. We configured a commercial HCl cavity ring-down spectrometer (CRDS) for sampling HCl in the ambient atmosphere and developed validation techniques to characterize the measurement uncertainties. The HCl analyzer was used to make continuous HCl measurements in the polluted marine boundary layer during the Halifax Fog and Air Quality Study (HaliFAQS). Bimodal HCl features in the high irradiance days indicated two photochemical processes; (1) morning time photolysis of Cl precursors, and (2) midday formation of nitric acid followed by acid displacement onto chloride (Cl) containing aerosols. A box model used measured HCl to estimate nitryl chloride mixing ratios at sunrise and assessed the contribution of photolabile Cl precursors to radical formation. Total gaseous chlorine (TClg) measurements can illuminate unknown sources of Cl to the atmosphere. Techniques for measuring TClg have been limited to offline analysis of extracted filters and do not provide suitable temporal information on fast atmospheric process. The utility of this novel TClg measurement technique will be crucial to future estimates and assessments of chlorinated compounds and their impact on air quality, climate, and health
    • …
    corecore