562 research outputs found

    Investigation of a Bubble Detector based on Active Electrolocation of Weakly Electric Fish

    Get PDF
    Weakly electric fish employ active electrolocation for navigation and object detection. They emit an electric signal with their electric organ in the tail and sense the electric field with electroreceptors that are distributed over their skin. We adopted this principle to design a bubble detector that can detect gas bubbles in a fluid or, in principle, objects with different electric conductivity than the surrounding fluid. The evaluation of the influence of electrode diameter on detecting a given bubble size showed that the signal increases with electrode diameter. Therefore it appears that this detector will be more appropriate for large sized applications such as bubble columns than small sized applications such as bubble detectors in dialysis

    Detecting genomic indel variants with exact breakpoints in single- and paired-end sequencing data using SplazerS

    Get PDF
    Motivation: The reliable detection of genomic variation in resequencing data is still a major challenge, especially for variants larger than a few base pairs. Sequencing reads crossing boundaries of structural variation carry the potential for their identification, but are difficult to map. Results: Here we present a method for ‘split’ read mapping, where prefix and suffix match of a read may be interrupted by a longer gap in the read-to-reference alignment. We use this method to accurately detect medium-sized insertions and long deletions with precise breakpoints in genomic resequencing data. Compared with alternative split mapping methods, SplazerS significantly improves sensitivity for detecting large indel events, especially in variant-rich regions. Our method is robust in the presence of sequencing errors as well as alignment errors due to genomic mutations/divergence, and can be used on reads of variable lengths. Our analysis shows that SplazerS is a versatile tool applicable to unanchored or single-end as well as anchored paired-end reads. In addition, application of SplazerS to targeted resequencing data led to the interesting discovery of a complete, possibly functional gene retrocopy variant. Availability: SplazerS is available from http://www.seqan.de/projects/ splazers

    AI-based structure-function correlation in age-related macular degeneration

    Get PDF
    Sensitive and robust outcome measures of retinal function are pivotal for clinical trials in age-related macular degeneration (AMD). A recent development is the implementation of artificial intelligence (AI) to infer results of psychophysical examinations based on findings derived from multimodal imaging. We conducted a review of the current literature referenced in PubMed and Web of Science among others with the keywords 'artificial intelligence' and 'machine learning' in combination with 'perimetry', 'best-corrected visual acuity (BCVA)', 'retinal function' and 'age-related macular degeneration'. So far AI-based structure-function correlations have been applied to infer conventional visual field, fundus-controlled perimetry, and electroretinography data, as well as BCVA, and patient-reported outcome measures (PROM). In neovascular AMD, inference of BCVA (hereafter termed inferred BCVA) can estimate BCVA results with a root mean squared error of ~7-11 letters, which is comparable to the accuracy of actual visual acuity assessment. Further, AI-based structure-function correlation can successfully infer fundus-controlled perimetry (FCP) results both for mesopic as well as dark-adapted (DA) cyan and red testing (hereafter termed inferred sensitivity). Accuracy of inferred sensitivity can be augmented by adding short FCP examinations and reach mean absolute errors (MAE) of ~3-5 dB for mesopic, DA cyan and DA red testing. Inferred BCVA, and inferred retinal sensitivity, based on multimodal imaging, may be considered as a quasi-functional surrogate endpoint for future interventional clinical trials in the future

    Transcriptionally active enhancers in human cancer cells

    Get PDF
    The growth of human cancer cells is driven by aberrant enhancer and gene transcription activity. Here, we use transient transcriptome sequencing (TT-seq) to map thousands of transcriptionally active putative enhancers in fourteen human cancer cell lines covering seven types of cancer. These enhancers were associated with cell type-specific gene expression, enriched for genetic variants that predispose to cancer, and included functionally verified enhancers. Enhancer-promoter (E-P) pairing by correlation of transcription activity revealed similar to 40,000 putative E-P pairs, which were depleted for housekeeping genes and enriched for transcription factors, cancer-associated genes, and 3D conformational proximity. The cell type specificity and transcription activity of target genes increased with the number of paired putative enhancers. Our results represent a rich resource for future studies of gene regulation by enhancers and their role in driving cancerous cell growth.Peer reviewe

    Quasiperiodicity and non-computability in tilings

    Full text link
    We study tilings of the plane that combine strong properties of different nature: combinatorial and algorithmic. We prove existence of a tile set that accepts only quasiperiodic and non-recursive tilings. Our construction is based on the fixed point construction; we improve this general technique and make it enforce the property of local regularity of tilings needed for quasiperiodicity. We prove also a stronger result: any effectively closed set can be recursively transformed into a tile set so that the Turing degrees of the resulted tilings consists exactly of the upper cone based on the Turing degrees of the later.Comment: v3: the version accepted to MFCS 201

    Fingerprints in Compressed Strings

    Get PDF
    The Karp-Rabin fingerprint of a string is a type of hash value that due to its strong properties has been used in many string algorithms. In this paper we show how to construct a data structure for a string S of size N compressed by a context-free grammar of size n that answers fingerprint queries. That is, given indices i and j, the answer to a query is the fingerprint of the substring S[i,j]. We present the first O(n) space data structures that answer fingerprint queries without decompressing any characters. For Straight Line Programs (SLP) we get O(logN) query time, and for Linear SLPs (an SLP derivative that captures LZ78 compression and its variations) we get O(log log N) query time. Hence, our data structures has the same time and space complexity as for random access in SLPs. We utilize the fingerprint data structures to solve the longest common extension problem in query time O(log N log l) and O(log l log log l + log log N) for SLPs and Linear SLPs, respectively. Here, l denotes the length of the LCE

    Mukaiyama addition of (trimethylsilyl) acetonitrile to dimethyl acetals mediated by trimethylsilyl trifluoromethanesulfonate

    Get PDF
    (Trimethylsilyl) acetonitrile reacts smoothly with dimethyl acetals in the presence of stoichiometric trimethylsilyl trifluoromethanesulfonate (TMSOTf) to yield β-methoxynitriles. The ideal substrates for this reaction are acetals derived from aromatic aldehydes. Elimination to the corresponding α,β-unsaturated nitriles is observed as the major product in the case of electron-rich acetals. A mechanistic hypothesis that includes isomerization of the silylnitrile to a nucleophilic N-silyl ketene imine is presented

    Reconciliation of essential process parameters for an enhanced predictability of Arctic stratospheric ozone loss and its climate interactions

    Get PDF
    Significant reductions in stratospheric ozone occur inside the polar vortices each spring when chlorine radicals produced by heterogeneous reactions on cold particle surfaces in winter destroy ozone mainly in two catalytic cycles, the ClO dimer cycle and the ClO/BrO cycle. Chlorofluorocarbons (CFCs), which are responsible for most of the chlorine currently present in the stratosphere, have been banned by the Montreal Protocol and its amendments, and the ozone layer is predicted to recover to 1980 levels within the next few decades. During the same period, however, climate change is expected to alter the temperature, circulation patterns and chemical composition in the stratosphere, and possible geo-engineering ventures to mitigate climate change may lead to additional changes. To realistically predict the response of the ozone layer to such influences requires the correct representation of all relevant processes. The European project RECONCILE has comprehensively addressed remaining questions in the context of polar ozone depletion, with the objective to quantify the rates of some of the most relevant, yet still uncertain physical and chemical processes. To this end RECONCILE used a broad approach of laboratory experiments, two field missions in the Arctic winter 2009/10 employing the high altitude research aircraft M55-Geophysica and an extensive match ozone sonde campaign, as well as microphysical and chemical transport modelling and data assimilation. Some of the main outcomes of RECONCILE are as follows: (1) vortex meteorology: the 2009/10 Arctic winter was unusually cold at stratospheric levels during the six-week period from mid-December 2009 until the end of January 2010, with reduced transport and mixing across the polar vortex edge; polar vortex stability and how it is influenced by dynamic processes in the troposphere has led to unprecedented, synoptic-scale stratospheric regions with temperatures below the frost point; in these regions stratospheric ice clouds have been observed, extending over >106km2 during more than 3 weeks. (2) Particle microphysics: heterogeneous nucleation of nitric acid trihydrate (NAT) particles in the absence of ice has been unambiguously demonstrated; conversely, the synoptic scale ice clouds also appear to nucleate heterogeneously; a variety of possible heterogeneous nuclei has been characterised by chemical analysis of the non-volatile fraction of the background aerosol; substantial formation of solid particles and denitrification via their sedimentation has been observed and model parameterizations have been improved. (3) Chemistry: strong evidence has been found for significant chlorine activation not only on polar stratospheric clouds (PSCs) but also on cold binary aerosol; laboratory experiments and field data on the ClOOCl photolysis rate and other kinetic parameters have been shown to be consistent with an adequate degree of certainty; no evidence has been found that would support the existence of yet unknown chemical mechanisms making a significant contribution to polar ozone loss. (4) Global modelling: results from process studies have been implemented in a prognostic chemistry climate model (CCM); simulations with improved parameterisations of processes relevant for polar ozone depletion are evaluated against satellite data and other long term records using data assimilation and detrended fluctuation analysis. Finally, measurements and process studies within RECONCILE were also applied to the winter 2010/11, when special meteorological conditions led to the highest chemical ozone loss ever observed in the Arctic. In addition to quantifying the 2010/11 ozone loss and to understand its causes including possible connections to climate change, its impacts were addressed, such as changes in surface ultraviolet (UV) radiation in the densely populated northern mid-latitudes

    Reconciliation of essential process parameters for an enhanced predictability of Arctic stratospheric ozone loss and its climate interactions : (RECONCILE) ; activities and results

    Get PDF
    The international research project RECONCILE has addressed central questions regarding polar ozone depletion, with the objective to quantify some of the most relevant yet still uncertain physical and chemical processes and thereby improve prognostic modelling capabilities to realistically predict the response of the ozone layer to climate change. This overview paper outlines the scope and the general approach of RECONCILE, and it provides a summary of observations and modelling in 2010 and 2011 that have generated an in many respects unprecedented dataset to study processes in the Arctic winter stratosphere. Principally, it summarises important outcomes of RECONCILE including (i) better constraints and enhanced consistency on the set of parameters governing catalytic ozone destruction cycles, (ii) a better understanding of the role of cold binary aerosols in heterogeneous chlorine activation, (iii) an improved scheme of polar stratospheric cloud (PSC) processes that includes heterogeneous nucleation of nitric acid trihydrate (NAT) and ice on non-volatile background aerosol leading to better model parameterisations with respect to denitrification, and (iv) long transient simulations with a chemistry-climate model (CCM) updated based on the results of RECONCILE that better reproduce past ozone trends in Antarctica and are deemed to produce more reliable predictions of future ozone trends. The process studies and the global simulations conducted in RECONCILE show that in the Arctic, ozone depletion uncertainties in the chemical and microphysical processes are now clearly smaller than the sensitivity to dynamic variability
    corecore