2,207 research outputs found

    Enhancement of superoxide evolution by nickel-doped for the removal of organic pollutants and cyanobacteria

    Get PDF
    Organic pollutants and cyanobacteria exist in water widely, which make significant impacts on human health so that appropriate methods are needed for their removal. In this work, Ni doped Bismuth oxychloride (BiOCl) photocatalysts were successfully synthesized by a simple hydrothermal method. The light absorption and charge carriers separation involved in superoxide (·O2^{-}) generation can be optimized with the introduction of Ni element. And photocatalytic degradation experiments showed that the 9% Ni-BiOCl enhanced photodegradation activity of organic matter (Rh B and BPA) as well as M. aeruginosa. The degradation efficiency of Ni-BiOCl on the removal of Rh B and BPA were approximately 34.99% and 57% higher than that of pristine BiOCl. Furthermore, the algae inactivation was systematically studied by three-dimensional fluorescence spectrum. Results showed that ·O2− acted an irreplaceable role among the experiment of photocatalytic algae removal, and the details were described as ·O2^{−} and h^{+} first destroyed the cell wall of M. aeruginosa, secondly inactivated the active fluorescent substances in the cell, and then catalyzed the oxidation of intracellular exudates such as chlorophyll and phycocyanin as inorganic substances. This study provides a multifunctional catalyst for controlling water pollution and environmental restoration

    Transformation Consistency Regularization – A Semi-supervised Paradigm for Image-to-Image Translation

    Get PDF
    Scarcity of labeled data has motivated the development of semi-supervised learning methods, which learn from large portions of unlabeled data alongside a few labeled samples. Consistency Regularization between model's predictions under different input perturbations, particularly has shown to provide state-of-the art results in a semi-supervised framework. However, most of these method have been limited to classification and segmentation applications. We propose Transformation Consistency Regularization, which delves into a more challenging setting of image-to-image translation, which remains unexplored by semi-supervised algorithms. The method introduces a diverse set of geometric transformations and enforces the model's predictions for unlabeled data to be invariant to those transformations. We evaluate the efficacy of our algorithm on three different applications: image colorization, denoising and super-resolution. Our method is significantly data efficient, requiring only around 10 - 20% of labeled samples to achieve similar image reconstructions to its fully-supervised counterpart. Furthermore, we show the effectiveness of our method in video processing applications, where knowledge from a few frames can be leveraged to enhance the quality of the rest of the movie

    Outlier Edge Detection Using Random Graph Generation Models and Applications

    Get PDF
    Outliers are samples that are generated by different mechanisms from other normal data samples. Graphs, in particular social network graphs, may contain nodes and edges that are made by scammers, malicious programs or mistakenly by normal users. Detecting outlier nodes and edges is important for data mining and graph analytics. However, previous research in the field has merely focused on detecting outlier nodes. In this article, we study the properties of edges and propose outlier edge detection algorithms using two random graph generation models. We found that the edge-ego-network, which can be defined as the induced graph that contains two end nodes of an edge, their neighboring nodes and the edges that link these nodes, contains critical information to detect outlier edges. We evaluated the proposed algorithms by injecting outlier edges into some real-world graph data. Experiment results show that the proposed algorithms can effectively detect outlier edges. In particular, the algorithm based on the Preferential Attachment Random Graph Generation model consistently gives good performance regardless of the test graph data. Further more, the proposed algorithms are not limited in the area of outlier edge detection. We demonstrate three different applications that benefit from the proposed algorithms: 1) a preprocessing tool that improves the performance of graph clustering algorithms; 2) an outlier node detection algorithm; and 3) a novel noisy data clustering algorithm. These applications show the great potential of the proposed outlier edge detection techniques.Comment: 14 pages, 5 figures, journal pape

    The first microbial colonizers of the human gut: composition, activities, and health implications of the infant gut microbiota

    Get PDF
    The human gut microbiota is engaged in multiple interactions affecting host health during the host's entire life span. Microbes colonize the neonatal gut immediately following birth. The establishment and interactive development of this early gut microbiota are believed to be (at least partially) driven and modulated by specific compounds present in human milk. It has been shown that certain genomes of infant gut commensals, in particular those of bifidobacterial species, are genetically adapted to utilize specific glycans of this human secretory fluid, thus representing a very intriguing example of host-microbe coevolution, where both partners are believed to benefit. In recent years, various metagenomic studies have tried to dissect the composition and functionality of the infant gut microbiome and to explore the distribution across the different ecological niches of the infant gut biogeography of the corresponding microbial consortia, including those corresponding to bacteria and viruses, in healthy and ill subjects. Such analyses have linked certain features of the microbiota/microbiome, such as reduced diversity or aberrant composition, to intestinal illnesses in infants or disease states that are manifested at later stages of life, including asthma, inflammatory bowel disease, and metabolic disorders. Thus, a growing number of studies have reported on how the early human gut microbiota composition/development may affect risk factors related to adult health conditions. This concept has fueled the development of strategies to shape the infant microbiota composition based on various functional food products. In this review, we describe the infant microbiota, the mechanisms that drive its establishment and composition, and how microbial consortia may be molded by natural or artificial interventions. Finally, we discuss the relevance of key microbial players of the infant gut microbiota, in particular bifidobacteria, with respect to their role in health and disease

    Neutrino Oscillations and Collider Test of the R-parity Violating Minimal Supergravity Model

    Full text link
    We study the R-parity violating minimal supergravity models accounting for the observed neutrino masses and mixing, which can be tested in future collider experiments. The bi-large mixing can be explained by allowing five dominant tri-linear couplings λ1,2,3â€Č \lambda'_{1,2,3} and λ1,2\lambda_{1,2}. The desired ratio of the atmospheric and solar neutrino mass-squared differences can be obtained in a very limited parameter space where the tree-level contribution is tuned to be suppressed. In this allowed region, we quantify the correlation between the three neutrino mixing angles and the tri-linear R-parity violating couplings. Qualitatively, the relations ∣λ1â€Č∣<∣λ2â€ČâˆŁâˆŒâˆŁÎ»3â€Č∣| \lambda'_1 | < | \lambda'_2| \sim | \lambda'_3|, and ∣λ1âˆŁâˆŒâˆŁÎ»2∣|\lambda_1| \sim |\lambda_2| are required by the large atmospheric neutrino mixing angle Ξ23\theta_{23} and the small angle Ξ13\theta_{13}, and the large solar neutrino mixing angle Ξ12\theta_{12}, respectively. Such a prediction on the couplings can be tested in the next linear colliders by observing the branching ratios of the lightest supersymmetric particle (LSP). For the stau or the neutralino LSP, the ratio ∣λ1∣2:∣λ2∣2:∣λ1∣2+∣λ2∣2|\lambda_1|^2: |\lambda_2|^2: |\lambda_1|^2 + |\lambda_2|^2 can be measured by establishing Br(eÎœ):Br(ΌΜ):Br(Ï„Îœ)Br(e\nu): Br(\mu\nu) : Br(\tau\nu) or Br(Îœe±τ∓):Br(ΜΌ±τ∓):Br(Μτ±τ∓)Br(\nu e^\pm \tau^\mp ): Br(\nu\mu^\pm\tau^\mp) : Br(\nu\tau^\pm\tau^\mp), respectively. The information on the couplings λiâ€Č\lambda'_i can be drawn by measuring Br(litbˉ)âˆâˆŁÎ»iâ€Č∣2Br(l_i t \bar{b}) \propto |\lambda'_i|^2 if the neutralino LSP is heavier than the top quark.Comment: RevTex, 25 pages, 8 eps figure

    Digital PCR methods improve detection sensitivity and measurement precision of low abundance mtDNA deletions

    Get PDF
    Mitochondrial DNA (mtDNA) mutations are a common cause of primary mitochondrial disorders, and have also been implicated in a broad collection of conditions, including aging, neurodegeneration, and cancer. Prevalent among these pathogenic variants are mtDNA deletions, which show a strong bias for the loss of sequence in the major arc between, but not including, the heavy and light strand origins of replication. Because individual mtDNA deletions can accumulate focally, occur with multiple mixed breakpoints, and in the presence of normal mtDNA sequences, methods that detect broad-spectrum mutations with enhanced sensitivity and limited costs have both research and clinical applications. In this study, we evaluated semi-quantitative and digital PCR-based methods of mtDNA deletion detection using double-stranded reference templates or biological samples. Our aim was to describe key experimental assay parameters that will enable the analysis of low levels or small differences in mtDNA deletion load during disease progression, with limited false-positive detection. We determined that the digital PCR method significantly improved mtDNA deletion detection sensitivity through absolute quantitation, improved precision and reduced assay standard error

    Metabonomics and Intensive Care

    Get PDF
    This article is one of ten reviews selected from the Annual Update in Intensive Care and Emergency medicine 2016. Other selected articles can be found online at http://www.biomedcentral.com/collections/annualupdate2016. Further information about the Annual Update in Intensive Care and Emergency Medicine is available from http://www.springer.com/series/8901

    Taguchi Loss Function for Varus/Valgus Alignment in Total Knee Arthroplasty

    Get PDF
    Methods of designing equipment to improve quality have been developed by Taguchi. A key feature of these methods is the development of loss function, which quantifies the financial cost (loss) resulting from deviations from target dimensions. Total knee arthroplasties can fail due to prosthetic component malalignment. A Taguchi loss function for varus/valgus alignment of the prosthesis and revision rates was developed. Six studies were identified from a comprehensive literature search. Varus and extreme valgus alignments correlated with an increased percentage of prosthetic failure. A loss function of L( y) = 326.80y2,whereywasdeviationfromidealvarus/valgusangle,wasdetermined.TheexpectedlossfunctionwasEL=326.80y2 , where y was deviation from ideal varus/valgus angle, was determined. The expected loss function was EL=326.80yÂŻ2+s2 , where yÂŻ was the mean deviance from the ideal varus/valgus angle and s2 was the variance in varus/valgus angle. This loss function was used to estimate the cost savings of using computer-assisted surgical navigation in total knee arthroplasty (TKA). The average savings of a navigated TKA versus a conventional TKA, based on the expected loss equation derived from the Taguchi loss function, was $2,304 per knee. The expected loss function derived here can serve as a tool for biomedical engineers seeking to use Taguchi quality engineering methods in designing orthopaedic devices

    Resource-efficient high-dimensional subspace teleportation with a quantum autoencoder.

    Get PDF
    Quantum autoencoders serve as efficient means for quantum data compression. Here, we propose and demonstrate their use to reduce resource costs for quantum teleportation of subspaces in high-dimensional systems. We use a quantum autoencoder in a compress-teleport-decompress manner and report the first demonstration with qutrits using an integrated photonic platform for future scalability. The key strategy is to compress the dimensionality of input states by erasing redundant information and recover the initial states after chip-to-chip teleportation. Unsupervised machine learning is applied to train the on-chip autoencoder, enabling the compression and teleportation of any state from a high-dimensional subspace. Unknown states are decompressed at a high fidelity (~0.971), obtaining a total teleportation fidelity of ~0.894. Subspace encodings hold great potential as they support enhanced noise robustness and increased coherence. Laying the groundwork for machine learning techniques in quantum systems, our scheme opens previously unidentified paths toward high-dimensional quantum computing and networking
    • 

    corecore