9,639 research outputs found

    \u3ci\u3eIn Vivo\u3c/i\u3e Cloning of Up to 16 kb Plasmids in \u3ci\u3eE. Coli\u3c/i\u3e is As Simple As PCR

    Get PDF
    The precise assembly of defined DNA sequences into plasmids is an essential task in bioscience research. While a number of molecular cloning techniques have been developed, many methods require specialized expensive reagents or laborious experimental procedure. Not surprisingly, conventional cloning techniques based on restriction digestion and ligation are still commonly used in routine DNA cloning. Here, we describe a simple, fast, and economical cloning method based on RecA- and RecET-independent in vivo recombination of DNA fragments with overlapping ends using E. coli. All DNA fragments were prepared by a 2-consecutive PCR procedure with Q5 DNA polymerase and used directly for transformation resulting in 95% cloning accuracy and zero background from parental template plasmids. Quantitative relationships were established between cloning efficiency and three factors–the length of overlapping nucleotides, the number of DNA fragments, and the size of target plasmids–which can provide general guidance for selecting in vivo cloning parameters. The method may be used to accurately assemble up to 5 DNA fragments with 25 nt overlapping ends into relatively small plasmids, and 3 DNA fragments into plasmids up to 16 kb in size. The whole cloning procedure may be completed within 2 days by a researcher with little training in cloning. The combination of high accuracy and zero background eliminates the need for screening a large number of colonies. The method requires no enzymes other than Q5 DNA polymerase, has no sequence restriction, is highly reliable, and represents one of the simplest, fastest, and cheapest cloning techniques available. Our method is particularly suitable for common cloning tasks in the lab where the primary goal is to quickly generate a plasmid with a pre-defined sequence at low costs

    The Informativeness of Text, the Deep Learning Approach

    Get PDF
    This paper uses a deep learning natural language processing approach (Google's Bidirectional Encoder Representations from Transformers, hereafter BERT) to comprehensively summarize financial texts and examine their informativeness. First, we compare BERT's effectiveness in sentiment classification in financial texts with that of a finance specific dictionary, the naïve Bayes, and Word2Vec, a shallow machine learning approach. We find that first, BERT outperforms all other approaches, and second, pre-training BERT with financial texts further improves its performance. Using BERT, we show that conference call texts provide information to investors and that other less accurate approaches underestimate the economic significance of textual informativeness by at least 25%. Last, textual sentiments summarized by BERT can predict future earnings and capital expenditure, after controlling for financial statement based determinants commonly used in finance and accounting research

    FinEntity: Entity-level Sentiment Classification for Financial Texts

    Full text link
    In the financial domain, conducting entity-level sentiment analysis is crucial for accurately assessing the sentiment directed toward a specific financial entity. To our knowledge, no publicly available dataset currently exists for this purpose. In this work, we introduce an entity-level sentiment classification dataset, called \textbf{FinEntity}, that annotates financial entity spans and their sentiment (positive, neutral, and negative) in financial news. We document the dataset construction process in the paper. Additionally, we benchmark several pre-trained models (BERT, FinBERT, etc.) and ChatGPT on entity-level sentiment classification. In a case study, we demonstrate the practical utility of using FinEntity in monitoring cryptocurrency markets. The data and code of FinEntity is available at \url{https://github.com/yixuantt/FinEntity}Comment: EMNLP'23 Main Conference Short Pape

    MBA: A market-based approach to data allocation and migration for cloud database

    Full text link
    With the coming shift to cloud computing, cloud database is emerging to provide database service over the Internet. In the cloud-based environment, data are distributed at internet scale and the system needs to handle a huge number of user queries simultaneously without delay. How data are distributed among the servers has a crucial impact on the query load distribution and the system response time. In this paper, we propose a market-based control method, called MBA, to achieve query load balance via reasonable data distribution. In MBA, database nodes are treated as traders in a market, and certain market rules are used to intelligently decide data allocation and migration. We built a prototype system and conducted extensive experiments. Experimental results show that the MBA method signicantly improves system performance in terms of average query response time and fairness

    Robust Digital-Twin Localization via An RGBD-based Transformer Network and A Comprehensive Evaluation on a Mobile Dataset

    Full text link
    The potential of digital-twin technology, involving the creation of precise digital replicas of physical objects, to reshape AR experiences in 3D object tracking and localization scenarios is significant. However, enabling robust 3D object tracking in dynamic mobile AR environments remains a formidable challenge. These scenarios often require a more robust pose estimator capable of handling the inherent sensor-level measurement noise. In this paper, recognizing the challenges of comprehensive solutions in existing literature, we propose a transformer-based 6DoF pose estimator designed to achieve state-of-the-art accuracy under real-world noisy data. To systematically validate the new solution's performance against the prior art, we also introduce a novel RGBD dataset called Digital Twin Tracking Dataset v2 (DTTD2), which is focused on digital-twin object tracking scenarios. Expanded from an existing DTTD v1 (DTTD1), the new dataset adds digital-twin data captured using a cutting-edge mobile RGBD sensor suite on Apple iPhone 14 Pro, expanding the applicability of our approach to iPhone sensor data. Through extensive experimentation and in-depth analysis, we illustrate the effectiveness of our methods under significant depth data errors, surpassing the performance of existing baselines. Code and dataset are made publicly available at: https://github.com/augcog/DTTD

    6-Chloro-3-nitro-N-(propan-2-yl)pyridin-2-amine

    Get PDF
    There are two mol­ecules in the asymmetric unit mol­ecule of the title compound, C8H10ClN3O2. Intra­molecular N—H⋯O hydrogen bonds stabilize the mol­ecular structure. There are no classical inter­molecular hydrogen bonds in the crystal structure

    Graphene-Based Nanostructures in Electrocatalytic Oxygen Reduction

    Full text link
    Application of graphene-type materials in electrocatalysis is a topic of growing scientific and technological interest. A tremendous amount of research has been carried out in the field of oxygen electroreduction, particularly with respect to potential applications in the fuel cell research also with use of graphene-type catalytic components. This work addresses fundamental aspects and potential applications of graphene structures in the oxygen reduction electrocatalysis. Special attention will be paid to creation of catalytically active sites by using non-metallic heteroatoms as dopants, formation of hierarchical nanostructured electrocatalysts, their long-term stability, and application as supports for dispersed metals (activating interactions)

    Fine intervals are required when using point intercept transects to assess coral reef status

    Get PDF
    The Point Intercept Transect (PIT) method has commonly been used in recent decades for estimating the status of coral reef benthic communities. It is a simple method that is efficiently performed underwater, as benthic components are recorded only as presence or absence at specific interval points along transects. Therefore, PIT is also popular in citizen science activities such as Reef Check programs. Longer intervals are commonly associated with longer transects, yet sampling interval length can significantly influence benthic coverage calculations. Despite this, the relative accuracy of longer or shorter intervals related to transect length has not been tested for PIT. In this study, we tested the optimum intervals of PIT for several commonly used transect lengths using the bootstrap method on empirical data collected on tropical coral reefs and non-reefal coral communities. Our results recommend fine intervals of 10 cm or shorter, depending on the length of the transect, to increase the accuracy of estimating benthic community status on coral reefs. Permanent transects should also be considered in long-term monitoring programs to improve data quality
    corecore