53 research outputs found

    ESG: Extended Similarity Group method for automated protein function prediction

    Get PDF
    We present here the Extended Similarity Group (ESG) method, which annotates query sequences with Gene Ontology (GO) terms by assigning probability to each annotation computed based on iterative PSI-BLAST searches. Conventionally sequence homology based function annotation methods, such as BLAST, retrieve function information from top hits with a significant score (E-values). In contrast, the PFP method, which we have presented previously, goes one step ahead in utilizing a PSI-BLAST result by considering very weak hits even an E-value of up to 100 and also by incorporating the functional association between GO terms (FAM matrix) computed using term co-occurrence frequencies in the UniProt database. PFP is very successful which is evidenced by the top rank in the function prediction category in CASP7 competition. Our new approach, ESG method, further improves the accuracy of PFP by essentially employing PFP in an iterative fashion. An advantage of ESG is that it is built in a rigorous statistical framework: Unlike PFP method that assigns a weighted score to each GO term, ESG assigns a probability based on weights computed using the E-value of each hit sequence on the path between the original query sequence and the current hit sequence

    edcc: An R Package for the Economic Design of the Control Chart

    Get PDF
    The basic purpose of the economic design of the control charts is to find the optimum control charts parameters to minimize the process cost. In this paper, an R package, edcc (economic design of control charts), which provides a numerical method to find the optimum chart parameters is presented using the unified approach of the economic design. Also, some examples are given to illustrate how to use this package. The types of the control chart available in the edcc package are X?, CUSUM (cumulative sum), and EWMA (exponentially-weighted moving average) control charts

    Calibration of second-order correlation functions for non-stationary sources with a multi-start multi-stop time-to-digital converter

    Full text link
    A novel high-throughput second-order-correlation measurement system is developed which records and makes use of all the arrival times of photons detected at both start and stop detectors. This system is suitable particularly for a light source having a high photon flux and a long coherence time since it is more efficient than conventional methods by an amount equal to the product of the count rate and the correlation time of the light source. We have used this system in carefully investigating the dead time effects of detectors and photon counters on the second-order correlation function in the two-detector configuration. For a non-stationary light source, distortion of original signal was observed at high photon flux. A systematic way of calibrating the second-order correlation function has been devised by introducing a concept of an effective dead time of the entire measurement system.Comment: 7 pages, 6 figure

    ESG: Extended Similarity Group method for automated protein function prediction

    Get PDF

    Supplemental Materials: SEM Approach to the Mediation Analysis of the Two-Condition Within-Participant Design

    No full text
    Identification of indirect effects in the two-condition within-particioant desig

    edcc

    No full text
    The basic purpose of the economic design of the control charts is to find the optimum control charts parameters to minimize the process cost. In this paper, an R package, edcc (economic design of control charts), which provides a numerical method to find the optimum chart parameters is presented using the unified approach of the economic design. Also, some examples are given to illustrate how to use this package. The types of the control chart available in the edcc package are XĚ…, CUSUM (cumulative sum), and EWMA (exponentially-weighted moving average) control charts

    Optical imaging featuring both long working distance and high spatial resolution by correcting the aberration of a large aperture lens

    Get PDF
    High-resolution optical imaging within thick objects has been a challenging task due to the short working distance of conventional high numerical aperture (NA) objective lenses. Lenses with a large physical diameter and thus a large aperture, such as microscope condenser lenses, can feature both a large NA and a long working distance. However, such lenses suffer from strong aberrations. To overcome this problem, we present a method to correct the aberrations of a transmission-mode imaging system that is composed of two condensers. The proposed method separately identifies and corrects aberrations of illumination and collection lenses of up to 1.2 NA by iteratively optimizing the total intensity of the synthetic aperture images in the forward and phase-conjugation processes. At a source wavelength of 785 nm, we demonstrated a spatial resolution of 372 nm at extremely long working distances of up to 1.6 mm, an order of magnitude improvement in comparison to conventional objective lenses. Our method of converting microscope condensers to high-quality objectives may facilitate increases in the imaging depths of super-resolution and expansion microscopes. © The Author(s) 201

    Managerial Learning from Analyst Feedback to Voluntary Capex Guidance, Investment Efficiency, and Firm Performance

    Get PDF
    We test predictions that managers issuing voluntary capex guidance learn from analyst feedback and that this learning enhances investment efficiency and firm performance. Our findings are consistent with these predictions. First, we find that managers’ capex adjustments and capex guidance revisions relate positively with analyst feedback measured by differences between postguidance analyst capex forecasts and managerial capex guidance. Second, changes in investment efficiency relate positively with analyst feedback. Third, subsequent firm financial performance relates positively with the predicted values of both managers’ capex adjustments and capex guidance revisions. These findings extend prior evidence regarding sources of managerial learning and investment efficiency and help to explain the active issuance of voluntary guidance by managers in settings where, as for capex guidance, the potential for managerial learning from related share price effects is limited, as we also explain

    Development of a Resource Allocation Model Using Competitive Advantage

    No full text
    In general, during decision making or negotiations, the investor and the investee may often have different opinions which result in conflicts. So, an objective standard to mitigate potential conflicts between investors and investees should be provided since it is highly important that rational decisions must be made when choosing investments from various options. However, the models currently used come with some problems for several reasons, for instance, the arbitrariness of the evaluator, the difficulty in understanding the relationships that exist among the various investment options (that is, alternatives to investments), inconsistency in priorities, and simply providing selection criteria without detailing the proportion of investment in each option or evaluating only a single investment option at a time without considering all options. Thus, in this research, we present a project selection model which can enable reasonable resource allocation or determination of return rates by considering the core competencies for various investment options. Here, core competency is based on both performance and ability to create a competitive advantage. For this, we deduce issue-specific structural power indicators and analyze quantitatively the resource allocation results based on negotiation power. Through this, it is possible to examine whether the proposed project selection model considers core competencies or not by comparing several project selection models currently used. Furthermore, the proposed model can be used on its own, or in combination with other methods. Consequently, the presented model can be used as a quantitative criterion for determining behavioral tactics, and also can be used to mitigate potential conflicts between the investor and the investee who are considering idiosyncratic investments, determined by an interplay between power and core competency
    • …
    corecore