300 research outputs found

    Accuracy and stability analysis of path loss exponent measurement for localization in wireless sensor network

    Get PDF
    In wireless sensor network localization, path loss model is often used to provide a conversion between distance and received signal strength (RSS). Path loss exponent is one of the main environmental parameters for path loss model to characterize the rate of conversion. Therefore, the accuracy of path loss exponent directly influences the results of RSS-to-distance conversion. When the conversion requires distance estimation from RSS value, small error of measured path loss exponent could lead to large error of the conversion output. To improve the localization results, the approaches of measuring accurate parameters from different environments have become important. Different approaches provide different measurement stabilities, depending on the performance and robustness of the approach. This paper presents four calibration approaches to provide measurements of path loss exponent based on measurement arrangement and transmitter/receiver node’s allocation. These include one-line measurement, online-update spread locations measurement, online-update small-to big rectangular measurement, and online-update big-to-small rectangular measurement. The first two are general approaches, and the last two are our newly proposed approaches. Based on our research experiments, a comparison is presented among the four approaches in terms of accuracy and stability. The results show that both online-update rectangular measurements have better stability of measurements. For accuracy of measurement, online-update big-to-small rectangular measurement provides the best result after convergence

    Towards An Integrated Effort For Managing IT Process Standards Implementation

    Get PDF
    Adopting IT process standards seems to be a trend for IT organizations to meet ad-hoc informational needs and to provide better business value. Due to the changing environments of IT organizations themselves, one key to IT success lies in not only the establishment, but also the sustainability of ad-hoc professional IT functions. As IT organizations face many kinds of process standards to implement for various IT functions and although the implementations may be different due to various IT domains, from the management point of view, these implementations may not exist individually. This article attempts to highlight a possibility of an integrated effort to effectively manage the implementations of IT standards in an IT organization. Such a shared management refers to the integrated institutionalization design, which provides a road map for all IT functions to systematically improve and sustain the implementation results. A case example is provided for demonstrating the proposed attempt

    IT Portfolio Investment Evaluation on E-Commerce Solution Alternatives

    Get PDF
    Our study examines the group decision-making process and proposes a multi-criteria framework for e-commerce solution investment in information technology (IT) portfolios. First, the evaluation criteria that fit in the IT evaluation context are constructed. Second, the Fuzzy Analytic Hierarchy Process (FAHP) is employed to determine the weights of decision criteria and the benefit score to the company. Third, the Fuzzy Multiple Criteria Decision-Making (FMCDM) approach is used to synthesize the team decision. Finally, an empirical case of five proposed portal solutions in a car manufacturing company is used to exemplify the approach

    MiniZero: Comparative Analysis of AlphaZero and MuZero on Go, Othello, and Atari Games

    Full text link
    This paper presents MiniZero, a zero-knowledge learning framework that supports four state-of-the-art algorithms, including AlphaZero, MuZero, Gumbel AlphaZero, and Gumbel MuZero. While these algorithms have demonstrated super-human performance in many games, it remains unclear which among them is most suitable or efficient for specific tasks. Through MiniZero, we systematically evaluate the performance of each algorithm in two board games, 9x9 Go and 8x8 Othello, as well as 57 Atari games. For two board games, using more simulations generally results in higher performance. However, the choice of AlphaZero and MuZero may differ based on game properties. For Atari games, both MuZero and Gumbel MuZero are worth considering. Since each game has unique characteristics, different algorithms and simulations yield varying results. In addition, we introduce an approach, called progressive simulation, which progressively increases the simulation budget during training to allocate computation more efficiently. Our empirical results demonstrate that progressive simulation achieves significantly superior performance in two board games. By making our framework and trained models publicly available, this paper contributes a benchmark for future research on zero-knowledge learning algorithms, assisting researchers in algorithm selection and comparison against these zero-knowledge learning baselines. Our code and data are available at https://rlg.iis.sinica.edu.tw/papers/minizero.Comment: Submitted to IEEE Transactions on Games, under revie

    Drastic population fluctuations explain the rapid extinction of the passenger pigeon

    Get PDF
    To assess the role of human disturbances in species' extinction requires an understanding of the species population history before human impact. The passenger pigeon was once the most abundant bird in the world, with a population size estimated at 3-5 billion in the 1800s; its abrupt extinction in 1914 raises the question of how such an abundant bird could have been driven to extinction in mere decades. Although human exploitation is often blamed, the role of natural population dynamics in the passenger pigeon's extinction remains unexplored. Applying high-throughput sequencing technologies to obtain sequences from most of the genome, we calculated that the passenger pigeon's effective population size throughout the last million years was persistently about 1/10,000 of the 1800's estimated number of individuals, a ratio 1,000-times lower than typically found. This result suggests that the passenger pigeon was not always super abundant but experienced dramatic population fluctuations, resembling those of an "outbreak" species. Ecological niche models supported inference of drastic changes in the extent of its breeding range over the last glacial-interglacial cycle. An estimate of acorn-based carrying capacity during the past 21,000 y showed great year-to-year variations. Based on our results, we hypothesize that ecological conditions that dramatically reduced population size under natural conditions could have interacted with human exploitation in causing the passenger pigeon's rapid demise. Our study illustrates that even species as abundant as the passenger pigeon can be vulnerable to human threats if they are subject to dramatic population fluctuations, and provides a new perspective on the greatest human-caused extinction in recorded history

    Genome-wide identification of specific oligonucleotides using artificial neural network and computational genomic analysis

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Genome-wide identification of specific oligonucleotides (oligos) is a computationally-intensive task and is a requirement for designing microarray probes, primers, and siRNAs. An artificial neural network (ANN) is a machine learning technique that can effectively process complex and high noise data. Here, ANNs are applied to process the unique subsequence distribution for prediction of specific oligos.</p> <p>Results</p> <p>We present a novel and efficient algorithm, named the integration of ANN and BLAST (IAB) algorithm, to identify specific oligos. We establish the unique marker database for human and rat gene index databases using the hash table algorithm. We then create the input vectors, via the unique marker database, to train and test the ANN. The trained ANN predicted the specific oligos with high efficiency, and these oligos were subsequently verified by BLAST. To improve the prediction performance, the ANN over-fitting issue was avoided by early stopping with the best observed error and a k-fold validation was also applied. The performance of the IAB algorithm was about 5.2, 7.1, and 6.7 times faster than the BLAST search without ANN for experimental results of 70-mer, 50-mer, and 25-mer specific oligos, respectively. In addition, the results of polymerase chain reactions showed that the primers predicted by the IAB algorithm could specifically amplify the corresponding genes. The IAB algorithm has been integrated into a previously published comprehensive web server to support microarray analysis and genome-wide iterative enrichment analysis, through which users can identify a group of desired genes and then discover the specific oligos of these genes.</p> <p>Conclusion</p> <p>The IAB algorithm has been developed to construct SpecificDB, a web server that provides a specific and valid oligo database of the probe, siRNA, and primer design for the human genome. We also demonstrate the ability of the IAB algorithm to predict specific oligos through polymerase chain reaction experiments. SpecificDB provides comprehensive information and a user-friendly interface.</p

    Genome-wide identification of specific oligonucleotides using artificial neural network and computational genomic analysis

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Genome-wide identification of specific oligonucleotides (oligos) is a computationally-intensive task and is a requirement for designing microarray probes, primers, and siRNAs. An artificial neural network (ANN) is a machine learning technique that can effectively process complex and high noise data. Here, ANNs are applied to process the unique subsequence distribution for prediction of specific oligos.</p> <p>Results</p> <p>We present a novel and efficient algorithm, named the integration of ANN and BLAST (IAB) algorithm, to identify specific oligos. We establish the unique marker database for human and rat gene index databases using the hash table algorithm. We then create the input vectors, via the unique marker database, to train and test the ANN. The trained ANN predicted the specific oligos with high efficiency, and these oligos were subsequently verified by BLAST. To improve the prediction performance, the ANN over-fitting issue was avoided by early stopping with the best observed error and a k-fold validation was also applied. The performance of the IAB algorithm was about 5.2, 7.1, and 6.7 times faster than the BLAST search without ANN for experimental results of 70-mer, 50-mer, and 25-mer specific oligos, respectively. In addition, the results of polymerase chain reactions showed that the primers predicted by the IAB algorithm could specifically amplify the corresponding genes. The IAB algorithm has been integrated into a previously published comprehensive web server to support microarray analysis and genome-wide iterative enrichment analysis, through which users can identify a group of desired genes and then discover the specific oligos of these genes.</p> <p>Conclusion</p> <p>The IAB algorithm has been developed to construct SpecificDB, a web server that provides a specific and valid oligo database of the probe, siRNA, and primer design for the human genome. We also demonstrate the ability of the IAB algorithm to predict specific oligos through polymerase chain reaction experiments. SpecificDB provides comprehensive information and a user-friendly interface.</p

    Prevalence of PIK3CA mutations in Taiwanese patients with breast cancer: a retrospective next-generation sequencing database analysis

    Get PDF
    BackgroundBreast cancer is the most common cancer type that affects women. In hormone receptor–positive (HR+), human epidermal growth factor receptor 2−negative (HER2–) advanced breast cancer (ABC), phosphatidylinositol-4,5-bisphosphate 3-kinase catalytic subunit alpha (PIK3CA) is the most frequently mutated gene associated with poor prognosis. This study evaluated the frequency of PIK3CA mutations in the Taiwanese breast cancer population.MethodologyThis is a retrospective study; patient data were collected for 2 years from a next-generation sequencing database linked to electronic health records (EHRs). The primary endpoint was the regional prevalence of PIK3CA mutation. The secondary endpoints were to decipher the mutation types across breast cancer subtype, menopausal status, and time to treatment failure after everolimus (an mTOR inhibitor) or cyclin-dependent kinase 4/6 (CDK4/6) inhibitor treatment.ResultsPIK3CA mutations were identified in 278 of 728 patients (38%). PIK3CA mutations were reported in 43% of patients with HR−/HER2+ subtype and 42% of patients with HR+/HER2– postmenopausal status. A lower prevalence of PIK3CA mutations was observed in triple-negative (27%) and HR+/HER2– premenopausal patients (29%). The most common mutation was at exon 20 (H1047R mutation, 41.6%), followed by exon 9 (E545K mutation, 18.9% and E542K mutation, 10.3%). Among patients treated with CDK4/6 inhibitors, the median time to treatment failure was 12 months (95% CI: 7-21 months) in the PIK3CA mutation cohort and 16 months (95% CI: 11-23 months) in the PIK3CA wild-type cohort, whereas patients receiving an mTOR inhibitor reported a median time to treatment failure of 20.5 months (95% CI: 8-33 months) in the PIK3CA mutation cohort and 6 months (95% CI: 2-9 months) in the PIK3CA wild-type cohort.ConclusionA high frequency of PIK3CA mutations was detected in Taiwanese patients with breast cancer, which was consistent with previous studies. Early detection of PIK3CA mutations might influence therapeutic decisions, leading to better treatment outcomes
    • 

    corecore