182 research outputs found

    New Statistical Algorithms for the Analysis of Mass Spectrometry Time-Of-Flight Mass Data with Applications in Clinical Diagnostics

    Get PDF
    Mass spectrometry (MS) based techniques have emerged as a standard forlarge-scale protein analysis. The ongoing progress in terms of more sensitive machines and improved data analysis algorithms led to a constant expansion of its fields of applications. Recently, MS was introduced into clinical proteomics with the prospect of early disease detection using proteomic pattern matching. Analyzing biological samples (e.g. blood) by mass spectrometry generates mass spectra that represent the components (molecules) contained in a sample as masses and their respective relative concentrations. In this work, we are interested in those components that are constant within a group of individuals but differ much between individuals of two distinct groups. These distinguishing components that dependent on a particular medical condition are generally called biomarkers. Since not all biomarkers found by the algorithms are of equal (discriminating) quality we are only interested in a small biomarker subset that - as a combination - can be used as a fingerprint for a disease. Once a fingerprint for a particular disease (or medical condition) is identified, it can be used in clinical diagnostics to classify unknown spectra. In this thesis we have developed new algorithms for automatic extraction of disease specific fingerprints from mass spectrometry data. Special emphasis has been put on designing highly sensitive methods with respect to signal detection. Thanks to our statistically based approach our methods are able to detect signals even below the noise level inherent in data acquired by common MS machines, such as hormones. To provide access to these new classes of algorithms to collaborating groups we have created a web-based analysis platform that provides all necessary interfaces for data transfer, data analysis and result inspection. To prove the platform's practical relevance it has been utilized in several clinical studies two of which are presented in this thesis. In these studies it could be shown that our platform is superior to commercial systems with respect to fingerprint identification. As an outcome of these studies several fingerprints for different cancer types (bladder, kidney, testicle, pancreas, colon and thyroid) have been detected and validated. The clinical partners in fact emphasize that these results would be impossible with a less sensitive analysis tool (such as the currently available systems). In addition to the issue of reliably finding and handling signals in noise we faced the problem to handle very large amounts of data, since an average dataset of an individual is about 2.5 Gigabytes in size and we have data of hundreds to thousands of persons. To cope with these large datasets, we developed a new framework for a heterogeneous (quasi) ad-hoc Grid - an infrastructure that allows to integrate thousands of computing resources (e.g. Desktop Computers, Computing Clusters or specialized hardware, such as IBM's Cell Processor in a Playstation 3)

    Analysis of Clickstream Data

    Get PDF
    This thesis is concerned with providing further statistical development in the area of web usage analysis to explore web browsing behaviour patterns. We received two data sources: web log files and operational data files for the websites, which contained information on online purchases. There are many research question regarding web browsing behaviour. Specifically, we focused on the depth-of-visit metric and implemented an exploratory analysis of this feature using clickstream data. Due to the large volume of data available in this context, we chose to present effect size measures along with all statistical analysis of data. We introduced two new robust measures of effect size for two-sample comparison studies for Non-normal situations, specifically where the difference of two populations is due to the shape parameter. The proposed effect sizes perform adequately for non-normal data, as well as when two distributions differ from shape parameters. We will focus on conversion analysis, to investigate the causal relationship between the general clickstream information and online purchasing using a logistic regression approach. The aim is to find a classifier by assigning the probability of the event of online shopping in an e-commerce website. We also develop the application of a mixture of hidden Markov models (MixHMM) to model web browsing behaviour using sequences of web pages viewed by users of an e-commerce website. The mixture of hidden Markov model will be performed in the Bayesian context using Gibbs sampling. We address the slow mixing problem of using Gibbs sampling in high dimensional models, and use the over-relaxed Gibbs sampling, as well as forward-backward EM algorithm to obtain an adequate sample of the posterior distributions of the parameters. The MixHMM provides an advantage of clustering users based on their browsing behaviour, and also gives an automatic classification of web pages based on the probability of observing web page by visitors in the website

    Measuring Consumers' Attachment to Geographical Indications: Implications for Competition Policy

    Get PDF
    Geographical Indications (GIs) are considered as upmarket products because they are based on tradition and convey information about their geographical origin. Otherwise, the limitation of the geographical areas devoted to GIs and the exclusivity they benefit on the product lead to suspicions of monopoly power. Quality and market power should however reflect a stronger attachment, making consumers less price sensitive than for standard goods. This research aims to compare theses conjectures to empirical measures concerning the French cheese market. Price elasticities are computed from a demand model on 21 products, 11 Protected Designation of Origin (PDO) products and 10 non PDOs. The results are counterintuitive, PDOs being as price elastic as or more price elastic than standard products. This finding thus challenges the widespread idea that PDOs systematically correspond to high quality. It also has important implications in terms of competition policy, showing that PDO cheeses suppliers cannot decide on price increases without suffering large reductions in demand

    Anomaly Detection through Adaptive DASO Optimization Techniques

    Get PDF
    An intrusion detection systems (IDS) detect and prevent network attacks. Due to the complicated network environment, the ID system merges a high number of samples into a small number of normal samples, resulting in inadequate samples to identify and train and a maximum false detection rate. External malicious attacks damage conventional IDS, which affects network activity. Adaptive Dolphin Atom Search Optimization overcomes this. Thus, the work aims to create an adaptive optimization-based network intrusion detection system that modifies the classifier for accurate prediction. The model selects feature and detects intrusions. Mutual information selects feature for further processing in the feature selection module. Deep RNNs detect intrusions. The novel Adaptive Dolphin Atom Search Optimization technique trains the deep RNN. Adaptive DASO combines the DASO algorithm with adaptive concepts. The DASO is the integration of the dolphin echolocation (DE) with the atom search optimization (ASO). Thus, the intrusions are detected using the adaptive DASO-based deep RNN. The developed adaptive DASO approach attains better detection performance based on several parameters such as specificity, accuracy, and sensitivity

    Analysis of job scheduling algorithms for heterogeneous multiprocessor computing systems

    Get PDF
    The problem of scheduling independent jobs on heterogeneous multiprocessor models (i.e., those with non-identical or uniform processors) with independent memories has been studied. Actually, a number of demand scheduling nonpreemptive algorithms have been evaluated, with respect to their mean flow and completion time performance criterion. In particular, the deterministic analysis has been used to predict the worst-case performance whereas simulation techniques have been applied to estimate the expected performance of the algorithms. As a result from the deterministic analysis, informative worstcase bounds have been proven, from which the behaviour of the extreme performance of the considered algorithms can be well predicted. However, relaxing some or a combination of the system parameters then, our model corresponds to versions which have already been studied. (i.e. the classical homogeneous and heterogeneous models or the homogeneous one with independent memories). For such cases, the proven bounds in this thesis either agree or are better and more informative than the ones found for these simpler models.. Finally, the analysis of the worst-case and expected performance results reveals that there is a high degree of correlation in the behaviour of the algorithms as predicted or estimated by these two performance measurements, respectively

    Trust Management: A Cooperative Approach Using Game Theory

    Get PDF
    Trust, defined as the willingness to accept risk and vulnerability based upon positive expectations of the intentions or behaviours of another. The qualities or behaviours of one person that create good expectations in another are referred to as trustworthiness. Because of its perceived link to cooperative behaviour, many social scientists regard trust as the backbone of effective social structures. With the advancement in technology, through these online social media people can explore various products, services and facilities. Through these networks the end users want to communicate are usually physically unknown with each other, the evaluation of their trustworthiness is mandatory. Mathematical methods and computational procedures do not easily define trust. Psychological and sociological factors can influence trust. End users are vulnerable to a variety of risks. The need to define trust is expanding as businesses try to establish effective marketing strategies through their social media activities, and as a result, they must obtain consumer trust. Game theory is a theoretical framework for analysing strategic interactions between two or more individuals, in the terminology of game theory, called players. Thus, a conceptual framework for trust evaluation can be designed using a game theory approach that can indicate the conditions under which trustworthy behaviour can be determined

    A systematic review on Drug Re-profiling/Re-Purposing

    Get PDF
    Hardcore capability of drug repurposing has allowed rising population of diversified diseased patients to approach various medications with known safety profiles. In an ongoing scenario considering current pharmaceutical market, we have numerous drugs that are approved and repurposed by the U.S. Food and Drug Administration. Developing and bringing a novel drug molecule from the laboratory to a market requires a lot of investment in terms of money, efforts, and time. On the other hand, repurposing a drug holds the capability of bringing out best cures with harmless, ease availability and inexpensive quality. Sildenafil, Chloroquine, Metformin are some examples of repurposed drug used in multiple disease models. Despite numerous challenges, drug repurposing stood to be a core component to any comprehensive drug re-discovering strategies which has been planned to bring benefit to the patients suffering from a wide variety of dreadful ailments. In this review, we have discussed the various repurposed drugs in numerous types of cancer, deadly novel coronavirus (SARS-CoV-2) and some orphan diseases. This paper holds various examples of drugs which are still under clinical trial and have high chances of being approved as repurposed drugs benefitting humankind

    1993 OURE report, including the 3rd Annual UMR Undergraduate Research Symposium -- Entire Proceedings

    Get PDF
    The Opportunities for Undergraduate Research Experience program began in 1990. This volume represents the proceedings of the third annual OURE program. The aims of the program are to enrich the learning process and make it more active, encourage interaction between students and faculty members, raise the level of research on the campus, help recruit superior students to the graduate program, and support the notion that teaching and research are compatible and mutually reinforcing. As the papers herein attest, the OURE program continues to achieve its goals -- UMR students have performed research on an enormous variety of topics, have worked closely with faculty members, and have experienced deeply both the pleasures and frustrations of research. Several of the undergraduates whose papers are included are now graduate students at UMR or elsewhere. The first section of this volume is made up of papers presented at the third annual UMR Undergraduate Research Symposium held on January 29, 1993
    • …
    corecore