970 research outputs found

    Editorial: Ferid murad, at 80: A legacy of science, medicine, and mentorship

    Get PDF

    Opinion mining summarization and automation process a survey

    Get PDF
    In this modern age, the internet is a powerful source of information. Roughly, one-third of the world population spends a significant amount of their time and money on surfing the internet. In every field of life, people are gaining vast information from it such as learning, amusement, communication, shopping, etc. For this purpose, users tend to exploit websites and provide their remarks or views on any product, service, event, etc. based on their experience that might be useful for other users. In this manner, a huge amount of feedback in the form of textual data is composed of those webs, and this data can be explored, evaluated and controlled for the decision-making process. Opinion Mining (OM) is a type of Natural Language Processing (NLP) and extraction of the theme or idea from the user's opinions in the form of positive, negative and neutral comments. Therefore, researchers try to present information in the form of a summary that would be useful for different users. Hence, the research community has generated automatic summaries from the 1950s until now, and these automation processes are divided into two categories, which is abstractive and extractive methods. This paper presents an overview of the useful methods in OM and explains the idea about OM regarding summarization and its automation process

    Which Pairs of Stocks should we Trade? Selection of Pairs for Statistical Arbitrage and Pairs Trading in Karachi Stock Exchange

    Get PDF
    Pairs Trading refers to a statistical arbitrage approach devised to take advantage from short term fluctuations simultaneously depicted by two stocks from long run equilibrium position. In this study a technique has been designed for the selection of pairs for pairs trading strategy. Engle-Granger 2-step Cointegration approach has been applied for identifying the trading pairs. The data employed in this study comprised of daily stock prices of Commercial Banks and Financial Services Sector. Restricted pairs have been formed out of highly liquid log share price series of 22 Commercial Banks and 19 Financial Services companies listed on Karachi Stock Exchange. Sample time period extended from November 2, 2009 to June 28, 2013 having total 911 observations for each share prices series incorporated in the study. Out of 231 pairs of commercial banks 25 were found cointegrated whereas 40 cointegrated pairs were identified among 156 pairs formed in Financial Services Sector. Furthermore a Cointegration relationship was estimated by regressing one stock price series on another, whereas the order of regression is accessed through Granger Causality Test. The mean reverting residual of Cointegration regression is modeled through the Vector Error Correction Model in order to assess the speed of adjustment coefficient for the statistical arbitrage opportunity. The findings of the study depict that the cointegrated stocks can be combined linearly in a long/short portfolio having stationary dynamics. Although for the given strategy profitability has not been assessed in this study yet the VECM results for residual series show significant deviations around the mean which identify the statistical arbitrage opportunity and ensure profitability of the pairs trading strategy. JEL classifications: C32, C53, G17 Keywords: Pairs Trading, Statistical Arbitrage, Engle-Granger 2-step Cointegration Approach, VECM

    Information extraction from semi and unstructured data sources: a systematic literature review

    Get PDF
    Millions of structured, semi structured and unstructured documents have been produced around the globe on a daily basis. Sources of such documents are individuals as well as several research societies like IEEE, Elsevier, Springer and Wiley that we use to publish the scientific documents enormously. These documents are a huge resource of scientific knowledge for research communities and interested users around the world. However, due to their massive volume and varying document formats, search engines are facing problems in indexing such documents, thus making retrieval of information inefficient, tedious and time consuming. Information extraction from such documents is among the hottest areas of research in data/text mining. As the number of such documents is increasing tremendously, more sophisticated information extraction techniques are necessary. This research focuses on reviewing and summarizing existing state-of-theart techniques in information extraction to highlight their limitations. Consequently, the research gap is formulated for the researchers in information extraction domain

    Oil Price Flux and Macroeconomy of Oil Exporters

    Get PDF
    Oil is an important energy source, embodies the largest commodity market in the world. Global economic performance has been highly correlated with oil price changes. The study considered 10 major oil exporters to measure the effect of oil price changes on their economies considering variables: Oil Prices (OP), Inflation (CPI) , GDP deflator, Lending interest rate (IR), real interest rate (RIR), Official Exchange Rate (EX), and Net domestic credit (LDU).  By applying Johansen Co-integration techniques, long run relationship among variables has been analyzed covering the time period from 1970 to 2019. In order to find the short run relationship, Error Correction Model (ECM) technique is used. Study affirmed that there exist a strong relation among economic variables and oil price fluctuations; however the intensity of relationship displays a variation. Oil prices are associated with GDP deflator and RIR significantly as compared to other variables. Moreover, it can be suggested that each country should observe it own economic strategy in response to price change to reflect on economic policy instead of following some other country trends

    Risk factors for prostate cancer: a case-control study investigating selected key exposures and their interactions with predisposition genes

    Get PDF
    Prostate cancer is the UK number one male cancer. Evidence from epidemiological studies suggests only age, race and family history as established risk factors. Other factors such as low dose diagnostic radiations and surrogate hormone markers such as baldness, finger length pattern and acne are hypothesized to have a potential role in the aetiology of prostate cancer. It is evident that genetics plays an important role in prostate cancer aetiology. This thesis focuses both environmental and genetic factors. The environmental factors include selected surrogate hormone markers, medical diagnostic radiation procedures and family history of prostate cancer. The genetic part explores genetic polymorphisms that could have implications for interactions with exposures studied. Single nucleotide polymorphisms (SNPs) involved in mechanistic pathways related to DNA repair genes and potential hormone marker genes were the main targets

    Measuring the BDARX architecture by agent oriented system a case study

    Get PDF
    Distributed systems are progressively designed as multi-agent systems that are helpful in designing high strength complex industrial software. Recently, distributed systems cooperative applications are openly access, dynamic and large scales. Nowadays, it hardly seems necessary to emphasis on the potential of decentralized software solutions. This is because the main benefit lies in the distributed nature of information, resources and action. On the other hand, the progression in multi agent systems creates new challenges to the traditional methodologies of fault-tolerance that typically relies on centralized and offline solution. Research on multi-agent systems had gained attention for designing software that operates in distributed and open environments, such as the Internet. DARX (Dynamic Agent Replication eXtension) is one of the architecture which aimed at building reliable software that would prove to be both flexible and scalable and also aimed to provide adaptive fault tolerance by using dynamic replication methodologies. Therefore, the enhancement of DARX known as BDARX can provide dynamic solution of byzantine faults for the agent based systems that embedded DARX. The BDARX architecture improves the fault tolerance ability of multi-agent systems in long run and strengthens the software to be more robust against such arbitrary faults. The BDARX provide the solution for the Byzantine fault tolerance in DARX by making replicas on the both sides of communication agents by using BFT protocol for agent systems instead of making replicas only on server end and assuming client as failure free. This paper shows that the dynamic behaviour of agents avoid us from making discrimination between server and client replicas

    Quantum memory assisted entropic uncertainty and entanglement dynamics: Two qubits coupled with local fields and Ornstein Uhlenbeck noise

    Full text link
    Entropic uncertainty and entanglement are two distinct aspects of quantum mechanical procedures. To estimate entropic uncertainty relations, entropies are used: the greater the entropy bound, the less effective the quantum operations and entanglement are. In this regard, we analyze the entropic uncertainty, entropic uncertainty lower bound, and concurrence dynamics in two non-interacting qubits. The exposure of two qubits is studied in two different qubit-noise configurations, namely, common qubit-noise and independent qubit-noise interactions. To include the noisy effects of the local external fields, a Gaussian Ornstein Uhlenbeck process is considered. We show that the rise in entropic uncertainty gives rise to the disentanglement in the two-qubit Werner type state and both are directly proportional. Depending on the parameters adjustment and the number of environments coupled, different classical environments have varying capacities to induce entropic uncertainty and disentanglement in quantum systems. The entanglement is shown to be vulnerable to current external fields; however, by employing the ideal parameter ranges we provided, prolonged entanglement retention while preventing entropic uncertainty growth can be achieved. Besides, we have also analyzed the intrinsic behavior of the classical fields towards two-qubit entanglement without any imperfection with respect to different parameter

    The Effect of Median Based Estimators on CUSUM Chart

    Get PDF
    Cumulative Sum (CUSUM) chart has been used extensively to monitor mean shifts.It is highly sought after by practitioners and researchers in many areas of quality control due to its sensitivity in detecting small to moderate shifts. Normality assumption governs its ability to monitor the process mean. When the assumption is violated, CUSUM chart typically loses its practical use. As normality is hard to achieve in practice, the usual CUSUM chart is often substituted with robust charts.This is to provide more accurate results under slight deviation from normality. Thus, in this paper, we investigate the impact of using robust location estimators, namely, median and Hodges-Lehmann on CUSUM performance. By pairing the location estimators with a robust scale estimator known as median absolute deviation about the median (MADn), a duo median based CUSUM chart is attained.The performances of both charts are studied under normality and contaminated normal distribution and evaluated using the average run length (ARL). While demonstrating an average power to detect the out-of-control situations, the in-control performances of both charts remain unaffected in the presence of outliers. This could very well be advantageous when the proposed charts are tested on a real data set in the future. A case in point is when the statistical tool is used to monitor changes in clinical variables for the health care outcomes.By minimising the false positives, a sound judgement can be made for any clinical decision
    • …
    corecore