270 research outputs found

    Quantum computing for finance

    Full text link
    Quantum computers are expected to surpass the computational capabilities of classical computers and have a transformative impact on numerous industry sectors. We present a comprehensive summary of the state of the art of quantum computing for financial applications, with particular emphasis on stochastic modeling, optimization, and machine learning. This Review is aimed at physicists, so it outlines the classical techniques used by the financial industry and discusses the potential advantages and limitations of quantum techniques. Finally, we look at the challenges that physicists could help tackle

    TOWARDS A HOLISTIC RISK MODEL FOR SAFEGUARDING THE PHARMACEUTICAL SUPPLY CHAIN: CAPTURING THE HUMAN-INDUCED RISK TO DRUG QUALITY

    Get PDF
    Counterfeit, adulterated, and misbranded medicines in the pharmaceutical supply chain (PSC) are a critical problem. Regulators charged with safeguarding the supply chain are facing shrinking resources for inspections while concurrently facing increasing demands posed by new drug products being manufactured at more sites in the US and abroad. To mitigate risk, the University of Kentucky (UK) Central Pharmacy Drug Quality Study (DQS) tests injectable drugs dispensed within the UK hospital. Using FT-NIR spectrometry coupled with machine learning techniques the team identifies and flags potentially contaminated drugs for further testing and possible removal from the pharmacy. Teams like the DQS are always working with limited equipment, time, and staffing resources. Scanning every vial immediately before use is infeasible and drugs must be prioritized for analysis. A risk scoring system coupled with batch sampling techniques is currently used in the DQS. However, a risk scoring system only allows the team to know about the risks to the PSC today. It doesn’t let us predict what the risks will be in the future. To begin bridging this gap in predictive modeling capabilities the authors assert that models must incorporate the human element. A sister project to the DQS, the Drug Quality Game (DGC), enables humans and all of their unpredictability to be inserted into a virtual PSC. The DQG approach was adopted as a means of capturing human creativity, imagination, and problem-solving skills. Current methods of prioritizing drug scans rely heavily on drug cost, sole-source status, warning letters, equipment and material specifications. However, humans, not machines, commit fraud. Given that even one defective drug product could have catastrophic consequences this project will improve risk-based modeling by equipping future models to identify and incorporate human-induced risks, expanding the overall landscape of risk-based modeling. This exploratory study tested the following hypotheses (1) a useful game system able to simulate real-life humans and their actions in a pharmaceutical manufacturing process can be designed and deployed, (2) there are variables in the game that are predictive of human-induced risks to the PSC, and (3) the game can identify ways in which bad actors can “game the system” (GTS) to produce counterfeit, adulterated, and misbranded drugs. A commercial-off-the-shelf (COTS) game, BigPharma, was used as the basis of a game system able to simulate the human subjects and their actions in a pharmaceutical manufacturing process. BigPharma was selected as it provides a low-cost, time-efficient virtual environment that captures the major elements of a pharmaceutical business- research, marketing, and manufacturing/processing. Running Big Pharma with a Python shell enables researchers to implement specific GxP-related tasks (Good x Practice, where x=Manufacturing, Clinical, Research, etc.) not provided in the COTS BigPharma game. Results from players\u27 interaction with the Python shell/Big Pharma environment suggest that the game can identify both variables predictive of human-induced risks to the PSC and ways in which bad actors may GTS. For example, company profitability emerged as one variable predictive of successful GTS. Player\u27s unethical in-game techniques matched well with observations seen within the DQS

    Resistance to change: a functional analysis of reponses to technical change in a Swiss bank

    Get PDF
    This thesis demonstrates the signal function and diagnostic value of user resistance in a software development project. Its starting point is the critical analysis of managerial common sense which negates resistance, or sees resistance to change as a 'nuisance' and as the manifestation of an individual or structural 'deficiency'; these notions prohibit change agents from appreciating the signal function of resistance to change in organisational processes. The first source of evidence is the literature on impacts, attitudes, and acceptance of information technology internationally and in particular in Switzerland. The second source is the tradition of psychological field theory which I reconstruct as the 'feeding the reluctant eater' paradigm, a form of social engineering. The third source is an empirical study of the semantics (semantic differential and free associations) of 'resistance to change' among management trainees in the UK, Switzerland and the USA (N=388). The thesis develops and investigates a concept of resistance that is based a pain analogy, and on the notions of self-monitoring and self-active systems. An organization which is implementing new technology is a self-active system that directs and energetizes its activities with the help of internal and external communication. The functional analogy of the organismic pain system and resistance to change is explored. The analogy consists of parallel information processing, filtering and recoding of information, a bimodal pattern of attention over time, and the functions of attention allocation, evaluation, alteration and learning. With this analogy I am able to generate over 50 hypotheses on resistance to change and its effects on organisational processes. The evidence for some of these hypotheses is explored in an empirical study of a Swiss banking group. The implemention of computer services between 1983 and 1991 is reconstructed in the central bank and 24 branches. Data includes the analysis of two opinion surveys (1985 n=305; 1991 n=326), documents (n=134), narrative interviews (n=34), job analyses (n=34), field observations and performance data (n=24). A method is developed to describe the varying structure of organisational information processing through time. The content analysis allows me to describe when in relation to the action, how intense, and in what manner 'resistance' becomes an issue between 1983 and 1991. The fruitfulness of the pain analogy is demonstrated (a) by shifting the analysis of resistance from structure to process and to that of an independent rather than to that of a dependent variable; (b) by shifting the focus from from motivation to communication; (c) by eroding the a priori assumption that resistance is a nuisance; and (d) by indicating the diagnostic value of "bad news" in organisational communication; resistance is diagnostic information; it shows us when, where and why things go wrong

    Bridging the divide: firms and institutional variety in Italy

    Get PDF
    The underperformance of Italy’s macroeconomy is common knowledge, yet empirical evidence has shown that a high quality segment of Italian export oriented firms has outperformed international competitors although the country lacks practically all attributes of a coordinated market economy. This thesis shows that the ability of firms to produce high quality goods in Italy is linked to the practice of "capital skill asset pooling" within a novel model of production organisation, "disintegrated hierarchy". "Capital-skill asset pooling" follows from the vertical disintegration of production functions across firms and entails the sharing of production assets between firms governed by heterogeneous institutional frameworks. Through the comparisons of firm-level case studies across three industries, the thesis shows that two simultaneous conditions are necessary for "capitalskill asset pooling" to develop: 1) the presence of lead firms endowed with patient capital, and 2) the presence small suppliers endowed with firm-, industry- and product-specific skills. This finding complements the Varieties of Capitalism literature by showing that firms can produce high or diversified quality goods in the absence of the necessary institutional preconditions by developing functional substitutes to coordinated market economy assets through "capital-skill asset pooling"

    Computational Methods for Medical and Cyber Security

    Get PDF
    Over the past decade, computational methods, including machine learning (ML) and deep learning (DL), have been exponentially growing in their development of solutions in various domains, especially medicine, cybersecurity, finance, and education. While these applications of machine learning algorithms have been proven beneficial in various fields, many shortcomings have also been highlighted, such as the lack of benchmark datasets, the inability to learn from small datasets, the cost of architecture, adversarial attacks, and imbalanced datasets. On the other hand, new and emerging algorithms, such as deep learning, one-shot learning, continuous learning, and generative adversarial networks, have successfully solved various tasks in these fields. Therefore, applying these new methods to life-critical missions is crucial, as is measuring these less-traditional algorithms' success when used in these fields

    A Statistical Approach to the Alignment of fMRI Data

    Get PDF
    Multi-subject functional Magnetic Resonance Image studies are critical. The anatomical and functional structure varies across subjects, so the image alignment is necessary. We define a probabilistic model to describe functional alignment. Imposing a prior distribution, as the matrix Fisher Von Mises distribution, of the orthogonal transformation parameter, the anatomical information is embedded in the estimation of the parameters, i.e., penalizing the combination of spatially distant voxels. Real applications show an improvement in the classification and interpretability of the results compared to various functional alignment methods

    A comparison of the CAR and DAGAR spatial random effects models with an application to diabetics rate estimation in Belgium

    Get PDF
    When hierarchically modelling an epidemiological phenomenon on a finite collection of sites in space, one must always take a latent spatial effect into account in order to capture the correlation structure that links the phenomenon to the territory. In this work, we compare two autoregressive spatial models that can be used for this purpose: the classical CAR model and the more recent DAGAR model. Differently from the former, the latter has a desirable property: its ρ parameter can be naturally interpreted as the average neighbor pair correlation and, in addition, this parameter can be directly estimated when the effect is modelled using a DAGAR rather than a CAR structure. As an application, we model the diabetics rate in Belgium in 2014 and show the adequacy of these models in predicting the response variable when no covariates are available
    • 

    corecore