15,100 research outputs found

    Copulas in finance and insurance

    Get PDF
    Copulas provide a potential useful modeling tool to represent the dependence structure among variables and to generate joint distributions by combining given marginal distributions. Simulations play a relevant role in finance and insurance. They are used to replicate efficient frontiers or extremal values, to price options, to estimate joint risks, and so on. Using copulas, it is easy to construct and simulate from multivariate distributions based on almost any choice of marginals and any type of dependence structure. In this paper we outline recent contributions of statistical modeling using copulas in finance and insurance. We review issues related to the notion of copulas, copula families, copula-based dynamic and static dependence structure, copulas and latent factor models and simulation of copulas. Finally, we outline hot topics in copulas with a special focus on model selection and goodness-of-fit testing

    Adjusted empirical likelihood estimation of the youden index and associated threshold for the bigamma model

    Get PDF
    The Youden index is a widely used measure in the framework of medical diagnostic, where the effectiveness of a biomarker (screening marker or predictor) for classifying a disease status is studied. When the biomarker is continuous, it is important to determine the threshold or cut-off point to be used in practice for the discrimination between diseased and healthy populations. We introduce a new method based on adjusted empirical likelihood for quantiles aimed to estimate the Youden index and its associated threshold. We also include bootstrap based confidence intervals for both of them. In the simulation study, we compare this method with a recent approach based on the delta method under the bigamma scenario. Finally, a real example of prostatic cancer, well known in the literature, is analyzed to provide the reader with a better understanding of the new metho

    Copulas in finance and insurance

    Get PDF
    Copulas provide a potential useful modeling tool to represent the dependence structure among variables and to generate joint distributions by combining given marginal distributions. Simulations play a relevant role in finance and insurance. They are used to replicate efficient frontiers or extremal values, to price options, to estimate joint risks, and so on. Using copulas, it is easy to construct and simulate from multivariate distributions based on almost any choice of marginals and any type of dependence structure. In this paper we outline recent contributions of statistical modeling using copulas in finance and insurance. We review issues related to the notion of copulas, copula families, copula-based dynamic and static dependence structure, copulas and latent factor models and simulation of copulas. Finally, we outline hot topics in copulas with a special focus on model selection and goodness-of-fit testing.Dependence structure, Extremal values, Copula modeling, Copula review

    Large Panels with Common Factors and Spatial Correlations

    Get PDF
    This paper considers the statistical analysis of large panel data sets where even after condi-tioning on common observed effects the cross section units might remain dependently distrib-uted. This could arise when the cross section units are subject to unobserved common effects and/or if there are spill over effects due to spatial or other forms of local dependencies. The paper provides an overview of the literature on cross section dependence, introduces the con-cepts of time-specific weak and strong cross section dependence and shows that the commonly used spatial models are examples of weak cross section dependence. It is then established that the Common Correlated Effects (CCE) estimator of panel data model with a multifactor error structure, recently advanced by Pesaran (2006), continues to provide consistent estimates of the slope coefficient, even in the presence of spatial error processes. Small sample properties of the CCE estimator under various patterns of cross section dependence, including spatial forms, are investigated by Monte Carlo experiments. Results show that the CCE approach works well in the presence of weak and/or strong cross sectionally correlated errors. We also explore the role of certain characteristics of spatial processes in determining the performance of CCE estimators, such as the form and intensity of spatial dependence, and the sparseness of the spatial weight matrix.panels, Common Correlated Effects, strong and weak cross section dependence

    Adjusted empirical likelihood estimation of the youden index and associated threshold for the bigamia model

    Get PDF
    The Youden index is a widely used measure in the framework of medical diagnostic, where the effectiveness of a biomarker (screening marker or predictor) for classifying a disease status is studied. When the biomarker is continuous, it is important to determine the threshold or cut-off point to be used in practice for the discrimination between diseased and healthy populations. We introduce a new method based on adjusted empirical likelihood for quantiles aimed to estimate the Youden index and its associated threshold. We also include bootstrap based confidence intervals for both of them. In the simulation study, we compare this method with a recent approach based on the delta method under the bigamma scenario. Finally, a real example of prostatic cancer, well known in the literature, is analyzed to provide the reader with a better understanding of the new methodConfidence interval, Empirical likelihood, Optimal cut-off point, ROC curve, Youden index

    Governance of Dual-Use Technologies: Theory and Practice

    Get PDF
    The term dual-use characterizes technologies that can have both military and civilian applications. What is the state of current efforts to control the spread of these powerful technologies—nuclear, biological, cyber—that can simultaneously advance social and economic well-being and also be harnessed for hostile purposes? What have previous efforts to govern, for example, nuclear and biological weapons taught us about the potential for the control of these dual-use technologies? What are the implications for governance when the range of actors who could cause harm with these technologies include not just national governments but also non-state actors like terrorists? These are some of the questions addressed by Governance of Dual-Use Technologies: Theory and Practice, the new publication released today by the Global Nuclear Future Initiative of the American Academy of Arts and Sciences. The publication's editor is Elisa D. Harris, Senior Research Scholar, Center for International Security Studies, University of Maryland School of Public Affairs. Governance of Dual-Use Technologies examines the similarities and differences between the strategies used for the control of nuclear technologies and those proposed for biotechnology and information technology. The publication makes clear the challenges concomitant with dual-use governance. For example, general agreement exists internationally on the need to restrict access to technologies enabling the development of nuclear weapons. However, no similar consensus exists in the bio and information technology domains. The publication also explores the limitations of military measures like deterrence, defense, and reprisal in preventing globally available biological and information technologies from being misused. Some of the other questions explored by the publication include: What types of governance measures for these dual-use technologies have already been adopted? What objectives have those measures sought to achieve? How have the technical characteristics of the technology affected governance prospects? What have been the primary obstacles to effective governance, and what gaps exist in the current governance regime? Are further governance measures feasible? In addition to a preface from Global Nuclear Future Initiative Co-Director Robert Rosner (University of Chicago) and an introduction and conclusion from Elisa Harris, Governance of Dual-Use Technologiesincludes:On the Regulation of Dual-Use Nuclear Technology by James M. Acton (Carnegie Endowment for International Peace)Dual-Use Threats: The Case of Biotechnology by Elisa D. Harris (University of Maryland)Governance of Information Technology and Cyber Weapons by Herbert Lin (Stanford University

    Sequential Voting Promotes Collective Discovery in Social Recommendation Systems

    Full text link
    One goal of online social recommendation systems is to harness the wisdom of crowds in order to identify high quality content. Yet the sequential voting mechanisms that are commonly used by these systems are at odds with existing theoretical and empirical literature on optimal aggregation. This literature suggests that sequential voting will promote herding---the tendency for individuals to copy the decisions of others around them---and hence lead to suboptimal content recommendation. Is there a problem with our practice, or a problem with our theory? Previous attempts at answering this question have been limited by a lack of objective measurements of content quality. Quality is typically defined endogenously as the popularity of content in absence of social influence. The flaw of this metric is its presupposition that the preferences of the crowd are aligned with underlying quality. Domains in which content quality can be defined exogenously and measured objectively are thus needed in order to better assess the design choices of social recommendation systems. In this work, we look to the domain of education, where content quality can be measured via how well students are able to learn from the material presented to them. Through a behavioral experiment involving a simulated massive open online course (MOOC) run on Amazon Mechanical Turk, we show that sequential voting systems can surface better content than systems that elicit independent votes.Comment: To be published in the 10th International AAAI Conference on Web and Social Media (ICWSM) 201

    Design, modeling and synthesis of an in vitro transcription rate regulatory circuit

    Get PDF
    This paper describes the design, modeling and realization of a synthetic in vitro circuit that aims at regulating the rate of mRNA transcription. Two DNA templates are designed to interact through their transcripts, creating negative feedback loops that will equate their transcription rates at steady state. A mathematical model is developed for this circuit, consisting of a set of ODEs derived from the mass action laws and Michaelis-Menten kinetics involving all the present chemical species. The DNA strands were accordingly designed, following thermodynamics principles and minimizing unwanted interactions. Preliminary experimental results show that the circuit is performing the expected task, by matching at steady state the transcription rates of the two DNA templates
    corecore