165,547 research outputs found

    On Segre's Lemma of Tangents

    Get PDF
    Segre's lemma of tangents dates back to the 1950's when he used it in the proof of his "arc is a conic" theorem. Since then it has been used as a tool to prove results about various objects including internal nuclei, Kakeya sets, sets with few odd secants and further results on arcs. Here, we survey some of these results and report on how re-formulations of Segre's lemma of tangents are leading to new results

    A Complete Market Model for Option Valuation

    Get PDF
    This paper is an introduction and survey of Black-Scholes Model as a complete model for Option Valuation. It is a Stochastic processes that represent diffusive dynamics, a common and improved modelling assumption for financial systems. As the markets are frictionless generally, it becomes very necessary for us to use a more convenient and complete method in order to avoid errors for computations. We include a review of Stochastic Differential equations(SDE), the -lemma which gives a clear picture of Log-normal distribution of a Geometrical Brownian Motion path and solution of Black- Scholes  Model Keywords: Stochastic Differential Equations, ’s lemma, tame and completeness of Black-Scholes Mode

    Eigenvectors of random matrices: A survey

    Get PDF
    Eigenvectors of large matrices (and graphs) play an essential role in combinatorics and theoretical computer science. The goal of this survey is to provide an up-to-date account on properties of eigenvectors when the matrix (or graph) is random.Comment: 64 pages, 1 figure; added Section 7 on localized eigenvector

    Survey on the geometric Bogomolov conjecture

    Get PDF
    This is a survey paper of the developments on the geometric Bogomolov conjecture. We explain the recent results by the author as well as previous works concerning the conjecture. This paper also includes an introduction to the height theory over function fields and a quick review on basic notions on non-archimedean analytic geometry.Comment: 57 pages. This is an expanded lecture note of a talk at "Non-archimedean analytic Geometry: Theory and Practice" (24--28 August, 2015). It has been submitted to the conference proceedings. Appendix adde

    Optimal Data Acquisition for Statistical Estimation

    Get PDF
    We consider a data analyst's problem of purchasing data from strategic agents to compute an unbiased estimate of a statistic of interest. Agents incur private costs to reveal their data and the costs can be arbitrarily correlated with their data. Once revealed, data are verifiable. This paper focuses on linear unbiased estimators. We design an individually rational and incentive compatible mechanism that optimizes the worst-case mean-squared error of the estimation, where the worst-case is over the unknown correlation between costs and data, subject to a budget constraint in expectation. We characterize the form of the optimal mechanism in closed-form. We further extend our results to acquiring data for estimating a parameter in regression analysis, where private costs can correlate with the values of the dependent variable but not with the values of the independent variables

    Empirical likelihood confidence intervals for complex sampling designs

    No full text
    We define an empirical likelihood approach which gives consistent design-based confidence intervals which can be calculated without the need of variance estimates, design effects, resampling, joint inclusion probabilities and linearization, even when the point estimator is not linear. It can be used to construct confidence intervals for a large class of sampling designs and estimators which are solutions of estimating equations. It can be used for means, regressions coefficients, quantiles, totals or counts even when the population size is unknown. It can be used with large sampling fractions and naturally includes calibration constraints. It can be viewed as an extension of the empirical likelihood approach to complex survey data. This approach is computationally simpler than the pseudoempirical likelihood and the bootstrap approaches. The simulation study shows that the confidence interval proposed may give better coverages than the confidence intervals based on linearization, bootstrap and pseudoempirical likelihood. Our simulation study shows that, under complex sampling designs, standard confidence intervals based on normality may have poor coverages, because point estimators may not follow a normal sampling distribution and their variance estimators may be biased.<br/
    corecore