69,523 research outputs found

    A subsampling method for the computation of multivariate estimators with high breakdown point

    Get PDF
    All known robust location and scale estimators with high breakdown point for multivariate sample's are very expensive to compute. In practice, this computation has to be carried out using an approximate subsampling procedure. In this work we describe an alternative subsampling scheme, applicable to both the Stahel-Donoho estimator and the estimator based on the Minimum Volume Ellipsoid, with the property that the number of subsamples required is substantially reduced with respect to the standard subsampling procedures used in both cases. We also discuss some bias and variability properties of the estimator obtained from the proposed subsampling process

    Solution Repair/Recovery in Uncertain Optimization Environment

    Full text link
    Operation management problems (such as Production Planning and Scheduling) are represented and formulated as optimization models. The resolution of such optimization models leads to solutions which have to be operated in an organization. However, the conditions under which the optimal solution is obtained rarely correspond exactly to the conditions under which the solution will be operated in the organization.Therefore, in most practical contexts, the computed optimal solution is not anymore optimal under the conditions in which it is operated. Indeed, it can be "far from optimal" or even not feasible. For different reasons, we hadn't the possibility to completely re-optimize the existing solution or plan. As a consequence, it is necessary to look for "repair solutions", i.e., solutions that have a good behavior with respect to possible scenarios, or with respect to uncertainty of the parameters of the model. To tackle the problem, the computed solution should be such that it is possible to "repair" it through a local re-optimization guided by the user or through a limited change aiming at minimizing the impact of taking into consideration the scenarios

    "Rotterdam econometrics": publications of the econometric institute 1956-2005

    Get PDF
    This paper contains a list of all publications over the period 1956-2005, as reported in the Rotterdam Econometric Institute Reprint series during 1957-2005.

    Rethinking Digital Forensics

    Get PDF
    © IAER 2019In the modern socially-driven, knowledge-based virtual computing environment in which organisations are operating, the current digital forensics tools and practices can no longer meet the need for scientific rigour. There has been an exponential increase in the complexity of the networks with the rise of the Internet of Things, cloud technologies and fog computing altering business operations and models. Adding to the problem are the increased capacity of storage devices and the increased diversity of devices that are attached to networks, operating autonomously. We argue that the laws and standards that have been written, the processes, procedures and tools that are in common use are increasingly not capable of ensuring the requirement for scientific integrity. This paper looks at a number of issues with current practice and discusses measures that can be taken to improve the potential of achieving scientific rigour for digital forensics in the current and developing landscapePeer reviewe

    Towards Machine Wald

    Get PDF
    The past century has seen a steady increase in the need of estimating and predicting complex systems and making (possibly critical) decisions with limited information. Although computers have made possible the numerical evaluation of sophisticated statistical models, these models are still designed \emph{by humans} because there is currently no known recipe or algorithm for dividing the design of a statistical model into a sequence of arithmetic operations. Indeed enabling computers to \emph{think} as \emph{humans} have the ability to do when faced with uncertainty is challenging in several major ways: (1) Finding optimal statistical models remains to be formulated as a well posed problem when information on the system of interest is incomplete and comes in the form of a complex combination of sample data, partial knowledge of constitutive relations and a limited description of the distribution of input random variables. (2) The space of admissible scenarios along with the space of relevant information, assumptions, and/or beliefs, tend to be infinite dimensional, whereas calculus on a computer is necessarily discrete and finite. With this purpose, this paper explores the foundations of a rigorous framework for the scientific computation of optimal statistical estimators/models and reviews their connections with Decision Theory, Machine Learning, Bayesian Inference, Stochastic Optimization, Robust Optimization, Optimal Uncertainty Quantification and Information Based Complexity.Comment: 37 page

    Skorokhod's M1 topology for distribution-valued processes

    Get PDF
    Skorokhod's M1 topology is defined for c\`adl\`ag paths taking values in the space of tempered distributions (more generally, in the dual of a countably Hilbertian nuclear space). Compactness and tightness characterisations are derived which allow us to study a collection of stochastic processes through their projections on the familiar space of real-valued c\`adl\`ag processes. It is shown how this topological space can be used in analysing the convergence of empirical process approximations to distribution-valued evolution equations with Dirichlet boundary conditions.Comment: 13 pages, 2 figure

    Efficient option pricing with transaction costs

    Get PDF
    A fast numerical algorithm is developed to price European options with proportional transaction costs using the utility-maximization framework of Davis (1997). This approach allows option prices to be computed by solving the investor’s basic portfolio selection problem without insertion of the option payoff into the terminal value function. The properties of the value function can then be used to drastically reduce the number of operations needed to locate the boundaries of the no-transaction region, which leads to very efficient option valuation. The optimization problem is solved numerically for the case of exponential utility, and comparisons with approximately replicating strategies reveal tight bounds for option prices even as transaction costs become large. The computational technique involves a discrete-time Markov chain approximation to a continuous-time singular stochastic optimal control problem. A general definition of an option hedging strategy in this framework is developed. This involves calculating the perturbation to the optimal portfolio strategy when an option trade is executed
    corecore