93 research outputs found

    Variational Analysis of Constrained M-Estimators

    Get PDF
    We propose a unified framework for establishing existence of nonparametric M-estimators, computing the corresponding estimates, and proving their strong consistency when the class of functions is exceptionally rich. In particular, the framework addresses situations where the class of functions is complex involving information and assumptions about shape, pointwise bounds, location of modes, height at modes, location of level-sets, values of moments, size of subgradients, continuity, distance to a "prior" function, multivariate total positivity, and any combination of the above. The class might be engineered to perform well in a specific setting even in the presence of little data. The framework views the class of functions as a subset of a particular metric space of upper semicontinuous functions under the Attouch-Wets distance. In addition to allowing a systematic treatment of numerous M-estimators, the framework yields consistency of plug-in estimators of modes of densities, maximizers of regression functions, level-sets of classifiers, and related quantities, and also enables computation by means of approximating parametric classes. We establish consistency through a one-sided law of large numbers, here extended to sieves, that relaxes assumptions of uniform laws, while ensuring global approximations even under model misspecification

    Fusion of Hard and Soft Information in Nonparametric Density Estimation

    Get PDF
    This article discusses univariate density estimation in situations when the sample (hard information) is supplemented by “soft” information about the random phenomenon. These situations arise broadly in operations research and management science where practical and computational reasons severely limit the sample size, but problem structure and past experiences could be brought in. In particular, density estimation is needed for generation of input densities to simulation and stochastic optimization models, in analysis of simulation output, and when instantiating probability models. We adopt a constrained maximum likelihood estimator that incorporates any, possibly random, soft information through an arbitrary collection of constraints. We illustrate the breadth of possibilities by discussing soft information about shape, support, continuity, smoothness, slope, location of modes, symmetry, density values, neighborhood of known density, moments, and distribution functions. The maximization takes place over spaces of extended real-valued semicontinuous functions and therefore allows us to consider essentially any conceivable density as well as convenient exponential transformations. The infinite dimensionality of the optimization problem is overcome by approximating splines tailored to these spaces. To facilitate the treatment of small samples, the construction of these splines is decoupled from the sample. We discuss existence and uniqueness of the estimator, examine consistency under increasing hard and soft information, and give rates of convergence. Numerical examples illustrate the value of soft information, the ability to generate a family of diverse densities, and the effect of misspecification of soft information.U.S. Army Research Laboratory and the U.S. Army Research Office grant 00101-80683U.S. Army Research Laboratory and the U.S. Army Research Office grant W911NF-10-1-0246U.S. Army Research Laboratory and the U.S. Army Research Office grant W911NF-12-1-0273U.S. Army Research Laboratory and the U.S. Army Research Office grant 00101-80683U.S. Army Research Laboratory and the U.S. Army Research Office grant W911NF-10-1-0246U.S. Army Research Laboratory and the U.S. Army Research Office grant W911NF-12-1-027

    Specifying and Validating Probabilistic Inputs for Prescriptive Models of Decision Making over Time

    Get PDF
    Optimization models for making decisions over time in uncertain environments rely on probabilistic inputs, such as scenario trees for stochastic mathematical programs. The quality of model outputs, i.e., the solutions obtained, depends on the quality of these inputs. However, solution quality is rarely assessed in a rigorous way. The connection between validation of model inputs and quality of the resulting solution is not immediate. This chapter discusses some efforts to formulate realistic probabilistic inputs and subsequently validate them in terms of the quality of solutions they produce. These include formulating probabilistic models based on statistical descriptions understandable to decision makers; conducting statistical tests to assess the validity of stochastic process models and their discretization; and conducting re-enactments to assess the quality of the formulation in terms of solution performance against observational data. Studies of long-term capacity expansion in service industries, including electric power, and short-term scheduling of thermal electricity generating units provide motivation and illustrations. The chapter concludes with directions for future research

    Effects of Anacetrapib in Patients with Atherosclerotic Vascular Disease

    Get PDF
    BACKGROUND: Patients with atherosclerotic vascular disease remain at high risk for cardiovascular events despite effective statin-based treatment of low-density lipoprotein (LDL) cholesterol levels. The inhibition of cholesteryl ester transfer protein (CETP) by anacetrapib reduces LDL cholesterol levels and increases high-density lipoprotein (HDL) cholesterol levels. However, trials of other CETP inhibitors have shown neutral or adverse effects on cardiovascular outcomes. METHODS: We conducted a randomized, double-blind, placebo-controlled trial involving 30,449 adults with atherosclerotic vascular disease who were receiving intensive atorvastatin therapy and who had a mean LDL cholesterol level of 61 mg per deciliter (1.58 mmol per liter), a mean non-HDL cholesterol level of 92 mg per deciliter (2.38 mmol per liter), and a mean HDL cholesterol level of 40 mg per deciliter (1.03 mmol per liter). The patients were assigned to receive either 100 mg of anacetrapib once daily (15,225 patients) or matching placebo (15,224 patients). The primary outcome was the first major coronary event, a composite of coronary death, myocardial infarction, or coronary revascularization. RESULTS: During the median follow-up period of 4.1 years, the primary outcome occurred in significantly fewer patients in the anacetrapib group than in the placebo group (1640 of 15,225 patients [10.8%] vs. 1803 of 15,224 patients [11.8%]; rate ratio, 0.91; 95% confidence interval, 0.85 to 0.97; P=0.004). The relative difference in risk was similar across multiple prespecified subgroups. At the trial midpoint, the mean level of HDL cholesterol was higher by 43 mg per deciliter (1.12 mmol per liter) in the anacetrapib group than in the placebo group (a relative difference of 104%), and the mean level of non-HDL cholesterol was lower by 17 mg per deciliter (0.44 mmol per liter), a relative difference of -18%. There were no significant between-group differences in the risk of death, cancer, or other serious adverse events. CONCLUSIONS: Among patients with atherosclerotic vascular disease who were receiving intensive statin therapy, the use of anacetrapib resulted in a lower incidence of major coronary events than the use of placebo. (Funded by Merck and others; Current Controlled Trials number, ISRCTN48678192 ; ClinicalTrials.gov number, NCT01252953 ; and EudraCT number, 2010-023467-18 .)

    Scandium In Aluminium Alloys: Physical Metallurgy, Properties And Applications

    No full text
    The use of scandium as an alloying element in aluminium has gained an increasing interest even though scandium is difficult to extract, which makes the metal very expensive. The three principle effects that can be obtained by adding scandium to aluminium alloys are (i)grain refinement during casting or welding, (ii)precipitation hardening from Al3Sc particles and (iii)grain structure control from Al3Sc dispersoids. Addition of scandium in combination with zirconium is particularly effective, and the reason for this is linked to the recently discovered core/shell structure of the Al3(Sc,Zr) dispersoids. The effects that can be obtained from a Sc addition are to a large extent dependent on the alloy system to which it is added, and an overview of some effects in all the major classes of wrought aluminium alloys is given. Industrial use of Sc-containing aluminium alloy is so far limited to a few aerospace applications and to sporting equipment, further use is dependent on a price reduction of scandium

    Variational Theory for Optimization under Stochastic Ambiguity

    Get PDF
    This paper is in review.Stochastic ambiguity provides a rich class of uncertainty models that includes those in stochastic, robust, risk-based, and semi-in nite optimization, and that accounts for both uncertainty about parameter values as well as incompleteness of the description of uncertainty. We provide a novel, unifying perspective on optimization under stochastic ambiguity that rests on two pillars. First, the paper models ambiguity by decision-dependent collections of cumulative distribution functions viewed as subsets of a metric space of upper semicontinuous functions. We derive a series of results for this set- ting including estimates of the metric, the hypo-distance, and a new proof of the equivalence with weak convergence. Second, we utilize the theory of lopsided convergence to establish existence, convergence, and approximation of solutions of optimization problems with stochastic ambiguity. For the rst time, we estimate the lop-distance between bifunctions and show that this leads to bounds on the solution quality for problems with stochastic ambiguity. Among other consequences, these results facilitate the study of the \price of robustness" and related quantities

    Multivariate Epi-Splines and Evolving Function Identification Problems

    Get PDF
    Includes erratumThe broad class of extended real-valued lower semicontinuous (lsc) functions on IRn captures nearly all functions of practical importance in equation solving, variational problems, fitting, and estimation. The paper develops piecewise polynomial functions, called epi-splines, that approximate any lsc function to an arbitrary level of accuracy. Epi-splines provide the foundation for the solution of a rich class of function identification problems that incorporate general constraints on the function to be identified including those derived from information about smoothness, shape, proximity to other functions, and so on. As such extrinsic information as well as observed function and subgradient values often evolve in applications, we establish conditions under which the computed epi-splines converge to the function we seek to identify. Numerical examples in response surface building and probability density estimation illustrate the framework.U. S. Army Research Laboratory and the U. S. Army Research Office grant 00101-80683U. S. Army Research Laboratory and the U. S. Army Research Office grant W911NF-10-1-0246U. S. Army Research Laboratory and the U. S. Army Research Office grant W911NF-12-1-0273U. S. Army Research Laboratory and the U. S. Army Research Office grant 00101-80683U. S. Army Research Laboratory and the U. S. Army Research Office grant W911NF-10-1-0246U. S. Army Research Laboratory and the U. S. Army Research Office grant W911NF-12-1-027

    Optimality functions and lopsided convergence

    Get PDF
    The article of record as published may be found at http://dx.doi.org/10.1007/s10957-015-0839-0Optimality functions pioneered by E. Polak characterize stationary points, quantify the degree with which a point fails to be stationary, and play central roles in algorithm development. For optimization problems requiring approximations, optimality functions can be used to ensure consistency in approximations, with the consequence that optimal and stationary points of the approximate problems indeed are approximately optimal and stationary for an original problem. In this paper, we review the framework and illustrate its application to nonlinear programming and other areas. Moreover, we introduce lopsided convergence of bifunctions on metric spaces and show that this notion of convergence is instrumental in establishing consistency of approximations. Lopsided convergence also leads to further characterizations of stationary points under perturbations and approximations.This material is based upon work supported in part by the US Army Research Laboratory and the US Army Research Office under Grant Numbers 00101-80683, W911NF-10-1-0246 and W911NF-12-1-0273.This material is based upon work supported in part by the US Army Research Laboratory and the US Army Research Office under Grant Numbers 00101-80683, W911NF-10-1-0246 and W911NF-12-1-0273

    Uncertainty Quantification using Exponential Epi-Splines

    Get PDF
    Proceedings of the International Conference on Structural Safety and Reliability, New York, NY.We quantify uncertainty in complex systems by a flexible, nonparametric framework for estimating probability density functions of output quantities of interest. The framework systematically incorporates soft information about the system from engineering judgement and experience to improve the estimates and ensure that they are consistent with prior knowledge. The framework is based on a maximum likelihood criterion, with epi-splines facilitating rapid solution of the resulting optimization problems. In four numerical examples with few realizations of the system output, we identify the main features of output densities even for nonsmooth and discontinuous system function and high-dimensional inputs
    • …
    corecore