31 research outputs found
Computing the Set of Approximate Solutions of an MOP with Stochastic Search Algorithms
research reportIn this work we develop a framework for the approximation of the entire set of -efficient solutions of a multi-objective optimization problem with stochastic search algorithms. For this, we propose the set of interest, investigate its topology and state a convergence result for a generic stochastic search algorithm toward this set of interest. Finally, we present some numerical results indicating the practicability of the novel approach
Computing a Finite Size Representation of the Set of Approximate Solutions of an MOP
Recently, a framework for the approximation of the entire set of
-efficient solutions (denote by ) of a multi-objective
optimization problem with stochastic search algorithms has been proposed. It
was proven that such an algorithm produces -- under mild assumptions on the
process to generate new candidate solutions --a sequence of archives which
converges to in the limit and in the probabilistic sense. The
result, though satisfactory for most discrete MOPs, is at least from the
practical viewpoint not sufficient for continuous models: in this case, the set
of approximate solutions typically forms an -dimensional object, where
denotes the dimension of the parameter space, and thus, it may come to
perfomance problems since in practise one has to cope with a finite archive.
Here we focus on obtaining finite and tight approximations of , the
latter measured by the Hausdorff distance. We propose and investigate a novel
archiving strategy theoretically and empirically. For this, we analyze the
convergence behavior of the algorithm, yielding bounds on the obtained
approximation quality as well as on the cardinality of the resulting
approximation, and present some numerical results
Convergence of Stochastic Search Algorithms to Finite Size Pareto Set Approximations
In this work we study the convergence of generic stochastic search algorithms toward the Pareto set of continuous multi-objective optimization problems. The focus is on obtaining a finite approximation that should capture the entire solution set in a suitable sense, which will be defined using the concept of -dominance. Under mild assumptions about the process to generate new candidate solutions, the limit approximation set will be determined entirely by the archiving strategy. We investigate two different archiving strategies which lead to a different limit behavior of the algorithms, yielding bounds on the obtained approximation quality as well as on the cardinality of the resulting Pareto set approximation. Finally, we demonstrate the potential for a possible hybridization of a given stochastic search algorithm with a particular local search strategy -- multi-objective continuation methods -- by showing that the concept of -dominance can be integrated into this approach in a suitable way
A new memetic strategy for the numerical treatment of multi-objective optimization problems
In this paper we propose a novel iterative search procedure for multi-objective optimization problems. The iteration process – though derivative free – utilizes the geometry of the directional cones of such optimization problems, and is capable both of moving toward and along the (local) Pareto set depending on the distance of the current iterate toward this set. Next, we give one possible way of integrating this local search procedure into a given EMO algorithm result-ing in a novel memetic strategy. Finally, we present some numerical results on some well-known benchmark problems indicating the strength of both the local search strategy as well as the new hybrid approach
Convergence of Stochastic Search Algorithms to Finite Size Pareto Set Approximations
In this work we study the convergence of generic stochastic search algorithms toward the Pareto set of continuous multi-objective optimization problems. The focus is on obtaining a finite approximation that should capture the entire solution set in a suitable sense, which will be defined using the concept of -dominance. Under mild assumptions about the process to generate new candidate solutions, the limit approximation set will be determined entirely by the archiving strategy. We investigate two different archiving strategies which lead to a different limit behavior of the algorithms, yielding bounds on the obtained approximation quality as well as on the cardinality of the resulting Pareto set approximation. Finally, we demonstrate the potential for a possible hybridization of a given stochastic search algorithm with a particular local search strategy -- multi-objective continuation methods -- by showing that the concept of -dominance can be integrated into this approach in a suitable way
Brain reserve contributes to distinguishing preclinical Alzheimer's stages 1 and 2
BackgroundIn preclinical Alzheimer's disease, it is unclear why some individuals with amyloid pathologic change are asymptomatic (stage 1), whereas others experience subjective cognitive decline (SCD, stage 2). Here, we examined the association of stage 1 vs. stage 2 with structural brain reserve in memory-related brain regions.MethodsWe tested whether the volumes of hippocampal subfields and parahippocampal regions were larger in individuals at stage 1 compared to asymptomatic amyloid-negative older adults (healthy controls, HCs). We also tested whether individuals with stage 2 would show the opposite pattern, namely smaller brain volumes than in amyloid-negative individuals with SCD. Participants with cerebrospinal fluid (CSF) biomarker data and bilateral volumetric MRI data from the observational, multi-centric DZNE-Longitudinal Cognitive Impairment and Dementia Study (DELCODE) study were included. The sample comprised 95 amyloid-negative and 26 amyloid-positive asymptomatic participants as well as 104 amyloid-negative and 47 amyloid-positive individuals with SCD. Volumes were based on high-resolution T2-weighted images and automatic segmentation with manual correction according to a recently established high-resolution segmentation protocol.ResultsIn asymptomatic individuals, brain volumes of hippocampal subfields and of the parahippocampal cortex were numerically larger in stage 1 compared to HCs, whereas the opposite was the case in individuals with SCD. MANOVAs with volumes as dependent data and age, sex, years of education, and DELCODE site as covariates showed a significant interaction between diagnosis (asymptomatic versus SCD) and amyloid status (Ass42/40 negative versus positive) for hippocampal subfields. Post hoc paired comparisons taking into account the same covariates showed that dentate gyrus and CA1 volumes in SCD were significantly smaller in amyloid-positive than negative individuals. In contrast, CA1 volumes were significantly (p = 0.014) larger in stage 1 compared with HCs.ConclusionsThese data indicate that HCs and stages 1 and 2 do not correspond to linear brain volume reduction. Instead, stage 1 is associated with larger than expected volumes of hippocampal subfields in the face of amyloid pathology. This indicates a brain reserve mechanism in stage 1 that enables individuals with amyloid pathologic change to be cognitively normal and asymptomatic without subjective cognitive decline
Creative destruction in science
Drawing on the concept of a gale of creative destruction in a capitalistic economy, we argue that initiatives to assess the robustness of findings in the organizational literature should aim to simultaneously test competing ideas operating in the same theoretical space. In other words, replication efforts should seek not just to support or question the original findings, but also to replace them with revised, stronger theories with greater explanatory power. Achieving this will typically require adding new measures, conditions, and subject populations to research designs, in order to carry out conceptual tests of multiple theories in addition to directly replicating the original findings. To illustrate the value of the creative destruction approach for theory pruning in organizational scholarship, we describe recent replication initiatives re-examining culture and work morality, working parents\u2019 reasoning about day care options, and gender discrimination in hiring decisions.
Significance statement
It is becoming increasingly clear that many, if not most, published research findings across scientific fields are not readily replicable when the same method is repeated. Although extremely valuable, failed replications risk leaving a theoretical void\u2014 reducing confidence the original theoretical prediction is true, but not replacing it with positive evidence in favor of an alternative theory. We introduce the creative destruction approach to replication, which combines theory pruning methods from the field of management with emerging best practices from the open science movement, with the aim of making replications as generative as possible. In effect, we advocate for a Replication 2.0 movement in which the goal shifts from checking on the reliability of past findings to actively engaging in competitive theory testing and theory building.
Scientific transparency statement
The materials, code, and data for this article are posted publicly on the Open Science Framework, with links provided in the article
The CMS Phase-1 pixel detector upgrade
The CMS detector at the CERN LHC features a silicon pixel detector as its innermost subdetector. The original CMS pixel detector has been replaced with an upgraded pixel system (CMS Phase-1 pixel detector) in the extended year-end technical stop of the LHC in 2016/2017. The upgraded CMS pixel detector is designed to cope with the higher instantaneous luminosities that have been achieved by the LHC after the upgrades to the accelerator during the first long shutdown in 2013–2014. Compared to the original pixel detector, the upgraded detector has a better tracking performance and lower mass with four barrel layers and three endcap disks on each side to provide hit coverage up to an absolute value of pseudorapidity of 2.5. This paper describes the design and construction of the CMS Phase-1 pixel detector as well as its performance from commissioning to early operation in collision data-taking.Peer reviewe
Results of the Numerical and Evolutionary Optimization Workshop NEO 2015 held at September 23-25 2015 in Tijuana, Mexico
This volume comprises a selection of works presented at the Numerical and Evolutionary Optimization (NEO) workshop held in September 2015 in Tijuana, Mexico. The development of powerful search and optimization techniques is of great importance in today’s world that requires researchers and practitioners to tackle a growing number of challenging real-world problems. In particular, there are two well-established and widely known fields that are commonly applied in this area: (i) traditional numerical optimization techniques and (ii) comparatively recent bio-inspired heuristics. Both paradigms have their unique strengths and weaknesses, allowing them to solve some challenging problems while still failing in others.The goal of the NEO workshop series is to bring together people from these and related fields to discuss, compare and merge their complimentary perspectives in order to develop fast and reliable hybrid methods that maximize the strengths and minimize the weaknesses of the underlying paradigms. Through this effort, we believe that the NEO can promote the development of new techniques that are applicable to a broader class of problems. Moreover, NEO fosters the understanding and adequate treatment of real-world problems particularly in emerging fields that affect us all such as health care, smart cities, big data, among many others. The extended papers the NEO 2015 that comprise this book make a contribution to this goal.Analysis and classification of mental states of vigilance with evolutionary computatio
Effects of local search in genetic programming
International audienc