973,006 research outputs found
An Efficient Algorithm for Automatic Structure Optimization in X-ray Standing-Wave Experiments
X-ray standing-wave photoemission experiments involving multilayered samples
are emerging as unique probes of the buried interfaces that are ubiquitous in
current device and materials research. Such data require for their analysis a
structure optimization process comparing experiment to theory that is not
straightforward. In this work, we present a new computer program for optimizing
the analysis of standing-wave data, called SWOPT, that automates this
trial-and-error optimization process. The program includes an algorithm that
has been developed for computationally expensive problems: so-called black-box
simulation optimizations. It also includes a more efficient version of the Yang
X-ray Optics Program (YXRO) [Yang, S.-H., Gray, A.X., Kaiser, A.M., Mun, B.S.,
Sell, B.C., Kortright, J.B., Fadley, C.S., J. Appl. Phys. 113, 1 (2013)] which
is about an order of magnitude faster than the original version. Human
interaction is not required during optimization. We tested our optimization
algorithm on real and hypothetical problems and show that it finds better
solutions significantly faster than a random search approach. The total
optimization time ranges, depending on the sample structure, from minutes to a
few hours on a modern laptop computer, and can be up to 100x faster than a
corresponding manual optimization. These speeds make the SWOPT program a
valuable tool for realtime analyses of data during synchrotron experiments
Computer program for parameter optimization
Flexible, large scale digital computer program was designed for the solution of a wide range of multivariable parameter optimization problems. The program has the ability to solve constrained optimization problems involving up to one hundred parameters
The ADS general-purpose optimization program
The mathematical statement of the general nonlinear optimization problem is given as follows: find the vector of design variables, X, that will minimize f(X) subject to G sub J (x) + or - 0 j=1,m H sub K hk(X) = 0 k=1,l X Lower I approx less than X sub I approx. less than X U over I i = 1,N. The vector of design variables, X, includes all those variables which may be changed by the ADS program in order to arrive at the optimum design. The objective function F(X) to be minimized may be weight, cost or some other performance measure. If the objective is to be maximized, this is accomplished by minimizing -F(X). The inequality constraints include limits on stress, deformation, aeroelastic response or controllability, as examples, and may be nonlinear implicit functions of the design variables, X. The equality constraints h sub k(X) represent conditions that must be satisfied precisely for the design to be acceptable. Equality constraints are not fully operational in version 1.0 of the ADS program, although they are available in the Augmented Lagrange Multiplier method. The side constraints given by the last equation are used to directly limit the region of search for the optimum. The ADS program will never consider a design which is not within these limits
Convex Optimization for Linear Query Processing under Approximate Differential Privacy
Differential privacy enables organizations to collect accurate aggregates
over sensitive data with strong, rigorous guarantees on individuals' privacy.
Previous work has found that under differential privacy, computing multiple
correlated aggregates as a batch, using an appropriate \emph{strategy}, may
yield higher accuracy than computing each of them independently. However,
finding the best strategy that maximizes result accuracy is non-trivial, as it
involves solving a complex constrained optimization program that appears to be
non-linear and non-convex. Hence, in the past much effort has been devoted in
solving this non-convex optimization program. Existing approaches include
various sophisticated heuristics and expensive numerical solutions. None of
them, however, guarantees to find the optimal solution of this optimization
problem.
This paper points out that under (, )-differential privacy,
the optimal solution of the above constrained optimization problem in search of
a suitable strategy can be found, rather surprisingly, by solving a simple and
elegant convex optimization program. Then, we propose an efficient algorithm
based on Newton's method, which we prove to always converge to the optimal
solution with linear global convergence rate and quadratic local convergence
rate. Empirical evaluations demonstrate the accuracy and efficiency of the
proposed solution.Comment: to appear in ACM SIGKDD 201
An Integrated Programming and Development Environment for Adiabatic Quantum Optimization
Adiabatic quantum computing is a promising route to the computational power
afforded by quantum information processing. The recent availability of
adiabatic hardware has raised challenging questions about how to evaluate
adiabatic quantum optimization programs. Processor behavior depends on multiple
steps to synthesize an adiabatic quantum program, which are each highly
tunable. We present an integrated programming and development environment for
adiabatic quantum optimization called JADE that provides control over all the
steps taken during program synthesis. JADE captures the workflow needed to
rigorously specify the adiabatic quantum optimization algorithm while allowing
a variety of problem types, programming techniques, and processor
configurations. We have also integrated JADE with a quantum simulation engine
that enables program profiling using numerical calculation. The computational
engine supports plug-ins for simulation methodologies tailored to various
metrics and computing resources. We present the design, integration, and
deployment of JADE and discuss its potential use for benchmarking adiabatic
quantum optimization programs by the quantum computer science community.Comment: 28 pages, 17 figures, feedback welcomed, even if it's criticism; v2
manuscript updated based on reviewer feedback; v3 manuscript updated based on
reviewer feedback, title modifie
Computer optimization program finds values for several independent variables that minimize a dependent variable
Computer program finds values of independent variables which minimize the dependent variable. This optimization program has been used on the F-1 and J-2 engine programs to establish minimum film coolant requirements
- …
