490 research outputs found

    C++ lambda expressions and closures

    Get PDF
    AbstractA style of programming that uses higher-order functions has become common in C++, following the introduction of the Standard Template Library (STL) into the standard library. In addition to their utility as arguments to STL algorithms, function parameters are useful as callbacks on GUI events, defining tasks to be executed in a thread, and so forth. C++’s mechanisms for defining functions or function objects are, however, rather verbose, and they often force the function’s definition to be placed far from its use. As a result, C++ frustrates programmers in taking full advantage of its own standard libraries. The effective use of modern C++ libraries calls for a concise mechanism for defining small one-off functions in the language, a need that can be fulfilled with lambda expressions.This paper describes a design and implementation of language support for lambda expressions in C++. C++’s compilation model, where activation records are maintained in a stack, and the lack of automatic object lifetime management make safe lambda functions and closures challenging: if a closure outlives its scope of definition, references stored in a closure dangle. Our design is careful to balance between conciseness of syntax and explicit annotations to guarantee safety. The presented design is included in the draft specification of the forthcoming major revision of the ISO C++ standard, dubbed C++0x. In rewriting typical C++ programs to take advantage of lambda functions, we observed clear benefits, such as reduced code size and improved clarity

    Optimal compromise between incompatible conditional probability distributions, with application to Objective Bayesian Kriging

    Full text link
    Models are often defined through conditional rather than joint distributions, but it can be difficult to check whether the conditional distributions are compatible, i.e. whether there exists a joint probability distribution which generates them. When they are compatible, a Gibbs sampler can be used to sample from this joint distribution. When they are not, the Gibbs sampling algorithm may still be applied, resulting in a "pseudo-Gibbs sampler". We show its stationary probability distribution to be the optimal compromise between the conditional distributions, in the sense that it minimizes a mean squared misfit between them and its own conditional distributions. This allows us to perform Objective Bayesian analysis of correlation parameters in Kriging models by using univariate conditional Jeffreys-rule posterior distributions instead of the widely used multivariate Jeffreys-rule posterior. This strategy makes the full-Bayesian procedure tractable. Numerical examples show it has near-optimal frequentist performance in terms of prediction interval coverage

    Revisiting Language Support for Generic Programming: When Genericity Is a Core Design Goal

    Get PDF
    Context Generic programming, as defined by Stepanov, is a methodology for writing efficient and reusable algorithms by considering only the required properties of their underlying data types and operations. Generic programming has proven to be an effective means of constructing libraries of reusable software components in languages that support it. Generics-related language design choices play a major role in how conducive generic programming is in practice. Inquiry Several mainstream programming languages (e.g. Java and C++) were first created without generics; features to support generic programming were added later, gradually. Much of the existing literature on supporting generic programming focuses thus on retrofitting generic programming into existing languages and identifying related implementation challenges. Is the programming experience significantly better, or different when programming with a language designed for generic programming without limitations from prior language design choices? Approach We examine Magnolia, a language designed to embody generic programming. Magnolia is representative of an approach to language design rooted in algebraic specifications. We repeat a well-known experiment, where we put Magnolia’s generic programming facilities under scrutiny by implementing a subset of the Boost Graph Library, and reflect on our development experience. Knowledge We discover that the idioms identified as key features for supporting Stepanov-style generic programming in the previous studies and work on the topic do not tell a full story. We clarify which of them are more of a means to an end, rather than fundamental features for supporting generic programming. Based on the development experience with Magnolia, we identify variadics as an additional key feature for generic programming and point out limitations and challenges of genericity by property. Grounding Our work uses a well-known framework for evaluating the generic programming facilities of a language from the literature to evaluate the algebraic approach through Magnolia, and we draw comparisons with well-known programming languages. Importance This work gives a fresh perspective on generic programming, and clarifies what are fundamental language properties and their trade-offs when considering supporting Stepanov-style generic programming. The understanding of how to set the ground for generic programming will inform future language design.publishedVersio

    Building Qutrit Diagonal Gates from Phase Gadgets

    Full text link
    Phase gadgets have proved to be an indispensable tool for reasoning about ZX-diagrams, being used in optimisation and simulation of quantum circuits and the theory of measurement-based quantum computation. In this paper we study phase gadgets for qutrits. We present the flexsymmetric variant of the original qutrit ZX-calculus, which allows for rewriting that is closer in spirit to the original (qubit) ZX-calculus. In this calculus phase gadgets look as you would expect, but there are non-trivial differences in their properties. We devise new qutrit-specific tricks to extend the graphical Fourier theory of qubits, resulting in a translation between the 'additive' phase gadgets and a 'multiplicative' counterpart we dub phase multipliers. This enables us to generalise the qubit notion of multiple-control to qutrits in two ways. The first type is controlling on a single tritstring, while the second type applies the gate a number of times equal to the tritwise multiplication modulo 3 of the control qutrits.We show how both types of control can be implemented for any qutrit Z or X phase gate, ancilla-free, and using only Clifford and phase gates. The first requires a polynomial number of gates and exponentially small phases, while the second requires an exponential number of gates, but constant sized phases. This is interesting, because such a construction is not possible in the qubit setting. As an application of these results we find a construction for emulating arbitrary qubit diagonal unitaries, and specifically find an ancilla-free emulation for the qubit CCZ gate that only requires three single-qutrit non-Clifford gates, provably lower than the four T gates needed for qubits with ancilla.Comment: In Proceedings QPL 2022, arXiv:2311.0837

    An advanced specular and diffuse Bidirectional Reflectance Distribution Function target model for a synthetic aperture ground penetrating radar

    Get PDF
    Specific radars that are designed to radiate electromagnetic (EM) energy into the ground for the purpose of detecting and identifying underground targets are called ground penetrating radars (GPR). High resolution three-dimensional images of the underground environment can be produced using a bistatic, synthetic aperture radar (SAR) processing technique. Information pertaining to the underground scenario can be extracted from the three-dimensional images through methodical post data analysis. Theoretical models and proof-of-concept designs are used to validate and advance deep GPR research and development efforts. The theoretical modeling of a deep GPR system is a lucrative method to obtain realistic deep GPR results and analysis. Realistic theoretical deep GPR models must correctly model a GPR system as well as the effects of energy interactions on the deep GPR data. The desire for a realistic deep GPR model is to aid in the ultimate efforts of one day making field deployable and airborne deep GPRs. Complex energy interactions take place when EM energy propagates through a high dielectric medium creating adverse effects on GPR data. These energy interactions include specular and diffuse reflections, attenuation, and dispersion. A valid theoretical model must be capable of producing realistic joint specular and diffuse reflection at different dielectric boundaries. This thesis proposes and analyzes the Bidirectional Reflectance Distribution Function (BRDF) as a valid specular and diffuse reflectance model used in the generation of realistic GPR data. This thesis also introduces and analyzes the direct path signal and the air-soil interface commonly found in bistatic GPR systems. The efforts of this thesis will provide a realistic GPR model that can be used in the development of more advanced systems. To assess the validity of the proposed model comprehensive testing and analysis has been completed. Intense analysis of the realistic theoretical model introduced in this thesis included variations in the target\u27s spatial orientation, size, and position. Analysis also examined the validity of the BRDF as a reflectance model along with the modeling of the direct path signal. The model was then compared to known real GPR data. The authentic energy interaction created through the use of the BRDF and incorporation of path attenuation, dispersion, direct path signal, and the air-soil interface has proven to produce acceptable results

    Control of Electrodialysis Desalination Systems as Smart Loads in Microgrids with High Penetration of Renewable Generation

    Get PDF
    Water desalination systems connected to microgrids with high penetration of renewable energy generation are frequently used to promote the development of remote areas. These microgrids often have power quality and even stability problems. This work shows that electrodialysis desalination systems can be managed as smart loads, that is, they can contribute to the power balance and voltage regulation of the microgrid without neglecting its main function of water desalination. For this, a model of multiple inputs and multiple outputs for a desalination system is proposed where the variables to control are the treated water concentration and the active and reactive powers demanded by the desalination system. Based on this model, a control law is proposed that allows to face the complexity of the non-linear system in a simple and precise way. The proposed control guarantees the low salt concentration of the drinking water and favors the energy balance of the microgrid, allowing better control of the power quality and greater penetration of renewable energy generation

    Obtaining Real-World Benchmark Programs from Open-Source Repositories Through Abstract-Semantics Preserving Transformations

    Get PDF
    Benchmark programs are an integral part of program analysis research. Researchers use benchmark programs to evaluate existing techniques and test the feasibility of new approaches. The larger and more realistic the set of benchmarks, the more confident a researcher can be about the correctness and reproducibility of their results. However, obtaining an adequate set of benchmark programs has been a long-standing challenge in the program analysis community. In this thesis, we present the APT tool, a framework we designed and implemented to automate the generation of realistic benchmark programs suitable for program analysis evaluations. Our tool targets intra-procedural analyses that operate on an integer domain, specifically symbolic execution. The framework is composed of three main stages. In the first stage, the tool extracts potential benchmark programs from open-source repositories suitable for symbolic execution. In the second stage, the tool transforms the extracted programs into compilable, stand-alone benchmarks by removing external dependencies and nonlinear expressions. In the third stage, the benchmarks are verified and made available for the user. We have designed our transformation algorithms to remove program dependencies and nonlinear expressions while preserving their semantics-equivalence in the abstraction of symbolic analysis. That is, we want the information the analysis computes on the original program and its transformed version to be equivalent. Our work provides static analysis researchers with concise, compilable benchmark programs that are relevant to symbolic execution, allowing them to focus their efforts on advancing analysis techniques. Furthermore, our work benefits the software engineering community by enabling static analysis researchers to perform benchmarking with a large, realistic set of programs, thus strengthening the empirical evidence of the advancements in static program analysis
    • …
    corecore