78 research outputs found

    Index handling and assign optimization for Algorithmic Differentiation reuse index managers

    Full text link
    For operator overloading Algorithmic Differentiation tools, the identification of primal variables and adjoint variables is usually done via indices. Two common schemes exist for their management and distribution. The linear approach is easy to implement and supports memory optimization with respect to copy statements. On the other hand, the reuse approach requires more implementation effort but results in much smaller adjoint vectors, which are more suitable for the vector mode of Algorithmic Differentiation. In this paper, we present both approaches, how to implement them, and discuss their advantages, disadvantages and properties of the resulting Algorithmic Differentiation type. In addition, a new management scheme is presented which supports copy optimizations and the reuse of indices, thus combining the advantages of the other two. The implementations of all three schemes are compared on a simple synthetic example and on a real world example using the computational fluid dynamics solver in SU2.Comment: 20 pages, 14 figures, 4 table

    Reverse-Mode Automatic Differentiation of Compiled Programs

    Full text link
    Tools for algorithmic differentiation (AD) provide accurate derivatives of computer-implemented functions for use in, e. g., optimization and machine learning (ML). However, they often require the source code of the function to be available in a restricted set of programming languages. As a step towards making AD accessible for code bases with cross-language or closed-source components, we recently presented the forward-mode AD tool Derivgrind. It inserts forward-mode AD logic into the machine code of a compiled program using the Valgrind dynamic binary instrumentation framework. This work extends Derivgrind, adding the capability to record the real-arithmetic evaluation tree, and thus enabling operator overloading style reverse-mode AD for compiled programs. We maintain the high level of correctness reported for Derivgrind's forward mode, failing the same few testcases in an extensive test suite for the same well-understood reasons. Runtime-wise, the recording slows down the execution of a compiled 64-bit benchmark program by a factor of about 180.Comment: 17 pages, 5 figures, 1 listin

    Forward-Mode Automatic Differentiation of Compiled Programs

    Full text link
    Algorithmic differentiation (AD) is a set of techniques that provide partial derivatives of computer-implemented functions. Such a function can be supplied to state-of-the-art AD tools via its source code, or via an intermediate representation produced while compiling its source code. We present the novel AD tool Derivgrind, which augments the machine code of compiled programs with forward-mode AD logic. Derivgrind leverages the Valgrind instrumentation framework for a structured access to the machine code, and a shadow memory tool to store dot values. Access to the source code is required at most for the files in which input and output variables are defined. Derivgrind's versatility comes at the price of scaling the run-time by a factor between 30 and 75, measured on a benchmark based on a numerical solver for a partial differential equation. Results of our extensive regression test suite indicate that Derivgrind produces correct results on GCC- and Clang-compiled programs, including a Python interpreter, with a small number of exceptions. While we provide a list of scenarios that Derivgrind does not handle correctly, nearly all of them are academic counterexamples or originate from highly optimized math libraries. As long as differentiating those is avoided, Derivgrind can be applied to an unprecedentedly wide range of cross-language or partially closed-source software with little integration efforts.Comment: 21 pages, 3 figures, 3 tables, 5 listing

    La incuestionabilidad del riesgo

    Get PDF
    Con anterioridad a la década de 1980, la literatura especializada en análisis y gestión del riesgo estaba dominada por la llamada visión tecnocrática o dominante. Esta visión establecía que los desastres naturales eran sucesos físicos extremos, producidos por una naturaleza caprichosa, externos a lo social y que requerían soluciones tecnológicas y de gestión por parte de expertos. Este artículo se centra en desarrollar una nueva explicación para entender la persistencia hegemónica de la visión tecnocrática basada en el concepto de incuestionabilidad del riesgo. Esta propuesta conceptual hace referencia a la incapacidad y desidia de los expertos, científicos y tomadores de decisiones en general (claimmakers) de identificar y actuar sobre las causas profundas de la producción del riesgo ya que ello conllevaría a cuestionar los imperativos normativos, las necesidades de las elites y los estilos de vida del actual sistema socioeconómico globalizado.Before de 1980s, the natural hazard analysis and management specialized literature was dominated by the so called "dominant" or "technocratic" view. Such perspective had established that natural disasters are extreme physical events caused by a whimsical nature and that these events are external to society. These events required technological and management solutions developed by experts. The current article aims at addressing a new explanatory component in the hegemonic persistence of the technocratic view. Such assumption was based on the "unquestionability of the risk" concept. It is stated that the "unquestionability of the risk" is the overall incapacity and neglect of experts, scientists and decision makers to identify and act over the deep causes of risk production, since it would make them question the normative imperatives and the demands from the elite as well as the life style in nowadays globalized socio-economic system

    Mind the Costs: Rescaling and Multi-Level Environmental Governance in Venice Lagoon

    Get PDF
    Competences over environmental matters are distributed across agencies at different scales on a national-to-local continuum. This article adopts a transaction costs economics perspective in order to explore the question whether, in the light of a particular problem, the scale at which a certain competence is attributed can be reconsidered. Specifically, it tests whether a presumption of least-cost operation concerning an agency at a given scale can hold. By doing so, it investigates whether the rescaling of certain tasks, aiming at solving a scale-related problem, is likely to produce an increase in costs for day-to-day agency operations as compared to the status quo. The article explores such a perspective for the case of Venice Lagoon. The negative aspects of the present arrangement concerning fishery management and morphological remediation are directly linked to the scale of the agencies involved. The analysis suggests that scales have been chosen correctly, at least from the point of view of the costs incurred to the agencies involved. Consequently, a rescaling of those agencies does not represent a viable option

    Eco-politics beyond the paradigm of sustainability: A conceptual framework and research agenda

    Get PDF
    This contribution sketches a conceptual framework for the analysis of the post-ecologist era and outlines a research agenda for investigating its politics of unsustainability. The article suggests that this new era and its particular mode of eco-politics necessitate a new environmental sociology. Following a review of some achievements and limitations of the paradigm of sustainability, the concept of post-ecologism is related to existing discourses of the ‘end of nature’, the ‘green backlash’ and the ‘death of environmentalism’. The shifting terrain of eco-politics in the late-modern condition is mapped and an eco-sociological research programme outlined centring on the post-ecologist question: How do advanced modern capitalist consumer democracies try and manage to sustain what is known to be unsustainable

    Desastres naturais: convivência com o risco

    Get PDF
    Estudos sobre riscos de desastres naturais têm-se aprimorado de uma abordagem fisicalista para uma perspectiva socioambiental. No entanto, planejamento e gestão ainda seguem o paradigma antropocêntrico da superioridade humana e do poder ilimitado da ciência e tecnologia. Evidencia-se uma incapacidade cognitiva, cultural e de ação por parte de especialistas, científicos e tomadores de decisão (claimmakers) para identificar e atuar sobre as causas sociais da produção de risco. Frente a uma ciência cartesiana e positivista na resolução de problemas, baseada na segurança e controle sobre o mundo natural, propõe-se uma ciência pós-normal que considera os riscos e incertezas do conhecimento científico e das problemáticas ambientais. Essa nova proposta também incide sobre a participação e o diálogo entre stakeholders como referência para ampliar a qualidade do saber científico e o entendimento da complexidade das questões ambientais. Este artigo discute a necessidade de se promover um salto epistemológico sobre a forma de pensar e produzir conhecimentos, bem como implementar a gestão dos riscos de desastres, tendo como objeto de estudo processos de comunicação e educação para prevenção de desastres.Studies on the risks of natural disasters have improved from a physicalist approach to a social and environmental perspective. However, planning and management still follow the anthropocentric paradigm of human superiority and the unlimited power of science and technology, evincing a cognitive, cultural and action inability on the part of experts, scientists and decision makers (or, rather, claim makers) to identify and act upon the social causes of risk production. In view of the Cartesian and Positivist science used to solve problems, based on security and on control over the natural world, a post-normal science has been proposed that considers the risks and uncertainties of scientific knowledge and environmental issues. This new approach encompasses participation and dialogue among stakeholders as a means to increase the quality of scientific knowledge and acknowledge the complexity of environmental issues. This article discusses the need for an epistemological leap on how we think and produce knowledge, as well as for implementing the management of disaster risk. Its objects of study are communication processes and education for disaster prevention

    Environmentalism in the EU-28 context: the impact of governance quality on environmental energy efficiency

    Get PDF
    Environmental policies are a significant cornerstone of a developed economy, but the question that arises is whether such policies lead to a sustainable growth path. It is clear that the energy sector plays a pivotal role in environmental policies, and although the current literature has focused on examining the link between energy consumption and economic growth through an abundance of studies, it does not explicitly consider the role of institutional or governance quality variables in the process. Both globalization and democracy are important drivers of sustainability, while environmentalism is essential for the objective of gaining a “better world.” Governance quality is expected to be the key, not only for economic purposes but also for the efficiency of environmental policies. To that end, the analysis in this paper explores the link between governance quality and energy efficiency for the EU-28 countries, spanning the period 1995 to 2014. The findings document that there is a nexus between energy efficiency and income they move together: the most efficient countries are in the group with higher GDP per capita. Furthermore, the results show that governance quality is an important driver of energy efficiency and, hence, of environmental policies.University of Granad

    Exploration of differentiability in a proton computed tomography simulation framework

    Get PDF
    Objective. Gradient-based optimization using algorithmic derivatives can be a useful technique to improve engineering designs with respect to a computer-implemented objective function. Likewise, uncertainty quantification through computer simulations can be carried out by means of derivatives of the computer simulation. However, the effectiveness of these techniques depends on how ‘well-linearizable’ the software is. In this study, we assess how promising derivative information of a typical proton computed tomography (pCT) scan computer simulation is for the aforementioned applications. Approach. This study is mainly based on numerical experiments, in which we repeatedly evaluate three representative computational steps with perturbed input values. We support our observations with a review of the algorithmic steps and arithmetic operations performed by the software, using debugging techniques. Main results. The model-based iterative reconstruction (MBIR) subprocedure (at the end of the software pipeline) and the Monte Carlo (MC) simulation (at the beginning) were piecewise differentiable. However, the observed high density and magnitude of jumps was likely to preclude most meaningful uses of the derivatives. Jumps in the MBIR function arose from the discrete computation of the set of voxels intersected by a proton path, and could be reduced in magnitude by a ‘fuzzy voxels’ approach. The investigated jumps in the MC function arose from local changes in the control flow that affected the amount of consumed random numbers. The tracking algorithm solves an inherently non-differentiable problem. Significance. Besides the technical challenges of merely applying AD to existing software projects, the MC and MBIR codes must be adapted to compute smoother functions. For the MBIR code, we presented one possible approach for this while for the MC code, this will be subject to further research. For the tracking subprocedure, further research on surrogate models is necessary
    corecore