6,153 research outputs found

    Superconvergence for Neumann boundary control problems governed by semilinear elliptic equations

    Full text link
    This paper is concerned with the discretization error analysis of semilinear Neumann boundary control problems in polygonal domains with pointwise inequality constraints on the control. The approximations of the control are piecewise constant functions. The state and adjoint state are discretized by piecewise linear finite elements. In a postprocessing step approximations of locally optimal controls of the continuous optimal control problem are constructed by the projection of the respective discrete adjoint state. Although the quality of the approximations is in general affected by corner singularities a convergence order of h2∣ln⁥h∣3/2h^2|\ln h|^{3/2} is proven for domains with interior angles smaller than 2π/32\pi/3 using quasi-uniform meshes. For larger interior angles mesh grading techniques are used to get the same order of convergence

    Backdoors to Normality for Disjunctive Logic Programs

    Full text link
    Over the last two decades, propositional satisfiability (SAT) has become one of the most successful and widely applied techniques for the solution of NP-complete problems. The aim of this paper is to investigate theoretically how Sat can be utilized for the efficient solution of problems that are harder than NP or co-NP. In particular, we consider the fundamental reasoning problems in propositional disjunctive answer set programming (ASP), Brave Reasoning and Skeptical Reasoning, which ask whether a given atom is contained in at least one or in all answer sets, respectively. Both problems are located at the second level of the Polynomial Hierarchy and thus assumed to be harder than NP or co-NP. One cannot transform these two reasoning problems into SAT in polynomial time, unless the Polynomial Hierarchy collapses. We show that certain structural aspects of disjunctive logic programs can be utilized to break through this complexity barrier, using new techniques from Parameterized Complexity. In particular, we exhibit transformations from Brave and Skeptical Reasoning to SAT that run in time O(2^k n^2) where k is a structural parameter of the instance and n the input size. In other words, the reduction is fixed-parameter tractable for parameter k. As the parameter k we take the size of a smallest backdoor with respect to the class of normal (i.e., disjunction-free) programs. Such a backdoor is a set of atoms that when deleted makes the program normal. In consequence, the combinatorial explosion, which is expected when transforming a problem from the second level of the Polynomial Hierarchy to the first level, can now be confined to the parameter k, while the running time of the reduction is polynomial in the input size n, where the order of the polynomial is independent of k.Comment: A short version will appear in the Proceedings of the Proceedings of the 27th AAAI Conference on Artificial Intelligence (AAAI'13). A preliminary version of the paper was presented on the workshop Answer Set Programming and Other Computing Paradigms (ASPOCP 2012), 5th International Workshop, September 4, 2012, Budapest, Hungar

    Exposures and exposure hedging in exchange rate risk management

    Get PDF
    Corporations are affected by increasing volatilities on foreign exchange markets. A response to this development was the creation of financial instruments, so called derivatives, in order to protect corporations from the effects of flexible exchange rates. To understand the included risks and to take correct decisions it is necessary to get a fundamental insight into exchange rate risk management. First it is the aim of this paper to systemize the possibilities of determining exchange rate risk as well as objectives of exchange rate risk management. In the second part of the paper a model to determine the optimal hedge ratio in the case of hedging transaction risks with forwards is described. --Currency Risk,Transaction Risk,Currency Forwards,Optimal Hedging

    Entropy of unimodular Lattice Triangulations

    Full text link
    Triangulations are important objects of study in combinatorics, finite element simulations and quantum gravity, where its entropy is crucial for many physical properties. Due to their inherent complex topological structure even the number of possible triangulations is unknown for large systems. We present a novel algorithm for an approximate enumeration which is based on calculations of the density of states using the Wang-Landau flat histogram sampling. For triangulations on two-dimensional integer lattices we achive excellent agreement with known exact numbers of small triangulations as well as an improvement of analytical calculated asymptotics. The entropy density is C=2.196(3)C=2.196(3) consistent with rigorous upper and lower bounds. The presented numerical scheme can easily be applied to other counting and optimization problems.Comment: 6 pages, 7 figure

    Stochastic efficiency measurement: The curse of theoretical consistency

    Get PDF
    The availability of efficiency estimation software – freely distributed via the internet and relatively easy to use – recently inflated the number of corresponding applications. The resulting efficiency estimates are used without a critical assessment with respect to the literature on theoretical consistency, flexibility and the choice of the appropriate functional form. The robustness of policy suggestions based on inferences from efficiency measures nevertheless crucially depends on theoretically well-founded estimates. This paper adresses stochastic efficiency measurement by critically reviewing the theoretical consistency of recently published technical efficiency estimates. The results confirm the need for a posteriori checking the regularity of the estimated frontier by the researcher and, if necessary, the a priori imposition of the theoretical requirements.functional form, stochastic efficiency analysis, theoretical consistency

    Bringing BCI into everyday life: Motor imagery in a pseudo realistic environment

    Get PDF
    Bringing Brain-Computer Interfaces (BCIs) into everyday life is a challenge because an out-of-lab environment implies the presence of variables that are largely beyond control of the user and the software application. This can severely corrupt signal quality as well as reliability of BCI control. Current BCI technology may fail in this application scenario because of the large amounts of noise, nonstationarity and movement artifacts. In this paper, we systematically investigate the performance of motor imagery BCI in a pseudo realistic environment. In our study 16 participants were asked to perform motor imagery tasks while dealing with different types of distractions such as vibratory stimulations or listening tasks. Our experiments demonstrate that standard BCI procedures are not robust to theses additional sources of noise, implicating that methods which work well in a lab environment, may perform poorly in realistic application scenarios. We discuss several promising research directions to tackle this important problem.BMBF, 01GQ1115, Adaptive Gehirn-Computer-Schnittstellen (BCI) in nichtstationÀren Umgebunge
    • 

    corecore