2,526 research outputs found

    A Regularized Boundary Element Formulation for Contactless SAR Evaluations within Homogeneous and Inhomogeneous Head Phantoms

    Get PDF
    This work presents a Boundary Element Method (BEM) formulation for contactless electromagnetic field assessments. The new scheme is based on a regularized BEM approach that requires the use of electric measurements only. The regularization is obtained by leveraging on an extension of Calderon techniques to rectangular systems leading to well-conditioned problems independent of the discretization density. This enables the use of highly discretized Huygens surfaces that can be consequently placed very near to the radiating source. In addition, the new regularized scheme is hybridized with both surfacic homogeneous and volumetric inhomogeneous forward BEM solvers accelerated with fast matrix-vector multiplication schemes. This allows for rapid and effective dosimetric assessments and permits the use of inhomogeneous and realistic head phantoms. Numerical results corroborate the theory and confirms the practical effectiveness of all newly proposed formulations

    Economic Efficiency, Distributive Justice and Liability Rules

    Get PDF
    The main purpose of this paper is to show that the conflict between the considerations involving economic efficiency and those of distributive justice, in the context of assigning liability, is not as sharp as is generally believed to be the case. The condition of negligence liability which characterizes efficiency in the context of liability rules has an all-or-none character. Negligence liability requires that if one party is negligent and the other is not then the liability for the entire accident loss must fall on the negligent party. Thus within the framework of standard liability rules efficiency requirements preclude any non-efficiency considerations in cases where one party is negligent and the other is not. In this paper it is shown that a part of accident loss plays no part in providing appropriate incentives to the parties for taking due care and can therefore be apportioned on non-efficiency considerations. For a systematic analysis of efficiency requirements, a notion more general than that of a liability rule, namely, that of a decomposed liability rule is introduced. A complete characterization of efficient decomposed liability rules is provided in the paper. One important implication of the characterization theorems of this paper is that by decomposing accident loss in two parts, the scope for distributive considerations can be significantly broadened without sacrificing economic efficiency.Tort Law, Liability Rules, Decomposed liability Rules, Efficient Rules, Nash Equilibria, Negligence Liability, Distributive Justice

    The Belief-Function Approach to Aggregating Audit Evidence

    Get PDF
    This is the peer reviewed version of the following article: Srivastava, R. P., "The Belief-Function Approach to Aggregating Audit Evidence" International Journal of Intelligent Systems, Vol. 10, No. 3, March 1995, pp. 329-356., which has been published in final form at http://doi.org/10.1002/int.4550100304. This article may be used for non-commercial purposes in accordance with Wiley Terms and Conditions for Self-Archiving.In this article, we present the belief-function approach to aggregating audit evidence. The approach uses an evidential network to represent the structure of audit evidence. In turn, it allows us to treat all types of dependencies and relationships among accounts and items of evidence, and thus the approach should help the auditor conduct an efficient and effective audit. Aggregation of evidence is equivalent to propagation of beliefs in an evidential network. The paper describes in detail the three major steps involved in the propagation process. The first step deals with drawing the evidential network representing the connections among variables and items of evidence, based on the experience and judgment of the auditor. We then use the evidential network to determine the clusters of variables over which we have belief functions. The second step deals with constructing a Markov tree from the clusters of variables determined in step one. The third step deals with the propagation of belief functions in the Markov tree. We use a moderately complex example to illustrate the details of the aggregation process

    Image processing using a two-dimensional digital convolution filter.

    Get PDF

    Decision Making Under Ambiguity: A Belief-function Perspective

    Get PDF
    This is the publisher's version, which is being shared with permission, and which is also available electronically from: http://acs.polsl.pl/In this article, we discuss problems with probability theory in representing uncertainties encountered in the "real world" and show how belief functions can overcome these difficulties. Also, we discuss an expected utility approach of decision making under ambiguity using the belief function framework. In particular, we develop a proposition for decision making under ambiguity using the expected utility theory. This proposition is based on Strat's approach of resolving ambiguity in the problem using belief functions. We use the proposition to explain the Ellsberg paradox and model the decision making behavior under ambiguity. We use the empirical data of Einhorn and Hogarth to validate the proposition. Also, we use the proposition to predict several decision making behaviors under ambiguity for special conditions. Furthermore, we discuss the general condition under which the "switching" behavior, as observed by Einhorn and Hogarth, will occur using the concept of "precision measure" in the expected utility theory
    • …
    corecore