4,579 research outputs found

    Green's functions for multiply connected domains via conformal mapping

    Get PDF
    A method is described for the computation of the Green's function in the complex plane corresponding to a set of K symmetrically placed polygons along the real axis. An important special case is a set of K real intervals. The method is based on a Schwarz-Christoffel conformal map of the part of the upper half-plane exterior to the problem domain onto a semi-infinite strip whose end contains K-1 slits. From the Green's function one can obtain a great deal of information about polynomial approximations, with applications in digital filters and matrix iteration. By making the end of the strip jagged, the method can be generalised to weighted Green's functions and weighted approximations

    Moving Beyond Human and Organizational Incongruence

    Get PDF
    Purpose – The purpose of this paper is offer an understanding on how value creation, on both a human and organizational level can be found and constructed through a shift away from Design/methodology/approach – The paper describes theoretical foundations on the concept of congruence and it purports to demonstrate the co-relation between incongruence and dysfunction in both organizations than individuals. Findings – The congruence theory, originally developed by Williams, co-author of this paper, refers to the capacity of individuals to align the individual stances to the organizational ones, thus leading to a system based on a system of balance among elements, conceptually paradoxical among themselves. The paper manages to demonstrate that performance needs to be found in system of reference other than the fiscal or financial diligence and more within the human dimensionality. Originality value – This paper explores the factors that block the creation of congruence in people and organizations and explores strategies that can simultaneous and congruently move people and organizations to a path of sustainabilityhuman sustainability; congruence and system thinking and theory

    Feature subset selection: a correlation based filter approach

    Get PDF
    Recent work has shown that feature subset selection can have a position affect on the performance of machine learning algorithms. Some algorithms can be slowed or their performance adversely affected by too much data some of which may be irrelevant or redundant to the learning task. Feature subset selection, then, is a method of enhancing the performance of learning algorithms, reducing the hypothesis search space, and, in some cases, reducing the storage requirement. This paper describes a feature subset selector that uses a correlation based heuristic to determine the goodness of feature subsets, and evaluates its effectiveness with three common ML algorithms: a decision tree inducer (C4.5), a naive Bayes classifier, and an instance based learner(IBI). Experiments using a number of standard data sets drawn from real and artificial domains are presented. Feature subset selection gave significant improvement for all three algorithms; C4.5 generated smaller decision trees

    Practical feature subset selection for machine learning

    Get PDF
    Machine learning algorithms automatically extract knowledge from machine readable information. Unfortunately, their success is usually dependant on the quality of the data that they operate on. If the data is inadequate, or contains extraneous and irrelevant information, machine learning algorithms may produce less accurate and less understandable results, or may fail to discover anything of use at all. Feature subset selection can result in enhanced performance, a reduced hypothesis search space, and, in some cases, reduced storage requirement. This paper describes a new feature selection algorithm that uses a correlation based heuristic to determine the “goodness” of feature subsets, and evaluates its effectiveness with three common machine learning algorithms. Experiments using a number of standard machine learning data sets are presented. Feature subset selection gave significant improvement for all three algorithm

    Quantum data hiding in the presence of noise

    Get PDF
    When classical or quantum information is broadcast to separate receivers, there exist codes that encrypt the encoded data such that the receivers cannot recover it when performing local operations and classical communication, but they can decode reliably if they bring their systems together and perform a collective measurement. This phenomenon is known as quantum data hiding and hitherto has been studied under the assumption that noise does not affect the encoded systems. With the aim of applying the quantum data hiding effect in practical scenarios, here we define the data-hiding capacity for hiding classical information using a quantum channel. Using this notion, we establish a regularized upper bound on the data hiding capacity of any quantum broadcast channel, and we prove that coherent-state encodings have a strong limitation on their data hiding rates. We then prove a lower bound on the data hiding capacity of channels that map the maximally mixed state to the maximally mixed state (we call these channels "mictodiactic"---they can be seen as a generalization of unital channels when the input and output spaces are not necessarily isomorphic) and argue how to extend this bound to generic channels and to more than two receivers.Comment: 12 pages, accepted for publication in IEEE Transactions on Information Theor

    Robust quantum data locking from phase modulation

    Get PDF
    Quantum data locking is a unique quantum phenomenon that allows a relatively short key to (un)lock an arbitrarily long message encoded in a quantum state, in such a way that an eavesdropper who measures the state but does not know the key has essentially no information about the encrypted message. The application of quantum data locking in cryptography would allow one to overcome the limitations of the one-time pad encryption, which requires the key to have the same length as the message. However, it is known that the strength of quantum data locking is also its Achilles heel, as the leakage of a few bits of the key or the message may in principle allow the eavesdropper to unlock a disproportionate amount of information. In this paper we show that there exist quantum data locking schemes that can be made robust against information leakage by increasing the length of the shared key by a proportionate amount. This implies that a constant size key can still encrypt an arbitrarily long message as long as a fraction of it remains secret to the eavesdropper. Moreover, we greatly simplify the structure of the protocol by proving that phase modulation suffices to generate strong locking schemes, paving the way to optical experimental realizations. Also, we show that successful data locking protocols can be constructed using random codewords, which very well could be helpful in discovering random codes for data locking over noisy quantum channels.Comment: A new result on the robustness of quantum data locking has been adde

    100 Years: A Centennial History of the School of Social Policy & Practice

    Get PDF
    This book is dedicated to the generations of Penn Social Work faculty, staff, and students who advanced the field of Social Work and made Philadelphia, the nation, and the world a better place to live. We are thankful to Susan Molofsky Todres, an Overseer of the School, who generously supported the production of this book. Her commitment to the School and the University is greatly appreciated. And finally our gratitude to Ram A. Cnaan, Professor and Associate Dean for Research and Doctoral Education at our School, for his patience and persistence in the planning, supervising and coordinating the details of this volume of institutional history

    Effects of Reamer-Femoral Component Offset on Cement Mantle Penetration in Hip Resurfacing Arthroplasty

    Get PDF
    Hip resurfacing arthroplasty has changed the treatment of end stage arthritis without severe deformity for young, active adults. Presently, there are varying clinical approaches to implant design selection and cementation techniques. The purpose of this project is to determine what amount of reamer-femoral component offset allows for the best cement penetration into the femoral head. Rapid prototyped femoral component models were produced with reamer femoral component offsets of 0.0 mm, 0.5 mm, and 1.0 mm. After implantation onto models of reamed femoral heads made from high-density open-cell reticulated carbon foam, cement penetration was assessed from cross-sections of the foam-implant unit. Increased offset was found to decrease the extent of cement over penetration from the dome and chamfer. Increased offset also yielded optimal cement penetration as measured from the walls. Finally, increased offset was found to increase the height of cement mantle formation while maintaining complete seating of all implants

    Extensions of bialgebras and their cohomological description

    Get PDF
    This paper develops the theory of crossed product Hopf algebras of pairs of arbitrary Hopf algebras. The theory generalizes the crossed products of (Maj90), the Abelian crossed products of (Hof94) and the crossed product algebras of (BCM86). First, conditions are given on the structures involved that are shown to be equivalent to the existence of the crossed product. Next, a bisimplicial object is found that gives a cohomological description of the conditions. Cleft extensions of pairs of arbitrary Hopf algebras are then defined. These generalize the cleft extension algebras of (Swe68) and the Abelian cleft extensions of (By93); they are equivalent to the extensions of (Hof94), while giving an internal definition of extensions. Finally, the equivalence of crossed products and extensions is proved. Throughout this paper extensive use is made of the relatively new technique of tensor diagrams, without which many of the calculations would be intractable
    corecore