160,134 research outputs found

    The 2007-09 financial crisis and bank opaqueness

    Get PDF
    Doubts about the accuracy with which outside investors can assess a banking firmā€™s value motivate many government interventions in the banking market. The recent financial crisis has reinforced concerns about the possibility that banks are unusually opaque. Yet the empirical evidence, thus far, is mixed. This paper examines the trading characteristics of bank shares over the period from January 1990 through September 2009. We find that bank share trading exhibits sharply different features before vs. during the crisis. Until mid-2007, large (NYSE-traded) banking firms appear to be no more opaque than a set of control firms, and smaller (NASD-traded) banks are, at most, slightly more opaque. During the crisis, however, both large and small banking firms exhibit a sharp increase in opacity, consistent with the policy interventions implemented at the time. Although portfolio composition is significantly related to market microstructure variables, no specific asset category(s) stand out as particularly important in determining bank opacity.Banks and banking ; Stock market ; Financial crises

    spChains: A Declarative Framework for Data Stream Processing in Pervasive Applications

    Get PDF
    Pervasive applications rely on increasingly complex streams of sensor data continuously captured from the physical world. Such data is crucial to enable applications to ``understand'' the current context and to infer the right actions to perform, be they fully automatic or involving some user decisions. However, the continuous nature of such streams, the relatively high throughput at which data is generated and the number of sensors usually deployed in the environment, make direct data handling practically unfeasible. Data not only needs to be cleaned, but it must also be filtered and aggregated to relieve higher level algorithms from near real-time handling of such massive data flows. We propose here a stream-processing framework (spChains), based upon state-of-the-art stream processing engines, which enables declarative and modular composition of stream processing chains built atop of a set of extensible stream processing blocks. While stream processing blocks are delivered as a standard, yet extensible, library of application-independent processing elements, chains can be defined by the pervasive application engineering team. We demonstrate the flexibility and effectiveness of the spChains framework on two real-world applications in the energy management and in the industrial plant management domains, by evaluating them on a prototype implementation based on the Esper stream processo

    Artificial Intelligence and Patient-Centered Decision-Making

    Get PDF
    Advanced AI systems are rapidly making their way into medical research and practice, and, arguably, it is only a matter of time before they will surpass human practitioners in terms of accuracy, reliability, and knowledge. If this is true, practitioners will have a prima facie epistemic and professional obligation to align their medical verdicts with those of advanced AI systems. However, in light of their complexity, these AI systems will often function as black boxes: the details of their contents, calculations, and procedures cannot be meaningfully understood by human practitioners. When AI systems reach this level of complexity, we can also speak of black-box medicine. In this paper, we want to argue that black-box medicine conflicts with core ideals of patient-centered medicine. In particular, we claim, black-box medicine is not conducive for supporting informed decision-making based on shared information, shared deliberation, and shared mind between practitioner and patient

    H-alpha features with hot onsets III. Fibrils in Lyman-alpha and with ALMA

    Full text link
    In H-alpha most of the solar surface is covered by dense canopies of long opaque fibrils, but predictions for quiet-Sun observations with ALMA have ignored this fact. Comparison with Ly-alpha suggests that the large opacity of H-alpha fibrils is caused by hot precursor events. Application of a recipe that assumes momentary Saha-Boltzmann extinction during their hot onset to millimeter wavelengths suggests that ALMA will observe H-alpha-like fibril canopies, not acoustic shocks underneath, and will yield data more interesting than if these canopies were transparent.Comment: Accepted for Astronomy & Astrophysics; Figure 1 correcte

    Solid state thermal control polymer coating Patent

    Get PDF
    Development of solid state polymer coating for obtaining thermal balance in spacecraft component

    The Interaction of Yer Deletion and Nasal Assimilation in Optimality Theory1

    Get PDF
    The problem of opacity presents a challenge for generative phonology. This paper examines the process of Nasal Assimilation in Polish rendered opaque by the process of Vowel Deletion in Optimality Theory (Prince & Smolensky, 1993), which currently is a dominating model for phonological analysis. The opaque interaction of the two processes exposes the inadequacy of standard Optimality Theory arising from the fact that standard OT is a non-derivational theory. It is argued that only by introducing intermediate levels can Optimality Theory deal with complex cases of opaque interactions

    Asynchronous processing of Coq documents: from the kernel up to the user interface

    Get PDF
    The work described in this paper improves the reactivity of the Coq system by completely redesigning the way it processes a formal document. By subdividing such work into independent tasks the system can give precedence to the ones of immediate interest for the user and postpones the others. On the user side, a modern interface based on the PIDE middleware aggregates and present in a consistent way the output of the prover. Finally postponed tasks are processed exploiting modern, parallel, hardware to offer better scalability.Comment: in Proceedings of ITP, Aug 2015, Nanjing, Chin

    Transparency and Ontology of Love (Chapter 14 of To Know as I Am Known: The Communion of the Saints and the Ontology of Love)

    Full text link
    Excerpt: In his book The Path of Perfect Love, Diogenes Allen suggests that it is because of our inability to perceive the reality of other people and things that we donā€™t grasp what brings out the fundamental feature of love, viz. the recognition or perception of things beside oneā€™s self. The reader may recall that both Badhwar and Royce made reference, the latter extensively, to the importance of recognizing the reality of the other person if one is to love. Allen focuses deeply on this theme. I will briefly present Allenā€™s position in section I and turn to a sermon given by Meister Eckhart in section II. These reflections allow us to pursue, in section III, the notion of transparency. Section IV summarizes the various things we have learned along the way, putting them into the broader context of transparency. Section V proposes an account of the human individual that links the individual through transparency to the community. Providing an ontology of the heavenly individual in turn provides for the ontology of love for, as it turns out, love is an enduring component of the individual properly understood
    • ā€¦
    corecore