35,645 research outputs found

    Chapter 18 of the General Theory “Further Analysed”: The Theory of Economics as A Method

    Get PDF
    In 1987, Greenwald and Stiglitz accused Keynes’s summary of the General Theory in chapter 18 of relying upon “neoclassical and Marshallian tools”. A number of contributions have on the contrary emphasized the methodological importance of this chapter, which this paper revisits in the light of A Treatise on Probability. It thereby shows that the notions of cause and dependence used to discuss the relationships between independent and dependent variables of the General Theory are related to the concept of “independence for knowledge”, which concerns logical connections between arguments rather than material connections between events. We demonstrate that such logical connections established in chapter 18 are rediscussed in chapters 19-21, where Keynes allows for probable repercussions between the factors and removes the simplifying assumptions previously introduced. After stressing the methodological continuity this method provides with the analysis of credit cycles in A Treatise on Money, we argue that chapter 18 is an indispensable tool to decode the internal text structure of the General Theory. We thus characterize the latter as a vademecum to the complex economic world, the author providing an analytical method allowing – and requiring – the readers to emulate his efforts to grasp the complexity and interdependence of the economic material.John Maynard Keynes, The General Theory, complexity, economic methodology

    High-Integrity Performance Monitoring Units in Automotive Chips for Reliable Timing V&V

    Get PDF
    As software continues to control more system-critical functions in cars, its timing is becoming an integral element in functional safety. Timing validation and verification (V&V) assesses softwares end-to-end timing measurements against given budgets. The advent of multicore processors with massive resource sharing reduces the significance of end-to-end execution times for timing V&V and requires reasoning on (worst-case) access delays on contention-prone hardware resources. While Performance Monitoring Units (PMU) support this finer-grained reasoning, their design has never been a prime consideration in high-performance processors - where automotive-chips PMU implementations descend from - since PMU does not directly affect performance or reliability. To meet PMUs instrumental importance for timing V&V, we advocate for PMUs in automotive chips that explicitly track activities related to worst-case (rather than average) softwares behavior, are recognized as an ISO-26262 mandatory high-integrity hardware service, and are accompanied with detailed documentation that enables their effective use to derive reliable timing estimatesThis work has also been partially supported by the Spanish Ministry of Economy and Competitiveness (MINECO) under grant TIN2015-65316-P and the HiPEAC Network of Excellence. Jaume Abella has been partially supported by the MINECO under Ramon y Cajal postdoctoral fellowship number RYC-2013-14717. Enrico Mezzet has been partially supported by the Spanish Ministry of Economy and Competitiveness under Juan de la Cierva-IncorporaciĂłn postdoctoral fellowship number IJCI-2016- 27396.Peer ReviewedPostprint (author's final draft

    Medical Errors

    Get PDF
    Overview: Time of Death: 5:07 p.m. – Proceeding the solemn afternoon of February 22nd 2003, the Santillian family listened on as doctors told them that their cherished loved one was officially pronounced brain dead and would soon have to be taken off life support. Two weeks prior to this, seventeen-year-old Jesica Santillian received the thrilling news that she had finally been matched with a heart-lung donor and would be admitted to Duke University Medical Center in early February for a double-organ transplant. After years of living in pain brought on by her failing organs, Jesica was supposed to be one of the lucky ones, that is, until an ill-fated call received an hour after the new organs had been put in turned her luck upside down. The call was from a technician in the immunology lab saying that something had gone terribly wrong; Jesica’s blood type, type O, did not match the blood type her new organs, which were type A. What that meant was that Jesica’s life was in serious danger because the antibodies in her blood would shortly start attacking and destroying her new organs. Two weeks and an odds-shattering second set of donated organs later, the near death teenager’s family said their last goodbyes as the medication that kept her heart going was discontinued and her heart took its last untimely beat seven minutes later (Kopp 1)

    The Role of Correlated Noise in Quantum Computing

    Full text link
    This paper aims to give an overview of the current state of fault-tolerant quantum computing, by surveying a number of results in the field. We show that thresholds can be obtained for a simple noise model as first proved in [AB97, Kit97, KLZ98], by presenting a proof for statistically independent noise, following the presentation of Aliferis, Gottesman and Preskill [AGP06]. We also present a result by Terhal and Burkard [TB05] and later improved upon by Aliferis, Gottesman and Preskill [AGP06] that shows a threshold can still be obtained for local non-Markovian noise, where we allow the noise to be weakly correlated in space and time. We then turn to negative results, presenting work by Ben-Aroya and Ta-Shma [BT11] who showed conditional errors cannot be perfectly corrected. We end our survey by briefly mentioning some more speculative objections, as put forth by Kalai [Kal08, Kal09, Kal11]

    Understanding Refugee Law in an Enlarged European Union Theory

    Get PDF
    The present article seeks to explore how asylum law is formed, transformed and reformed in Europe, what its effects are on state practice and refugee protection in the Baltic and Central European candidate countries, and what this process reveals about the framework used by scholars to understand the dynamics of international refugee law. Arguably, an exclusive focus on EU institutions and their dissemination of regional and international norms among candidate countries through the acquis communitaire is misleading. Looking at the subregional interplay between Vienna and Budapest, Berlin and Warsaw, Copenhagen and Vilnius provides a richer understanding of the emergence of norms than the standard narrative of a Brussels dictate. Hence, to capture these dynamics, we will attempt to expand the framework of analysis by incorporating sub-regional settings, cutting across the divide between old and new Members, and by analysing the repercussions sent out by domestic legislation within these settings. While acknowledging that bilateral and multilateral relations are continuously interwoven, we conclude that bilateralism accounts for a greater degree of normative development and proliferation than multilateralism at EU level, and that domestic legislation as formed by sub-regional dynamics will remain the ultimate object of study for scholars of international refugee law.

    Division of Property Upon Dissolution of Marriage

    Get PDF

    A study into the feasibility of generic programming for the construction of complex software

    Get PDF
    A high degree of abstraction and capacity for reuse can be obtained in software design through the use of Generic Programming (GP) concepts. Despite widespread use of GP in computing, some areas such as the construction of generic component libraries as the skeleton for complex computing systems with extensive domains have been neglected. Here we consider the design of a library of generic components based on the GP paradigm implemented with Java. Our aim is to investigate the feasibility of using GP paradigm in the construction of complex computer systems where the management of users interacting with the system and the optimisation of the system’s resources is required.Postprint (author's final draft

    Efficient interaction analysis for an effective provision of knowledge about the discussion process to CSCL practices

    Get PDF
    The discussion process plays an important social task in Computer-Supported Collaborative Learning (CSCL) where participants can discuss about the activity being performed, collaborate with each other through the exchange of ideas that may arise, propose new resolution mechanisms, as well as justify and refine their own contributions and thus acquire new knowledge. Indeed, learning by discussion when applied to collaborative learning scenarios can provide significant benefits for students in collaborative learning, and in education in general. However, the discussion process in the context of distance education presents high drop out in comparison to traditional programs due chiefly to a sense of isolation of participants who do not have knowledge about others nor they can compare their own progress and performance to the group. To alleviate this problem, the provision of appropriate knowledge from the analysis of on-line interaction is rapidly gaining popularity due to its great impact on the discussion performance and outcomes. This implies a need to capture and structure all types of information generated by group activity and then to extract the relevant knowledge in order to provide participants with efficient awareness and feedback as regards group performance and collaboration. As a result, it is necessary to process and analyzed complex event log files from group activity in a constant manner, and thus it may require computational capacity beyond that of a single computer. To this end, in this paper we show how a Grid approach can considerably increase the overall efficiency of processing group activity log files and thus allow discussion participants to receive effective knowledge even in real time. The context of this study is a real discussion experience that took place at the Open University of Catalonia (UOC).Peer ReviewedPostprint (published version
    • 

    corecore