14 research outputs found

    Standard setting organizations and open source communities: Partners or competitors?

    No full text
    Standardization serves a as a means to improve overall quality of life through the economies of scale gained from the pervasive adoption of technical solutions. It enables competition by facilitating interoperability between products of different vendors. The wider open source community develops free and open source software (FOSS) in a global upstream/downstream model that similarly benefits society as a public good. FOSS and standards setting organizations (SSOs) are both instruments causing standardizing effects. Innovators and policy-makers assume that a mutually beneficial collaboration between them is desirable. However, their exact relationship is not fully understood, especially when and how FOSS and SSOs complement each other, or displace each other as competitors. To be able to compare FOSS and SSOs, our study develops a phase model of standardization that is applicable to both approaches, and applies this model to compare the strengths and weaknesses of FOSS and SSOs against common opportunities and threats in the ICT sector. Based on qualitative expert interviews with FOSS and SSO representatives, the synthesis of the separate results support conclusions from a product, a process and a societal perspective. The study identifies cost of change as a key determinant for the efficacy of each approach. It concludes that FOSS and SSOs create complementary products, compete for efficiency of the standardization process, and are both independent and complementary standardization instruments available to industry and influenceable by policy-makers. The paper closes with a discussion of possible implications relevant to businesses, the wider open source community, SSOs and policy-makers

    Module Intersection for the Integration-by-Parts Reduction of Multi-Loop Feynman Integrals

    No full text
    International audienceIn this manuscript, which is to appear in the proceedings of the conference “MathemAmplitude 2019” in Padova, Italy, we provide an overview of the module intersection method for the the integration-by-parts (IBP) reduction of multi-loop Feynman integrals. The module intersection method, based on computational algebraic geometry, is a highly efficient way of getting IBP relations without double propagator or with a bound on the highest propagator degree. In this manner, trimmed IBP systems which are much shorter than the traditional ones can be obtained. We apply the modern, Petri net based, workflow management system GPI-Space in combination with the computer algebra system Singular to solve the trimmed IBP system via interpolation and efficient parallelization. We show, in particular, how to use the new plugin feature of GPI-Space to manage a global state of the computation and to efficiently handle mutable data. Moreover, a Mathematica interface to generate IBPs with restricted propagator degree, which is based on module intersection, is presented in this review

    Towards Massively Parallel Computations in Algebraic Geometry

    No full text
    Introducing parallelism and exploring its use is still a fundamental challenge for the computer algebra community. In high performance numerical simulation, on the other hand, transparent environments for distributed computing which follow the principle of separating coordination and computation have been a success story for many years. In this paper, we explore the potential of using this principle in the context of computer algebra. More precisely, we combine two well-established systems: The mathematics we are interested in is implemented in the computer algebra system Singular, whose focus is on polynomial computations, while the coordination is left to the workflow management system GPI-Space, which relies on Petri nets as its mathematical modeling language, and has been successfully used for coordinating the parallel execution (autoparallelization) of academic codes as well as for commercial software in application areas such as seismic data processing. The result of our efforts is a major step towards a framework for massively parallel computations in the application areas of Singular, specifically in commutative algebra and algebraic geometry. As a first test case for this framework, we have modeled and implemented a hybrid smoothness test for algebraic varieties which combines ideas from Hironaka's celebrated desingularization proof with the classical Jacobian criterion. Applying our implementation to two examples originating from current research in algebraic geometry, one of which cannot be handled by other means, we illustrate the behavior of the smoothness test within our framework, and investigate how the computations scale up to 256 cores

    Empirische Befunde zum Verhaeltnis von Know-how-Transfer und Kompetenzentwicklung

    No full text
    Das BMBF-Programm 'Lernkultur Kompetenzentwicklung' verfolgt die Zielsetzung, eine neue kompetenzbasierte Lernkultur modellhaft einzufuehren, denn es ist uebereinstimmende Auffassung, dass die Lern-und Innovationsfaehigkeit von Unternehmen in der Bundesrepublik gegenwaertig nicht ausreicht, um den Wettbewerbsanforderungen eines globalisierten Marktes und den Bedingungen der Wissensgesellschaft zu entsprechen. Der Beitrag des Programmbereichs 'Lernen im Prozess der Arbeit' zu diesem Programm ist die wissenschaftliche Begleitung betrieblicher Gestaltungsprojekte zur Einfuehrung einer neuen Lernkultur. Vor diesem Hintergrund besteht die Aufgabe der Autoren in der Analyse der konkreten Formen und Entwicklungen kompetenzbasierter Lernkultur. Hierzu wird ein einheitliches Instrumentarium entwickelt, das es ermoeglicht, einzelbetriebliche Ergebnisse der Kompetenzentwicklung in einen groesseren Zusammenhang zu stellen und vergleichend zu analysieren. Das erste Instrument, die 'Kompetenzbilanz', zielt auf die Bewertung der Ergebnisse eines betrieblichen Vorhabens zur Kompetenzentwicklung. Das zweite Instrument, 'Know-how-Transfer', dient dazu, die Quellen des Wissens und die Instrumente des Know-how-Transfers zu identifizieren, um die Instrumente anschliessend differenziert hinsichtlich ihrer Eignung und Wirkung zu bewerten. Die Instrumente werden entwickelt, um den Einfluss unterschiedlicher Methoden des Know-how-Transfers auf die Kompetenzentwicklung von Individuen, Team und Organisation genauer zu untersuchen. Ziel ist demnach die Entwicklung einer Hypothese zum Zusammenhang von Wissenstransfer und Kompetenzentwicklung. In dem Forschungsbericht werden sodann die Ergebnisse aus der Erprobung der beiden Instrumente vorgestellt. Bei der Analyse des Zusammenhanges von Know-how-Transfer und Kompetenz lautet die zentrale Frage: Welchen Einfluss hat der Know-how-Transfer auf die Kompetenzentwicklung? In einem ersten Abschnitt erlaeutern die Verfasser ihre Einschaetzung der Diskussion zum Thema Lernkultur, um daraus abgeleitet die zentralen Ergebnisse der Exploration vorzustellen. Die Grundlage bilden ein betriebsuebergreifender Vergleich sowie zwei Fallanalysen der Bereiche Produktion und Verwaltung. Die Resultate werden zur Entwicklung einer Hypothese zum Einfluss des Wissenstransfers auf die Kompetenzentwicklung genutzt. In einem zusammenfassenden Ausblick stellen die Autoren fest, dass die Erprobung von Evaluationsinstrumenten ein zwiespaeltiges Ergebnis erbracht hat. Auf der einen Seite kann man sagen, dass der gegenwaertige Kenntnisstand in den Arbeitswissenschaften ohne weiteres ausreicht, um die Ergebnisse der Vorhaben zur Kompetenzentwicklung abschliessend und verlaesslich zu bilanzieren. Auf der anderen Seite erweist sich eine Bewertung der Qualitaet des Know-how-Transfer unter dem Aspekt der Unterstuetzung der Prozesse der Kompetenzentwicklung nach wie vor als ausgesprochen schwierig. (ICG2)SIGLEAvailable from Arbeitsgemeinschaft Betriebliche Weiterbildungsforschung e.V. -ABWF- Projekt Qualifikations-Entwicklungs-Management -QUEM-, Berlin (DE) / FIZ - Fachinformationszzentrum Karlsruhe / TIB - Technische InformationsbibliothekDEGerman

    Cardiopulmonary resuscitation in commercial aircraft - evidence -based recommendations of the DGLRM guideline

    No full text
    Approximately 3 billion people worldwide travel by commercial air transport annually. A calculation based on the number of passengers transported shows that between 1 out of 14,000 to 1 out of 50,000 transported passengers will experience acute medical problems during a flight. Cardiac arrest accounts for 0.3% of all in-flight medical emergencies. However, it is responsible for 86% of in-flight events resulting in death. So far, no guideline for inflight cardiac arrest (IFCA) does exist to provide specific treatment recommendations. A task force was created to develop a guideline for the treatment of in-flight cardiac arrest based on clinical and investigational expertise in this area. By using a systematic literature search including Grade, Rand and Delphi methods, specific recommendations for the treatment of IFCA have been created. Several main recommendations have been developed: emergency equipment location as well as content should be mentioned in the preflight safety announcement; ECG should be available for patients with cardiac arrest, it is very important to request help by an on -board announcement after identification of a patient with cardiac arrest; two -person CPR is considered optimum and should be performed if possible; the crew should be trained regularly in basic life support - ideally with a focus on CPR in aircraft; a diversion should immediately be performed if the patient has a return of spontaneous circulation
    corecore