6,259 research outputs found

    Causality re-established

    Get PDF
    Causality never gained the status of a "law" or "principle" in physics. Some recent literature even popularized the false idea that causality is a notion that should be banned from theory. Such misconception relies on an alleged universality of reversibility of laws of physics, based either on determinism of classical theory, or on the multiverse interpretation of quantum theory, in both cases motivated by mere interpretational requirements for realism of the theory. Here, I will show that a properly defined unambiguous notion of causality is a theorem of quantum theory, which is also a falsifiable proposition of the theory. Such causality notion appeared in the literature within the framework of operational probabilistic theories. It is a genuinely theoretical notion, corresponding to establish a definite partial order among events, in the same way as we do by using the future causal cone on Minkowski space. The causality notion is logically completely independent of the misidentified concept of "determinism", and, being a consequence of quantum theory, is ubiquitous in physics. In addition, as classical theory can be regarded as a restriction of quantum theory, causality holds also in the classical case, although the determinism of the theory trivializes it. I then conclude arguing that causality naturally establishes an arrow of time. This implies that the scenario of the "Block Universe" and the connected "Past Hypothesis" are incompatible with causality, and thus with quantum theory: they both are doomed to remain mere interpretations and, as such, not falsifiable, similar to the hypothesis of "super-determinism". This article is part of a discussion meeting issue "Foundations of quantum mechanics and their impact on contemporary society".Comment: Presented at the Royal Society of London, on 11/12/ 2017, at the conference "Foundations of quantum mechanics and their impact on contemporary society". To appear on Philosophical Transactions of the Royal Society

    Coordination Implications of Software Coupling in Open Source Projects

    Get PDF
    The effect of software coupling on the quality of software has been studied quite widely since the seminal paper on software modularity by Parnas [1]. However, the effect of the increase in software coupling on the coordination of the developers has not been researched as much. In commercial software development environments there normally are coordination mechanisms in place to manage the coordination requirements due to software dependencies. But, in the case of Open Source software such coordination mechanisms are harder to implement, as the developers tend to rely solely on electronic means of communication. Hence, an understanding of the changing coordination requirements is essential to the management of an Open Source project. In this paper we study the effect of changes in software coupling on the coordination requirements in a case study of a popular Open Source project called JBoss

    Strong mass effect on ion beam mixing in metal bilayers

    Full text link
    Molecular dynamics simulations have been used to study the mechanism of ion beam mixing in metal bilayers. We are able to explain the ion induced low-temperature phase stability and melting behavior of bilayers using only a simple ballistic picture up to 10 keV ion energies. The atomic mass ratio of the overlayer and the substrate constituents seems to be a key quantity in understanding atomic mixing. The critical bilayer mass ratio of δ<0.33\delta < 0.33 is required for the occurrence of a thermal spike (local melting) with a lifetime of τ>0.3\tau > 0.3 ps at low-energy ion irradiation (1 keV) due to a ballistic mechanism. The existing experimental data follow the same trend as the simulated values.Comment: 4 pages, 4 figures, preprin

    Feasibility, drug safety, and effectiveness of etiological treatment programs for Chagas disease in Honduras, Guatemala, and Bolivia: 10-year experience of Médecins Sans Frontières

    Get PDF
    BACKGROUND: Chagas disease (American trypanosomiasis) is a zoonotic or anthropozoonotic disease caused by the parasite Trypanosoma cruzi. Predominantly affecting populations in poor areas of Latin America, medical care for this neglected disease is often lacking. Médecins Sans Frontières/Doctors Without Borders (MSF) has provided diagnostic and treatment services for Chagas disease since 1999. This report describes 10 years of field experience in four MSF programs in Honduras, Guatemala, and Bolivia, focusing on feasibility protocols, safety of drug therapy, and treatment effectiveness. METHODOLOGY: From 1999 to 2008, MSF provided free diagnosis, etiological treatment, and follow-up care for patients <18 years of age seropositive for T. cruzi in Yoro, Honduras (1999-2002); Olopa, Guatemala (2003-2006); Entre Ríos, Bolivia (2002-2006); and Sucre, Bolivia (2005-2008). Essential program components guaranteeing feasibility of implementation were information, education, and communication (IEC) at the community and family level; vector control; health staff training; screening and diagnosis; treatment and compliance, including family-based strategies for early detection of adverse events; and logistics. Chagas disease diagnosis was confirmed by testing blood samples using two different diagnostic tests. T. cruzi-positive patients were treated with benznidazole as first-line treatment, with appropriate counseling, consent, and active participation from parents or guardians for daily administration of the drug, early detection of adverse events, and treatment withdrawal, when necessary. Weekly follow-up was conducted, with adverse events recorded to assess drug safety. Evaluations of serological conversion were carried out to measure treatment effectiveness. Vector control, entomological surveillance, and health education activities were carried out in all projects with close interaction with national and regional programs. RESULTS: Total numbers of children and adolescents tested for T. cruzi in Yoro, Olopa, Entre Ríos, and Sucre were 24,471, 8,927, 7,613, and 19,400, respectively. Of these, 232 (0.9%), 124 (1.4%), 1,475 (19.4%), and 1,145 (5.9%) patients, respectively, were diagnosed as seropositive. Patients were treated with benznidazole, and early findings of seroconversion varied widely between the Central and South American programs: 87.1% and 58.1% at 18 months post-treatment in Yoro and Olopa, respectively; 5.4% by up to 60 months in Entre Ríos; and 0% at an average of 18 months in Sucre. Benznidazole-related adverse events were observed in 50.2% and 50.8% of all patients treated in Yoro and Olopa, respectively, and 25.6% and 37.9% of patients in Entre Ríos and Sucre, respectively. Most adverse events were mild and manageable. No deaths occurred in the treatment population. CONCLUSIONS: These results demonstrate the feasibility of implementing Chagas disease diagnosis and treatment programs in resource-limited settings, including remote rural areas, while addressing the limitations associated with drug-related adverse events. The variability in apparent treatment effectiveness may reflect differences in patient and parasite populations, and illustrates the limitations of current treatments and measures of efficacy. New treatments with improved safety profiles, pediatric formulations of existing and new drugs, and a faster, reliable test of cure are all urgently needed

    Functional and Structural Biological Methods for Palytoxin Detection

    Get PDF
    Palytoxin (PLTX) and its analogues are marine polyethers identified in Palythoa and Zoanthus corals, Ostreopsis dinoflagellates, and Trichodesmium cyanobacteria. Humans can be exposed to these toxins by different routes with a series of adverse effects but the most severe risk is associated with poisonings by the consumption of edible marine organisms accumulating these toxins, as occurs in (sub)-tropical areas. In temperate areas, adverse effects ascribed to PLTXs have been recorded after inhalation of marine aerosols and/or cutaneous contact with seawater during Ostreopsis blooms, as well as during cleaning procedures of Palythoa-containing home aquaria. Besides instrumental analytical methods, in the last years a series of alternative or complementary methods based on biological/biochemical tools have been developed for the rapid and specific PLTX detection required for risk assessment. These methods are usually sensitive, cost- and time-effective, and do not require highly specialized operators. Among them, structural immunoassays and functional cellbased assays are reviewed. The availability of specific anti-PLTX antibodies allowed the development of different sensitive structural assays, suitable for its detection also in complex matrices, such as mussels. In addition, knowing the mechanism of PLTX action, a series of functional identification methods has been developed. Despite some of them being limited by matrix effects and specificity issues, biological methods for PLTX detection represent a feasible tool, suitable for rapid screening

    Migración en profundidad de dato sísmico terrestre: modelado de la superficie de referencia equivalente

    Get PDF
    El presente trabajo enuncia, fundamenta y demuestra de manera empírica, una metodología de modelado alternativa para resolver el problema de la zona somera del modelo inicial de velocidades interválicas para el flujo de migración en profundidad de dato sísmico terrestre. Para ilustrar la metodología que se propone basta con pensar en un sistema óptico elemental donde intercede una lente entre un objeto y su imagen. Para optimizar o enfocar dicha imagen, podemos optar por cambiar las propiedades físicas de la lente o, posiblemente más simple, sólo manipular la posición relativa de las componentes del sistema. En correspondencia con esa ilustración, la zona somera del modelo de velocidades interválicas, puede considerarse como la lente principal del sistema a los fines de enfocar las imágenes sísmicas en profundidad del subsuelo. Los métodos normalmente empleados o tradicionales para resolver este problema versan en el intento de modelar las propiedades de esta lente. Lo que se propone en la metodología expuesta en este trabajo, como camino alternativo, es redefinir una superficie de referencia equivalente, reubicando adecuadamente fuentes y receptores, al momento de construir el modelo inicial de velocidades. Luego, dicha superficie se emplea como referencia para: primero, el proceso de migración en profundidad; segundo, el flujo de tomografía de reflexión necesario para el refinamiento del modelo; y, por último, para el retorno al dominio del tiempo y, viceversa, siempre que sea necesario a los fines de cualquier procesamiento y/o análisis posterior. La fundamentación del método al que hacemos referencia se realiza en base a modelos de velocidades sintéticos simples. En base a los mismos, se compara el grado de enfoque obtenible entre el método propuesto y una técnica más convencional. Adicionalmente, para demostrar su aplicabilidad, se analizan ejemplos sobre datos sísmicos reales, y significativamente diferentes en cuanto a su calidad y a las características del marco geológico. Todos los elementos que finalmente se exponen en este trabajo, permiten concluir la utilidad de este método, sus ventajas, y la contribución que implica a la consistencia y robustez del flujo general de migración en profundidad de dato sísmico terrestre.Eje: Tercer Simposio sobre Inversión y Procesamiento de Señales en Exploración Sísmica.Facultad de Ciencias Astronómicas y Geofísica
    • …
    corecore