3,783 research outputs found

    Non-Vanishing Cosmological Constant Λ\Lambda, Phase Transitions, And Λ\Lambda-Dependence Of High Energy Processes

    Full text link
    It is pointed out that a collider experiment involves a local contribution to the energy-momentum tensor, a circumstance which not a common feature of the current state of the Universe at large characterized by the cosmological constant Λ0\Lambda_0. This contribution may be viewed as a change in the structure of space-time from its large scale form governed by Λ0\Lambda_0 to one governed by a Λ\Lambda peculiar to the scale of the experiment. Possible consequences of this effect are explored by exploiting the asymptotic symmetry of space-time for non-vanishing Λ\Lambda and its relation to vacuum energy.Comment: 11 pages; UCTP101.02; last section revised; the version to appear in Physics Letters

    Editorial special issue: PHM for railway systems and mass transportation

    Get PDF
    The railway and mass transportation system is composed of industrial goods with substantial capital investments and long life cycles. This applies to rolling stock like trains, locomotives, wagons, and even more to the infrastructure like signaling, catenary, tracks, bridges, and tunnels. The lifespan of rolling stock is 30 to 40 years while the infrastructure is used 30 to 60 years even more than 100 years in case of tunnels and bridges. As in other industrial goods, the cost drivers are determined in the early design phases but realized mainly during a long time of operation. Maintenance is one of the main cost drivers but essential to a reliable, capable, and – above all – safe operation

    The provenance of late Cenozoic East Asian Red Clay : Tectonic-metamorphic history of potential source regions and a novel combined zircon-rutile approach

    Get PDF
    Constraining the provenance of aeolian mineral dust is critical in understanding past climate changes, atmospheric dust activity, circulation, and sediment generation. On the Chinese Loess Plateau (CLP), use of detrital zircon U-Pb age data as source tracers for the dust has seen a huge growth and lead to breakthroughs in understanding dust provenance. However, significant ambiguities remain especially regarding the provenance of the aeolian Neogene Red Clay (RC). To address this, here we review the state of the art of understanding of Neogene RC provenance, with a focus on single-grain analyses, and introduce detrital rutile geochemistry as a tool to complement zircon U-Pb dating. Furthermore, to better utilise the link between the detrital minerals and their primary origin, we compile primary source region geologic background and single-grain data relevant for use of geochronological and metamorphic provenance proxy minerals. We discuss four major tectonic divisions in northern China and southern Mongolia: North China Craton (NCC), Tarim Craton (TC), Central China Orogen (CCO), parts of the Central Asian Orogenic Belt (CAOB), and briefly summarize the Tibetan-Himalayan orogen. Many of these regions have been tectonically active during the same time periods in the Earth's history, and our analysis demonstrates how use of zircon age data alone has limitations in differentiating between a number of key potential dust sources to the CLP. Addition of a metamorphic source tracer such as rutile allows some of these possible source areas to be distinguished. For example, the proximal northern NCC regions that show high-/ ultrahigh-temperature metamorphic conditions can potentially be diagnostic of a northerly source component to CLP dust. Our combined zircon-rutile data analysis of ca. 4 Ma Nihewan RC in northern CLP verifies the utility of the novel rutile provenance proxy in sourcing CLP sediments. The zircon and rutile data suggest similar dust provenance: the dominant sources are proximal areas on the NCC, while contributions from the dry areas in parts of the CAOB, central deserts, and the Yellow River are also likely. Our results also hint at a minor source component deriving from distal western source regions in the TC, and/or in the central parts of the CCO, but rutile data from potential secondary source areas are needed to verify this possibility. We also conclude that multi-proxy single-grain provenance analyses are needed for more reliable provenance analyses.Peer reviewe

    Bohm's interpretation and maximally entangled states

    Get PDF
    Several no-go theorems showed the incompatibility between the locality assumption and quantum correlations obtained from maximally entangled spin states. We analyze these no-go theorems in the framework of Bohm's interpretation. The mechanism by which non-local correlations appear during the results of measurements performed on distant parts of entangled systems is explicitly put into evidence in terms of Bohmian trajectories. It is shown that a GHZ like contradiction of the type+1=-1 occurs for well-chosen initial positions of the Bohmian trajectories and that it is this essential non-classical feature that makes it possible to violate the locality condition.Comment: 18 page

    Ökonomische Analyse europäischer Bankenregulierung: Verbriefung und Interbankenmarkt im Fokus

    Get PDF
    Die Bankenneuregulierung der Europäische Kommission sieht eine Beschränkung der Kreditvergabe im Interbankenmarkt auf 25 % des Eigenkapitals sowie einen Selbsteinbehalt des Originators in Höhe von 5 % am gesamten zu verbriefenden Forderungsportfolio vor. Eine starre Regulierung führt aber nicht zwingend zu einer dauerhaften Krisenprävention, wie die vorliegende Arbeit modelltheoretisch belegt. Eine starre Kreditvergabebeschränkung erreicht zwar eine Mindestdiversifikation und Eigenkapitalaufstockung im Bankensektor, wodurch das systemische Risiko gesenkt wird. Allerdings geht dies mit steigenden Transaktionskosten einher. Anhand eines Modells von Fender und Mitchell werden die Auswirkungen auf die Screening-Anstrengungen bei Verbriefungen mit komplettem Portfolioselbsteinbehalt, Einbehalt der Equity Tranche und Einbehalt eines vertikalen Anteils durch den Originator untersucht. Aus dem Modell wird ersichtlich, dass ein vertikaler Einbehalt kleiner 100 % des Forderungspools, wie er von der Europäischen Kommission vorgesehen ist, in keiner Situation zu einem optimalen Screening-Einsatz führt, sondern sogar teilweise eine Verschlechterung im Vergleich zum Einbehalt der Equity Tranche darstellt. Eine pauschale Regulierung ist deshalb abzulehnen und eine qualitative, dynamische Regulierung, die mehr Transparenz schafft, zu befürworten. -- The new regulation for banks by the European Commission contains a restriction to 25 % of the equity for credit allocation on the interbank market and an enduring participation of the originator in the whole receivables portfolio of 5 %. But an inflexible regulation does not permanently prevent the market from further financial crisis, which is theoretically analysed in the presented paper. Indeed an inflexible restriction of the equity for credit allocation achieves a minimum diversification and an equity increase on the banking sector, which reduces the systemic risk. Admittedly, this can only occur by acceptance of increasing transaction costs. By applying a model from Fender and Mitchell the impact of the screening efforts for securitizations with complete retention of the portfolio, the retention of the equity tranche and the retention of the vertical fraction by the originator is analysed. The model shows that a vertical retention smaller than 100 % of the pool of receivables, as proposed by the European Commission, does not lead to an optimal level of screening in any situation and might even cause a worsening in comparison to the retention of the equity tranche. Considering the complexity of the financial system, a sweeping regulation must be rejected and a qualitative, dynamic regulation that establishes a higher level of transparency is recommended.Bankenregulierung,Verbriefung,Selbstbehalt,Interbankenmarkt

    Excited Heavy Baryons and Their Symmetries I: Formalism

    Full text link
    This is the first of two papers to study a new emergent symmetry which connects orbitally excited heavy baryons to the ground states in the combined heavy quark and large NcN_c limit. The existence of this symmetry is shown in a model-independent way, and different possible realizations of the symmetry are discussed. It is also proved that this emergent symmetry commutes with the large NcN_c spin-flavor symmetry.Comment: 20 pages in REVTe

    Geometric phases for non-degenerate and degenerate mixed states

    Full text link
    This paper focuses on the geometric phase of general mixed states under unitary evolution. Here we analyze both non-degenerate as well as degenerate states. Starting with the non-degenerate case, we show that the usual procedure of subtracting the dynamical phase from the total phase to yield the geometric phase for pure states, does not hold for mixed states. To this end, we furnish an expression for the geometric phase that is gauge invariant. The parallelity conditions are shown to be easily derivable from this expression. We also extend our formalism to states that exhibit degeneracies. Here with the holonomy taking on a non-abelian character, we provide an expression for the geometric phase that is manifestly gauge invariant. As in the case of the non-degenerate case, the form also displays the parallelity conditions clearly. Finally, we furnish explicit examples of the geometric phases for both the non-degenerate as well as degenerate mixed states.Comment: 23 page

    The legal rule that computers are presumed to be operating correctly – unforeseen and unjust consequences

    Get PDF
    In England and Wales, courts consider computers, as a matter of law, to have been working correctly unless there is evidence to the contrary. Therefore, evidence produced by computers is treated as reliable unless other evidence suggests otherwise. This way of handling evidence is known as a ‘rebuttable presumption’. A court will treat a computer as if it is working perfectly unless someone can show why that is not the case. This presumption poses a challenge to those who dispute evidence produced by a computer system. Frequently the challenge is insurmountable, particularly where a substantial institution operates the system. The Post Office Horizon scandal clearly exposes the problem and the harm that may result. From 1999, the Post Office prosecuted hundreds of postmasters and Post Office employees for theft and fraud based on evidence produced by the Horizon computer system showing shortfalls in their branch accounts. In those prosecutions, the Post Office relied on the presumption that computers were operating correctly. Hundreds of postmasters and others were convicted, sentenced to terms of imprisonment, fined, or had their property confiscated. This clearly demonstrated that the Law Commission’s assertion that ‘such a regime would work fairly’ was flawed. In the December 2019 judgment in the group litigation Bates v The Post Office Ltd (No 6: Horizon Issues) Rev 1, Mr Justice Fraser concluded that it was possible that software errors in Horizon could have caused apparent shortfalls in branch accounts, rather than these being due to theft or fraud. Following this judgement, the Criminal Cases Review Commission referred an unprecedented number of convictions, based upon the supposed shortfalls in the Horizon accounts, to the Court of Appeal. Appeal courts have quashed more than 70 convictions at the time of writing. There will be many more appeals and many more convictions quashed in what is likely the largest miscarriage of justice in British history. Were it not for the group litigation, the fundamental unreliability of the software in the Post Office’s Horizon computer system would not have been revealed, as previous challenges to Horizon’s correctness were unable to rebut the presumption of reliability for computer evidence. The financial risk of bringing legal action deterred other challenges. Similar issues apply in other situations where the reliability of computer evidence is questioned, such as in payment disputes. The legal presumption, as applied in practice, has exposed widespread misunderstanding about the nature of computer failures – in particular, the fact that these are almost invariably failures of software. The presumption has been the cause of widespread injustice. There is a pressing requirement for the presumption to be re-evaluated to avoid the risk of further or continuing injustice. We propose that the presumption that computer evidence is reliable be replaced with a process where if computer evidence is challenged, a party must justify the correctness of the evidence upon which they rely. The proposed process, summarised below, requires the disclosure of documents that would already exist in any well-managed computer system. The procedural and evidential safeguards of the kind we propose would probably have avoided the disastrous repeated miscarriages of justice over the past 20 years
    corecore