919 research outputs found

    Constraint on Neutrino Decay with Medium-Baseline Reactor Neutrino Oscillation Experiments

    Full text link
    The experimental bound on lifetime of nu_3, the neutrino mass eigenstate with the smallest nu_e component, is much weaker than those of nu_1 and nu_2 by many orders of magnitude to which the astrophysical constraints apply. We argue that the future reactor neutrino oscillation experiments with medium-baseline (~ 50 km), such as JUNO or RENO-50, has the best chance of placing the most stringent constraint on nu_3 lifetime among all neutrino experiments which utilize the artificial source neutrinos. Assuming decay into invisible states, we show by a detailed chi^2 analysis that the nu_3 lifetime divided by its mass, tau_3/m_3, can be constrained to be tau_3/m_3 > 7.5 (5.5) x 10^{-11} s/eV at 95% (99%) C.L. by 100 kt.years exposure by JUNO. It may be further improved to the level comparable to the atmospheric neutrino bound by its longer run. We also discuss to what extent nu_3 decay affects mass-ordering determination and precision measurements of the mixing parameters.Comment: 23 pages, 6 figures, clarification of some discussions, added some references, no change in results and conclusions, version accepted for publication in JHE

    O Futuro ... a quem pertence?

    Get PDF
    Univ Fed Sao Paulo UNIFESP, Escola Paulista Med, Dept Otorhinolaryngol & Head & Neck Surg, Sao Paulo, SP, BrazilUniv Fed Sao Paulo UNIFESP, Escola Paulista Med, Dept Otorhinolaryngol & Head & Neck Surg, Sao Paulo, SP, BrazilWeb of Scienc

    Politics Around the Dining Table: Brazil, 1881 to 1928

    Get PDF
    Group eating has long been associated with ritual ceremonies of the life cycle. ‘Food and Disruption: What shall we eat tomorrow?’ It is an interesting theme when we look back to forms of dining, specifically to the use of the dining table as a space and instrument for the construction of the idea of progress and national identity. Self-definition occurs in a specific space and time, in this case, Brazil between 1889 and 1930

    The Structure and Evolution of Online Rating Biases in the Sharing Economy

    Get PDF
    A wave of sharing economy companies are profoundly changing the market landscape, disrupting traditional businesses alongside the social fabrics of exchange. A critical challenge to their growth, however, is that how to generate trust from online to offline transactions. Users in many online platforms rely on reputational systems such as ratings to infer quality and make decisions. However, ratings are biased by behavioral tendencies, such as homophily and power dependence. Our project examines the structure and evolution of rating biases by analyzing massive amount of platform data. Using big data techniques on leading sharing economy platforms, we identify the structure and evolution of biases, attempting to correct the tendencies in system design. We examine rating biases and their relationships to social distance among heterogeneous user populations. The coevolution of reputational systems and trust further implies long-term behavioral trends, which are critical to investigate for business growth

    On the Internet Delay Space Dimensionality

    Full text link
    We investigate the dimensionality properties of the Internet delay space, i.e., the matrix of measured round-trip latencies between Internet hosts. Previous work on network coordinates has indicated that this matrix can be embedded, with reasonably low distortion, in a low-dimensional Euclidean space. Our work addresses the question: to what extent is the dimensionality an intrinsic property of the distance matrix, defined without reference to a host metric such as Euclidean space? Does the intrinsic dimensionality of the Internet delay space match the dimension determined using embedding techniques? if not, what explain the discrepancy? What properties of the network contribute to its overall dimensionality? Using a dataset obtained via the King method, we compare three intrinsically-defined measures of dimensionality with the dimension obtained using network embedding techniques to establish the following conclusions. First, the structure of the delay space is best described by fractal measures of dimension rather than by integer-valued parameters, such as the embedding dimension. Second, the intrinsic dimension is inherently than the embedding dimension; in fact by some measures it is less than 2. Third, the Internet dimensionality can be reduced by decomposing its delay space into pieces consisting of hosts which share an upstream Tier-1 autonomous system in common. Finally, we argue that fractal dimensionality measures and non-linear embedding algorithms are capable of detecting subtle features of the delay space geometry which are not detected by other embedding techniques

    End-to-End Automation in Cloud Infrastructure Provisioning

    Get PDF
    Infrastructure provisioning in the cloud can be time-consuming and error-prone due to the manual process of building scripts. Configuration Management Tools (CMT) such as Ansible, Puppet or Chef use scripts to orchestrate the infrastructure provisioning and its configuration in the cloud. Although CMTs have a high level of automation in the infrastructure provisioning still remains a challenge to automate the iterative development process in the cloud. Infrastructure as Code is a process where the infrastructure is automatically built, managed, and provisioned by scripts. However, there are several infrastructure provisioning tools and scripting languages that need to be used coherently. In previous work, we have introduced the ARGON modelling tool with the purpose of abstracting the complexity of working with different DevOps tools through a DSL. In this work, we present an end-to- end automation for a toolchain for infrastructure provisioning in the cloud based on DevOps community tools and ARGON

    Boson Sampling with efficient scaling and efficient verification

    Full text link
    A universal quantum computer of moderate scale is not available yet, however intermediate models of quantum computation would still permit demonstrations of a quantum computational advantage over classical computing and could challenge the Extended Church-Turing Thesis. One of these models based on single photons interacting via linear optics is called Boson Sampling. Proof-of-principle Boson Sampling has been demonstrated, but the number of photons used for these demonstrations is below the level required to claim quantum computational advantage. To make progress with this problem, here we conclude that the most practically achievable pathway to scale Boson Sampling experiments with current technologies is by combining continuous-variables quantum information and temporal encoding. We propose the use of switchable dual-homodyne and single-photon detections, the temporal loop technique and scattershot based Boson Sampling. This proposal gives details as to what the required assumptions are and a pathway for a quantum optical demonstration of quantum computational advantage. Furthermore, this particular combination of techniques permits a single efficient implementation of Boson Sampling and efficient verification in a single experimental setup

    An Infrastructure Modeling Approach for Multi-Cloud Provisioning

    Get PDF
    Cloud Computing has become the primary model of pay-per-use used by practitioners and researchers to obtain an infrastructure in a short time. DevOps uses the Infrastructure as Code approach to infrastructure automation based on software development practices. Moreover, the DevOps community provides different tools to orchestrate the infrastructure provisioning in a particular cloud provider. However, the traditional method of using a single cloud provider has several limitations regarding privacy, security, performance, geography reach, and vendor lock-in. To mitigate these issues industry and academia are implementing multiple clouds (i.e., multi-cloud). In previous work, we have introduced ARGON, which is an infrastructure modeling tool for cloud provisioning that leverages the model-driven engineering (MDE) to provide a uniform, cohesive, and seamless process to support the DevOps concept. In this paper, we present an extension of ARGON to support the multi-cloud infrastructure provisioning and propose a flexible migration process among cloud

    Resources Package Modelling Supporting Border Surveillance Operations

    Get PDF
    The purpose of this work is to propose a military planning tool capable of providing logistical bases and patrol packages to most effectively support border surveillance. Presently, military patrols are employed along geographical borders to combat transnational crimes; acts such as drug trafficking, smuggling of goods and illegal natural resources exploitation. The patrols make temporary stops within specific time windows at specific places characterised by a high incidence of crime (hotspots). These hotspots have different criticalities within given time windows. To optimise the results, the proposed model allows additional stops in more critical hotspots. It achieves this using a mathematical optimisation model. Considering that there are not adequate logistical-military capacities (logistical bases and patrols) at all needed locations, developing a border surveillance plan that optimises resource use is imperative. The model was run using black hole-based optimisation and a real patrol mission’s database to ensure timely solutions. The solutions were then evaluated in terms of quality (number of bases and patrols, coverage efforts, and travel time) and computational processing time. Next, they were compared with solutions using the traditional method, thereby demonstrating the model’s robustness in providing timely surveillance schemes that ensure high coverage with minimum resources
    • …
    corecore