40,968 research outputs found

    Neutrino reactions on 138^{138}La and 180^{180}Ta via charged and neutral currents by the Quasi-particle Random Phase Approximation (QRPA)

    Full text link
    Cosmological origins of the two heaviest odd-odd nuclei, 138^{138}La and 180^{180}Ta, are believed to be closely related to the neutrino-process. We investigate in detail neutrino-induced reactions on the nuclei. Charged current (CC) reactions, 138^{138}Ba(νe,e)138 (\nu_e, e^{-}) ^{138}La and 180^{180}Hf(νe,e)180 (\nu_e, e^{-}) ^{180}Ta, are calculated by the standard Quasi-particle Random Phase Approximation (QRPA) with neutron-proton pairing as well as neutron-neutron, proton-proton pairing correlations. For neutral current (NC) reactions, 139^{139}La(νν)139 (\nu \nu^{'}) ^{139}{La}^* and 181^{181}Ta(ν,ν)181 (\nu, \nu^{'}) ^{181}Ta^*, we generate ground and excited states of odd-even target nuclei, 139^{139}La and 181^{181}Ta, by operating one quasi-particle to even-even nuclei, 138^{138}Ba and 180^{180}Hf, which are assumed as the BCS ground state. Numerical results for CC reactions are shown to be consistent with recent semi-empirical data deduced from the Gamow-Teller strength distributions measured in the (3^{3}He, t) reaction. Results for NC reactions are estimated to be smaller by a factor about 4 \sim 5 rather than those by CC reactions. Finally, cross sections weighted by the incident neutrino flux in the core collapsing supernova are presented for further applications to the network calculations for relevant nuclear abundances

    Comparing Computing Platforms for Deep Learning on a Humanoid Robot

    Full text link
    The goal of this study is to test two different computing platforms with respect to their suitability for running deep networks as part of a humanoid robot software system. One of the platforms is the CPU-centered Intel NUC7i7BNH and the other is a NVIDIA Jetson TX2 system that puts more emphasis on GPU processing. The experiments addressed a number of benchmarking tasks including pedestrian detection using deep neural networks. Some of the results were unexpected but demonstrate that platforms exhibit both advantages and disadvantages when taking computational performance and electrical power requirements of such a system into account.Comment: 12 pages, 5 figure

    Healthcare Price Transparency: Policy Approaches and Estimated Impacts on Spending

    Get PDF
    Healthcare price transparency discussions typically focus on increasing patients' access to information about their out-of-pocket costs, but that focus is too narrow and should include other audiences -- physicians, employers, health plans and policymakers -- each with distinct needs and uses for healthcare price information. Greater price transparency can reduce U.S. healthcare spending.For example, an estimated 100billioncouldbesavedoverthenext10yearsifthreeselectinterventionswereundertaken.However,mostoftheprojectedsavingscomefrommakingpriceinformationavailabletoemployersandphysicians,accordingtoananalysisbyresearchersattheformerCenterforStudyingHealthSystemChange(HSC).Basedonthecurrentavailabilityandmodestimpactofplanbasedtransparencytools,requiringallprivateplanstoprovidepersonalizedoutofpocketpricedatatoenrolleeswouldreducetotalhealthspendingbyanestimated100 billion could be saved over the next 10 years if three select interventions were undertaken. However, most of the projected savings come from making price information available to employers and physicians, according to an analysis by researchers at the former Center for Studying Health System Change (HSC). Based on the current availability and modest impact of plan-based transparency tools, requiring all private plans to provide personalized out-of-pocket price data to enrollees would reduce total health spending by an estimated 18 billion over the next decade. While 18billionisasubstantialdollaramount,itislessthanatenthofapercentofthe18 billion is a substantial dollar amount, it is less than a tenth of a percent of the 40 trillionin total projected health spending over the same period. In contrast, using state all-payer claims databases to gather and report hospital-specific prices might reduce spending by an estimated $61 billion over 10 years.The effects of price transparency depend critically on the intended audience, the decision-making context and how prices are presented. And the impact of price transparency can be greatly amplified if target audiences are able and motivated to act on the information. Simply providing prices is insufficient to control spending without other shifts in healthcare financing, including changes in benefit design to make patients more sensitive to price differences among providers and alternative treatments. Other reforms that can amplify the impact of price transparency include shifting from fee-for-service payments that reward providers for volume to payment methods that put providers at risk for spending for episodes of care or defined patient populations. While price transparency alone seems unlikely to transform the healthcare system, it can play a needed role in enabling effective reforms in value-based benefit design and provider payment

    The Sharing Economy and Collaborative Finance: the Case of P2p Lending in Vietnam

    Get PDF
    Peer-to-peer Online Lending (P2PO) has received increasing attention over the last years, not only because of its disruptive nature and its disintermediation of nearly all major banking functions, but also because of its rapid growth and expanding breadth of services. This model offers a new way of investing in addition to investing in traditional channels such as banking or financial company. The transaction process is done online, the personal information and terms of mobilization are completely transparent and secure in the best way. The strong development of P2PO also raises a number of issues that require careful attention to promote positive and to limit negative aspects. The research aims to highlight particular aspects of this new business model and to analyze the opportunities and risks for lenders and borrowers in Viet Nam. The research combines qualitative analysis and data survey to serve descriptive statistics about P2PO in Viet Nam. The research show the potential of online peer lending is enormous but the regulators will restrict the Sharing economy model in general and P2PO lending in particula

    Total flow of N and P in Vietnam urban wastes

    Get PDF
    The amount of organic matters, N and P, is quite significant in urban wastes, especially in wastewater and solid wastes. It was found from this study that their production was about 302,241 ton of TN/day and 54,682 ton of TP/day. During the urbanization and industrialization, these numbers continue to increase. These nutrient matters can be used in agriculture as well as in other practices. Nevertheless, they will become pollutants when being discharged to surrounding environment (rivers, lakes, etc.) as they cause water eutrophication and increase risks for water supply

    Exclusion Statistics in a trapped two-dimensional Bose gas

    Full text link
    We study the statistical mechanics of a two-dimensional gas with a repulsive delta function interaction, using a mean field approximation. By a direct counting of states we establish that this model obeys exclusion statistics and is equivalent to an ideal exclusion statistics gas.Comment: 3 pages; minor changes in notation; typos correcte

    Rodrigues Formula for the Nonsymmetric Multivariable Laguerre Polynomial

    Full text link
    Extending a method developed by Takamura and Takano, we present the Rodrigues formula for the nonsymmetric multivariable Laguerre polynomials which form the orthogonal basis for the BNB_{N}-type Calogero model with distinguishable particles. Our construction makes it possible for the first time to algebraically generate all the nonsymmetric multivariable Laguerre polynomials with different parities for each variable.Comment: 6 pages, LaTe

    Linear Query Approximation Algorithms for Non-monotone Submodular Maximization under Knapsack Constraint

    Full text link
    This work, for the first time, introduces two constant factor approximation algorithms with linear query complexity for non-monotone submodular maximization over a ground set of size nn subject to a knapsack constraint, DLA\mathsf{DLA} and RLA\mathsf{RLA}. DLA\mathsf{DLA} is a deterministic algorithm that provides an approximation factor of 6+ϵ6+\epsilon while RLA\mathsf{RLA} is a randomized algorithm with an approximation factor of 4+ϵ4+\epsilon. Both run in O(nlog(1/ϵ)/ϵ)O(n \log(1/\epsilon)/\epsilon) query complexity. The key idea to obtain a constant approximation ratio with linear query lies in: (1) dividing the ground set into two appropriate subsets to find the near-optimal solution over these subsets with linear queries, and (2) combining a threshold greedy with properties of two disjoint sets or a random selection process to improve solution quality. In addition to the theoretical analysis, we have evaluated our proposed solutions with three applications: Revenue Maximization, Image Summarization, and Maximum Weighted Cut, showing that our algorithms not only return comparative results to state-of-the-art algorithms but also require significantly fewer queries
    corecore