4,443 research outputs found

    Improved Convergence Rates for Distributed Resource Allocation

    Full text link
    In this paper, we develop a class of decentralized algorithms for solving a convex resource allocation problem in a network of nn agents, where the agent objectives are decoupled while the resource constraints are coupled. The agents communicate over a connected undirected graph, and they want to collaboratively determine a solution to the overall network problem, while each agent only communicates with its neighbors. We first study the connection between the decentralized resource allocation problem and the decentralized consensus optimization problem. Then, using a class of algorithms for solving consensus optimization problems, we propose a novel class of decentralized schemes for solving resource allocation problems in a distributed manner. Specifically, we first propose an algorithm for solving the resource allocation problem with an o(1/k)o(1/k) convergence rate guarantee when the agents' objective functions are generally convex (could be nondifferentiable) and per agent local convex constraints are allowed; We then propose a gradient-based algorithm for solving the resource allocation problem when per agent local constraints are absent and show that such scheme can achieve geometric rate when the objective functions are strongly convex and have Lipschitz continuous gradients. We have also provided scalability/network dependency analysis. Based on these two algorithms, we have further proposed a gradient projection-based algorithm which can handle smooth objective and simple constraints more efficiently. Numerical experiments demonstrates the viability and performance of all the proposed algorithms

    Feedback Effects and the Limits to Arbitrage

    Get PDF
    This paper identifies a limit to arbitrage that arises from the fact that a firm's fundamental value is endogenous to the act of exploiting the arbitrage. Trading on private information reveals this information to managers and helps them improve their real decisions, in turn enhancing fundamental value. While this increases the profitability of a long position, it reduces the profitability of a short position -- selling on negative information reveals that firm prospects are poor, causing the manager to cancel investment. Optimal abandonment increases firm value and may cause the speculator to realize a loss on her initial sale. Thus, investors may strategically refrain from trading on negative information, and so bad news is incorporated more slowly into prices than good news. The effect has potentially important real consequences -- if negative information is not incorporated into stock prices, negative-NPV projects may not be abandoned, leading to overinvestment.

    Relay: A New IR for Machine Learning Frameworks

    Full text link
    Machine learning powers diverse services in industry including search, translation, recommendation systems, and security. The scale and importance of these models require that they be efficient, expressive, and portable across an array of heterogeneous hardware devices. These constraints are often at odds; in order to better accommodate them we propose a new high-level intermediate representation (IR) called Relay. Relay is being designed as a purely-functional, statically-typed language with the goal of balancing efficient compilation, expressiveness, and portability. We discuss the goals of Relay and highlight its important design constraints. Our prototype is part of the open source NNVM compiler framework, which powers Amazon's deep learning framework MxNet

    Urban characteristics attributable to density-driven tie formation

    Get PDF
    Motivated by empirical evidence on the interplay between geography, population density and societal interaction, we propose a generative process for the evolution of social structure in cities. Our analytical and simulation results predict both super-linear scaling of social tie density and information flow as a function of the population. We demonstrate that our model provides a robust and accurate fit for the dependency of city characteristics with city size, ranging from individual-level dyadic interactions (number of acquaintances, volume of communication) to population-level variables (contagious disease rates, patenting activity, economic productivity and crime) without the need to appeal to modularity, specialization, or hierarchy.Comment: Early version of this paper was presented in NetSci 2012 as a contributed talk in June 2012. An improved version of this paper is published in Nature Communications in June 2013. It has 14 pages and 5 figure

    X-ray luminescence computed tomography using a focused X-ray beam

    Full text link
    Due to the low X-ray photon utilization efficiency and low measurement sensitivity of the electron multiplying charge coupled device (EMCCD) camera setup, the collimator based narrow beam X-ray luminescence computed tomography (XLCT) usually requires a long measurement time. In this paper, we, for the first time, report a focused X-ray beam based XLCT imaging system with measurements by a single optical fiber bundle and a photomultiplier tube (PMT). An X-ray tube with a polycapillary lens was used to generate a focused X-ray beam whose X-ray photon density is 1200 times larger than a collimated X-ray beam. An optical fiber bundle was employed to collect and deliver the emitted photons on the phantom surface to the PMT. The total measurement time was reduced to 12.5 minutes. For numerical simulations of both single and six fiber bundle cases, we were able to reconstruct six targets successfully. For the phantom experiment, two targets with an edge-to-edge distance of 0.4 mm and a center-to-center distance of 0.8 mm were successfully reconstructed by the measurement setup with a single fiber bundle and a PMT.Comment: 39 Pages, 12 Figures, 2 Tables, In submission (under review) to JB

    Comparative approaches for assessing access to alcohol outlets: exploring the utility of a gravity potential approach.

    Get PDF
    BackgroundA growing body of research recommends controlling alcohol availability to reduce harm. Various common approaches, however, provide dramatically different pictures of the physical availability of alcohol. This limits our understanding of the distribution of alcohol access, the causes and consequences of this distribution, and how best to reduce harm. The aim of this study is to introduce both a gravity potential measure of access to alcohol outlets, comparing its strengths and weaknesses to other popular approaches, and an empirically-derived taxonomy of neighborhoods based on the type of alcohol access they exhibit.MethodsWe obtained geospatial data on Seattle, including the location of 2402 alcohol outlets, United States Census Bureau estimates on 567 block groups, and a comprehensive street network. We used exploratory spatial data analysis and employed a measure of inter-rater agreement to capture differences in our taxonomy of alcohol availability measures.ResultsSignificant statistical and spatial variability exists between measures of alcohol access, and these differences have meaningful practical implications. In particular, standard measures of outlet density (e.g., spatial, per capita, roadway miles) can lead to biased estimates of physical availability that over-emphasize the influence of the control variables. Employing a gravity potential approach provides a more balanced, geographically-sensitive measure of access to alcohol outlets.ConclusionsAccurately measuring the physical availability of alcohol is critical for understanding the causes and consequences of its distribution and for developing effective evidence-based policy to manage the alcohol outlet licensing process. A gravity potential model provides a superior measure of alcohol access, and the alcohol access-based taxonomy a helpful evidence-based heuristic for scholars and local policymakers
    • …
    corecore