2,766 research outputs found

    <i>Spitzer</i> microlens measurement of a massive remnant in a well-separated binary

    Get PDF
    We report the detection and mass measurement of a binary lens OGLE-2015-BLG-1285La,b, with the more massive component having M1 > 1.35 M⊙ (80% probability). A main-sequence star in this mass range is ruled out by limits on blue light, meaning that a primary in this mass range must be a neutron star (NS) or black hole (BH). The system has a projected separation r⊥ = 6.1 ± 0.4 AU and lies in the Galactic bulge. These measurements are based on the "microlens parallax" effect, i.e., comparing the microlensing light curve as seen from Spitzer, which lay at 1.25 AU projected from Earth, to the light curves from four ground-based surveys, three in the optical and one in the near-infrared. Future adaptive optics imaging of the companion by 30 m class telescopes will yield a much more accurate measurement of the primary mass. This discovery both opens the path and defines the challenges to detecting and characterizing BHs and NSs in wide binaries, with either dark or luminous companions. In particular, we discuss lessons that can be applied to future Spitzer and Kepler K2 microlensing parallax observations

    A cell outage management framework for dense heterogeneous networks

    Get PDF
    In this paper, we present a novel cell outage management (COM) framework for heterogeneous networks with split control and data planes-a candidate architecture for meeting future capacity, quality-of-service, and energy efficiency demands. In such an architecture, the control and data functionalities are not necessarily handled by the same node. The control base stations (BSs) manage the transmission of control information and user equipment (UE) mobility, whereas the data BSs handle UE data. An implication of this split architecture is that an outage to a BS in one plane has to be compensated by other BSs in the same plane. Our COM framework addresses this challenge by incorporating two distinct cell outage detection (COD) algorithms to cope with the idiosyncrasies of both data and control planes. The COD algorithm for control cells leverages the relatively larger number of UEs in the control cell to gather large-scale minimization-of-drive-test report data and detects an outage by applying machine learning and anomaly detection techniques. To improve outage detection accuracy, we also investigate and compare the performance of two anomaly-detecting algorithms, i.e., k-nearest-neighbor- and local-outlier-factor-based anomaly detectors, within the control COD. On the other hand, for data cell COD, we propose a heuristic Grey-prediction-based approach, which can work with the small number of UE in the data cell, by exploiting the fact that the control BS manages UE-data BS connectivity and by receiving a periodic update of the received signal reference power statistic between the UEs and data BSs in its coverage. The detection accuracy of the heuristic data COD algorithm is further improved by exploiting the Fourier series of the residual error that is inherent to a Grey prediction model. Our COM framework integrates these two COD algorithms with a cell outage compensation (COC) algorithm that can be applied to both planes. Our COC solution utilizes an actor-critic-based reinforcement learning algorithm, which optimizes the capacity and coverage of the identified outage zone in a plane, by adjusting the antenna gain and transmission power of the surrounding BSs in that plane. The simulation results show that the proposed framework can detect both data and control cell outage and compensate for the detected outage in a reliable manner

    An Atomic Gravitational Wave Interferometric Sensor in Low Earth Orbit (AGIS-LEO)

    Full text link
    We propose an atom interferometer gravitational wave detector in low Earth orbit (AGIS-LEO). Gravitational waves can be observed by comparing a pair of atom interferometers separated over a ~30 km baseline. In the proposed configuration, one or three of these interferometer pairs are simultaneously operated through the use of two or three satellites in formation flight. The three satellite configuration allows for the increased suppression of multiple noise sources and for the detection of stochastic gravitational wave signals. The mission will offer a strain sensitivity of < 10^(-18) / Hz^(1/2) in the 50 mHz - 10 Hz frequency range, providing access to a rich scientific region with substantial discovery potential. This band is not currently addressed with the LIGO or LISA instruments. We analyze systematic backgrounds that are relevant to the mission and discuss how they can be mitigated at the required levels. Some of these effects do not appear to have been considered previously in the context of atom interferometry, and we therefore expect that our analysis will be broadly relevant to atom interferometric precision measurements. Finally, we present a brief conceptual overview of shorter-baseline (< 100 m) atom interferometer configurations that could be deployed as proof-of-principle instruments on the International Space Station (AGIS-ISS) or an independent satellite.Comment: 37 pages, 21 figure

    US Cosmic Visions: New Ideas in Dark Matter 2017: Community Report

    Get PDF
    This white paper summarizes the workshop "U.S. Cosmic Visions: New Ideas in Dark Matter" held at University of Maryland on March 23-25, 2017.Comment: 102 pages + reference

    A survey of machine learning techniques applied to self organizing cellular networks

    Get PDF
    In this paper, a survey of the literature of the past fifteen years involving Machine Learning (ML) algorithms applied to self organizing cellular networks is performed. In order for future networks to overcome the current limitations and address the issues of current cellular systems, it is clear that more intelligence needs to be deployed, so that a fully autonomous and flexible network can be enabled. This paper focuses on the learning perspective of Self Organizing Networks (SON) solutions and provides, not only an overview of the most common ML techniques encountered in cellular networks, but also manages to classify each paper in terms of its learning solution, while also giving some examples. The authors also classify each paper in terms of its self-organizing use-case and discuss how each proposed solution performed. In addition, a comparison between the most commonly found ML algorithms in terms of certain SON metrics is performed and general guidelines on when to choose each ML algorithm for each SON function are proposed. Lastly, this work also provides future research directions and new paradigms that the use of more robust and intelligent algorithms, together with data gathered by operators, can bring to the cellular networks domain and fully enable the concept of SON in the near future

    Architecting Data Centers for High Efficiency and Low Latency

    Full text link
    Modern data centers, housing remarkably powerful computational capacity, are built in massive scales and consume a huge amount of energy. The energy consumption of data centers has mushroomed from virtually nothing to about three percent of the global electricity supply in the last decade, and will continuously grow. Unfortunately, a significant fraction of this energy consumption is wasted due to the inefficiency of current data center architectures, and one of the key reasons behind this inefficiency is the stringent response latency requirements of the user-facing services hosted in these data centers such as web search and social networks. To deliver such low response latency, data center operators often have to overprovision resources to handle high peaks in user load and unexpected load spikes, resulting in low efficiency. This dissertation investigates data center architecture designs that reconcile high system efficiency and low response latency. To increase the efficiency, we propose techniques that understand both microarchitectural-level resource sharing and system-level resource usage dynamics to enable highly efficient co-locations of latency-critical services and low-priority batch workloads. We investigate the resource sharing on real-system simultaneous multithreading (SMT) processors to enable SMT co-locations by precisely predicting the performance interference. We then leverage historical resource usage patterns to further optimize the task scheduling algorithm and data placement policy to improve the efficiency of workload co-locations. Moreover, we introduce methodologies to better manage the response latency by automatically attributing the source of tail latency to low-level architectural and system configurations in both offline load testing environment and online production environment. We design and develop a response latency evaluation framework at microsecond-level precision for data center applications, with which we construct statistical inference procedures to attribute the source of tail latency. Finally, we present an approach that proactively enacts carefully designed causal inference micro-experiments to diagnose the root causes of response latency anomalies, and automatically correct them to reduce the response latency.PHDComputer Science & EngineeringUniversity of Michigan, Horace H. Rackham School of Graduate Studieshttps://deepblue.lib.umich.edu/bitstream/2027.42/144144/1/yunqi_1.pd

    Investigating Key Techniques to Leverage the Functionality of Ground/Wall Penetrating Radar

    Get PDF
    Ground penetrating radar (GPR) has been extensively utilized as a highly efficient and non-destructive testing method for infrastructure evaluation, such as highway rebar detection, bridge decks inspection, asphalt pavement monitoring, underground pipe leakage detection, railroad ballast assessment, etc. The focus of this dissertation is to investigate the key techniques to tackle with GPR signal processing from three perspectives: (1) Removing or suppressing the radar clutter signal; (2) Detecting the underground target or the region of interest (RoI) in the GPR image; (3) Imaging the underground target to eliminate or alleviate the feature distortion and reconstructing the shape of the target with good fidelity. In the first part of this dissertation, a low-rank and sparse representation based approach is designed to remove the clutter produced by rough ground surface reflection for impulse radar. In the second part, Hilbert Transform and 2-D Renyi entropy based statistical analysis is explored to improve RoI detection efficiency and to reduce the computational cost for more sophisticated data post-processing. In the third part, a back-projection imaging algorithm is designed for both ground-coupled and air-coupled multistatic GPR configurations. Since the refraction phenomenon at the air-ground interface is considered and the spatial offsets between the transceiver antennas are compensated in this algorithm, the data points collected by receiver antennas in time domain can be accurately mapped back to the spatial domain and the targets can be imaged in the scene space under testing. Experimental results validate that the proposed three-stage cascade signal processing methodologies can improve the performance of GPR system
    corecore