669,833 research outputs found

    Foggy clouds and cloudy fogs: a real need for coordinated management of fog-to-cloud computing systems

    Get PDF
    The recent advances in cloud services technology are fueling a plethora of information technology innovation, including networking, storage, and computing. Today, various flavors have evolved of IoT, cloud computing, and so-called fog computing, a concept referring to capabilities of edge devices and users' clients to compute, store, and exchange data among each other and with the cloud. Although the rapid pace of this evolution was not easily foreseeable, today each piece of it facilitates and enables the deployment of what we commonly refer to as a smart scenario, including smart cities, smart transportation, and smart homes. As most current cloud, fog, and network services run simultaneously in each scenario, we observe that we are at the dawn of what may be the next big step in the cloud computing and networking evolution, whereby services might be executed at the network edge, both in parallel and in a coordinated fashion, as well as supported by the unstoppable technology evolution. As edge devices become richer in functionality and smarter, embedding capacities such as storage or processing, as well as new functionalities, such as decision making, data collection, forwarding, and sharing, a real need is emerging for coordinated management of fog-to-cloud (F2C) computing systems. This article introduces a layered F2C architecture, its benefits and strengths, as well as the arising open and research challenges, making the case for the real need for their coordinated management. Our architecture, the illustrative use case presented, and a comparative performance analysis, albeit conceptual, all clearly show the way forward toward a new IoT scenario with a set of existing and unforeseen services provided on highly distributed and dynamic compute, storage, and networking resources, bringing together heterogeneous and commodity edge devices, emerging fogs, as well as conventional clouds.Peer ReviewedPostprint (author's final draft

    Word Limited: An Empirical Analysis of the Relationship Between theLength, Resiliency, and Impact of Federal Regulations

    Get PDF
    Since the rise of the modern administrative state we have seen a demonstrable trend towards lengthier regulations. However, popular critiques of the administrative state that focus on the overall size of the Federal Register are misguided. They rest on the premise that more, and longer, regulations unduly burden industry and the economy in general. However, movement towards lengthier and more detailed regulations could be rational and largely unproblematic. This study tests two potential rational explanations for the trend towards longer regulations: dubbed (1) “the insulation hypothesis” and (2) “the socially beneficial hypothesis.” Each of these explanations embodies a theoretically rational decision. First, the insulation hypothesis rests on the idea that it would make sense for policy-makers to include more detailed legal and scientific support in new regulations, and thereby increase their length relative to previous regulations, if the addition-al detail provided more insulation from judicial review. Second, the socially beneficial hypothesis rests on the idea that devoting relatively more time and re-sources to each new rule would be appropriate if longer, newer regulations produced more net social benefits than older, shorter ones. The empirical analysis set forth in this article combines data from a number of publicly available sources to test these hypotheses. The results, confirming “the socially beneficial hypothesis,” add to the canon of empirical analysis of administrative law, building on the work of Cass Sunstein, Cary Coglianese, and others. Recognizing an overly burdensome regulatory state, an undoubtedly worthwhile and vital check in a democratic society, requires more than simply counting the pages of regulations. The results of this study should put some minds at ease, at least with respect to EPA regulations; they should also help better direct our scrutiny in the future

    Microlensing optical depth towards the Galactic bulge from MOA observations during 2000 with Difference Image Analysis

    Get PDF
    We analyze the data of the gravitational microlensing survey carried out by by the MOA group during 2000 towards the Galactic Bulge (GB). Our observations are designed to detect efficiently high magnification events with faint source stars and short timescale events, by increasing the the sampling rate up to 6 times per night and using Difference Image Analysis (DIA). We detect 28 microlensing candidates in 12 GB fields corresponding to 16 deg^2. We use Monte Carlo simulations to estimate our microlensing event detection efficiency, where we construct the I-band extinction map of our GB fields in order to find dereddened magnitudes. We find a systematic bias and large uncertainty in the measured value of the timescale tEoutt_{\rm Eout} in our simulations. They are associated with blending and unresolved sources, and are allowed for in our measurements. We compute an optical depth tau = 2.59_{-0.64}^{+0.84} \times 10^{-6} towards the GB for events with timescales 0.3<t_E<200 days. We consider disk-disk lensing, and obtain an optical depth tau_{bulge} = 3.36_{-0.81}^{+1.11} \times 10^{-6}[0.77/(1-f_{disk})] for the bulge component assuming a 23% stellar contribution from disk stars. These observed optical depths are consistent with previous measurements by the MACHO and OGLE groups, and still higher than those predicted by existing Galactic models. We present the timescale distribution of the observed events, and find there are no significant short events of a few days, in spite of our high detection efficiency for short timescale events down to t_E = 0.3 days. We find that half of all our detected events have high magnification (>10). These events are useful for studies of extra-solar planets.Comment: 65 pages and 30 figures, accepted for publication in ApJ. A systematic bias and uncertainty in the optical depth measurement has been quantified by simulation

    CARMA observations of massive Planck-discovered cluster candidates at z>0.5 associated with WISE overdensities: strategy, observations and validation

    Get PDF
    We present 1-2 arcmin spatial resolution CARMA-8 31-GHz observations towards 19 unconfirmed Planck cluster candidates, selected to have significant galaxy overdensities from the WISE early data release and thought to be at z>1 from the WISE colors of the putative brightest cluster galaxy (BCG). We find a Sunyaev-Zeldovich (SZ) detection in the CARMA-8 data towards 9 candidate clusters, where one detection is considered tentative. For each cluster candidate we present CARMA-8 maps, a study of their radio-source environment and we assess the reliability of the SZ detection. The CARMA SZ detections appear to be SZ-bright, with the mean, primary-beam-corrected peak flux density of the decrement being -2.9mJy/beam with a standard deviation of 0.8, and are typically offset from the Planck position by approximately 80 arcsec. Using archival imaging data in the vicinity of the CARMA SZ centroids, we present evidence that one cluster matches Abell 586-a known z~0.2 cluster; four candidate clusters are likely to have 0.3<z<0.7; and, for the remaining 4, the redshift information is inconclusive. We also argue that the sensitivity limits resulting from the cross-correlation between Planck and WISE makes it challenging to use our selection criterion to identify clusters at z > 1.Comment: 29 pages, MNRAS, in pres

    The single currency and European citizenship

    Get PDF
    We could expect that the introduction of the single currency had been accompanied by a significant share of studies and researches about the implications and impacts of such a watershed event on European citizenship. On the contrary, we soon discover to be facing a paradox, which could be phrased as follows: while the purpose of building European citizenship is the very rationale for the project of the single currency, the Scholars – but also the policy community – have mostly underestimated if not neglected this relation, both in terms of public policy making and discourse and of interpretation and forecasting. As a consequence of all of that, relevant features of the single currency happened to remain hidden, poorly considered and almost not thematized. In order to fill this gap, the first part of this article will present the main findings emerged from a documentary research conducted by FONDACA between 2010 and 2011, aimed at mapping the existing academic and policy thematizations about the hidden dimensions of the euro. The second part will be devoted to define “the other side of the coin” as an empirical phenomenon

    The DELPHI Silicon Tracker in the global pattern recognition

    Full text link
    ALEPH and DELPHI were the first experiments operating a silicon vertex detector at LEP. During the past 10 years of data taking the DELPHI Silicon Tracker was upgraded three times to follow the different tracking requirements for LEP 1 and LEP 2 as well as to improve the tracking performance. Several steps in the development of the pattern recognition software were done in order to understand and fully exploit the silicon tracker information. This article gives an overview of the final algorithms and concepts of the track reconstruction using the Silicon Tracker in DELPHI.Comment: Talk given at the 8th International Workshop on Vertex Detectors, Vertex'99, Texel, Nederland
    • 

    corecore