83,257 research outputs found

    A methodological approach to BISDN signalling performance

    Get PDF
    Sophisticated signalling protocols are required to properly handle the complex multimedia, multiparty services supported by the forthcoming BISDN. The implementation feasibility of these protocols should be evaluated during their design phase, so that possible performance bottlenecks are identified and removed. In this paper we present a methodology for evaluating the performance of BISDN signalling systems under design. New performance parameters are introduced and their network-dependent values are extracted through a message flow model which has the capability to describe the impact of call and bearer control separation on the signalling performance. Signalling protocols are modelled through a modular decomposition of the seven OSI layers including the service user to three submodels. The workload model is user descriptive in the sense that it does not approximate the direct input traffic required for evaluating the performance of a layer protocol; instead, through a multi-level approach, it describes the actual implications of user signalling activity for the general signalling traffic. The signalling protocol model is derived from the global functional model of the signalling protocols and information flows using a network of queues incorporating synchronization and dependency functions. The same queueing approach is followed for the signalling transfer network which is used to define processing speed and signalling bandwidth requirements and to identify possible performance bottlenecks stemming from the realization of the related protocols

    Cyberscience and the Knowledge-Based Economy, Open Access and Trade Publishing: From Contradiction to Compatibility with Nonexclusive Copyright Licensing

    Get PDF
    Open source, open content and open access are set to fundamentally alter the conditions of knowledge production and distribution. Open source, open content and open access are also the most tangible result of the shift towards e-Science and digital networking. Yet, widespread misperceptions exist about the impact of this shift on knowledge distribution and scientific publishing. It is argued, on the one hand, that for the academy there principally is no digital dilemma surrounding copyright and there is no contradiction between open science and the knowledge-based economy if profits are made from nonexclusive rights. On the other hand, pressure for the ‘digital doubling’ of research articles in Open Access repositories (the ‘green road’) is misguided and the current model of Open Access publishing (the ‘gold road’) has not much future outside biomedicine. Commercial publishers must understand that business models based on the transfer of copyright have not much future either. Digital technology and its economics favour the severance of distribution from certification. What is required of universities and governments, scholars and publishers, is to clear the way for digital innovations in knowledge distribution and scholarly publishing by enabling the emergence of a competitive market that is based on nonexclusive rights. This requires no change in the law but merely an end to the praxis of copyright transfer and exclusive licensing. The best way forward for research organisations, universities and scientists is the adoption of standard copyright licenses that reserve some rights, namely Attribution and No Derivative Works, but otherwise will allow for the unlimited reproduction, dissemination and re-use of the research article, commercial uses included

    Molecfit: A general tool for telluric absorption correction. I. Method and application to ESO instruments

    Full text link
    Context: The interaction of the light from astronomical objects with the constituents of the Earth's atmosphere leads to the formation of telluric absorption lines in ground-based collected spectra. Correcting for these lines, mostly affecting the red and infrared region of the spectrum, usually relies on observations of specific stars obtained close in time and airmass to the science targets, therefore using precious observing time. Aims: We present molecfit, a tool for correcting for telluric absorption lines based on synthetic modelling of the Earth's atmospheric transmission. Molecfit is versatile and can be used with data obtained with various ground-based telescopes and instruments. Methods: Molecfit combines a publicly available radiative transfer code, a molecular line database, atmospheric profiles, and various kernels to model the instrument line spread function. The atmospheric profiles are created by merging a standard atmospheric profile representative of a given observatory's climate, of local meteorological data, and of dynamically retrieved altitude profiles for temperature, pressure, and humidity. We discuss the various ingredients of the method, its applicability, and its limitations. We also show examples of telluric line correction on spectra obtained with a suite of ESO Very Large Telescope (VLT) instruments. Results: Compared to previous similar tools, molecfit takes the best results for temperature, pressure, and humidity in the atmosphere above the observatory into account. As a result, the standard deviation of the residuals after correction of unsaturated telluric lines is frequently better than 2% of the continuum. Conclusion: Molecfit is able to accurately model and correct for telluric lines over a broad range of wavelengths and spectral resolutions. (Abridged)Comment: 18 pages, 13 figures, 5 tables, accepted for publication in Astronomy and Astrophysic

    Analysis domain model for shared virtual environments

    Get PDF
    The field of shared virtual environments, which also encompasses online games and social 3D environments, has a system landscape consisting of multiple solutions that share great functional overlap. However, there is little system interoperability between the different solutions. A shared virtual environment has an associated problem domain that is highly complex raising difficult challenges to the development process, starting with the architectural design of the underlying system. This paper has two main contributions. The first contribution is a broad domain analysis of shared virtual environments, which enables developers to have a better understanding of the whole rather than the part(s). The second contribution is a reference domain model for discussing and describing solutions - the Analysis Domain Model

    Models for the modern power grid

    Full text link
    This article reviews different kinds of models for the electric power grid that can be used to understand the modern power system, the smart grid. From the physical network to abstract energy markets, we identify in the literature different aspects that co-determine the spatio-temporal multilayer dynamics of power system. We start our review by showing how the generation, transmission and distribution characteristics of the traditional power grids are already subject to complex behaviour appearing as a result of the the interplay between dynamics of the nodes and topology, namely synchronisation and cascade effects. When dealing with smart grids, the system complexity increases even more: on top of the physical network of power lines and controllable sources of electricity, the modernisation brings information networks, renewable intermittent generation, market liberalisation, prosumers, among other aspects. In this case, we forecast a dynamical co-evolution of the smart grid and other kind of networked systems that cannot be understood isolated. This review compiles recent results that model electric power grids as complex systems, going beyond pure technological aspects. From this perspective, we then indicate possible ways to incorporate the diverse co-evolving systems into the smart grid model using, for example, network theory and multi-agent simulation.Comment: Submitted to EPJ-ST Power Grids, May 201

    A novel satellite mission concept for upper air water vapour, aerosol and cloud observations using integrated path differential absorption LiDAR limb sounding

    Get PDF
    We propose a new satellite mission to deliver high quality measurements of upper air water vapour. The concept centres around a LiDAR in limb sounding by occultation geometry, designed to operate as a very long path system for differential absorption measurements. We present a preliminary performance analysis with a system sized to send 75 mJ pulses at 25 Hz at four wavelengths close to 935 nm, to up to 5 microsatellites in a counter-rotating orbit, carrying retroreflectors characterized by a reflected beam divergence of roughly twice the emitted laser beam divergence of 15 ”rad. This provides water vapour profiles with a vertical sampling of 110 m; preliminary calculations suggest that the system could detect concentrations of less than 5 ppm. A secondary payload of a fairly conventional medium resolution multispectral radiometer allows wide-swath cloud and aerosol imaging. The total weight and power of the system are estimated at 3 tons and 2,700 W respectively. This novel concept presents significant challenges, including the performance of the lasers in space, the tracking between the main spacecraft and the retroreflectors, the refractive effects of turbulence, and the design of the telescopes to achieve a high signal-to-noise ratio for the high precision measurements. The mission concept was conceived at the Alpbach Summer School 2010
    • 

    corecore