6 research outputs found

    Web Workload Generation According to the UniLoG Approach

    Get PDF
    Generating synthetic loads which are suffciently close to reality represents an important and challenging task in performance and quality-of-service (QoS) evaluations of computer networks and distributed systems. Here, the load to be generated represents sequences of requests at a well-defined service interface within a network node. The paper presents a tool (UniLoG.HTTP) which can be used in a flexible manner to generate realistic and representative server and network loads, in terms of access requests to Web servers as well as creation of typical Web traffic within a communication network. The paper describes the architecture of this load generator, the critical design decisions and solution approaches which allowed us to obtain the desired flexibility

    AI for IT Operations (AIOps) on Cloud Platforms: Reviews, Opportunities and Challenges

    Full text link
    Artificial Intelligence for IT operations (AIOps) aims to combine the power of AI with the big data generated by IT Operations processes, particularly in cloud infrastructures, to provide actionable insights with the primary goal of maximizing availability. There are a wide variety of problems to address, and multiple use-cases, where AI capabilities can be leveraged to enhance operational efficiency. Here we provide a review of the AIOps vision, trends challenges and opportunities, specifically focusing on the underlying AI techniques. We discuss in depth the key types of data emitted by IT Operations activities, the scale and challenges in analyzing them, and where they can be helpful. We categorize the key AIOps tasks as - incident detection, failure prediction, root cause analysis and automated actions. We discuss the problem formulation for each task, and then present a taxonomy of techniques to solve these problems. We also identify relatively under explored topics, especially those that could significantly benefit from advances in AI literature. We also provide insights into the trends in this field, and what are the key investment opportunities

    Journal of Telecommunications and Information Technology, 2006, nr 2

    Get PDF
    kwartalni

    Weak Gravitational Lensing by Large-Scale Structures:A Tool for Constraining Cosmology

    Get PDF
    There is now very strong evidence that our Universe is undergoing an accelerated expansion period as if it were under the influence of a gravitationally repulsive “dark energy” component. Furthermore, most of the mass of the Universe seems to be in the form of non-luminous matter, the so-called “dark matter”. Together, these “dark” components, whose nature remains unknown today, represent around 96 % of the matter-energy budget of the Universe. Unraveling the true nature of the dark energy and dark matter has thus, obviously, become one of the primary goals of present-day cosmology. Weak gravitational lensing, or weak lensing for short, is the effect whereby light emitted by distant galaxies is slightly deflected by the tidal gravitational fields of intervening foreground structures. Because it only relies on the physics of gravity, weak lensing has the unique ability to probe the distribution of mass in a direct and unbiased way. This technique is at present routinely used to study the dark matter, typical applications being the mass reconstruction of galaxy clusters and the study of the properties of dark halos surrounding galaxies. Another and more recent application of weak lensing, on which we focus in this thesis, is the analysis of the cosmological lensing signal induced by large-scale structures, the so-called “cosmic shear”. This signal can be used to measure the growth of structures and the expansion history of the Universe, which makes it particularly relevant to the study of dark energy. Of all weak lensing effects, the cosmic shear is the most subtle and its detection requires the accurate analysis of the shapes of millions of distant, faint galaxies in the near infrared. So far, the main factor limiting cosmic shear measurement accuracy has been the relatively small sky areas covered. Next-generation of wide-field, multicolor surveys will, however, overcome this hurdle by covering a much larger portion of the sky with improved image quality. The resulting statistical errors will then become subdominant compared to systematic errors, the latter becoming instead the main source of uncertainty. In fact, uncovering key properties of dark energy will only be achievable if these systematics are well understood and reduced to the required level. The major sources of uncertainty resides in the shape measurement algorithm used, the convolution of the original image by the instrumental and possibly atmospheric point spread function (PSF), the pixelation effect caused by the integration of light falling on the detector pixels and the degradation caused by various sources of noise. Measuring the Cosmic shear thus entails solving the difficult inverse problem of recovering the shear signal from blurred, pixelated and noisy galaxy images while keeping errors within the limits demanded by future weak lensing surveys. Reaching this goal is not without challenges. In fact, the best available shear measurement methods would need a tenfold improvement in accuracy to match the requirements of a space mission like Euclid from ESA, scheduled at the end of this decade. Significant progress has nevertheless been made in the last few years, with substantial contributions from initiatives such as GREAT (GRavitational lEnsing Accuracy Testing) challenges. The main objective of these open competitions is to foster the development of new and more accurate shear measurement methods. We start this work with a quick overview of modern cosmology: its fundamental tenets, achievements and the challenges it faces today. We then review the theory of weak gravitational lensing and explains how it can make use of cosmic shear observations to place constraints on cosmology. The last part of this thesis focuses on the practical challenges associated with the accurate measurement of the cosmic shear. After a review of the subject we present the main contributions we have brought in this area: the development of the gfit shear measurement method, new algorithms for point spread function (PSF) interpolation and image denoising. The gfit method emerged as one of the top performers in the GREAT10 Galaxy Challenge. It essentially consists in fitting two-dimensional elliptical SĂ©rsic light profiles to observed galaxy image in order to produce estimates for the shear power spectrum. PSF correction is automatic and an efficient shape-preserving denoising algorithm can be optionally applied prior to fitting the data. PSF interpolation is also an important issue in shear measurement because the PSF is only known at star positions while PSF correction has to be performed at any position on the sky. We have developed innovative PSF interpolation algorithms on the occasion of the GREAT10 Star Challenge, a competition dedicated to the PSF interpolation problem. Our participation was very successful since one of our interpolation method won the Star Challenge while the remaining four achieved the next highest scores of the competition. Finally we have participated in the development of a wavelet-based, shape-preserving denoising method particularly well suited to weak lensing analysis

    Information systems failure, politics and the sociology of translation : the problematic introduction of an American computerised reservation system and yield management at French Railways

    Get PDF
    This in-depth cases tudy examinest he troubled introduction of a new computerisedr eservationsystem at French Railways. Socrale, based on the American Airlines Sabre system, had a disastrousbeginning.I t wasb adly receivedb y the Frenchp ublic, led to strikes andg overnmentin quiries,a nd had tobe modified substantially.T he literatureo n information systemsf ailure is reviewedf rom functionalistt osocial constructivista nd critical perspectivesa nd the thesis aims to challengeb eliefs and assumptionsabout technological success and failure. The notion of 'symmetry' from the sociology of technologyemphasisetsh at failures expresst he samed ynamicsa s successess,h owingh ow technologicalc hoicesa renot obvious or unproblematic.Differences between air and rail transporĂœ between American and European transportderegulation and between the needs of national identity, regional development and public access totransporta re all reflectedi n the questiono f yield managementY. ield managemenist a crucial componentof computerisedre servations ystemsa nd was first adoptedd uring the deregulationo f the US air transportindustry in the early 80s. It requires complex optimisation software designed to manage passengerrevenues and control demand, by manipulating the availability of full and discounted fares according tomonitoredd emanda nds tatisticaal nalysis.Latour and Callon's sociology of 'translation' helps analyse how the Socrate project wasundertaken and interpreted as: borrowing from airline pricing, aiming to gain competitive advantage,associatingS ocrate to the successo f high-speedt rains, attemptingt o changep assengersb' uying andtravelling behaviour, transformingt he organisationa nd helping identify profitable market segmentsA.non-essentialisst tanceh elpsu nderstandh ow social and technicald istinctionsa re socially constructeda ndhow the differentiation between what is technical and what is social, for instance in the conception andapplication of yield managementi,s a mattero f power and politics. Clegg's circuits of power are usedt ocomplement the sociology of translation in examining how power and political factors contribute toinformation systemsb ecoming( or not) obligatoryp assagep oints.Politically controversialc hangesin Frenchr ail transporta re associatedw ith the role of computertechnology in deregulated European and global electronic markets and its effects on the concept ofnationali dentity and sovereigntyin transportp olicy-making
    corecore