174 research outputs found

    A Survey of Distributed Network and Systems Management Paradigms

    Get PDF
    Since the mid 1990s, network and systems management has steadily evolved from a centralized paradigm, where all the management processing takes place in a single management station, to distributed paradigms, where management is distributed over a potentially large number of nodes. Some of these paradigms, epitomized by the SNMPv2 and CMIP protocols, have been around for several years, whereas a flurry of new ones, based on mobile code, distributed objects or intelligent agents, only recently emerged. The goal of this survey is to classify all major network and systems management paradigms known to date, in order to help network and systems administrators design a management application. In the first part of the survey, we present a simple typology, based on a single criterion: the organizational model. In this typology, all paradigms are grouped into four types: centralized paradigms, weakly distributed hierarchical paradigms, strongly distributed hierarchical paradigms and cooperative paradigms. In the second part of the survey, we gradually build an enhanced typology, based on four criteria: delegation granularity, semantic richness of the information model, degree of specification of a task, and degree of automation of management. Finally, we show how to use our typologies to select a management paradigm in a given context. KEYWORDS Distributed Network Management, Distributed Systems Management, Integrated Management, Mobile Code, Distributed Objects, Intelligent Agents, Typology

    Dynamic instrumentation for the management of EJB-based applications

    Get PDF
    With the growing number of distributed component-based applications for enterprises, an efficient management should be applied over these applications to evaluate their performance, to provide them with a good quality of service(QoS), and to maintain them during their life cycle. In this report we focus on Enterprise JavaBeans distributed applications. We define a unified and automatic instrumentation mechanism to allow managing EJB-based applications. The mechanism is based on an instrumentation interface extracted automatically from the home and remote interfaces of EJB beans. The interface is then used by the management system of the application server hosting the EJB applications in order to generate Managed EJBs corresponding to each initial EJB bean of the application. The generated Managed EJBs form the basic elements of a management system over EJB-based applications. They are manipulated by a management application through an agent that allows the access of a number of managed objects including EJBs. We propose implementing the unified instrumentation for EJB-based applications automatica- lly by integrating an instrumentation service into application servers hosting these applications. The instrumentation service generates a unified instrumentation interface for all EJBs living or willing to live in the application server and then it generates the corresponding Managed EJBs according to the requirements of the management system used by the application server

    MAKING BOUNDARIES AND LINKING GLOBALLY: “MATERIAL POLITICS” OF PHYTOSANITARY REGULATION ON MEXICAN MANGOS

    Get PDF
    This dissertation illuminates how phytosanitary (PS) regulations enable mango exportation from Mexico to the United States. PS regulations are technical and legal measures to prevent plant pests from proliferating or being transported to other places and are important regulatory mechanisms enabling the globalization of agriculture. My case study investigates how PS regulations enable Mexican mango exportation as an aspect of the globalization of agriculture, illustrating the consequences of PS regulations to humans and non-humans. More specifically, three research questions are posed: (1) How does the PS regulation network operate to draw distinctions between pest/non-pest, thereby enabling the export of Mexican mangos to the United States? (2) What values are associated with the PS regulation network, and what are the normative, moral, or ethical implications of the regulations? And, (3) How are the PS regulations in transition in the state of Sinaloa changing economic prospects for mango growers and packers to tap into global mango markets? Theoretically, the analysis draws on a concept called “material politics,” which claims that politics is enacted through not only discursive measures, such as statutes, but also physical embodiment by material beings. Thus, PS regulations are conceptualized as a materially heterogeneous network that establishes boundaries between pest/non-pest, thereby connecting distinct places, such as mango orchards and consumers. The material politics concept also suggests the emergence of socio-material “ordering” effects by regulations, such as values, morals, and norms, as well as unequal economic opportunities. Nine months of ethnographic fieldwork in Mexico, which employed in-depth interviews, (participant) observations, and documentary research, yielded the following findings: (1) PS regulations as a network of governance (re)configured the production of the commodity, “disciplining” humans and non-humans to conform to the global regulatory order; (2) in this network, non-governmental entities played critical roles, fitting squarely with the recent neoliberal political-economic orientation in Mexico; and (3) although the government’s pest eradication program could improve market chances for growers, local political-economic circumstances, including small-scale growers’ dependence on packers for marketing, still left substantial challenges for such economic prospects to materialize

    Development of a Proactive Fault Diagnosis for Critical System

    Get PDF
    Large-scale network environments, such as the Internet, are characterized by the presence of various devices connected at various remote locations. There is a scenario of main office connected to different branch offices in another town and cities, with the presence of central administrative system at the main office. Any problem at branches is reported to the main office, due to availability of enough resources there. However, few support tools have been developed to allow the administrators at the central office to remotely control and monitor the computers at the branches. Even, in local area network environment, diagnosing the computers on the network is always a big problem for the administrator, as he/she moves from one computer to another, running the diagnostic program and collecting report for each machine tested. This is strenuous and time consuming. To help address these problems, I have employed the concept of mobile agent to design an architecture that can remotely perform various checks and tests on computers on network, and report its findings to the server administrator as central location. This architecture was implemented with Java, using Jini lookup service to establish communication between the computers. The agent tasks were implemented in C programming language. The result of this research work shows that the use of mobile agent for remote maintenance of computers on network was found to provide an improved, efficient, and dynamic diagnostic management system. All the same, it has proven to be a substantive contributor to efficient network management

    Tiedonlouhinta televerkkojen lokien analysoinnin tukena

    Get PDF
    Telecommunications network management is based on huge amounts of data that are continuously collected from elements and devices from all around the network. The data is monitored and analysed to provide information for decision making in all operation functions. Knowledge discovery and data mining methods can support fast-pace decision making in network operations. In this thesis, I analyse decision making on different levels of network operations. I identify the requirements decision-making sets for knowledge discovery and data mining tools and methods, and I study resources that are available to them. I then propose two methods for augmenting and applying frequent sets to support everyday decision making. The proposed methods are Comprehensive Log Compression for log data summarisation and Queryable Log Compression for semantic compression of log data. Finally I suggest a model for a continuous knowledge discovery process and outline how it can be implemented and integrated to the existing network operations infrastructure.Tiedonlouhintamenetelmillä analysoidaan suuria tietomääriä, jotka on kerätty esimerkiksi vähittäiskaupan asiakkaista, televerkkojen laitteista, prosessiteollisuuden tuotantolaitoksista, tai erotettu geeneistä tai muista tutkitusta kohteista. Menetelmät havaitsevat tehokkaasti asioiden välisiä yhteyksiä kuten käyttäytymis- ja toimintamalleja ja poikkeamia niistä. Menetelmillä tuotettua tietoa käytetään liike-elämässä ja teollisuudessa toimintojen tehostamiseen sekä tieteessä uusien tutkimustulosten etsimiseen. Tiedonlouhinnan menetelmien ongelmana on niiden monimutkaisuus ja vaikeakäyttöisyys. Pystyäkseen käyttämään menetelmiä, tulee hallita niiden teoreettiset perusteet ja kyetä asettamaan kohdalleen useita kymmeniä tuloksiin vaikuttavia syötearvoja. Tämä on hankalaa käytännön tehtävissä, kuten televerkkojen valvonnassa, joissa seurattavat datamäärät ovat valtavia ja aikaa päätöksen tekoon on vähän: pikemminkin minuutteja kuin tunteja. Minkälaisia tiedonlouhintamenetelmien tulisi olla, jotta ne voitaisiin liittää esimerkiksi osaksi televerkon valvojan työkaluja? Selvittääkseni tiedonlouhintamenetelmille asetettavat vaatimukset tarkastelen väitöskirjassani päätöksentekoa televerkon operoinnin ja ylläpidon eri vaiheissa ja tasoilla. Luon päätöksenteosta mallin ja tarkastelen sitä tukevia tiedonlouhinnan tehtäviä ja niiden tarvitsemia lähtötietoja. Kuvaan teollisessa käyttöympäristössä saatavilla olevan asiantuntemuksen, resurssit ja työvälineet, joiden avulla tiedonlouhintamenetelmiä käytetään ja johdan vaatimuslistan päätöksenteon tukena käytettäville tiedonlouhintamenetelmille. Tutkimuksessani esittelen kaksi menetelmää laajojen tapahtumia sisältävien lokitietokantojen analysointiin. CLC-menetelmä luo ilman etukäteisoppimista tai -määritelmiä annetusta laajasta tapahtumajoukosta tiivistelmän havaitsemalla ja kuvaamalla usein samankaltaisina toistuvat tapahtumat ja tapahtumien ketjut. Menetelmä jättää lokiin asiantuntijan tarkasteltavaksi yksittäiset ja harvoin esiintyvät tapahtumat. QLC-menetelmää puolestaan käytetään lokien tiiviiseen tallennukseen. Sen avulla voidaan lokit tallentaa joissain tapauksissa kolmanneksen pienempään tilaan yleisesti käytettyihin tiivistysmenetelmiin verrattuna. Lisäksi QLC-menetelmän etuna on, että sen avulla tiivistettyihin lokitiedostoihin voidaan kohdistaa kyselyjä ilman, että tiivistystä täytyy erikseen ensin purkaa. Sekä CLC- että QLC-menetelmä täyttää hyvin havaitut tiedonlouhintamenetelmille asetetut vaatimukset. Tutkimuksen lopuksi esitän teollista päätöksentekoa tukevaa jatkuvaa tiedonlouhintaa kuvaavan prosessimallin ja hahmottelen kuinka tiedonlouhintamenetelmät ja -prosessi voidaan yhdistää yrityksen tietojärjestelmään. Olen käyttänyt televerkkojen ylläpitoa tutkimusympäristönä, mutta sekä havaitsemani tiedonlouhintamenetelmille asetettavat vaatimukset että kehittämäni menetelmät ovat sovellettavissa muissa vastaavissa ympäristöissä, joissa tarkkaillaan ja analysoidaan jatkuvaa lokitapahtumien virtaa. Näille ympäristöille on yhteistä, että niissä on jatkuvasti tehtävä päätöksiä, joita ei pystytä tapahtumien ja prosessin tilojen harvinaisuuden tai moniselitteisyyden takia automatisoimaan. Tällaisia ovat muun muassa tietoturvalokit, verkkopalvelujen käytön seuranta, teollisten prosessien ylläpito, sekä laajojen logistiikkapalveluiden seuranta

    Topic Maps : a bibliometric study

    Get PDF
    Topic Maps is an international standard (ISO/IEC 13250) to describe and encode knowledge structures and associating them with relevant information resources. This thesis seeks to investigate what has been written about Topic Maps from year 2000 to 2011, as well as finding out the research and publication trend in Topic Maps. This study was based on quantitative methodology, which was bibliometric analysis. The data was collected from Scopus and Web of Knowledge databases. Search keywords used are “topic map”, “topic maps” and “ISO/IEC 13250”. A total of 356 publications (265 conference papers, 91 journal articles) from 2001 to 2011 taken into data analysis. The findings revealed that Topic Maps researchers had a preference to present their findings in conference rather than in journal. The authorship pattern was more towards coauthorship. Most researchers were coauthored locally, as international collaboration was very low. Computer science and library and information science related journals were the favourite publishing venue. Majority of the conferences were computer science and education related. The focus of the topic maps was on data integration and interoperability (2001-2004), information theory (2005 – 2008), knowledge and intelligent based system (2009 – 2011). Also, there were five themes identified, namely content management, repository, ontology, information architecture, retrieval and navigation, and semantic web. The future research areas will possibly be collaborative e-learning system, knowledge visualization system, visualization construction, semantic metadata creation from a relational database, knowledge navigation and retrieval improvement, intelligent topic map, distributed knowledge management based on extended topic maps, knowledge service system, knowledge representation modeling, and multi granularity and multi-level knowledge.Joint Master Degree in Digital Library Learning (DILL

    Annual Report 2017-2018 of the Institute for Nuclear and Energy Technologies (KIT Scientific Reports ; 7756)

    Get PDF
    The annual report of the Institute for Nuclear and Energy Technologies of KIT summarizes its research activities and provides some highlights of each working group, like thermal-hydraulic analyses for nuclear fusion reactors, accident analyses for light water reactors, and research on innovative energy technologies: liquid metal technologies for energy conversion, hydrogen technologies and geothermal power plants. The institute has been engaged in education and training in energy technologies
    corecore