61,657 research outputs found

    Knowledge Representation Concepts for Automated SLA Management

    Full text link
    Outsourcing of complex IT infrastructure to IT service providers has increased substantially during the past years. IT service providers must be able to fulfil their service-quality commitments based upon predefined Service Level Agreements (SLAs) with the service customer. They need to manage, execute and maintain thousands of SLAs for different customers and different types of services, which needs new levels of flexibility and automation not available with the current technology. The complexity of contractual logic in SLAs requires new forms of knowledge representation to automatically draw inferences and execute contractual agreements. A logic-based approach provides several advantages including automated rule chaining allowing for compact knowledge representation as well as flexibility to adapt to rapidly changing business requirements. We suggest adequate logical formalisms for representation and enforcement of SLA rules and describe a proof-of-concept implementation. The article describes selected formalisms of the ContractLog KR and their adequacy for automated SLA management and presents results of experiments to demonstrate flexibility and scalability of the approach.Comment: Paschke, A. and Bichler, M.: Knowledge Representation Concepts for Automated SLA Management, Int. Journal of Decision Support Systems (DSS), submitted 19th March 200

    Resilience of New Zealand indigenous forest fragments to impacts of livestock and pest mammals

    Get PDF
    A number of factors have combined to diminish ecosystem integrity in New Zealand indigenous lowland forest fragments surrounded by intensively grazed pasture. Livestock grazing, mammalian pests, adventive weeds and altered nutrient input regimes are important drivers compounding the changes in fragment structure and function due to historical deforestation and fragmentation. We used qualitative systems modelling and empirical data from Beilschmiedia tawa dominated lowland forest fragments in the Waikato Region to explore the relevance of two common resilience paradigms – engineering resilience and ecological resilience – for addressing the conservation management of forest fragments into the future. Grazing by livestock and foraging/predation by introduced mammalian pests both have direct detrimental impacts on key structural and functional attributes of forest fragments. Release from these perturbations through fencing and pest control leads to partial or full recovery of some key indicators (i.e. increased indigenous plant regeneration and cover, increased invertebrate populations and litter mass, decreased soil fertility and increased nesting success) relative to levels seen in larger forest systems over a range of timescales. These changes indicate that forest fragments do show resilience consistent with adopting an engineering resilience paradigm for conservation management, in the landscape context studied. The relevance of the ecological resilience paradigm in these ecosystems is obscured by limited data. We characterise forest fragment dynamics in terms of changes in indigenous species occupancy and functional dominance, and present a conceptual model for the management of forest fragment ecosystems

    geomatics for structural assessment and surface diagnostic of ch

    Get PDF
    Abstract The capacity to rapidly acquire large quantities of spatial data, to geo-reference information on them, to obtain detailed models that allow more and more accurate analyses and simulations, place Geoinformatics at the center of attention in many research areas. Among these, the use of these techniques for the study of existing structures is particularly interesting. Assessing the current stability of a building, monitoring the evolution over time of a failure, preventing the potential causes of damage, simulating the behavior of a building under seismic actions, are just some of the ways in which the geometric properties of a structure, acquired with the most up-to-date automated surveying systems, are used to help validate structural integrity analyse

    Climate Change and Biosphere Response: Unlocking the Collections Vault

    No full text
    Natural history collections (NHCs) are an important source of the long-term data needed to understand how biota respond to ongoing anthropogenic climate change. These include taxon occurrence data for ecological modeling, as well as information that can be used to reconstruct mechanisms through which biota respond to changing climates. The full potential of NHCs for climate change research cannot be fully realized until high-quality data sets are conveniently accessible for research, but this requires that higher priority be placed on digitizing the holdings most useful for climate change research (e.g., whole-biota studies, time series, records of intensively sampled common taxa). Natural history collections must not neglect the proliferation of new information from efforts to understand how present-day ecosystems are responding to environmental change. These new directions require a strategic realignment for many NHC holders to complement their existing focus on taxonomy and systematics. To set these new priorities, we need strong partnerships between NHC holders and global change biologists

    A Review of the Enviro-Net Project

    Get PDF
    Ecosystems monitoring is essential to properly understand their development and the effects of events, both climatological and anthropological in nature. The amount of data used in these assessments is increasing at very high rates. This is due to increasing availability of sensing systems and the development of new techniques to analyze sensor data. The Enviro-Net Project encompasses several of such sensor system deployments across five countries in the Americas. These deployments use a few different ground-based sensor systems, installed at different heights monitoring the conditions in tropical dry forests over long periods of time. This paper presents our experience in deploying and maintaining these systems, retrieving and pre-processing the data, and describes the Web portal developed to help with data management, visualization and analysis.Comment: v2: 29 pages, 5 figures, reflects changes addressing reviewers' comments v1: 38 pages, 8 figure

    Industrial implementation of intelligent system techniques for nuclear power plant condition monitoring

    Get PDF
    As the nuclear power plants within the UK age, there is an increased requirement for condition monitoring to ensure that the plants are still be able to operate safely. This paper describes the novel application of Intelligent Systems (IS) techniques to provide decision support to the condition monitoring of Nuclear Power Plant (NPP) reactor cores within the UK. The resulting system, BETA (British Energy Trace Analysis) is deployed within the UK’s nuclear operator and provides automated decision support for the analysis of refuelling data, a lead indicator of the health of AGR (Advanced Gas-cooled Reactor) nuclear power plant cores. The key contribution of this work is the improvement of existing manual, labour-intensive analysis through the application of IS techniques to provide decision support to NPP reactor core condition monitoring. This enables an existing source of condition monitoring data to be analysed in a rapid and repeatable manner, providing additional information relating to core health on a more regular basis than routine inspection data allows. The application of IS techniques addresses two issues with the existing manual interpretation of the data, namely the limited availability of expertise and the variability of assessment between different experts. Decision support is provided by four applications of intelligent systems techniques. Two instances of a rule-based expert system are deployed, the first to automatically identify key features within the refuelling data and the second to classify specific types of anomaly. Clustering techniques are applied to support the definition of benchmark behaviour, which is used to detect the presence of anomalies within the refuelling data. Finally data mining techniques are used to track the evolution of the normal benchmark behaviour over time. This results in a system that not only provides support for analysing new refuelling events but also provides the platform to allow future events to be analysed. The BETA system has been deployed within the nuclear operator in the UK and is used at both the engineering offices and on station to support the analysis of refuelling events from two AGR stations, with a view to expanding it to the rest of the fleet in the near future

    An objective framework to test the quality of candidate indicators of good environmental status

    Get PDF
    Large efforts are on-going within the EU to prepare the Marine Strategy Framework Directive's (MSFD) assessment of the environmental status of the European seas. This assessment will only be as good as the indicators chosen to monitor the 11 descriptors of good environmental status (GEnS). An objective and transparent framework to determine whether chosen indicators actually support the aims of this policy is, however, not yet in place. Such frameworks are needed to ensure that the limited resources available to this assessment optimize the likelihood of achieving GEnS within collaborating states. Here, we developed a hypothesis-based protocol to evaluate whether candidate indicators meet quality criteria explicit to the MSFD, which the assessment community aspires to. Eight quality criteria are distilled from existing initiatives, and a testing and scoring protocol for each of them is presented. We exemplify its application in three worked examples, covering indicators for three GEnS descriptors (1, 5, and 6), various habitat components (seaweeds, seagrasses, benthic macrofauna, and plankton), and assessment regions (Danish, Lithuanian, and UK waters). We argue that this framework provides a necessary, transparent and standardized structure to support the comparison of candidate indicators, and the decision-making process leading to indicator selection. Its application could help identify potential limitations in currently available candidate metrics and, in such cases, help focus the development of more adequate indicators. Use of such standardized approaches will facilitate the sharing of knowledge gained across the MSFD parties despite context-specificity across assessment regions, and support the evidence-based management of European seas

    Ontology-based data semantic management and application in IoT- and cloud-enabled smart homes

    Get PDF
    The application of emerging technologies of Internet of Things (IoT) and cloud computing have increasing the popularity of smart homes, along with which, large volumes of heterogeneous data have been generating by home entities. The representation, management and application of the continuously increasing amounts of heterogeneous data in the smart home data space have been critical challenges to the further development of smart home industry. To this end, a scheme for ontology-based data semantic management and application is proposed in this paper. Based on a smart home system model abstracted from the perspective of implementing users’ household operations, a general domain ontology model is designed by defining the correlative concepts, and a logical data semantic fusion model is designed accordingly. Subsequently, to achieve high-efficiency ontology data query and update in the implementation of the data semantic fusion model, a relational-database-based ontology data decomposition storage method is developed by thoroughly investigating existing storage modes, and the performance is demonstrated using a group of elaborated ontology data query and update operations. Comprehensively utilizing the stated achievements, ontology-based semantic reasoning with a specially designed semantic matching rule is studied as well in this work in an attempt to provide accurate and personalized home services, and the efficiency is demonstrated through experiments conducted on the developed testing system for user behavior reasoning

    Architecture and Information Requirements to Assess and Predict Flight Safety Risks During Highly Autonomous Urban Flight Operations

    Get PDF
    As aviation adopts new and increasingly complex operational paradigms, vehicle types, and technologies to broaden airspace capability and efficiency, maintaining a safe system will require recognition and timely mitigation of new safety issues as they emerge and before significant consequences occur. A shift toward a more predictive risk mitigation capability becomes critical to meet this challenge. In-time safety assurance comprises monitoring, assessment, and mitigation functions that proactively reduce risk in complex operational environments where the interplay of hazards may not be known (and therefore not accounted for) during design. These functions can also help to understand and predict emergent effects caused by the increased use of automation or autonomous functions that may exhibit unexpected non-deterministic behaviors. The envisioned monitoring and assessment functions can look for precursors, anomalies, and trends (PATs) by applying model-based and data-driven methods. Outputs would then drive downstream mitigation(s) if needed to reduce risk. These mitigations may be accomplished using traditional design revision processes or via operational (and sometimes automated) mechanisms. The latter refers to the in-time aspect of the system concept. This report comprises architecture and information requirements and considerations toward enabling such a capability within the domain of low altitude highly autonomous urban flight operations. This domain may span, for example, public-use surveillance missions flown by small unmanned aircraft (e.g., infrastructure inspection, facility management, emergency response, law enforcement, and/or security) to transportation missions flown by larger aircraft that may carry passengers or deliver products. Caveat: Any stated requirements in this report should be considered initial requirements that are intended to drive research and development (R&D). These initial requirements are likely to evolve based on R&D findings, refinement of operational concepts, industry advances, and new industry or regulatory policies or standards related to safety assurance

    Evaluation of Modification of the Upper Batavia Dam on the Fox River, Illinois

    Get PDF
    Progress Report, Federal Aid Project F-136-R Segment 6Report issued on: August 2004Submitted to Office of Water Resources, Illinois Department of Natural Resource
    • …
    corecore