122 research outputs found

    Active Ontology: An Information Integration Approach for Dynamic Information Sources

    Get PDF
    In this paper we describe an ontology-based information integration approach that is suitable for highly dynamic distributed information sources, such as those available in Grid systems. The main challenges addressed are: 1) information changes frequently and information requests have to be answered quickly in order to provide up-to-date information; and 2) the most suitable information sources have to be selected from a set of different distributed ones that can provide the information needed. To deal with the first challenge we use an information cache that works with an update-on-demand policy. To deal with the second we add an information source selection step to the usual architecture used for ontology-based information integration. To illustrate our approach, we have developed an information service that aggregates metadata available in hundreds of information services of the EGEE Grid infrastructure

    Recent developments in MrBUMP : better search-model preparation, graphical interaction with search models, and solution improvement and assessment

    Get PDF
    Increasing sophistication in molecular-replacement (MR) software and the rapid expansion of the PDB in recent years have allowed the technique to become the dominant method for determining the phases of a target structure in macromolecular X-ray crystallography. In addition, improvements in bioinformatic techniques for finding suitable homologous structures for use as MR search models, combined with developments in refinement and model-building techniques, have pushed the applicability of MR to lower sequence identities and made weak MR solutions more amenable to refinement and improvement. MrBUMP is a CCP4 pipeline which automates all stages of the MR procedure. Its scope covers everything from the sourcing and preparation of suitable search models right through to rebuilding of the positioned search model. Recent improvements to the pipeline include the adoption of more sensitive bioinformatic tools for sourcing search models, enhanced model-preparation techniques including better ensembling of homologues, and the use of phase improvement and model building on the resulting solution. The pipeline has also been deployed as an online service through CCP4 online, which allows its users to exploit large bioinformatic databases and coarse-grained parallelism to speed up the determination of a possible solution. Finally, the molecular-graphics application CCP4mg has been combined with MrBUMP to provide an interactive visual aid to the user during the process of selecting and manipulating search models for use in MR. Here, these developments in MrBUMP are described with a case study to explore how some of the enhancements to the pipeline and to CCP4mg can help to solve a difficult case

    ActOn: A Semantic Information Service for EGEE

    Full text link
    We describe an information service that aggregates metadata available in hundreds of information sources of the EGEE Grid infrastructure. It uses an ontology-based information integration architecture (ActOn), which is suitable the highly dynamic distributed information sources available in Grid systems, where information changes frequently and where the information of distributed sources has to be aggregated in order to solve complex queries. These two challenges are addressed by a metadata cache that works with an update-on-demand policy and by an information source selection module that selects the most suitable source at a given point in time, respectively. We have evaluated the quality of this information service, and compared it with other similar services from the EGEE production testbed, with promising result

    Faculty of Computer Science

    Get PDF
    Information about the Faculty of Computer Science of the Technische Universität Dresden, data and facts and a selection of current research projects, 2009Informationen über die Fakultät Informatik der TU Dresden, Daten und Fakten sowie eine Auswahl aktueller Forschungsprojekte, 200

    End user programming of awareness systems : addressing cognitive and social challenges for interaction with aware environments

    Get PDF
    The thesis is put forward that social intelligence in awareness systems emerges from end-Users themselves through the mechanisms that support them in the development and maintenance of such systems. For this intelligence to emerge three challenges have to be addressed, namely the challenge of appropriate awareness abstractions, the challenge of supportive interactive tools, and the challenge of infrastructure. The thesis argues that in order to advance towards social intelligent awareness systems, we should be able to interpret and predict the success or failure of such systems in relationship to their communicational objectives and their implications for the social interactions they support. The FN-AAR (Focus-Nimbus Aspects Attributes Resources) model is introduced as a formal model which by capturing the general characteristics of the awareness-systems domain allows predictions about socially salient patterns pertaining to human communication and brings clarity to the discussion around relevant concepts such as social translucency, symmetry, and deception. The thesis recognizes that harnessing the benefits of context awareness can be problematic for end-users and other affected individuals, who may not always be able to anticipate, understand or appreciate system function, and who may so feel their own sense of autonomy and privacy threatened. It introduces a set of tools and mechanisms that support end-user control, system intelligibility and accountability. This is achieved by minimizing the cognitive effort needed to handle the increased complexity of such systems and by enhancing the ability of people to configure and maintain intelligent environments. We show how these tools and mechanisms empower end-users to answer questions such as "how does the system behave", "why is something happening", "how would the system behave in response to a change in context", and "how can the system’s behaviour be altered" to achieve intelligibility, accountability, and end-user control. Finally, the thesis argues that awareness applications overall can not be examined as static configurations of services and functions, and that they should be seen as the results of both implicit and explicit interaction with the user. Amelie is introduced as a supportive framework for the development of context-aware applications that encourages the design of the interactive mechanisms through which end-users can control, direct and advance such systems dynamically throughout their deployment. Following the recombinant computing approach, Amelie addresses the implications of infrastructure design decisions on user experience, while by adopting the premises of the FN-AAR model Amelie supports the direct implementation of systems that allow end-users to meet social needs and to practice extant social skills

    Coastal risk adaptation: the potential role of accessible geospatial Big Data

    Get PDF
    Increasing numbers of people are living in and using coastal areas. Combined with the presence of pervasive coastal threats, such as flooding and erosion, this is having widespread impacts on coastal populations, infrastructure and ecosystems. For the right adaptive strategies to be adopted, and planning decisions to be made, rigorous evaluation of the available options is required. This evaluation hinges on the availability and use of suitable datasets. For knowledge to be derived from coastal datasets, such data needs to be combined and analysed in an effective manner. This paper reviews a wide range of literature relating to data-driven approaches to coastal risk evaluation, revealing how limitations have been imposed on many of these methods, due to restrictions in computing power and access to data. The rapidly emerging field of ‘Big Data’ can help overcome many of these hurdles. ‘Big Data’ involves powerful computer infrastructures, enabling storage, processing and real-time analysis of large volumes and varieties of data, in a fast and reliable manner. Through consideration of examples of how ‘Big Data’ technologies are being applied to fields related to coastal risk, it becomes apparent that geospatial Big Data solutions hold clear potential to improve the process of risk based decision making on the coast. ‘Big Data’ does not provide a stand-alone solution to the issues and gaps outlined in this paper, yet these technological methods hold the potential to optimise data-driven approaches, enabling robust risk profiles to be generated for coastal regions

    Formative computer based assessment in diagram based domains

    Get PDF
    This research argues that the formative assessment of student coursework in free-form, diagram-based domains can be automated using CBA techniques in a way which is both feasible and useful. Formative assessment is that form of assessment in which the objective is to assist the process of learning undertaken by the student. The primary deliverable associated with formative assessment is feedback. CBA courseware provides facilities to implement the full lifecycle of an exercise through an integrated, online system. This research demonstrates that CBA offers unique opportunities for student learning through formative assessment, including allowing students to correct their solutions over a larger number of submissions than it would be feasible to allow within the context of traditional assessment forms. The approach to research involves two main phases. The first phase involves designing and implementing an assessment course using the CourseMarker / DATsys CBA system. This system, in common with may other examples of CBA courseware, was intended primarily to conduct summative assessment. The benefits and limitations of the system are identified. The second phase identifies three extensions to the architecture which encapsulate the difference in requirements between summative assessment and formative assessment, presents a design for the extensions, documents their implementation as extensions to the CourseMarker / DATsys architecture and evaluates their contribution. The three extensions are novel extensions for free-form CBA which allow the assessment of the aesthetic layout of student diagrams, the marking of student solutions where multiple model solutions are acceptable and the prioritisation and truncation of feedback prior to its presentation to the student. Evaluation results indicate that the student learning process can be assisted through formative assessment which is automated using CBA courseware. The students learn through an iterative process in which feedback upon a submitted student coursework solution is used by the student to improve their solution, after which they may re-submit and receive further feedback

    CERA: análise de risco à erosão costeira baseada em sistemas de informação geográfica

    Get PDF
    Coastal areas are important in human development, providing numerous economic and social benefits. On the other hand, these areas are affected by several natural hazards. Therefore, the identification of endangered areas is essential to a thoughtful coastal management and to mitigate potential damages. Through the years, several methodologies of coastal risk assessment have been developed to support coastal managers in decision making. These methodologies assess areas for various types of coastal hazards, for variable extents and time scales, and return different final products often based on different conceptions. This work intends to contribute for further progress of coastal risk assessment methodologies with the development of CERA (Coastal Erosion Risk Assessment). CERA is a methodology developed to evaluate coastal erosion risk for a medium-term horizon (10 to 20 years). The methodology should be applicable in a wide range of coastal environments and scales, with considerable accuracy and efficiency. This method mainly targets governmental institutions from countries and regions where there is a lack of data and results of coastal management. For the development of CERA, an extensive literature review of existent coastal risk methodologies was performed. This task allowed to gain knowledge on how to apply the methodologies and to identify most common indicators, and adopted spatial scales and time frames. From the analysed methods, five were applied to the selected study sites within this work: Aveiro (Portugal), Macaneta spit (Mozambique) and Quintana Roo (Mexico). The applied methods (CERA1.0; CVI; Smartline; RISC-KIT CRAF1; and CHW) varied in terms of specific objective within coastal risk assessment, indicators considered, procedure and outputs. Consequently, the results of various methodologies disagree on the hazard level attributed for the study areas. However, they generally agree in the identification of most endangered locations of each study area. The application of these methods provided specific takeaways to be followed in the development of the new proposal. The new methodology (CERA2.0) follows closely the Source-Parthway- Receptor-Consequence model by evaluating risk propagation in four modules: susceptibility, value; exposure; and coastal erosion. Subsequently, these are combined to generate vulnerability, consequence and risk results. A total of 12 indicators are included. For easier application of the methodology, a QGIS plugin was developed. Given the required inputs, the plugin computes all CERA2.0 procedures and provides the results in a georeferenced format. The new procedure was also applied to the three case studies, obtaining a more realistic set of results.As zonas costeiras são locais de grande importância para o desenvolvimento humano, proporcionando inúmeros benefícios económicos e sociais. Por outro lado, estas zonas estão sujeitas a vários perigos naturais. Portanto, a identificação de zonas de perigo é essencial para uma gestão costeira apropriada e consequente mitigação de potenciais danos. Ao longo dos anos, várias metodologias de risco costeiro foram desenvolvidas com o intuito de apoiar gestores das zonas costeiras no processo de decisão. Estas metodologias variam no tipo de perigo em análise, no conceito e produto final determinado, na extensão de linha de costa a que podem ser aplicadas e na escala temporal em análise. Este trabalho procura contribuir para o progresso das metodologias de risco costeiro com o desenvolvimento do CERA (Coastal Erosion Risk Assessment). O CERA foi desenvolvido com o intuito de analisar o risco à erosão costeira a médio prazo (10 a 20 anos). A metodologia deve ser aplicável a uma grande variedade de ambientes costeiros e escalas, com uma considerável assertividade e eficiência. O principal público alvo para a utilização do método são instituições governamentais de países ou regiões onde exista fraca informação e resultados de gestão costeira. Para a conceção do CERA, foi feita uma extensa revisão de literatura, identificando metodologias de risco costeiro existentes. Esta tarefa proporcionou um melhor conhecimento relativo à aplicação das metodologias, identificação de indicadores mais comuns, bem como as escalas temporais e espaciais mais usadas. Das metodologias identificadas e estudadas, cinco foram aplicadas aos locais de estudo definidos para este trabalho: Aveiro (Portugal), Macaneta (Moçambique) e Quintana Roo (México). Os métodos aplicados (CERA1.0; CVI; Smartline; RISC-KIT CRAF1; e CHW) variam em termos de objetivo específico dentro da temática de risco costeiro, indicadores considerados, procedimentos e resultados. Consequentemente, os resultados dos vários métodos não são concordantes no nível de perigo atribuído a cada local. No entanto, os locais de maior perigo dentro de cada área de estudo são similares. A aplicação destes métodos permitiu o desenvolvimento de uma série de diretrizes a serem seguidas durante o desenvolvimento da nova proposta. A nova metodologia (CERA2.0) segue o modelo conceptual Source-Parthway- Receptor-Consequence, avaliando a propagação de risco em quatro módulos: suscetibilidade, valor, exposição e erosão costeira. Posteriormente, estes módulos são combinados de forma a obter resultados de vulnerabilidade, consequência e risco. A utilização do CERA2.0 requer um total de 12 indicadores. Para uma fácil aplicação da metodologia, foi desenvolvido um plugin no programa QGIS. Introduzindo os dados necessários, o plugin executa todos os processos previstos no CERA2.0 e providencia os resultados georreferenciados. O novo método foi igualmente aplicado aos casos de estudo, obtendo-se um conjunto de resultados mais realistas.Programa Doutoral em Engenharia Civi
    corecore