12,727 research outputs found

    Living Knowledge

    Get PDF
    Diversity, especially manifested in language and knowledge, is a function of local goals, needs, competences, beliefs, culture, opinions and personal experience. The Living Knowledge project considers diversity as an asset rather than a problem. With the project, foundational ideas emerged from the synergic contribution of different disciplines, methodologies (with which many partners were previously unfamiliar) and technologies flowed in concrete diversity-aware applications such as the Future Predictor and the Media Content Analyser providing users with better structured information while coping with Web scale complexities. The key notions of diversity, fact, opinion and bias have been defined in relation to three methodologies: Media Content Analysis (MCA) which operates from a social sciences perspective; Multimodal Genre Analysis (MGA) which operates from a semiotic perspective and Facet Analysis (FA) which operates from a knowledge representation and organization perspective. A conceptual architecture that pulls all of them together has become the core of the tools for automatic extraction and the way they interact. In particular, the conceptual architecture has been implemented with the Media Content Analyser application. The scientific and technological results obtained are described in the following

    Document image and zone classification through incremental learning

    Full text link

    GMES-service for assessing and monitoring subsidence hazards in coastal lowland areas around Europe. SubCoast D3.5.1

    Get PDF
    This document is version two of the user requirements for SubCoast work package 3.5, it is SubCoast deliverable 3.5.1. Work package 3.5 aims to provide a European integrated GIS product on subsidence and relative sea level rise. The first step of this process was to contact the European Environment Agency as the main user to discover their user requirements. This document presents these requirments, the outline methodology that will be used to carry out the integration and the datasets that will be used. In outline the main user requirements of the EEA are: 1. Gridded approach using an Inspire compliant grid 2. The grid would hold data on: a. Likely rate of subsidence b. RSLR c. Impact (Vulnerability) d. Certainty (confidence map) e. Contribution of ground motion to RSLR f. A measure of certainty in the data provided g. Metadata 3. Spatial Coverage - Ideally entire coastline of all 37 member states a. Spatial resolution - 1km 4. Provide a measure of the degree of contribution of ground motion to RSLR The European integration will be based around a GIS methodology. Datasets will be integrated and interpreted to provide information on data vlues above. The main value being a likelyhood of Subsidence. This product will initially be developed at it’s lowest level of detail for the London area. BGS have a wealth of data for london this will enable this less detialed product to be validated and also enable the generation of a more detailed product usig the best data availible. One the methodology has been developed it will be pushed out to other areas of the ewuropean coastline. The initial input data that have been reviewed for their suitability for the European integration are listed below. Thesea re the datasets that have European wide availibility, It is expected that more detailed datasets will be used in areas where they are avaiilble. 1. Terrafirma Data 2. One Geology 3. One Geology Europe 4. Population Density (Geoland2) 5. The Urban Atlas (Geoland2) 6. Elevation Data a. SRTM b. GDEM c. GTOPO 30 d. NextMap Europe 7. MyOceans Sea Level Data 8. Storm Surge Locations 9. European Environment Agencya. Elevation breakdown 1km b. Corine Land Cover 2000 (CLC2000) coastline c. Sediment Discharges d. Shoreline e. Maritime Boundaries f. Hydrodynamics and Sea Level Rise g. Geomorphology, Geology, Erosion Trends and Coastal Defence Works h. Corine land cover 1990 i. Five metre elevation contour line 10. FutureCoas

    The course of lectures on discipline “Intellectual property” (for the 5 year students of the specialty 8.03060101 “Management”)

    Get PDF
    Затверджено на засіданні кафедри менеджменту інноваційної діяльності та підприємнцтва. Протокол No 1 від 27 серпня 2015 р. Рекомендовано методичною комісією факультету управління і бізнесу у виробництві ТНТУ імені Івана Пулюя. Протокол No 6 від 26 лютого 2016 р.У методичних вказівках, у відповідності до робочої програми, сформовано лекційний матеріал з дисципліни “Інтелектуальна власність” для іноземних студентів спеціальності 8.03060101 “Менеджмент організацій та адміністрування”.Методичні вказівки призначені для допомоги іноземним студентам при вивченні курсу “Інтелектуальна власність”. У методичних вказівках містяться загальні теоретичні відомості, необхідні до вивчення даного курсу. Рекомендовано для іноземних студентів спеціальності 8.03060101 “Менеджмент організацій та адміністрування” з метою закріплення, поглиблення і узагальнення знань, одержаних студентами за час навчання та їх застосування до комплексного вирішення конкретного фахового завдання із дисципліни “Інтелектуальна власність”. Складено з урахуванням робочої програми вивчення курсу, методичних розробок інших вузів, а також матеріалів літературних джерел, наведених у рекомендованій літературі

    The concept of record in experiential, interactive and dynamic environments: can the InterPARES project address the ultimate archival challenge?

    Get PDF
    This paper discusses the concept of electronic record as articulated and used in the context of the InterPARES Project, a multinational and multidisciplinary research project that aims at developing the theoretical and methodological knowledge essential to the long-term preservation of authentic records created and/or maintained in digital form. This knowledge should provide the basis from which to formulate model policies, strategies and standards capable of ensuring the longevity of such material and the ability of its users to trust its authenticity. InterPARES has developed in two phases, the first of which was concerned with electronic records created and/or maintained in databases and document management systems, and the second with electronic records existing in experiential, interactive and dynamic digital systems. The paper describes the characteristics, elements, attributes and components of electronic records and, doing so, itshows how the concept of record in the electronic environment is at the same time much more precise that in the traditional one, and in constant evolution

    Feedback-Based Gameplay Metrics and Gameplay Performance Segmentation: An audio-visual approach for assessing player experience.

    Get PDF
    Gameplay metrics is a method and approach that is growing in popularity amongst the game studies research community for its capacity to assess players’ engagement with game systems. Yet, little has been done, to date, to quantify players’ responses to feedback employed by games that conveys information to players, i.e., their audio-visual streams. The present thesis introduces a novel approach to player experience assessment - termed feedback-based gameplay metrics - which seeks to gather gameplay metrics from the audio-visual feedback streams presented to the player during play. So far, gameplay metrics - quantitative data about a game state and the player's interaction with the game system - are directly logged via the game's source code. The need to utilise source code restricts the range of games that researchers can analyse. By using computer science algorithms for audio-visual processing, yet to be employed for processing gameplay footage, the present thesis seeks to extract similar metrics through the audio-visual streams, thus circumventing the need for access to, whilst also proposing a method that focuses on describing the way gameplay information is broadcast to the player during play. In order to operationalise feedback-based gameplay metrics, the present thesis introduces the concept of gameplay performance segmentation which describes how coherent segments of play can be identified and extracted from lengthy game play sessions. Moreover, in order to both contextualise the method for processing metrics and provide a conceptual framework for analysing the results of a feedback-based gameplay metric segmentation, a multi-layered architecture based on five gameplay concepts (system, game world instance, spatial-temporal, degree of freedom and interaction) is also introduced. Finally, based on data gathered from game play sessions with participants, the present thesis discusses the validity of feedback-based gameplay metrics, gameplay performance segmentation and the multi-layered architecture. A software system has also been specifically developed to produce gameplay summaries based on feedback-based gameplay metrics, and examples of summaries (based on several games) are presented and analysed. The present thesis also demonstrates that feedback-based gameplay metrics can be conjointly analysed with other forms of data (such as biometry) in order to build a more complete picture of game play experience. Feedback based game-play metrics constitutes a post-processing approach that allows the researcher or analyst to explore the data however they wish and as many times as they wish. The method is also able to process any audio-visual file, and can therefore process material from a range of audio-visual sources. This novel methodology brings together game studies and computer sciences by extending the range of games that can now be researched but also to provide a viable solution accounting for the exact way players experience games

    A Fuzzy Logic Based Novel Signature Verification System on Bank Cheque with Fractal Dimensions and Connected Components

    Get PDF
    Signature plays its authorization role in almost every document. Proper care should be taken for the verification of the genuineness of the signature in legal documents. Signature verification scheme can be online or offline based on the acquisition type. A novel method for offline signature verification in bank cheques is proposed. It is found out that using fractal dimensions for verification purpose improves the accuracy rate. Also the fundamentals of offline signature verification process are discussed. The proposed system uses connected Components Labeling, Fractal Dimensions and Fuzzy Logic for signature verification. The signature is scanned and preprocessed. Using connected components labeling, the signature is split into regions and each region is labeled uniquely. Feature values for each labeled regions are extracted and normalised. Fractal dimensions of signature images are calculated. Extracted feature values and fractal dimensions are compared with the feature values of the sample signatures for its genuineness. Fuzzy classifies the genuine and forged signatures correctly to its fullest extent. Some signatures may have more noise or it may be complex for the system to identify or classify. Those signatures may need some manual intervention. The proposed verification system shows very good results with good sensitivity and specificity. It has an accuracy of maximum 50%

    Digital Image Access & Retrieval

    Get PDF
    The 33th Annual Clinic on Library Applications of Data Processing, held at the University of Illinois at Urbana-Champaign in March of 1996, addressed the theme of "Digital Image Access & Retrieval." The papers from this conference cover a wide range of topics concerning digital imaging technology for visual resource collections. Papers covered three general areas: (1) systems, planning, and implementation; (2) automatic and semi-automatic indexing; and (3) preservation with the bulk of the conference focusing on indexing and retrieval.published or submitted for publicatio

    The ‘PAThs’ Project: An Effort to Represent the Physical Dimension of Coptic Literary Production (Third–Eleventh centuries)

    Get PDF
    PAThs – Tracking Papyrus and Parchment Paths: An Archaeological Atlas of Coptic Literature. Literary Texts in their Geographical Context. Production, Copying, Usage, Dissemination and Storage is an ambitious digital project based in Rome, working towards a new historical and archaeological geography of the Coptic literary tradition. This aim implies a number of auxiliary tasks and challenges, including classification of authors, works, titles, colophons, and codicological units, as well as the study and wherever possible exact mapping of the relevant geographical sites related to the production, circulation, and storage of manuscript
    corecore