217,446 research outputs found

    Empirical analysis of impacts of instance-driven changes in ontologies

    Get PDF
    Changes in the characterization of instances in digital contents are one of the rationales to change or evolve ontologies which support the domain. These changes can impacts on one or more of interrelated ontologies. Before implementing changes, their impact on the target ontology, other dependent ontologies or dependent systems should be analysed. We investigate three concerns for the determination of impacts of changes in ontologies: representation of changes to ensure minimum impact, impact determination and integrity determination. Key elements of our solution are the operationalization of change operations to minimize impacts, a parameterization approach for the determination of impacts, a categorization scheme for identified impacts, and prioritization technique for change operations based on the severity of impacts

    Archiving the Relaxed Consistency Web

    Full text link
    The historical, cultural, and intellectual importance of archiving the web has been widely recognized. Today, all countries with high Internet penetration rate have established high-profile archiving initiatives to crawl and archive the fast-disappearing web content for long-term use. As web technologies evolve, established web archiving techniques face challenges. This paper focuses on the potential impact of the relaxed consistency web design on crawler driven web archiving. Relaxed consistent websites may disseminate, albeit ephemerally, inaccurate and even contradictory information. If captured and preserved in the web archives as historical records, such information will degrade the overall archival quality. To assess the extent of such quality degradation, we build a simplified feed-following application and simulate its operation with synthetic workloads. The results indicate that a non-trivial portion of a relaxed consistency web archive may contain observable inconsistency, and the inconsistency window may extend significantly longer than that observed at the data store. We discuss the nature of such quality degradation and propose a few possible remedies.Comment: 10 pages, 6 figures, CIKM 201

    Basis Token Consistency: A Practical Mechanism for Strong Web Cache Consistency

    Full text link
    With web caching and cache-related services like CDNs and edge services playing an increasingly significant role in the modern internet, the problem of the weak consistency and coherence provisions in current web protocols is becoming increasingly significant and drawing the attention of the standards community [LCD01]. Toward this end, we present definitions of consistency and coherence for web-like environments, that is, distributed client-server information systems where the semantics of interactions with resource are more general than the read/write operations found in memory hierarchies and distributed file systems. We then present a brief review of proposed mechanisms which strengthen the consistency of caches in the web, focusing upon their conceptual contributions and their weaknesses in real-world practice. These insights motivate a new mechanism, which we call "Basis Token Consistency" or BTC; when implemented at the server, this mechanism allows any client (independent of the presence and conformity of any intermediaries) to maintain a self-consistent view of the server's state. This is accomplished by annotating responses with additional per-resource application information which allows client caches to recognize the obsolescence of currently cached entities and identify responses from other caches which are already stale in light of what has already been seen. The mechanism requires no deviation from the existing client-server communication model, and does not require servers to maintain any additional per-client state. We discuss how our mechanism could be integrated into a fragment-assembling Content Management System (CMS), and present a simulation-driven performance comparison between the BTC algorithm and the use of the Time-To-Live (TTL) heuristic.National Science Foundation (ANI-9986397, ANI-0095988

    eWOM: the effects of online consumer reviews on purchasing decision of electronic goods

    Get PDF
    Internet has become the primary source of information for a large number of consumers and it has dramatically changed the consumer behaviour. One of the main changes in modern consumer behaviour has been the transition from a passive to an active and informed consumer. Internet enables customers to share their opinions on, and experiences with, goods and services with a multitude of other consumers. Online consumer reviews are used by prospective buyers of related products who are interested in obtaining more information from people who have purchased and used a product of interest. Word-of-mouth (WOM) is one of the most important information sources when a consumer is making a purchase decision. The arrival and expansion of the Internet has extended consumers' options for gathering product information by including other consumers' comments, posted on the Internet, and has provided consumers opportunities to offer their own consumption-related advice by engaging in electronic word-of-mouth (eWOM). eWOM can be defined as all informal communications directed at consumers through Internet-based technology related to the usage or characteristics of particular goods and services, or their sellers. The aim of this study is to assess the impact of, one type of electronic word-of-mouth (eWOM), the online consumer review, on purchasing decision of electronic products. This empirical study also focuses on the relationship between reviews and purchasing behaviour. An instrument was prepared to measure the proposed constructs, with questionnaire items taken from prior studies but adapted to fit the context of e-commerce. The survey was applied to academicians in Turkey through internet. The data was analyzed using the SPSS package. The results show that consumer reviews have a causal impact on consumer purchasing behaviour and they have an effect on choosing the products by consumer. Finally, the results and their implications are discussed

    The systematic guideline review: method, rationale, and test on chronic heart failure

    Get PDF
    Background: Evidence-based guidelines have the potential to improve healthcare. However, their de-novo-development requires substantial resources-especially for complex conditions, and adaptation may be biased by contextually influenced recommendations in source guidelines. In this paper we describe a new approach to guideline development-the systematic guideline review method (SGR), and its application in the development of an evidence-based guideline for family physicians on chronic heart failure (CHF). Methods: A systematic search for guidelines was carried out. Evidence-based guidelines on CHF management in adults in ambulatory care published in English or German between the years 2000 and 2004 were included. Guidelines on acute or right heart failure were excluded. Eligibility was assessed by two reviewers, methodological quality of selected guidelines was appraised using the AGREE instrument, and a framework of relevant clinical questions for diagnostics and treatment was derived. Data were extracted into evidence tables, systematically compared by means of a consistency analysis and synthesized in a preliminary draft. Most relevant primary sources were re-assessed to verify the cited evidence. Evidence and recommendations were summarized in a draft guideline. Results: Of 16 included guidelines five were of good quality. A total of 35 recommendations were systematically compared: 25/35 were consistent, 9/35 inconsistent, and 1/35 un-rateable (derived from a single guideline). Of the 25 consistencies, 14 were based on consensus, seven on evidence and four differed in grading. Major inconsistencies were found in 3/9 of the inconsistent recommendations. We re-evaluated the evidence for 17 recommendations (evidence-based, differing evidence levels and minor inconsistencies) - the majority was congruent. Incongruity was found where the stated evidence could not be verified in the cited primary sources, or where the evaluation in the source guidelines focused on treatment benefits and underestimated the risks. The draft guideline was completed in 8.5 man-months. The main limitation to this study was the lack of a second reviewer. Conclusion: The systematic guideline review including framework development, consistency analysis and validation is an effective, valid, and resource saving-approach to the development of evidence-based guidelines

    Ontology-based domain modelling for consistent content change management

    Get PDF
    Ontology-based modelling of multi-formatted software application content is a challenging area in content management. When the number of software content unit is huge and in continuous process of change, content change management is important. The management of content in this context requires targeted access and manipulation methods. We present a novel approach to deal with model-driven content-centric information systems and access to their content. At the core of our approach is an ontology-based semantic annotation technique for diversely formatted content that can improve the accuracy of access and systems evolution. Domain ontologies represent domain-specific concepts and conform to metamodels. Different ontologies - from application domain ontologies to software ontologies - capture and model the different properties and perspectives on a software content unit. Interdependencies between domain ontologies, the artifacts and the content are captured through a trace model. The annotation traces are formalised and a graph-based system is selected for the representation of the annotation traces

    Online service delivery models : an international comparison in the public sector

    Get PDF
    Governments around the world are facing the challenge of responding to increased expectations by their customers with regard to public service delivery. Citizens, for example, expect governments to provide better and more efficient electronic services on the Web in an integrated way. Online portals have become the approach of choice in online service delivery to meet these requirements and become more customer-focussed. This study describes and analyses existing variants of online service delivery models based upon an empirical study and provides valuable insights for researchers and practitioners in government. For this study, we have conducted interviews with senior management representatives from five international governments. Based on our findings, we distinguish three different classes of service delivery models. We describe and characterise each of these models in detail and provide an in-depth discussion of the strengths and weaknesses of these approaches

    The assessment of usability of electronic shopping: A heuristic evaluation

    Get PDF
    Today there are thousands of electronic shops accessible via the Web. Some provide user-friendly features whilst others seem not to consider usability factors at all. Yet, it is critical that the electronic shopping interface is user-friendly so as to help users to obtain their desired results. This study applied heuristic evaluation to examine the usability of current electronic shopping. In particular, it focused on four UK-based supermarkets offering electronic services: including ASDA, Iceland, Sainsbury, and Tesco. The evaluation consists of two stages: a free-flow inspection and a task-based inspection. The results indicate that the most significant and common usability problems have been found to lie within the areas of ‘User Control and Freedom’ and ‘Help and Documentation’. The findings of this study are applied to develop a set of usability guidelines to support the future design of effective interfaces for electronic shopping

    Evaluation of e-learning web sites using fuzzy axiomatic design based approach

    Get PDF
    High quality web site has been generally recognized as a critical enabler to conduct online business. Numerous studies exist in the literature to measure the business performance in relation to web site quality. In this paper, an axiomatic design based approach for fuzzy group decision making is adopted to evaluate the quality of e-learning web sites. Another multi-criteria decision making technique, namely fuzzy TOPSIS, is applied in order to validate the outcome. The methodology proposed in this paper has the advantage of incorporating requirements and enabling reductions in the problem size, as compared to fuzzy TOPSIS. A case study focusing on Turkish e-learning websites is presented, and based on the empirical findings, managerial implications and recommendations for future research are offered

    PriPeARL: A Framework for Privacy-Preserving Analytics and Reporting at LinkedIn

    Full text link
    Preserving privacy of users is a key requirement of web-scale analytics and reporting applications, and has witnessed a renewed focus in light of recent data breaches and new regulations such as GDPR. We focus on the problem of computing robust, reliable analytics in a privacy-preserving manner, while satisfying product requirements. We present PriPeARL, a framework for privacy-preserving analytics and reporting, inspired by differential privacy. We describe the overall design and architecture, and the key modeling components, focusing on the unique challenges associated with privacy, coverage, utility, and consistency. We perform an experimental study in the context of ads analytics and reporting at LinkedIn, thereby demonstrating the tradeoffs between privacy and utility needs, and the applicability of privacy-preserving mechanisms to real-world data. We also highlight the lessons learned from the production deployment of our system at LinkedIn.Comment: Conference information: ACM International Conference on Information and Knowledge Management (CIKM 2018
    • 

    corecore