960,907 research outputs found

    Improving the evaluation of web search systems

    Get PDF
    Linkage analysis as an aid to web search has been assumed to be of significant benefit and we know that it is being implemented by many major Search Engines. Why then have few TREC participants been able to scientifically prove the benefits of linkage analysis over the past three years? In this paper we put forward reasons why disappointing results have been found and we identify the linkage density requirements of a dataset to faithfully support experiments into linkage analysis. We also report a series of linkage-based retrieval experiments on a more densely linked dataset culled from the TREC web documents

    DMP online: the Digital Curation Centre’s web-based tool for creating, maintaining and exporting data management plans

    Get PDF
    Funding bodies increasingly require researchers to produce Data Management Plans (DMPs). The Digital Curation Centre (DCC) has created DMP Online, a web-based tool which draws upon an analysis of funders’ requirements to enable researchers to create and export customisable DMPs, both at the grant application stage and during the project’s lifetime

    Complete instrumentation requirements for performance analysis of web based technologies

    Get PDF
    In this paper we present the eDragon environment, a research platform created to perform complete performance analysis of new Web-based technologies. eDragon enables the understanding of how application servers work in both sequential and parallel platforms offering a new insight in the usage of system resources. The environment is composed of a set of instrumentation modules, a performance analysis and visualization tool and a set of experimental methodologies to perform complete performance analysis of Web-based technologies. This paper describes the design and implementation of this research platform and highlights some of its main functionalities. We will also show how a detailed analytical view can be obtained through the application of a bottom-up strategy, starting with a group of system events and advancing to more complex performance metrics using a continuous derivation process.We acknowledge the European Center for Parallelism of Barcelona (CEPBA) and CEPBA-IBM Research Institute (CIRI) for supplying the computing resources for our experiments. This work is supported by the Ministry of Science and Technology of Spain and the European Union (FEDER funds) under contract TIC2001–0995-C02–0 I and by Direcció General de Recerca of the Generalitat de Catalunya under grant 2001FI 00694 UPC APTIND.Peer ReviewedPostprint (author's final draft

    Analyzing the BrowserID SSO System with Primary Identity Providers Using an Expressive Model of the Web

    Full text link
    BrowserID is a complex, real-world Single Sign-On (SSO) System for web applications recently developed by Mozilla. It employs new HTML5 features (such as web messaging and web storage) and cryptographic assertions to provide decentralized login, with the intent to respect users' privacy. It can operate in a primary and a secondary identity provider mode. While in the primary mode BrowserID runs with arbitrary identity providers (IdPs), in the secondary mode there is one IdP only, namely Mozilla's default IdP. We recently proposed an expressive general model for the web infrastructure and, based on this web model, analyzed the security of the secondary IdP mode of BrowserID. The analysis revealed several severe vulnerabilities. In this paper, we complement our prior work by analyzing the even more complex primary IdP mode of BrowserID. We do not only study authentication properties as before, but also privacy properties. During our analysis we discovered new and practical attacks that do not apply to the secondary mode: an identity injection attack, which violates a central authentication property of SSO systems, and attacks that break an important privacy promise of BrowserID and which do not seem to be fixable without a major redesign of the system. Some of our attacks on privacy make use of a browser side channel that has not gained a lot of attention so far. For the authentication bug, we propose a fix and formally prove in a slight extension of our general web model that the fixed system satisfies all the requirements we consider. This constitutes the most complex formal analysis of a web application based on an expressive model of the web infrastructure so far. As another contribution, we identify and prove important security properties of generic web features in the extended web model to facilitate future analysis efforts of web standards and web applications.Comment: arXiv admin note: substantial text overlap with arXiv:1403.186

    Goal-driven requirements analysis for hypermedia-intensive Web applications

    Get PDF
    Requirements analysis for Web applications still needs to employ effective RE practices to accommodate some distinctive aspects: capturing high-level communication goals, considering several user profiles, defining hypermedia-specific requirements, bridging the gap between requirements and Web design, and reusing requirements for an effective usability evaluation. Techniques should be usable, informal, require little training effort, and show relative advantage to project managers. On the basis of the i * framework, this paper presents a proposal for defining hypermedia requirements (concerning aspects such as content, interaction, navigation, and presentation) for Web applications. The model adopts a goal-driven approach coupled with scenario-based techniques, introduces a hypermedia requirement taxonomy to facilitate Web conceptual design, and paves the way for systematic usability evaluation. Particular attention is paid to the empirical validation of the model based on the perceived quality attributes theory. A case study developed with industrial partners is discusse

    Mastering the requirements analysis for communication-intensive websites

    Get PDF
    Web application development still needs to employ effective methods to accommodate some distinctive aspects of the requirements analysis process: capturing high-level communication goals, considering several user profiles and stakeholders, defining hypermedia-specific requirements (concerning navigation, content, information structure and presentation aspects), and reusing requirements for an effective usability evaluation. Techniques should be usable by both stakeholders and the design team, require little training effort, and show relative advantage to project managers. Over the last few years, requirements methodologies applied to web-based applications have considered mainly the transactional and operational aspects typical of traditional information systems. The communicational aspects of web sites have been neglected in regards to systematic requirements methods. This thesis, starting from key achievements in Requirements Engineering (hereafter RE), introduces a model (AWARE) for defining and analyzing requirements for web applications mainly conceived as strategic communication means for an institution or organization. The model extends traditional goal and scenario-based approaches for refining highlevel goals into website requirements, by introducing the analysis of ill-defined user goals, stakeholder communication goals, and a hypermedia requirement taxonomy to facilitate web conceptual design, and paving the way for a systematic usability evaluation. AWARE comprises a conceptual toolkit and a notation for effective requirements documentation. AWARE concepts and notation represent a useful communication and analysis conceptual tool that may support in the elicitation, negotiation, analysis and validation of requirements from the relevant stakeholders (users included). The empirical validation of the model is carried out in two ways. Firstly, the model has been employed in web projects on the field. These case studies and the lessons learnt will be presented and discussed to assess advantages and limits of the proposal. Secondly, a sample of web analysts and designers has been asked to study and apply the model: the feedback gathered is positive and encouraging for further improvement.Lo sviluppo di applicazioni web necessita di strumenti efficaci per gestire alcuni aspetti essenziali del processo di analisi dei requisiti: l'identificazione di obiettivi di comunicazione strategici, la presenza di una varietà di profili utente e di stakeholders, le definizione di requisiti ipermediali (riguardanti navigazione, interazione, contenuto e presentazione), e il riuso dei requisiti per una pianificazione efficace della valutazione dell'usabilità. Sono necessarie tecniche usabili sia dagli stakeholders che dai progettisti, che richiedono un tempo breve per essere appresi ed usati con efficacia, mostrando vantaggi significativi ai gestori di progetti complessi. La tesi definisce AWARE (Analysis of Web Application Requirements) - una metodologia per l'analisi dei requisiti specifica per la gestione di siti web (ed applicazioni interattive) con forti componenti comunicative. La metodologia estende le tecniche esistenti dell''analisi dei requisiti basate su approcci goal-oriented e scenario-based, introducendo una tassonomia di requisiti specifica per siti web (che permette di dare un input strutturato all'attività di progetazione), strumenti per l'identificazione e l'analisi di obiettivi ill-defined (generici o mal-definiti) e di obiettivi comunicativi e supporto metodologico per la valutazione dell'usabilità basata sui requisiti dell'applicazione. La metodologia AWARE è stata valutata sul campo attraverso progetti con professionisti del settore (web designers e IT managers), e grazie ad interventi di formazione in aziende specializzate nella comunicazione su web

    Exploring The Responsibilities Of Single-Inhabitant Smart Homes With Use Cases

    Get PDF
    DOI: 10.3233/AIS-2010-0076This paper makes a number of contributions to the field of requirements analysis for Smart Homes. It introduces Use Cases as a tool for exploring the responsibilities of Smart Homes and it proposes a modification of the conventional Use Case structure to suit the particular requirements of Smart Homes. It presents a taxonomy of Smart-Home-related Use Cases with seven categories. It draws on those Use Cases as raw material for developing questions and conclusions about the design of Smart Homes for single elderly inhabitants, and it introduces the SHMUC repository, a web-based repository of Use Cases related to Smart Homes that anyone can exploit and to which anyone may contribute

    A Requirement-centric Approach to Web Service Modeling, Discovery, and Selection

    Get PDF
    Service-Oriented Computing (SOC) has gained considerable popularity for implementing Service-Based Applications (SBAs) in a flexible\ud and effective manner. The basic idea of SOC is to understand users'\ud requirements for SBAs first, and then discover and select relevant\ud services (i.e., that fit closely functional requirements) and offer\ud a high Quality of Service (QoS). Understanding users’ requirements\ud is already achieved by existing requirement engineering approaches\ud (e.g., TROPOS, KAOS, and MAP) which model SBAs in a requirement-driven\ud manner. However, discovering and selecting relevant and high QoS\ud services are still challenging tasks that require time and effort\ud due to the increasing number of available Web services. In this paper,\ud we propose a requirement-centric approach which allows: (i) modeling\ud users’ requirements for SBAs with the MAP formalism and specifying\ud required services using an Intentional Service Model (ISM); (ii)\ud discovering services by querying the Web service search engine Service-Finder\ud and using keywords extracted from the specifications provided by\ud the ISM; and(iii) selecting automatically relevant and high QoS services\ud by applying Formal Concept Analysis (FCA). We validate our approach\ud by performing experiments on an e-books application. The experimental\ud results show that our approach allows the selection of relevant and\ud high QoS services with a high accuracy (the average precision is\ud 89.41%) and efficiency (the average recall is 95.43%)
    corecore