662,322 research outputs found

    DMP online: the Digital Curation Centre’s web-based tool for creating, maintaining and exporting data management plans

    Get PDF
    Funding bodies increasingly require researchers to produce Data Management Plans (DMPs). The Digital Curation Centre (DCC) has created DMP Online, a web-based tool which draws upon an analysis of funders’ requirements to enable researchers to create and export customisable DMPs, both at the grant application stage and during the project’s lifetime

    Experiences in deploying metadata analysis tools for institutional repositories

    Get PDF
    Current institutional repository software provides few tools to help metadata librarians understand and analyze their collections. In this article, we compare and contrast metadata analysis tools that were developed simultaneously, but independently, at two New Zealand institutions during a period of national investment in research repositories: the Metadata Analysis Tool (MAT) at The University of Waikato, and the Kiwi Research Information Service (KRIS) at the National Library of New Zealand. The tools have many similarities: they are convenient, online, on-demand services that harvest metadata using OAI-PMH; they were developed in response to feedback from repository administrators; and they both help pinpoint specific metadata errors as well as generating summary statistics. They also have significant differences: one is a dedicated tool wheres the other is part of a wider access tool; one gives a holistic view of the metadata whereas the other looks for specific problems; one seeks patterns in the data values whereas the other checks that those values conform to metadata standards. Both tools work in a complementary manner to existing Web-based administration tools. We have observed that discovery and correction of metadata errors can be quickly achieved by switching Web browser views from the analysis tool to the repository interface, and back. We summarize the findings from both tools' deployment into a checklist of requirements for metadata analysis tools

    Experiences in deploying metadata analysis tools for institutional repositories

    Get PDF
    Current institutional repository software provides few tools to help metadata librarians understand and analyze their collections. In this article, we compare and contrast metadata analysis tools that were developed simultaneously, but independently, at two New Zealand institutions during a period of national investment in research repositories: the Metadata Analysis Tool (MAT) at The University of Waikato, and the Kiwi Research Information Service (KRIS) at the National Library of New Zealand. The tools have many similarities: they are convenient, online, on-demand services that harvest metadata using OAI-PMH; they were developed in response to feedback from repository administrators; and they both help pinpoint specific metadata errors as well as generating summary statistics. They also have significant differences: one is a dedicated tool wheres the other is part of a wider access tool; one gives a holistic view of the metadata whereas the other looks for specific problems; one seeks patterns in the data values whereas the other checks that those values conform to metadata standards. Both tools work in a complementary manner to existing Web-based administration tools. We have observed that discovery and correction of metadata errors can be quickly achieved by switching Web browser views from the analysis tool to the repository interface, and back. We summarize the findings from both tools' deployment into a checklist of requirements for metadata analysis tools

    Exploring The Responsibilities Of Single-Inhabitant Smart Homes With Use Cases

    Get PDF
    DOI: 10.3233/AIS-2010-0076This paper makes a number of contributions to the field of requirements analysis for Smart Homes. It introduces Use Cases as a tool for exploring the responsibilities of Smart Homes and it proposes a modification of the conventional Use Case structure to suit the particular requirements of Smart Homes. It presents a taxonomy of Smart-Home-related Use Cases with seven categories. It draws on those Use Cases as raw material for developing questions and conclusions about the design of Smart Homes for single elderly inhabitants, and it introduces the SHMUC repository, a web-based repository of Use Cases related to Smart Homes that anyone can exploit and to which anyone may contribute

    State Analysis Database Tool

    Get PDF
    The State Analysis Database Tool software establishes a productive environment for collaboration among software and system engineers engaged in the development of complex interacting systems. The tool embodies State Analysis, a model-based system engineering methodology founded on a state-based control architecture (see figure). A state represents a momentary condition of an evolving system, and a model may describe how a state evolves and is affected by other states. The State Analysis methodology is a process for capturing system and software requirements in the form of explicit models and states, and defining goal-based operational plans consistent with the models. Requirements, models, and operational concerns have traditionally been documented in a variety of system engineering artifacts that address different aspects of a mission s lifecycle. In State Analysis, requirements, models, and operations information are State Analysis artifacts that are consistent and stored in a State Analysis Database. The tool includes a back-end database, a multi-platform front-end client, and Web-based administrative functions. The tool is structured to prompt an engineer to follow the State Analysis methodology, to encourage state discovery and model description, and to make software requirements and operations plans consistent with model descriptions

    Web-based Visual Analytics for Social Media Data

    Get PDF
    Social media data provides valuable information about different events, trends and happenings around the world. Visual data analysis tasks for social media data have large computational and storage space requirements. Due to these restrictions, subdivision of data analysis tools into several layers such as Data, Business Logic or Algorithms, and Presentation Layer is often necessary to make them accessible for variety of clients. On server side, social media data analysis algorithms can be implemented and published in the form of web services. Visual Interface can then be implemented in the form of thin clients that call these web services for data querying, exploration, and analysis tasks. In our work, we have implemented a web-based visual analytics tool for social media data analysis. Initially, we extended our existing desktop-based Twitter data analysis application named “ScatterBlog” to create web services based API that provides access to all the data analysis algorithms. In the second phase, we are creating web based visual interface consuming these web services. Some major components of the visual interface include map view, content lens view, abnormal event detection view, Tweets summary view and filtering / visual query module. The tool can then be used by parties from various fields of interest, requiring only a browser to perform social media data analysis tasks

    Mastering the requirements analysis for communication-intensive websites

    Get PDF
    Web application development still needs to employ effective methods to accommodate some distinctive aspects of the requirements analysis process: capturing high-level communication goals, considering several user profiles and stakeholders, defining hypermedia-specific requirements (concerning navigation, content, information structure and presentation aspects), and reusing requirements for an effective usability evaluation. Techniques should be usable by both stakeholders and the design team, require little training effort, and show relative advantage to project managers. Over the last few years, requirements methodologies applied to web-based applications have considered mainly the transactional and operational aspects typical of traditional information systems. The communicational aspects of web sites have been neglected in regards to systematic requirements methods. This thesis, starting from key achievements in Requirements Engineering (hereafter RE), introduces a model (AWARE) for defining and analyzing requirements for web applications mainly conceived as strategic communication means for an institution or organization. The model extends traditional goal and scenario-based approaches for refining highlevel goals into website requirements, by introducing the analysis of ill-defined user goals, stakeholder communication goals, and a hypermedia requirement taxonomy to facilitate web conceptual design, and paving the way for a systematic usability evaluation. AWARE comprises a conceptual toolkit and a notation for effective requirements documentation. AWARE concepts and notation represent a useful communication and analysis conceptual tool that may support in the elicitation, negotiation, analysis and validation of requirements from the relevant stakeholders (users included). The empirical validation of the model is carried out in two ways. Firstly, the model has been employed in web projects on the field. These case studies and the lessons learnt will be presented and discussed to assess advantages and limits of the proposal. Secondly, a sample of web analysts and designers has been asked to study and apply the model: the feedback gathered is positive and encouraging for further improvement.Lo sviluppo di applicazioni web necessita di strumenti efficaci per gestire alcuni aspetti essenziali del processo di analisi dei requisiti: l'identificazione di obiettivi di comunicazione strategici, la presenza di una varietà di profili utente e di stakeholders, le definizione di requisiti ipermediali (riguardanti navigazione, interazione, contenuto e presentazione), e il riuso dei requisiti per una pianificazione efficace della valutazione dell'usabilità. Sono necessarie tecniche usabili sia dagli stakeholders che dai progettisti, che richiedono un tempo breve per essere appresi ed usati con efficacia, mostrando vantaggi significativi ai gestori di progetti complessi. La tesi definisce AWARE (Analysis of Web Application Requirements) - una metodologia per l'analisi dei requisiti specifica per la gestione di siti web (ed applicazioni interattive) con forti componenti comunicative. La metodologia estende le tecniche esistenti dell''analisi dei requisiti basate su approcci goal-oriented e scenario-based, introducendo una tassonomia di requisiti specifica per siti web (che permette di dare un input strutturato all'attività di progetazione), strumenti per l'identificazione e l'analisi di obiettivi ill-defined (generici o mal-definiti) e di obiettivi comunicativi e supporto metodologico per la valutazione dell'usabilità basata sui requisiti dell'applicazione. La metodologia AWARE è stata valutata sul campo attraverso progetti con professionisti del settore (web designers e IT managers), e grazie ad interventi di formazione in aziende specializzate nella comunicazione su web

    Analysis of U.S. Senate Web Sites For Disability Accessibility

    Get PDF
    U.S. federal government web sites have increased significantly the level of services and information offered to various internal and external stakeholders. The Workforce Investment Act of 1998 amended Section 508 of the Rehabilitation Act of 1973, which complemented the intent and aims of the 1990 Americans with Disabilities Act (ADA). As a result, federal agencies and departments were mandated to provide disabled stakeholders with access to key information from federal web sites. However, since this enactment, some federal web sites still do not meet fully the legal requirements to accommodate users with disabilities. Additionally, web sites of members of the U.S. Congress technically do not fall under regulation. Without regulation, non-adherence to accessibility standards by congressional web sites may result in poor or ineffective utilization by citizen consumers or other stakeholders with disabilities. The purpose of this study is to examine the accessibility statistics for a pseudo-random sample of 50 web sites of U.S. Senators. The main web page of each site was evaluated with an online web site analysis software tool – Truwex. Three factors were used to gauge the level of accessibility: criteria based on Section 508, WCAG 1.0 standards, and WCAG 2.0 standards. Results suggest that the vast majority of the U.S. Senate web sites do not meet the federal legal guidelines that otherwise are imposed on other U.S. governmental agencies and departments. Many of the sites contain consistent patterns of non-compliance, and some minor changes could result in increased accessibility for disabled stakeholders

    Ecodesign tools for designers - defining the requirements

    Get PDF
    This paper presents the findings from a research project which set out to understand the type of requirements that industrial designers have of ecodesign tools, through the use of a web based prototype. Through qualitative data collection and analysis a number of important criteria for ecodesign tools were identified. The conclusions recognise the importance of developing holistic tools for industrial designers, identifying that a combination of guidance, education and information, along with well considered content, appropriate presentation and easy access, are all critical to their success. A framework for ecodesign tools for industrial designers is presented along with the evolution of 'Information/Inspiration' into a fully working web-based tool
    corecore