6,409 research outputs found

    Combating e-discrimination in the North West - final report

    Get PDF
    The Combating eDiscimination in the North West project examined over 100 websites advertising job opportunities both regionally and nationally, and found the vast majority to be largely inaccessible. Professional standards, such as using valid W3C code and adhering to the W3C Web Content Accessibility Guidelines, were largely not followed. The project also conducted interviews with both public and private sector web professionals, and focus groups of disabled computer users, to draw a broader picture of the accessibility of jobs websites. Interviews with leading web development companies in the Greater Manchester region, showed that there is a view there should not be any additional cost in making websites accessible, as the expertise to create a site professionally should be in place from the start, and that accessibility will follow from applying professional standards. However, through the process of trying to create a website for the project, with such a company, it was found that following professional standards is not sufficient to catch all the potential problems, and that user testing is an essential adjunct to professional practice. The main findings of the project are, thus, that: ‱ Most websites in the job opportunities sector are not following professional standards of web development, and are largely inaccessible ‱ Professional standards of web development need to be augmented with user testing to ensure proper accessibility

    On the Change in Archivability of Websites Over Time

    Get PDF
    As web technologies evolve, web archivists work to keep up so that our digital history is preserved. Recent advances in web technologies have introduced client-side executed scripts that load data without a referential identifier or that require user interaction (e.g., content loading when the page has scrolled). These advances have made automating methods for capturing web pages more difficult. Because of the evolving schemes of publishing web pages along with the progressive capability of web preservation tools, the archivability of pages on the web has varied over time. In this paper we show that the archivability of a web page can be deduced from the type of page being archived, which aligns with that page's accessibility in respect to dynamic content. We show concrete examples of when these technologies were introduced by referencing mementos of pages that have persisted through a long evolution of available technologies. Identifying these reasons for the inability of these web pages to be archived in the past in respect to accessibility serves as a guide for ensuring that content that has longevity is published using good practice methods that make it available for preservation.Comment: 12 pages, 8 figures, Theory and Practice of Digital Libraries (TPDL) 2013, Valletta, Malt

    A Contribution-based Framework for the Creation of Semantically-enabled Web Applications

    Get PDF
    We present Fortunata, a wiki-based framework designed to simplify the creation of semantically-enabled web applications. This framework facilitates the management and publicationof semantic data in web-based applications, to the extent that application developers do not need to be skilled in client-side technologies, and promotes application reuse by fostering collaboration among developers by means of wiki plugins.Weillustrate the use of this framework with two Fortunata-based applications named OMEMO and VPOET, and we evaluate it with two experiments performed with usability evaluators and application developers respectively. These experiments show a good balance between the usability of the applications created with this framework and the effort and skills required by developers

    A Content-Sensitive Wiki Help System

    Get PDF
    Context-sensitive help is a software application component that enables users to open help pertaining to their state, location, or the action they are performing within the software. Context-sensitive “wiki” help, on the other hand, is help powered by a wiki system with all the features of context-sensitive help. A context-sensitive wiki help system aims to make the context-sensitive help collaborative; in addition to seeking help, users can directly contribute to the help system. I have implemented a context-sensitive wiki help system into Yioop, an open source search engine and software portal created by Dr. Chris Pollett, in order to measure the effectiveness of said help system. An experimental evaluation study has been performed on users of Yioop and the results are discussed in this report

    Evaluating and improving web performance using free-to-use tools

    Get PDF
    Abstract. Fast website loading speeds can increase conversion rates and search engine rankings as well as encourage users to explore the site further, among other positive things. The purpose of the study was to find and compare free-to-use tools that can both evaluate the performance (loading and rendering speed) of a website and give suggestions how the performance could be improved. In addition, three tools were used to evaluate the performance of an existing WordPress site. Some of the performance improvement suggestions given by the tools were then acted upon, and the performance of the website was re-evaluated using the same tools. The research method used in the study was experimental research, and the research question was “How to evaluate and improve web performance using free-to-use tools?” There were also five sub-questions, of which the first two related to the tools and their features, and the last three to the case website. Eight free-to-use web performance evaluation tools were compared focusing on what performance metrics they evaluate, what performance improvement suggestions they can give, and six other features that can be useful to know in practice. In alphabetical order, the tools were: GTmetrix, Lighthouse, PageSpeed Insights, Pingdom Tools, Test My Site, WebPageTest, Website Speed Test (by Dotcom-Tools) and Website Speed Test (by Uptrends). The amounts of metrics evaluated by the tools ranged from one to fifteen. The performance improvement suggestions given by the tools could be put into three categories, meaning that the suggestions largely overlapped between the tools. All tools except Lighthouse were web-based tools. The performance of the case website was evaluated using GTmetrix, PageSpeed Insights and WebPageTest. On desktop, the performance was in the high-end range though varying between the three tools, and on mobile, the performance was noticeably slower due to the challenges of mobile devices (e.g. lower processing power compared to desktop computers) and mobile networks (e.g. higher latency compared to broadband connections). The common bottlenecks based on the suggestions given by the three tools seemed to be lack of using a CDN (Content Delivery Network), serving unoptimized images and serving large amounts of JavaScript. The results of the performance re-evaluation were mixed, highlighting the importance of carefully considering each performance improvement suggestion. The main takeaways of the study for practitioners are to use multiple tools to get a wide variety of performance metrics and suggestions, and to regard the suggestions and relative performance scores given by the tools only as guidelines with the main goal being to improve the time-based performance metrics

    SLIS Student Research Journal, Vol. 6, Iss. 1

    Get PDF
    • 

    corecore