63,134 research outputs found
PERFORMANCE EVALUATION ON QUALITY OF ASIAN AIRLINES WEBSITES – AN AHP PPROACH
In recent years, many people have devoted their efforts to the issue of quality of Web site. The concept of quality is consisting of many criteria: quality of service perspective, a user perspective, a content perspective or indeed a usability perspective. Because of its possible instant worldwide audience a Website’s quality and reliability are crucial. The very special nature of the web applications and websites pose unique software testing challenges. Webmasters, Web applications developers, and Website quality assurance managers need tools and methods that can match up to the new needs. This research conducts some tests to measure the quality web site of Asian flag carrier airlines via web diagnostic tools online. We propose a methodology for determining and evaluate the best airlines websites based on many criteria of website quality. The approach has been implemented using Analytical Hierarchy Process (AHP) to generate the weights for the criteria which are much better and guarantee more fairly preference of criteria. The proposed model uses the AHP pairwise comparisons and the measure scale to generate the weights for the criteria which are much better and guarantee
more fairly preference of criteria. The result of this study confirmed that the airlines websites of Asian are neglecting performance and quality criteria
An Architecture for Integrated Intelligence in Urban Management using Cloud Computing
With the emergence of new methodologies and technologies it has now become
possible to manage large amounts of environmental sensing data and apply new
integrated computing models to acquire information intelligence. This paper
advocates the application of cloud capacity to support the information,
communication and decision making needs of a wide variety of stakeholders in
the complex business of the management of urban and regional development. The
complexity lies in the interactions and impacts embodied in the concept of the
urban-ecosystem at various governance levels. This highlights the need for more
effective integrated environmental management systems. This paper offers a
user-orientated approach based on requirements for an effective management of
the urban-ecosystem and the potential contributions that can be supported by
the cloud computing community. Furthermore, the commonality of the influence of
the drivers of change at the urban level offers the opportunity for the cloud
computing community to develop generic solutions that can serve the needs of
hundreds of cities from Europe and indeed globally.Comment: 6 pages, 3 figure
Recommended from our members
An evaluation of DIADEM assisted online form completion
The DIADEM project aims to develop a web-based application in the form of an Expert System (ES) to assist cognitively impaired older-adult users in the task of interacting with and completing online transactions. Having recently developed the first experimental version of the application, this study reports on the preliminary findings of user trials carried out in three European countries top evaluate this early version of the application. Of the 94 users that took part in the trials, 77 were identified as users that were likely to present with some degree of mild cognitive impairment, and thus were included in the analysis stage. The key findings of the study indicate that users of DIADEM assisted form filling seemed report comparatively high-levels of satisfaction, particularly when considered against what is considered a typical level of satisfaction for this user group. Furthermore, as a result of a statistical analysis, the application appears to provide significantly increased levels of assistance for users presenting with higher levels of cognitive impairments, and therefore achieves its goal of catering for this particular target user group
Website Design and Evaluation Workshop
Workbook on website design prepared for presentation at LIBRARIES IN THE DIGITAL AGE 2004: HUMAN INFORMATION BEHAVIOUR AND COMPETENCIES FOR DIGITAL LIBRARIES includes chapters on pre-planning, card sort technique, focus groups, usability, site architecture, accessibility, and assessmentunpublishednot peer reviewe
Accessibility-based reranking in multimedia search engines
Traditional multimedia search engines retrieve results based mostly on the query submitted by the user, or using a log of previous searches to provide personalized results, while not considering the accessibility of the results for users with vision or other types of impairments. In this paper, a novel approach is presented which incorporates the accessibility of images for users with various vision impairments, such as color blindness, cataract and glaucoma, in order to rerank the results of an image search engine. The accessibility of individual images is measured through the use of vision simulation filters. Multi-objective optimization techniques utilizing the image accessibility scores are used to handle users with multiple vision impairments, while the impairment profile of a specific user is used to select one from the Pareto-optimal solutions. The proposed approach has been tested with two image datasets, using both simulated and real impaired users, and the results verify its applicability. Although the proposed method has been used for vision accessibility-based reranking, it can also be extended for other types of personalization context
What is usability in the context of the digital library and how can it be measured?
This paper reviews how usability has been defined in the context of the digital library, what methods have been applied and their applicability, and proposes an evaluation model and a suite of instruments for evaluating usability for academic digital libraries. The model examines effectiveness, efficiency, satisfaction, and learnability. It is found that there exists an interlocking relationship among effectiveness, efficiency, and satisfaction. It also examines how learnability interacts with these three attributes
Specification and implementation of mapping rule visualization and editing : MapVOWL and the RMLEditor
Visual tools are implemented to help users in defining how to generate Linked Data from raw data. This is possible thanks to mapping languages which enable detaching mapping rules from the implementation that executes them. However, no thorough research has been conducted so far on how to visualize such mapping rules, especially if they become large and require considering multiple heterogeneous raw data sources and transformed data values. In the past, we proposed the RMLEditor, a visual graph-based user interface, which allows users to easily create mapping rules for generating Linked Data from raw data. In this paper, we build on top of our existing work: we (i) specify a visual notation for graph visualizations used to represent mapping rules, (ii) introduce an approach for manipulating rules when large visualizations emerge, and (iii) propose an approach to uniformly visualize data fraction of raw data sources combined with an interactive interface for uniform data fraction transformations. We perform two additional comparative user studies. The first one compares the use of the visual notation to present mapping rules to the use of a mapping language directly, which reveals that the visual notation is preferred. The second one compares the use of the graph-based RMLEditor for creating mapping rules to the form-based RMLx Visual Editor, which reveals that graph-based visualizations are preferred to create mapping rules through the use of our proposed visual notation and uniform representation of heterogeneous data sources and data values. (C) 2018 Elsevier B.V. All rights reserved
A Brief History of Web Crawlers
Web crawlers visit internet applications, collect data, and learn about new
web pages from visited pages. Web crawlers have a long and interesting history.
Early web crawlers collected statistics about the web. In addition to
collecting statistics about the web and indexing the applications for search
engines, modern crawlers can be used to perform accessibility and vulnerability
checks on the application. Quick expansion of the web, and the complexity added
to web applications have made the process of crawling a very challenging one.
Throughout the history of web crawling many researchers and industrial groups
addressed different issues and challenges that web crawlers face. Different
solutions have been proposed to reduce the time and cost of crawling.
Performing an exhaustive crawl is a challenging question. Additionally
capturing the model of a modern web application and extracting data from it
automatically is another open question. What follows is a brief history of
different technique and algorithms used from the early days of crawling up to
the recent days. We introduce criteria to evaluate the relative performance of
web crawlers. Based on these criteria we plot the evolution of web crawlers and
compare their performanc
- …