1,664,205 research outputs found

    Challenges to describe QoS requirements for web services quality prediction to support web services interoperability in electronic commerce

    Get PDF
    Quality of service (QoS) is significant and necessary for web service applications quality assurance. Furthermore, web services quality has contributed to the successful implementation of Electronic Commerce (EC) applications. However, QoS is still the big issue for web services research and remains one of the main research questions that need to be explored. We believe that QoS should not only be measured but should also be predicted during the development and implementation stages. However, there are challenges and constraints to determine and choose QoS requirements for high quality web services. Therefore, this paper highlights the challenges for the QoS requirements prediction as they are not easy to identify. Moreover, there are many different perspectives and purposes of web services, and various prediction techniques to describe QoS requirements. Additionally, the paper introduces a metamodel as a concept of what makes a good web service

    PERFORMANCE EVALUATION ON QUALITY OF ASIAN AIRLINES WEBSITES – AN AHP PPROACH

    Get PDF
    In recent years, many people have devoted their efforts to the issue of quality of Web site. The concept of quality is consisting of many criteria: quality of service perspective, a user perspective, a content perspective or indeed a usability perspective. Because of its possible instant worldwide audience a Website’s quality and reliability are crucial. The very special nature of the web applications and websites pose unique software testing challenges. Webmasters, Web applications developers, and Website quality assurance managers need tools and methods that can match up to the new needs. This research conducts some tests to measure the quality web site of Asian flag carrier airlines via web diagnostic tools online. We propose a methodology for determining and evaluate the best airlines websites based on many criteria of website quality. The approach has been implemented using Analytical Hierarchy Process (AHP) to generate the weights for the criteria which are much better and guarantee more fairly preference of criteria. The proposed model uses the AHP pairwise comparisons and the measure scale to generate the weights for the criteria which are much better and guarantee more fairly preference of criteria. The result of this study confirmed that the airlines websites of Asian are neglecting performance and quality criteria

    Archiving the Relaxed Consistency Web

    Full text link
    The historical, cultural, and intellectual importance of archiving the web has been widely recognized. Today, all countries with high Internet penetration rate have established high-profile archiving initiatives to crawl and archive the fast-disappearing web content for long-term use. As web technologies evolve, established web archiving techniques face challenges. This paper focuses on the potential impact of the relaxed consistency web design on crawler driven web archiving. Relaxed consistent websites may disseminate, albeit ephemerally, inaccurate and even contradictory information. If captured and preserved in the web archives as historical records, such information will degrade the overall archival quality. To assess the extent of such quality degradation, we build a simplified feed-following application and simulate its operation with synthetic workloads. The results indicate that a non-trivial portion of a relaxed consistency web archive may contain observable inconsistency, and the inconsistency window may extend significantly longer than that observed at the data store. We discuss the nature of such quality degradation and propose a few possible remedies.Comment: 10 pages, 6 figures, CIKM 201

    Measuring University Web Site Quality: A Development of a User-Perceived Instrument and its Initial Implementation to Web sites of Accounting Departments in New Zealand's Universities

    Get PDF
    The emergent popularity of Web technologies and their applications have created vast opportunities for organisations, including institutions of higher education, to stretch out for broader customers and create greater networking relationships. The global and far-reaching nature of the Web, its various interactive capabilities, and the rapid growth of the Web use worldwide have made university Web sites more essential for promotion and commercial purposes. However, it has been acknowledged that in order to gain the benefits from Web utilisation, a well-designed Web site is needed. Previous studies on quality of Web sites are not lacking, but most of them have been focussed mainly on business Web sites. Empirical research that focuses on the Web site quality of institutions of higher education has been scarce. In this study, an instrument for measuring university Web site quality was developed and validated by taking into account both the perspectives of the users and the importance of its informational content. The instrument was subsequently put to the test by implementing it for measuring and ranking the quality of Web sites of Accounting Departments in New Zealand's universities. The results from this initial application substantiated the validity and reliability of the instrument.University Web sites, Web site quality, Instrument development, Accounting Department Web sites ranking

    An Infrastructure for acquiring high quality semantic metadata

    Get PDF
    Because metadata that underlies semantic web applications is gathered from distributed and heterogeneous data sources, it is important to ensure its quality (i.e., reduce duplicates, spelling errors, ambiguities). However, current infrastructures that acquire and integrate semantic data have only marginally addressed the issue of metadata quality. In this paper we present our metadata acquisition infrastructure, ASDI, which pays special attention to ensuring that high quality metadata is derived. Central to the architecture of ASDI is a erification engine that relies on several semantic web tools to check the quality of the derived data. We tested our prototype in the context of building a semantic web portal for our lab, KMi. An experimental evaluation omparing the automatically extracted data against manual annotations indicates that the verification engine enhances the quality of the extracted semantic metadata

    DEVELOPING AND VALIDATING A QUALITY ASSESSMENT SCALE FOR WEB PORTALS

    Get PDF
    The Web portals business model has spread rapidly over the last few years. Despite this, there have been very few scholarly findings about which services and characteristics make a Web site a portal and which dimensions determine the customers’ evaluation of the portal’s quality. Taking the example of financial portals, the authors develop a theoretical framework of the Web portal quality construct by determining the number and nature of corresponding dimensions, which are: security and trust, basic services quality, cross-buying services quality, added values, transaction support and relationship quality. To measure the six portal quality dimensions, multi item measurement scales are developed and validated.Construct Validation, Customer Retention, E-Banking, E- Loyalty, Service Quality, Web Portals

    Extended-Linking Services: towards a Quality Web

    Get PDF
    A URL takes requesters from a citation to a destination… provided, of course, the URL is still valid. The current chaotic web is wonderful in its way. However, within this chaotic web, we believe there is a need for a high-quality web of vetted information. The emerging OpenURL standard is the cornerstone of a worldwide web with high-quality links that feature properties such as: •Persistence: Increase the probable lifetime of citations. •Multiplicity: Produce a menu of targeted services for each citation. •Context-Sensitivity: Resolve a citation in a manner appropriate to the user and to the context. To encourage the development of extended-linking services, NISO formed a committee to develop a standard OpenURL syntax. Our immediate goal is to serve the scholarly-information community immediately. However, the OpenURL technique is widely applicable, and we expect to serve many other information communities
    • …
    corecore