12,655 research outputs found
Design and evaluation of improvement method on the web information navigation - A stochastic search approach
With the advent of fast growing Internet and World Wide Web (the Web), more and more companies enhance the business competitiveness by conducting electronic commerce. At the same time, more and more people gather or process information by surfing on the Web. However, due to unbalanced Web traffic and poorly organized information, users suffer from slow communication and disordered information. To improve the situation, information providers can analyze the traffic and Uniform Resource Locator (URL) counters to adjust the information layering and organization; nevertheless, heterogeneous navigation patterns and dynamic fluctuating Web traffic complicate the improvement process. Alternatively, improvement can be made by giving direct guidance to the surfers in navigating the Web sites. In this paper, information retrieval on a Web site is modeled as a Markov chain associated with the corresponding dynamic Web traffic and designated information pages. We consider four models of information retrieval based on combination of the level of skill or experience of the surfers as well as the degree of navigation support by the sites. Simulation is conducted to evaluate the performance of the different types of navigation guidance. In addition, we evaluate the four models of information retrieval in terms of complexity and applicability. The paper concludes with a research summary and a direction for future research efforts. © 2009 Elsevier B.V. All rights reserved.postprin
Design and evaluation of improvement method on the Web information navigation - a stochastic search approach
With the advent of fast growing Internet and World Wide Web (WWW), more and more companies start the electronic commerce to enhance the business competitiveness. On the other hand, more and more people surf on the Web for information gathering/processing. Due to unbalanced traffic and poorly organized information, users suffer the slow communication and disordered information organization. The information provider can analyze the traffic and uniform resource locator (URL) counters to adjust the organization; however, heterogeneous navigation patterns and dynamic fluctuating Web traffic make the tuning process very complicated. Alternatively the user may be provided with guidance to navigate through the Web pages efficiently. In this paper, a Web site was modeled as a Markov chain associated with the corresponding dynamic traffic and designated information pages. We consider four models: inexperienced surfers on guidance-less sites, experienced surfers on guidance-less sites, sites with the mean-length guidance, and sites with the known-first-arc guidance (generalized as sites with dynamic stochastic shortest path guidance). Simulation is conducted to evaluate the performance of the different types of navigation guidance. We also propose a reformulation policy to highlight the hyperlinks as steering guidance. The evolution on complexity and applicability is also discussed for the design guideline of general improvement methods. The paper concludes with the summary and future directions.published_or_final_versio
Learning Visual Features from Snapshots for Web Search
When applying learning to rank algorithms to Web search, a large number of
features are usually designed to capture the relevance signals. Most of these
features are computed based on the extracted textual elements, link analysis,
and user logs. However, Web pages are not solely linked texts, but have
structured layout organizing a large variety of elements in different styles.
Such layout itself can convey useful visual information, indicating the
relevance of a Web page. For example, the query-independent layout (i.e., raw
page layout) can help identify the page quality, while the query-dependent
layout (i.e., page rendered with matched query words) can further tell rich
structural information (e.g., size, position and proximity) of the matching
signals. However, such visual information of layout has been seldom utilized in
Web search in the past. In this work, we propose to learn rich visual features
automatically from the layout of Web pages (i.e., Web page snapshots) for
relevance ranking. Both query-independent and query-dependent snapshots are
considered as the new inputs. We then propose a novel visual perception model
inspired by human's visual search behaviors on page viewing to extract the
visual features. This model can be learned end-to-end together with traditional
human-crafted features. We also show that such visual features can be
efficiently acquired in the online setting with an extended inverted indexing
scheme. Experiments on benchmark collections demonstrate that learning visual
features from Web page snapshots can significantly improve the performance of
relevance ranking in ad-hoc Web retrieval tasks.Comment: CIKM 201
Issues in Evaluating Health Department Web-Based Data Query Systems: Working Papers
Compiles papers on conceptual and methodological topics to consider in evaluating state health department systems that provide aggregate data online, such as taxonomy, logic models, indicators, and design. Includes surveys and examples of evaluations
Improving Reachability and Navigability in Recommender Systems
In this paper, we investigate recommender systems from a network perspective
and investigate recommendation networks, where nodes are items (e.g., movies)
and edges are constructed from top-N recommendations (e.g., related movies). In
particular, we focus on evaluating the reachability and navigability of
recommendation networks and investigate the following questions: (i) How well
do recommendation networks support navigation and exploratory search? (ii) What
is the influence of parameters, in particular different recommendation
algorithms and the number of recommendations shown, on reachability and
navigability? and (iii) How can reachability and navigability be improved in
these networks? We tackle these questions by first evaluating the reachability
of recommendation networks by investigating their structural properties.
Second, we evaluate navigability by simulating three different models of
information seeking scenarios. We find that with standard algorithms,
recommender systems are not well suited to navigation and exploration and
propose methods to modify recommendations to improve this. Our work extends
from one-click-based evaluations of recommender systems towards multi-click
analysis (i.e., sequences of dependent clicks) and presents a general,
comprehensive approach to evaluating navigability of arbitrary recommendation
networks
Towards effective Web site designs: A framework for modeling, design evaluation and enhancement
Effective Web site design is critical to the success of e-commerce. Therefore, the evaluation and enhancement of a Web site design is of great importance. In this vein, accessibility is important and has been examined by a lot of researchers from different points of views. By and large, Web site accessibility is a structural problem and may be analytically investigated using mathematical approach. We propose a framework for representing real-world design problems as generic Web site designs, which then can be mapped into accessibility models analyzable or solvable using established analytical techniques. The framework consists of generic design and graph models, with the necessary mapping. We describe a generic Web site design using its objective and constraints, which correspond to important design requirements. By representing design problems using well-defined structures and rigorous analysis methods, this framework measures Web site accessibility using systematic and quantifiable approaches rather than qualitative ad-hoc practice. Hence, the framework facilitates the overall Web site design process, enhances design quality, and increases ease of analysis, implementation and continuous improvement. © 2005 IEEE.published_or_final_versio
Towards Analytical Approach to Effective Website Designs: A Framework for Modeling, Evaluation and Enhancement
Conference Theme: I.T. and Value CreationEffective website design is critical to the success of electronic commerce and digital government. Most prior website design research has taken a computational or cognitive/behavioral approach which may not yield optimal designs demanded by specific requirements. We consider website design as a structural problem which can be examined using analytical approach, such as mathematical optimization. Specifically, we propose a framework which classifies real-world design problems into generic website design categories and maps each resulting category into a graph model which can be analyzable or solved using appropriate analytical techniques. Our framework consists of generic designs and graph models, together with the necessary mapping. We classify the Web site applications and review their features proposed by previous research. We describe a generic website design category using its objective and key constraints that correspond to important design requirements. By modeling website design problems using well-defined structures and rigorous analysis methods, this framework is able to measure website accessibility in a systematic and quantifiable manner, arguably more desirable than existing qualitative ad-hoc practices. Overall, our framework can facilitate the website design process, enhance design quality, and increase ease of analysis, implementation and continuous improvement.link_to_subscribed_fulltex
Retrospective Higher-Order Markov Processes for User Trails
Users form information trails as they browse the web, checkin with a
geolocation, rate items, or consume media. A common problem is to predict what
a user might do next for the purposes of guidance, recommendation, or
prefetching. First-order and higher-order Markov chains have been widely used
methods to study such sequences of data. First-order Markov chains are easy to
estimate, but lack accuracy when history matters. Higher-order Markov chains,
in contrast, have too many parameters and suffer from overfitting the training
data. Fitting these parameters with regularization and smoothing only offers
mild improvements. In this paper we propose the retrospective higher-order
Markov process (RHOMP) as a low-parameter model for such sequences. This model
is a special case of a higher-order Markov chain where the transitions depend
retrospectively on a single history state instead of an arbitrary combination
of history states. There are two immediate computational advantages: the number
of parameters is linear in the order of the Markov chain and the model can be
fit to large state spaces. Furthermore, by providing a specific structure to
the higher-order chain, RHOMPs improve the model accuracy by efficiently
utilizing history states without risks of overfitting the data. We demonstrate
how to estimate a RHOMP from data and we demonstrate the effectiveness of our
method on various real application datasets spanning geolocation data, review
sequences, and business locations. The RHOMP model uniformly outperforms
higher-order Markov chains, Kneser-Ney regularization, and tensor
factorizations in terms of prediction accuracy
- …