468,135 research outputs found

    Incorporating web analysis into neural networks: An example in hopfield net searching

    Get PDF
    Neural networks have been used in various applications on the World Wide Web, but most of them only rely on the available input-output examples without incorporating Web-specific knowledge, such as Web link analysis, into the network design. In this paper, we propose a new approach in which the Web is modeled as an asymmetric Hopfield Net. Each neuron in the network represents a Web page, and the connections between neurons represent the hyperlinks between Web pages. Web content analysis and Web link analysis are also incorporated into the model by adding a page content score function and a link score function into the weights of the neurons and the synapses, respectively. A simulation study was conducted to compare the proposed model with traditional Web search algorithms, namely, a breadth-first search and a best-first search using PageRank as the heuristic. The results showed that the proposed model performed more efficiently and effectively in searching for domain-specific Web pages. We believe that the model can also be useful in other Web applications such as Web page clustering and search result ranking. © 2007 IEEE.published_or_final_versio

    Web site structure mining using social network analysis

    Get PDF
    Purpose – Web sites are typically designed attending to a variety of criteria. However, web site structure determines browsing behavior and way-finding results. The aim of this study is to identify the main profiles of web sites’ organizational structure by modeling them as graphs and considering several social network analysis features. Design/methodology/approach – A case study based on 80 institutional Spanish universities’ web sites has been used for this purpose. For each root domain, two different networks have been considered: the first is the domain network, and the second is the page network. In both cases, several indicators related to social network analysis have been evaluated to characterize the web site structure. Factor analysis provides the statistical methodology to adequately extract the main web site profiles in terms of their internal structure. Findings – This paper allows the categorization of web site design styles and provides general guidelines to assist designers to better identify areas for creating and improving institutional web sites. The findings of this study offer practical implications to web site designers for creating and maintaining an effective web presence, and for improving usability. Research limitations/implications – The research is limited to 80 institutional Spanish universities’ web sites. Other institutional university web sites from different countries can be analyzed, and the conclusions could be compared or enlarged. Originality/value – This paper highlights the importance of the internal web sites structure, and their implications on usability and way-finding results. As a difference to previous research, the paper is focused on the comparison of internal structure of institutional web sites, rather than analyzing the web as a whole or the interrelations among web sitesMinisterio de Educación y Ciencia DPI2007- 60128Junta de Andalucía. Consejería de Innovación, Ciencia y Empresa P07-TIC-0262

    Discussion on Teaching Reform of Advanced Computer Network Course

    Get PDF
    “Advanced Computer Network” is a professional selection course to improve network skills. The purpose of this course is to enable students to understand the frontier problems, methods, development and trends in their analysis and research fi elds. The rapid development of Internet technology has brought many problems to the teaching of “Advanced Computer Network Course”. How to establish a complete set of advanced computer network course theory and practice teaching framework for graduate students, how to design content for teaching courses, including advanced network technology topics, basic web page design principles, network practice teaching methods and paper reading seminar writing, and how to build an evaluation mechanism for the management of the entire teaching process, so as to achieve equal emphasis on basic theory and cutting-edge research, theoretical teaching and practical training, to enable graduate students to truly understand and master theoretical knowledge, improve their practical ability, and lay a solid foundation for scientifi c research and subsequent innovation

    NetworkMonitoring System (NMS)

    Get PDF
    Due to rapid changes and consequent new threats to computer networks there is a need for the design of systems that enhance network security. These systems make network administrators fully aware of the potential vulnerability of their networks. This paperdesigns a Network Monitoring System (NMS) which is an active defense and complex network surveillance platform designed for ISPs to meet their most rigorous security requirements. This system is motivated by the great needof government agencies, ecommerce companies and Web development organizations to secure their computer networks. The proposed system is also used by network administrators to enable them understand the vulnerabilities affecting computer networks. This enables these administrators to improve network security. The proposed system is a lawful network traffic (Internet Service Provider IP trffic) interception system with the main task of obtaining network communications, giving access to intercepted traffic to lawful authorities for the purpose of data analysis and/or evidence. Such data generally consist of signaling, network management information, or the content of network communications. The intercepted IP traffic is gathered and analyzed for network vulnerability in real time. Then, the corresponding TCP/UDP traffic (Web page, email message, VOIP calls, DHCP traffic, files transferred over the LAN such as HTML files, images, and video files, etc.) is rebuilt and displayed. Based on the results of the analysis of the rebuilt TCP/UDP an alarm could be generatedif amalicious behavior is detected. Experimental results show that the proposed system has many

    A phenomenological analysis of an instructional systems design creative project

    Get PDF
    This research paper is a phenomenological analysis of a creative project involving University of Northern Iowa undergraduate art students in the planning and creation of visual illustrations, graphic design concepts, .html documents, and imagery for a world wide web intranet/lnternet virtual space. This analysis looks at instructional design as a creative process and the phenomenology of the UNI Art/Cat (Art Resources Technology/Computer Assisted Training) computer laboratory. The mission, goals, and objectives of the creative project, experiential and experimental philosophies of education, and the phenomenologies of the instructional design process are the main considerations. The methodology of this thesis is primarily concerned with action research and research as lived experience. The generational aspects of computer hardware and software and the affective aspects of the evolution of the infrastructure upon instructional development is examined. This generation of techno-apparatus includes the Macintosh G3 Personal Computer in a network environment, Afga and Hewlett Packard Flatbed Scanners, Polaroid Slide Scanners, Adobe Graphic Design Software, and Symantec Visual Page Web Design Software. Commentary on the social and bureaucratic considerations in this particular creative project and discussion of the collaboration with UNI Art Department administration, faculty, and students is included with the final conclusions and recommendations

    Look back, look around:A systematic analysis of effective predictors for new outlinks in focused Web crawling

    Get PDF
    Small and medium enterprises rely on detailed Web analytics to be informed about their market and competition. Focused crawlers meet this demand by crawling and indexing specific parts of the Web. Critically, a focused crawler must quickly find new pages that have not yet been indexed. Since a new page can be discovered only by following a new outlink, predicting new outlinks is very relevant in practice. In the literature, many feature designs have been proposed for predicting changes in the Web. In this work we provide a structured analysis of this problem, using new outlinks as our running prediction target. Specifically, we unify earlier feature designs in a taxonomic arrangement of features along two dimensions: static versus dynamic features, and features of a page versus features of the network around it. Within this taxonomy, complemented by our new (mainly, dynamic network) features, we identify best predictors for new outlinks. Our main conclusion is that most informative features are the recent history of new outlinks on a page itself, and of its content-related pages. Hence, we propose a new 'look back, look around' (LBLA) model, that uses only these features. With the obtained predictions, we design a number of scoring functions to guide a focused crawler to pages with most new outlinks, and compare their performance. The LBLA approach proved extremely effective, outperforming other models including those that use a most complete set of features. One of the learners we use, is the recent NGBoost method that assumes a Poisson distribution for the number of new outlinks on a page, and learns its parameters. This connects the two so far unrelated avenues in the literature: predictions based on features of a page, and those based on probabilistic modelling. All experiments were carried out on an original dataset, made available by a commercial focused crawler.Comment: 23 pages, 15 figures, 4 tables, uses arxiv.sty, added new title, heuristic features and their results added, figures 7, 14, and 15 updated, accepted versio
    • 

    corecore