356 research outputs found

    Global Diffusion of the Internet XV: Web 2.0 Technologies, Principles, and Applications: A Conceptual Framework from Technology Push and Demand Pull Perspective

    Get PDF
    Web 2.0, the current Internet evolution, can be described by several key features of an expanded Web that is more interactive; allows easy social interactions through participation and collaboration from a variety of human sectors; responds more immediately to users\u27 queries and needs; is easier to search; and provides a faster, smoother, realistic and engaging user search capability, often with automatic updates to users. The purpose of this study is three-fold. First, the primary goal is to propose a conceptual Web 2.0 framework that provides better understanding of the Web 2.0 concept by classifying current key components in a holistic manner. Second, using several selective key components from the conceptual framework, this study conducts case analyses of Web 2.0 applications to discuss how they have adopted the selective key features (i.e., participation, collaboration, rich user experience, social networking, semantics, and interactivity responsiveness) of the conceptual Web 2.0 framework. Finally, the study provides insightful discussion of some challenges and opportunities provided by Web 2.0 to education, business, and social life

    CUBA: Artificial conviviality and user-behaviour analysis in web-feeds

    Get PDF
    Conviviality is a concept of great depth that plays an important role in any social interaction. A convivial relation between individuals is one that allows the participating individuals to behave and interact with each other following a set of conventions that are shared, commonly agreed upon, or at least understood. This presupposes an implicit or an explicit regulation mechanism based on consensus or social contracts and applies to the behaviours and interactions of participating individuals. With respect to an intelligent web-based system, an applicable social contribution is the give of assistance to other users in situations that are unclear and in guiding him to find the right decision whenever a conflict arises. Such a convivial social biotope deeply depends on both implicit and explicit co-operation and collaboration of natural users inside a community. Here, the individual conviviality may benefit from “The Wisdom of Crowds”, which fosters a dynamic understanding of the user’s behaviour and a strong influence of an individual’s well being to another person(s). The web-based system CUBA focus on such a behavioural analysis through profiling and demonstrates a convivial stay within a web-based feed system

    Emerging technologies for learning (volume 2)

    Get PDF

    Cost-Aware Resource Management for Decentralized Internet Services

    Full text link
    Decentralized network services, such as naming systems, content distribution networks, and publish-subscribe systems, play an increasingly critical role and are required to provide high performance, low latency service, achieve high availability in the presence of network and node failures, and handle a large volume of users. Judicious utilization of expensive system resources, such as memory space, network bandwidth, and number of machines, is fundamental to achieving the above properties. Yet, current network services typically rely on less-informed, heuristic-based techniques to manage scarce resources, and often fall short of expectations. This thesis presents a principled approach for building high performance, robust, and scalable network services. The key contribution of this thesis is to show that resolving the fundamental cost-benefit tradeoff between resource consumption and performance through mathematical optimization is practical in large-scale distributed systems, and enables decentralized network services to meet efficiently system-wide performance goals. This thesis presents a practical approach for resource management in three stages: analytically model the cost-benefit tradeoff as a constrained optimization problem, determine a near-optimal resource allocation strategy on the fly, and enforce the derived strategy through light-weight, decentralized mechanisms. It builds on self-organizing structured overlays, which provide failure resilience and scalability, and complements them with stronger performance guarantees and robustness under sudden changes in workload. This work enables applications to meet system-wide performance targets, such as low average response times, high cache hit rates, and small update dissemination times with low resource consumption. Alternatively, applications can make the maximum use of available resources, such as storage and bandwidth, and derive large gains in performance. I have implemented an extensible framework called Honeycomb to perform cost-aware resource management on structured overlays based on the above approach and built three critical network services using it. These services consist of a new name system for the Internet called CoDoNS that distributes data associated with domain names, an open-access content distribution network called CobWeb that caches web content for faster access by users, and an online information monitoring system called Corona that notifies users about changes to web pages. Simulations and performance measurements from a planetary-scale deployment show that these services provide unprecedented performance improvement over the current state of the art

    Design of an aggregator for managing informative big data

    Get PDF
    The design and characteristics of a new open source content aggregation program, AXYZ, are described. Several features of the program standout, including the processing engine of syndication channels, monitoring capability of information recovery in real time, possibility of configuration of the aggregator behavior, automatic content classification, and new models for representing information from relational interactive maps. On the other hand, the aggregation program is designed to manage thousands of syndication channels in the RSS format. It also provides statistics that can be used to study the production of any information producer and the impact of the information published in other sources. The AXYZ modules are capable of comparing the relationship between news or information from different sources and the degree of influence which is detected by patterns

    A series of case studies to enhance the social utility of RSS

    Get PDF
    RSS (really simple syndication, rich site summary or RDF site summary) is a dialect of XML that provides a method of syndicating on-line content, where postings consist of frequently updated news items, blog entries and multimedia. RSS feeds, produced by organisations or individuals, are often aggregated, and delivered to users for consumption via readers. The semi-structured format of RSS also allows the delivery/exchange of machine-readable content between different platforms and systems. Articles on web pages frequently include icons that represent social media services which facilitate social data. Amongst these, RSS feeds deliver data which is typically presented in the journalistic style of headline, story and snapshot(s). Consequently, applications and academic research have employed RSS on this basis. Therefore, within the context of social media, the question arises: can the social function, i.e. utility, of RSS be enhanced by producing from it data which is actionable and effective? This thesis is based upon the hypothesis that the fluctuations in the keyword frequencies present in RSS can be mined to produce actionable and effective data, to enhance the technology's social utility. To this end, we present a series of laboratory-based case studies which demonstrate two novel and logically consistent RSS-mining paradigms. Our first paradigm allows users to define mining rules to mine data from feeds. The second paradigm employs a semi-automated classification of feeds and correlates this with sentiment. We visualise the outputs produced by the case studies for these paradigms, where they can benefit users in real-world scenarios, varying from statistics and trend analysis to mining financial and sporting data. The contributions of this thesis to web engineering and text mining are the demonstration of the proof of concept of our paradigms, through the integration of an array of open-source, third-party products into a coherent and innovative, alpha-version prototype software implemented in a Java JSP/servlet-based web application architecture

    Sept. 2005

    Get PDF

    CIRA annual report FY 2016/2017

    Get PDF
    Reporting period April 1, 2016-March 31, 2017

    Journalism as usual: The use of social media as a newsgathering tool in the coverage of the Iranian elections in 2009

    Get PDF
    The Iranian elections of June 2009 and the ensuing protests were hailed as the 'Twitter revolution' in the media in the United Kingdom. However, this study of the use of sources by journalists covering the events shows that despite their rhetoric of the importance of social media in alerting the global community to events in Iran, journalists themselves did not turn to that social media for their own information, but relied most on traditional sourcing practices: political statements, expert opinion and a handful of 'man on the street' quotes for colour. This study shows that although the mythology of the Internet as a place where all voices are equal, and have equal access to the public discourse continues – a kind of idealized 'public sphere' – the sourcing practices of journalists and the traditions of coverage continue to ensure that traditional voices and sources are heard above the crowd
    • …
    corecore