887 research outputs found

    Quality assessment technique for ubiquitous software and middleware

    Get PDF
    The new paradigm of computing or information systems is ubiquitous computing systems. The technology-oriented issues of ubiquitous computing systems have made researchers pay much attention to the feasibility study of the technologies rather than building quality assurance indices or guidelines. In this context, measuring quality is the key to developing high-quality ubiquitous computing products. For this reason, various quality models have been defined, adopted and enhanced over the years, for example, the need for one recognised standard quality model (ISO/IEC 9126) is the result of a consensus for a software quality model on three levels: characteristics, sub-characteristics, and metrics. However, it is very much unlikely that this scheme will be directly applicable to ubiquitous computing environments which are considerably different to conventional software, trailing a big concern which is being given to reformulate existing methods, and especially to elaborate new assessment techniques for ubiquitous computing environments. This paper selects appropriate quality characteristics for the ubiquitous computing environment, which can be used as the quality target for both ubiquitous computing product evaluation processes ad development processes. Further, each of the quality characteristics has been expanded with evaluation questions and metrics, in some cases with measures. In addition, this quality model has been applied to the industrial setting of the ubiquitous computing environment. These have revealed that while the approach was sound, there are some parts to be more developed in the future

    Quantum Algorithms for the Most Frequently String Search, Intersection of Two String Sequences and Sorting of Strings Problems

    Full text link
    We study algorithms for solving three problems on strings. The first one is the Most Frequently String Search Problem. The problem is the following. Assume that we have a sequence of nn strings of length kk. The problem is finding the string that occurs in the sequence most often. We propose a quantum algorithm that has a query complexity O~(nk)\tilde{O}(n \sqrt{k}). This algorithm shows speed-up comparing with the deterministic algorithm that requires Ω(nk)\Omega(nk) queries. The second one is searching intersection of two sequences of strings. All strings have the same length kk. The size of the first set is nn and the size of the second set is mm. We propose a quantum algorithm that has a query complexity O~((n+m)k)\tilde{O}((n+m) \sqrt{k}). This algorithm shows speed-up comparing with the deterministic algorithm that requires Ω((n+m)k)\Omega((n+m)k) queries. The third problem is sorting of nn strings of length kk. On the one hand, it is known that quantum algorithms cannot sort objects asymptotically faster than classical ones. On the other hand, we focus on sorting strings that are not arbitrary objects. We propose a quantum algorithm that has a query complexity O(n(logn)2k)O(n (\log n)^2 \sqrt{k}). This algorithm shows speed-up comparing with the deterministic algorithm (radix sort) that requires Ω((n+d)k)\Omega((n+d)k) queries, where dd is a size of the alphabet.Comment: THe paper was presented on TPNC 201

    Communities at a Crossroads. Material semiotics for online sociability in the fade of cyberculture

    Get PDF
    How to conceptualize online sociability in the 21st century? To answer this question, Communities at a Crossroads looks back at the mid-2000s. With the burst of the creative-entrepreneur alliance, the territorialization of the internet and the commercialization of interpersonal ties, that period constituted a turning point for digital communitarian cultures. Many of the techno-libertarian culture\u2019s utopias underpinning the ideas for online sociability faced systematic counter evidence. This change in paradigm has still consequences today. Avoiding both empty invocations of community and swift conclusions of doom, Annalisa Pelizza investigates the theories of actions that have underpinned the development of techno-social digital assemblages after the \u2018golden age\u2019 of online communities. Communities at a Crossroads draws upon the analysis of Ars Electronica\u2019s Digital Communities archive, which is the largest of its kind worldwide, and in doing so presents a multi-faceted picture of internet sociability between the two centuries. Privileging an anti-essentialist, performative approach over sociological understandings of online communities, Communities at a Crossroads proposes a radical epistemological turn. It argues that in order to conceptualize contemporary online sociability, we need first to abandon the techno-libertarian communalist rhetoric. Then, it is necessary to move beyond the foundational distinction between Gemeinschaft and Gesellschaft, and adopt a material semiotic approach. In the end, we might have to relinquish the effort to define online or digital communities and engage in more meaningful mapping exercises

    The film of tomorrow: a cultural history of videoblogging

    Get PDF
    Videoblogging is a form of cultural production that emerged in the early 2000s as a result of the increasing availability of cheap digital recording equipment, new videoediting software, video website hosting and innovative distribution networks across the internet. This thesis explores the close entanglement of culture and technology in this early and under-examined area of media production – most notably in the self-definition and development of a specific community around video practices and technologies between 2004-2009. These videobloggers’ digital works are presented as an original case study of material digital culture on the internet, which also produced a distinctive aesthetic style. The thesis traces the discourses and technological infrastructures that were developed both within and around the community of videobloggers and that created the important pre-conditions for the video artefacts they produced. Through an ethnographically-informed cultural history of the practices and technologies of videoblogging, this thesis engages with the way in which new forms of cultural and technical hybrids have emerged in an increasingly digital age. The ethnographic research is informed by histories of film and video, which contribute to the theoretical understanding and contextualisation of videoblogging – as an early digital community – which has been somewhat neglected in favour of research on mainstream online video websites, such as YouTube. The thesis also contributes to scholarly understanding of contemporary digital video practices, and explores how the history of earlier amateur and semi-professional film and video has been influential on the practices, technologies and aesthetic styles of the videobloggers. It is also shown how their aesthetic has been drawn on and amplified in network culture, mainstream media, and contemporary media and cultural production. Through a critical mapping of the socio-technical structures of videoblogging, the thesis argues that the trajectories of future media and cultural production draws heavily from the practices and aesthetics of these early hybrid networked cultural-technical communities

    Tracing back Communities. An Analysis of Ars Electronica's Digital Communities archive from an ANT perspective

    Get PDF
    Since long before the popularization of the Web, community-making has been a significant driving force for the development of the Internet. As a consequence, in mid 1990s online communities became a key object of study at the intersection of social sciences, organizational studies and computer sciences. Today, about fifteen years after these early studies, the concept \u2018online community\u2019 seems to be at stake. As a matter of fact, while communitarian ties enabled by digital media are more and more invocated, in late 2000s the Internet is revealing itself as a much more bureaucratic and profit-oriented domain than ever, to the point that it is not clear whether there exist online ties that are specific enough to be called \u2018communitarian\u2019. In order to analyse such an opaque and unstable object of study as current techno-social assemblages, innovative methods specifically developed to study fuzzy objects have to be devised and some epistemological questions have to be addressed. This research starts indeed from the impasse that the digital communitarian culture is experiencing at the end of the 2000s and borrows some epistemological insights from the Actor-Network Theory. By analyzing the entry forms submitted to the world\u2019s leading competition for digital communities, Prix Ars Electronica, this research thus calls into question some \u2018black-boxed\u2019 concepts like \u2018cyberculture\u2019, \u2018digital revolution\u2019, \u2018empowerment\u2019 and \u2018online community\u2019 itself. On one hand, the results bring into question both leading sociological positions and hype-generated commonplaces. On the other hand, the results offer evidence to those arguments according to which current ICT developments represent the beginning of a new phase of technological enclosure

    Cross-layer latency-aware and -predictable data communication

    Get PDF
    Cyber-physical systems are making their way into more aspects of everyday life. These systems are increasingly distributed and hence require networked communication to coordinatively fulfil control tasks. Providing this in a robust and resilient manner demands for latency-awareness and -predictability at all layers of the communication and computation stack. This thesis addresses how these two latency-related properties can be implemented at the transport layer to serve control applications in ways that traditional approaches such as TCP or RTP cannot. Thereto, the Predictably Reliable Real-time Transport (PRRT) protocol is presented, including its unique features (e.g. partially reliable, ordered, in-time delivery, and latency-avoiding congestion control) and unconventional APIs. This protocol has been intensively evaluated using the X-Lap toolkit that has been specifically developed to support protocol designers in improving latency, timing, and energy characteristics of protocols in a cross-layer, intra-host fashion. PRRT effectively circumvents latency-inducing bufferbloat using X-Pace, an implementation of the cross-layer pacing approach presented in this thesis. This is shown using experimental evaluations on real Internet paths. Apart from PRRT, this thesis presents means to make TCP-based transport aware of individual link latencies and increases the predictability of the end-to-end delays using Transparent Transmission Segmentation.Cyber-physikalische Systeme werden immer relevanter für viele Aspekte des Alltages. Sie sind zunehmend verteilt und benötigen daher Netzwerktechnik zur koordinierten Erfüllung von Regelungsaufgaben. Um dies auf eine robuste und zuverlässige Art zu tun, ist Latenz-Bewusstsein und -Prädizierbarkeit auf allen Ebenen der Informations- und Kommunikationstechnik nötig. Diese Dissertation beschäftigt sich mit der Implementierung dieser zwei Latenz-Eigenschaften auf der Transport-Schicht, sodass Regelungsanwendungen deutlich besser unterstützt werden als es traditionelle Ansätze, wie TCP oder RTP, können. Hierzu wird das PRRT-Protokoll vorgestellt, inklusive seiner besonderen Eigenschaften (z.B. partiell zuverlässige, geordnete, rechtzeitige Auslieferung sowie Latenz-vermeidende Staukontrolle) und unkonventioneller API. Das Protokoll wird mit Hilfe von X-Lap evaluiert, welches speziell dafür entwickelt wurde Protokoll-Designer dabei zu unterstützen die Latenz-, Timing- und Energie-Eigenschaften von Protokollen zu verbessern. PRRT vermeidet Latenz-verursachenden Bufferbloat mit Hilfe von X-Pace, einer Cross-Layer Pacing Implementierung, die in dieser Arbeit präsentiert und mit Experimenten auf realen Internet-Pfaden evaluiert wird. Neben PRRT behandelt diese Arbeit transparente Übertragungssegmentierung, welche dazu dient dem TCP-basierten Transport individuelle Link-Latenzen bewusst zu machen und so die Vorhersagbarkeit der Ende-zu-Ende Latenz zu erhöhen

    DIVE on the internet

    Get PDF
    This dissertation reports research and development of a platform for Collaborative Virtual Environments (CVEs). It has particularly focused on two major challenges: supporting the rapid development of scalable applications and easing their deployment on the Internet. This work employs a research method based on prototyping and refinement and promotes the use of this method for application development. A number of the solutions herein are in line with other CVE systems. One of the strengths of this work consists in a global approach to the issues raised by CVEs and the recognition that such complex problems are best tackled using a multi-disciplinary approach that understands both user and system requirements. CVE application deployment is aided by an overlay network that is able to complement any IP multicast infrastructure in place. Apart from complementing a weakly deployed worldwide multicast, this infrastructure provides for a certain degree of introspection, remote controlling and visualisation. As such, it forms an important aid in assessing the scalability of running applications. This scalability is further facilitated by specialised object distribution algorithms and an open framework for the implementation of novel partitioning techniques. CVE application development is eased by a scripting language, which enables rapid development and favours experimentation. This scripting language interfaces many aspects of the system and enables the prototyping of distribution-related components as well as user interfaces. It is the key construct of a distributed environment to which components, written in different languages, connect and onto which they operate in a network abstracted manner. The solutions proposed are exemplified and strengthened by three collaborative applications. The Dive room system is a virtual environment modelled after the room metaphor and supporting asynchronous and synchronous cooperative work. WebPath is a companion application to a Web browser that seeks to make the current history of page visits more visible and usable. Finally, the London travel demonstrator supports travellers by providing an environment where they can explore the city, utilise group collaboration facilities, rehearse particular journeys and access tourist information data
    corecore