3,138 research outputs found

    Digital libraries and minority languages

    Get PDF
    Digital libraries have a pivotal role to play in the preservation and maintenance of international cultures in general and minority languages in particular. This paper outlines a software tool for building digital libraries that is well adapted for creating and distributing local information collections in minority languages, and describes some contexts in which it is used. The system can make multilingual documents available in structured collections and allows them to be accessed via multilingual interfaces. It is issued under a free open-source licence, which encourages participatory design of the software, and an end-user interface allows community-based localization of the various language interfaces - of which there are many

    Proceedings of the ECSCW'95 Workshop on the Role of Version Control in CSCW Applications

    Full text link
    The workshop entitled "The Role of Version Control in Computer Supported Cooperative Work Applications" was held on September 10, 1995 in Stockholm, Sweden in conjunction with the ECSCW'95 conference. Version control, the ability to manage relationships between successive instances of artifacts, organize those instances into meaningful structures, and support navigation and other operations on those structures, is an important problem in CSCW applications. It has long been recognized as a critical issue for inherently cooperative tasks such as software engineering, technical documentation, and authoring. The primary challenge for versioning in these areas is to support opportunistic, open-ended design processes requiring the preservation of historical perspectives in the design process, the reuse of previous designs, and the exploitation of alternative designs. The primary goal of this workshop was to bring together a diverse group of individuals interested in examining the role of versioning in Computer Supported Cooperative Work. Participation was encouraged from members of the research community currently investigating the versioning process in CSCW as well as application designers and developers who are familiar with the real-world requirements for versioning in CSCW. Both groups were represented at the workshop resulting in an exchange of ideas and information that helped to familiarize developers with the most recent research results in the area, and to provide researchers with an updated view of the needs and challenges faced by application developers. In preparing for this workshop, the organizers were able to build upon the results of their previous one entitled "The Workshop on Versioning in Hypertext" held in conjunction with the ECHT'94 conference. The following section of this report contains a summary in which the workshop organizers report the major results of the workshop. The summary is followed by a section that contains the position papers that were accepted to the workshop. The position papers provide more detailed information describing recent research efforts of the workshop participants as well as current challenges that are being encountered in the development of CSCW applications. A list of workshop participants is provided at the end of the report. The organizers would like to thank all of the participants for their contributions which were, of course, vital to the success of the workshop. We would also like to thank the ECSCW'95 conference organizers for providing a forum in which this workshop was possible

    Asynchronously Replicated Shared Workspaces for a Multi-Media Annotation Service over Internet

    Get PDF
    This paper describes a world wide collaboration system through multimedia Post-its (user generated annotations). DIANE is a service to create multimedia annotations to every application output on the computer, as well as to existing multimedia annotations. Users collaborate by registering multimedia documents and user generated annotation in shared workspaces. However, DIANE only allows effective participation in a shared workspace over a high performance network (ATM, fast Ethernet) since it deals with large multimedia object. When only slow or unreliable connections are available between a DIANE terminal and server, useful work becomes impossible. To overcome these restrictions we need to replicate DIANE servers so that users do not suffer degradation in the quality of service. We use the asynchronous replication service ODIN to replicate the shared workspaces to every interested site in a transparent way to users. ODIN provides a cost-effective object replication by building a dynamic virtual network over Internet. The topology of this virtual network optimizes the use of network resources while it satisfies the changing requirements of the users

    Narrative and Hypertext 2011 Proceedings: a workshop at ACM Hypertext 2011, Eindhoven

    No full text

    A model for structured document retrieval : empirical investigations

    Get PDF
    Documents often display a structure, e.g., several sections, each with several subsections and so on. Taking into account the structure of a document allows the retrieval process to focus on those parts of the document that are most relevant to an information need. In previous work, we developed a model for the representation and the retrieval of structured documents. This paper reports the first experimental study of the effectiveness and applicability of the model

    Vampiric Remediation: The Vampire as a Self-Reflexive Technique in ‘Dracula’ (1897), ‘Nosferatu’ (1922) and ‘Shadow of the Vampire’ (2000)

    Get PDF
    This paper aims at describing the self-reflexive functions of the vampire through the lens of remediation. First, I will describe remediation as the central form of representation used in the novel Dracula (1897). Its epistolary form remediates various contemporary high-tech media that are compiled as typewritten pages: It uses a hypermedia strategy. Dracula, the creature, mirrors this technique, since he and his abilities are an amalgamation of the characteristics of contemporary media. Dracula tries to remediate itself (that is to rehabilitate) in the shifting media-landscape of the outgoing 19th century and self-reflexively addresses this through the vampire’s connection to media. Second, Nosferatu: Eine Symphonie des Grauens (dir. Friedrich Wilhelm Murnau, 1922) deviates from this hypermedia strategy and argues for film’s immediacy. However, it also self-consciously addresses its state as an adaptation of Dracula and clearly acknowledges its medium when vampirism is involved within the film itself. Nosferatu connects vampirism with cinema and its techniques and, consequently, presents its vampire, ‘Count Orlok’, as a personification of film instead of an amalgamation of different media. Shadow of the Vampire (dir. Edmund Elias Merhige, 2000), then, is a refashioning within the medium: it is Nosferatu’s fictional making-of. Here, the borders between cinema and vampirism and between medium and reality collapse, as Shadow of the Vampire not only borrows the style and story of Nosferatu, but also incorporates the history and the myths surrounding the production of this seminal vampire movie. Consequently, it argues for film’s failure as a medium of immediacy facing the new hypermedia-landscape of the beginning 21st century. These three iterations of the vampire and remediation demonstrate how the vampire has been functionalized as a self-reflexive technique to speak about the medium it is depicted in, be it on the brink of a changing media-landscape, at the beginning of movies as the medium of immediacy, or its existence as an established art form at the emerging digital age.This paper aims at describing the self-reflexive functions of the vampire through the lens of remediation. First, I will describe remediation as the central form of representation used in the novel Dracula (1897). Its epistolary form remediates various contemporary high-tech media that are compiled as typewritten pages: It uses a hypermedia strategy. Dracula, the creature, mirrors this technique, since he and his abilities are an amalgamation of the characteristics of contemporary media. Dracula tries to remediate itself (that is to rehabilitate) in the shifting media-landscape of the outgoing 19th century and self-reflexively addresses this through the vampire’s connection to media. Second, Nosferatu: Eine Symphonie des Grauens (dir. Friedrich Wilhelm Murnau, 1922) deviates from this hypermedia strategy and argues for film’s immediacy. However, it also self-consciously addresses its state as an adaptation of Dracula and clearly acknowledges its medium when vampirism is involved within the film itself. Nosferatu connects vampirism with cinema and its techniques and, consequently, presents its vampire, ‘Count Orlok’, as a personification of film instead of an amalgamation of different media. Shadow of the Vampire (dir. Edmund Elias Merhige, 2000), then, is a refashioning within the medium: it is Nosferatu’s fictional making-of. Here, the borders between cinema and vampirism and between medium and reality collapse, as Shadow of the Vampire not only borrows the style and story of Nosferatu, but also incorporates the history and the myths surrounding the production of this seminal vampire movie. Consequently, it argues for film’s failure as a medium of immediacy facing the new hypermedia-landscape of the beginning 21st century. These three iterations of the vampire and remediation demonstrate how the vampire has been functionalized as a self-reflexive technique to speak about the medium it is depicted in, be it on the brink of a changing media-landscape, at the beginning of movies as the medium of immediacy, or its existence as an established art form at the emerging digital age.This paper aims at describing the self-reflexive functions of the vampire through the lens of remediation. First, I will describe remediation as the central form of representation used in the novel Dracula (1897). Its epistolary form remediates various contemporary high-tech media that are compiled as typewritten pages: It uses a hypermedia strategy. Dracula, the creature, mirrors this technique, since he and his abilities are an amalgamation of the characteristics of contemporary media. Dracula tries to remediate itself (that is to rehabilitate) in the shifting media-landscape of the outgoing 19th century and self-reflexively addresses this through the vampire’s connection to media. Second, Nosferatu: Eine Symphonie des Grauens (dir. Friedrich Wilhelm Murnau, 1922) deviates from this hypermedia strategy and argues for film’s immediacy. However, it also self-consciously addresses its state as an adaptation of Dracula and clearly acknowledges its medium when vampirism is involved within the film itself. Nosferatu connects vampirism with cinema and its techniques and, consequently, presents its vampire, ‘Count Orlok’, as a personification of film instead of an amalgamation of different media. Shadow of the Vampire (dir. Edmund Elias Merhige, 2000), then, is a refashioning within the medium: it is Nosferatu’s fictional making-of. Here, the borders between cinema and vampirism and between medium and reality collapse, as Shadow of the Vampire not only borrows the style and story of Nosferatu, but also incorporates the history and the myths surrounding the production of this seminal vampire movie. Consequently, it argues for film’s failure as a medium of immediacy facing the new hypermedia-landscape of the beginning 21st century. These three iterations of the vampire and remediation demonstrate how the vampire has been functionalized as a self-reflexive technique to speak about the medium it is depicted in, be it on the brink of a changing media-landscape, at the beginning of movies as the medium of immediacy, or its existence as an established art form at the emerging digital age

    Design Principals of Social Navigation

    Get PDF
    8th Delos Workshop on "User Interfaces for Digital Libraries" (on 21 October it will be held in conjuction with the 4th ERCIM Workshop on "User Interfaces for All"), SICS, Kista, Sweden, 21-23 October 1998PERSON

    Advanced Knowledge Technologies at the Midterm: Tools and Methods for the Semantic Web

    Get PDF
    The University of Edinburgh and research sponsors are authorised to reproduce and distribute reprints and on-line copies for their purposes notwithstanding any copyright annotation hereon. The views and conclusions contained herein are the author’s and shouldn’t be interpreted as necessarily representing the official policies or endorsements, either expressed or implied, of other parties.In a celebrated essay on the new electronic media, Marshall McLuhan wrote in 1962:Our private senses are not closed systems but are endlessly translated into each other in that experience which we call consciousness. Our extended senses, tools, technologies, through the ages, have been closed systems incapable of interplay or collective awareness. Now, in the electric age, the very instantaneous nature of co-existence among our technological instruments has created a crisis quite new in human history. Our extended faculties and senses now constitute a single field of experience which demands that they become collectively conscious. Our technologies, like our private senses, now demand an interplay and ratio that makes rational co-existence possible. As long as our technologies were as slow as the wheel or the alphabet or money, the fact that they were separate, closed systems was socially and psychically supportable. This is not true now when sight and sound and movement are simultaneous and global in extent. (McLuhan 1962, p.5, emphasis in original)Over forty years later, the seamless interplay that McLuhan demanded between our technologies is still barely visible. McLuhan’s predictions of the spread, and increased importance, of electronic media have of course been borne out, and the worlds of business, science and knowledge storage and transfer have been revolutionised. Yet the integration of electronic systems as open systems remains in its infancy.Advanced Knowledge Technologies (AKT) aims to address this problem, to create a view of knowledge and its management across its lifecycle, to research and create the services and technologies that such unification will require. Half way through its sixyear span, the results are beginning to come through, and this paper will explore some of the services, technologies and methodologies that have been developed. We hope to give a sense in this paper of the potential for the next three years, to discuss the insights and lessons learnt in the first phase of the project, to articulate the challenges and issues that remain.The WWW provided the original context that made the AKT approach to knowledge management (KM) possible. AKT was initially proposed in 1999, it brought together an interdisciplinary consortium with the technological breadth and complementarity to create the conditions for a unified approach to knowledge across its lifecycle. The combination of this expertise, and the time and space afforded the consortium by the IRC structure, suggested the opportunity for a concerted effort to develop an approach to advanced knowledge technologies, based on the WWW as a basic infrastructure.The technological context of AKT altered for the better in the short period between the development of the proposal and the beginning of the project itself with the development of the semantic web (SW), which foresaw much more intelligent manipulation and querying of knowledge. The opportunities that the SW provided for e.g., more intelligent retrieval, put AKT in the centre of information technology innovation and knowledge management services; the AKT skill set would clearly be central for the exploitation of those opportunities.The SW, as an extension of the WWW, provides an interesting set of constraints to the knowledge management services AKT tries to provide. As a medium for the semantically-informed coordination of information, it has suggested a number of ways in which the objectives of AKT can be achieved, most obviously through the provision of knowledge management services delivered over the web as opposed to the creation and provision of technologies to manage knowledge.AKT is working on the assumption that many web services will be developed and provided for users. The KM problem in the near future will be one of deciding which services are needed and of coordinating them. Many of these services will be largely or entirely legacies of the WWW, and so the capabilities of the services will vary. As well as providing useful KM services in their own right, AKT will be aiming to exploit this opportunity, by reasoning over services, brokering between them, and providing essential meta-services for SW knowledge service management.Ontologies will be a crucial tool for the SW. The AKT consortium brings a lot of expertise on ontologies together, and ontologies were always going to be a key part of the strategy. All kinds of knowledge sharing and transfer activities will be mediated by ontologies, and ontology management will be an important enabling task. Different applications will need to cope with inconsistent ontologies, or with the problems that will follow the automatic creation of ontologies (e.g. merging of pre-existing ontologies to create a third). Ontology mapping, and the elimination of conflicts of reference, will be important tasks. All of these issues are discussed along with our proposed technologies.Similarly, specifications of tasks will be used for the deployment of knowledge services over the SW, but in general it cannot be expected that in the medium term there will be standards for task (or service) specifications. The brokering metaservices that are envisaged will have to deal with this heterogeneity.The emerging picture of the SW is one of great opportunity but it will not be a wellordered, certain or consistent environment. It will comprise many repositories of legacy data, outdated and inconsistent stores, and requirements for common understandings across divergent formalisms. There is clearly a role for standards to play to bring much of this context together; AKT is playing a significant role in these efforts. But standards take time to emerge, they take political power to enforce, and they have been known to stifle innovation (in the short term). AKT is keen to understand the balance between principled inference and statistical processing of web content. Logical inference on the Web is tough. Complex queries using traditional AI inference methods bring most distributed computer systems to their knees. Do we set up semantically well-behaved areas of the Web? Is any part of the Web in which semantic hygiene prevails interesting enough to reason in? These and many other questions need to be addressed if we are to provide effective knowledge technologies for our content on the web

    An Investigation into world wide web publishing with the hypertext markup language

    Get PDF
    The purpose of this thesis project was to test and to demonstrate the World Wide Web as a publishing vehicle by creating a Web presence for the School of Printing Management and Sciences. In order to reach this goal, a full understanding of the Hypertext Markup Language must first be realized. Once this is accomplished, issues regarding integration of mixed-media elements within an HTML document were investigated. Once a prototype of the HTML document was accomplished, the mixed-media elements were tested and evaluated for proper integration and contextual cohesiveness. Many issues regarding implementation of mixed-media elements, such as file size and file format were addressed upon testing. One of the additional goals of this project is a comprehensive description of the methodology for creating and maintaining a World Wide Web publishing presence. This addresses: navigational software, structuring HTML documents, hyper text linking, HTML style issues and limitations, effective integration of mixedmedia elements, inline and external image issues, testing documents, advertising documents, strategies for determining proper file sizes and formats of mixedmedia elements, integrating supplemental programs, World Wide Web Server issues, installing HTML and mixed-media files onto a World Wide Web Server, etc. The Web site located at (http://www.rit.edu/~spms) served as the vehicle for the investigation. Results of the study revealed the issues of providing data that services users across a wide range of computer systems, with different bandwidth restrictions, utilizing a myriad of computer software. Specific standards apply to An Investigation into World Wide Web Publishing with the Hypertext Markup Language alleviate much of the guesswork, however, publishing on the Internet remains to be as challenging as it is rewarding. The Web\u27s format and the opportunity to reach millions of potential customers is creating new types of publishing ventures in true gold-rush fashion. The Web is being touted as the fourth medium, and some suggest it will have as great an impact on society as print, radio and television. The growth of the Web is explosive and will assuredly continue to blossom. Upon completion of this study, the author remains skeptical whether the World Wide Web is the medium of the future. It has, however, created a trend which will forever reshape the publishing world and the way information seekers receive their data. Publishing will change from a commodity based market where prices are based upon cost, and shift to a service market where prices are based upon the value of the information. Each reader requiring selected information tailored to their specific choice will pay for what they select no more paying for an entire magazine or newspaper and reading only one article. The future of information dissemination is electronic, interactive and selective. Whether the delivery mechanism will be the World Wide Web remains to be seen

    Literature as a technique of recollection

    Get PDF
    There is a caricature of Marcel Proust in which the despairing writer is consoled by a friend saying, 'Aber, aber, mon cher Marcel, nun versuchen Sie sich doch zu erinnern, wo Sie die Zeit verloren haben.' Literature in general, not only A La Recherche du Temps Perdu, deals with a different form of memory than that of mnemonics, in which the hints of places lead to a retrieval of what has been stored there before. Nevertheless it is difficult to pinpoint the criteria that make this difference. How does literature transcend the technologically limited sense of memory in terms of a storage and retrieval system? ..
    • 

    corecore