39,268 research outputs found
Critique of Architectures for Long-Term Digital Preservation
Evolving technology and fading human memory threaten the long-term intelligibility of many kinds of documents. Furthermore, some records are susceptible to improper alterations that make them untrustworthy. Trusted Digital Repositories (TDRs) and Trustworthy Digital Objects (TDOs) seem to be the only broadly applicable digital preservation methodologies proposed. We argue that the TDR approach has shortfalls as a method for long-term digital preservation of sensitive information. Comparison of TDR and TDO methodologies suggests differentiating near-term preservation measures from what is needed for the long term.
TDO methodology addresses these needs, providing for making digital documents durably intelligible. It uses EDP standards for a few file formats and XML structures for text documents. For other information formats, intelligibility is assured by using a virtual computer. To protect sensitive information—content whose inappropriate alteration might mislead its readers, the integrity and authenticity of each TDO is made testable by embedded public-key cryptographic message digests and signatures. Key authenticity is protected recursively in a social hierarchy. The proper focus for long-term preservation technology is signed packages that each combine a record collection with its metadata and that also bind context—Trustworthy Digital Objects.
Digital library research : current developments and trends
This column gives an overview of current trends in digital library research under the following headings: digital library architecture, systems, tools and technologies; digital content and collections; metadata; interoperability; standards; knowledge organisation systems; users and usability; legal, organisational, economic, and social issues in digital libraries
The Convergence of Digital-Libraries and the Peer-Review Process
Pre-print repositories have seen a significant increase in use over the past
fifteen years across multiple research domains. Researchers are beginning to
develop applications capable of using these repositories to assist the
scientific community above and beyond the pure dissemination of information.
The contribution set forth by this paper emphasizes a deconstructed publication
model in which the peer-review process is mediated by an OAI-PMH peer-review
service. This peer-review service uses a social-network algorithm to determine
potential reviewers for a submitted manuscript and for weighting the relative
influence of each participating reviewer's evaluations. This paper also
suggests a set of peer-review specific metadata tags that can accompany a
pre-print's existing metadata record. The combinations of these contributions
provide a unique repository-centric peer-review model that fits within the
widely deployed OAI-PMH framework.Comment: Journal of Information Science [in press
Architectural implications for context adaptive smart spaces
Buildings and spaces are complex entities containing complex social structures and interactions. A smart space is a composite of the users that inhabit it, the IT infrastructure that supports it, and the sensors and appliances that service it. Rather than separating the IT from the buildings and from the appliances that inhabit them and treating them as separate systems, pervasive computing combines them and allows them to interact. We outline a reactive context architecture that supports this vision of integrated smart spaces and explore some implications for building large-scale pervasive systems
Recommended from our members
A component-based product line architecture for workflow management systems
This paper presents a component-based product line for workflow management systems. The process followed to design the product line was based on the Catalysis method. Extensions were made to represent variability across the process. The domain of workflow management systems has been shown to be appropriate to the application of the product line approach as there are a standard architecture and models established by a regulatory board, the Workflow Management Coalition. In addition, there is a demand for similar workflow management systems but with some different features. The product line architecture was evaluated with Rapide simulation tools. The evaluation was based on selected scenarios, thus, avoiding implementation issues. The strategy that has been used to populate the architecture and experiment with the product line is shown. In particular, the design of the workflow execution manager component is described
Illinois Digital Scholarship: Preserving and Accessing the Digital Past, Present, and Future
Since the University's establishment in 1867, its scholarly output has been issued primarily in print, and the University Library and Archives have been readily able to collect, preserve, and to provide access to that output. Today, technological, economic, political and social forces are buffeting all means of scholarly communication. Scholars, academic institutions and publishers are engaged in debate about the impact of digital scholarship and open access publishing on the promotion and tenure process. The upsurge in digital scholarship affects many aspects of the academic enterprise, including how we record, evaluate, preserve, organize and disseminate scholarly work. The result has left the Library with no ready means by which to archive digitally produced publications, reports, presentations, and learning objects, much of which cannot be adequately represented in print form. In this incredibly fluid environment of digital scholarship, the critical question of how we will collect, preserve, and manage access to this important part of the University scholarly record demands a rational and forward-looking plan - one that includes perspectives from diverse scholarly disciplines, incorporates significant research breakthroughs in information science and computer science, and makes effective projections for future integration within the Library and computing services as a part of the campus infrastructure.Prepared jointly by the University of Illinois Library and CITES at the University of Illinois at Urbana-Champaig
Benets of tight coupled architectures for the integration of GNSS receiver and Vanet transceiver
Vehicular adhoc networks (VANETs) are one emerging type of networks that will enable a broad range of applications such as public safety, traffic management, traveler information support and entertain ment. Whether wireless access may be asynchronous or synchronous (respectively as in the upcoming IEEE 8021.11p standard or in some alternative emerging solutions), a synchronization among nodes is required. Moreover, the information on position is needed to let vehicular services work and to correctly forward the messages. As a result, timing and positioning are a strong prerequisite of VANETs. Also the diffusion of enhanced GNSS Navigators paves the way to the integration between GNSS receivers and VANET transceiv ers. This position paper presents an analysis on potential benefits coming from a tightcoupling between the two: the dissertation is meant to show to what extent Intelligent Transportation System (ITS) services could benefit from the proposed architectur
Secure Architectures for Mobile Applications
The paper presents security issues and architectures for mobile applications and GSM infrastructure. The article also introduces the idea of a new secure architecture for an inter-sector electronic wallet used in payments - STP4EW (Secure Transmission Protocol for Electronic Wallet)secure architecture, m-application, smart-cards, 3G Mobile
Systematizing Decentralization and Privacy: Lessons from 15 Years of Research and Deployments
Decentralized systems are a subset of distributed systems where multiple
authorities control different components and no authority is fully trusted by
all. This implies that any component in a decentralized system is potentially
adversarial. We revise fifteen years of research on decentralization and
privacy, and provide an overview of key systems, as well as key insights for
designers of future systems. We show that decentralized designs can enhance
privacy, integrity, and availability but also require careful trade-offs in
terms of system complexity, properties provided, and degree of
decentralization. These trade-offs need to be understood and navigated by
designers. We argue that a combination of insights from cryptography,
distributed systems, and mechanism design, aligned with the development of
adequate incentives, are necessary to build scalable and successful
privacy-preserving decentralized systems
Interface groups and financial transfer architectures
Analytic execution architectures have been proposed by the same authors as a
means to conceptualize the cooperation between heterogeneous collectives of
components such as programs, threads, states and services. Interface groups
have been proposed as a means to formalize interface information concerning
analytic execution architectures. These concepts are adapted to organization
architectures with a focus on financial transfers. Interface groups (and
monoids) now provide a technique to combine interface elements into interfaces
with the flexibility to distinguish between directions of flow dependent on
entity naming.
The main principle exploiting interface groups is that when composing a
closed system of a collection of interacting components, the sum of their
interfaces must vanish in the interface group modulo reflection. This certainly
matters for financial transfer interfaces.
As an example of this, we specify an interface group and within it some
specific interfaces concerning the financial transfer architecture for a part
of our local academic organization.
Financial transfer interface groups arise as a special case of more general
service architecture interfaces.Comment: 22 page
- …