29,308 research outputs found
Specification of an extensible and portable file format for electronic structure and crystallographic data
In order to allow different software applications, in constant evolution, to
interact and exchange data, flexible file formats are needed. A file format
specification for different types of content has been elaborated to allow
communication of data for the software developed within the European Network of
Excellence "NANOQUANTA", focusing on first-principles calculations of materials
and nanosystems. It might be used by other software as well, and is described
here in detail. The format relies on the NetCDF binary input/output library,
already used in many different scientific communities, that provides
flexibility as well as portability accross languages and platforms. Thanks to
NetCDF, the content can be accessed by keywords, ensuring the file format is
extensible and backward compatible
A study of a Java based framework for telecommunications services : a dissertation submitted in fulfilment of the requirements for the degree of Master of Science in Computer Science, Massey University, New Zealand
Additional content held on disk with print copy in Library.In this report, we study some of the general issues surrounding the area of telecommunications service development including the history of telecommunications services, current service creation techniques and the network used by services. We also discuss the lack of service portability and reasons for it. The JAIN framework – a set of Java APIs for integrated networks – is introduced as an approach that elegantly addresses this. We present a survey of recent work in telecommunications services that relate to JAIN. This includes a discussion of the feature interaction problem, an overview of the Telecommunications Information Networking Architecture, in particular, its relationship with JAIN, and the rapidly advancing field of Internet Telephony. In order to demonstrate the effectiveness of the JAIN approach we present designs for two advanced services that use the JAIN framework. These services are Internet Call Waiting and Click-to-Dial. Finally, areas for future research are introduced
Libpsht - algorithms for efficient spherical harmonic transforms
Libpsht (or "library for Performant Spherical Harmonic Transforms") is a
collection of algorithms for efficient conversion between spatial-domain and
spectral-domain representations of data defined on the sphere. The package
supports transforms of scalars as well as spin-1 and spin-2 quantities, and can
be used for a wide range of pixelisations (including HEALPix, GLESP and ECP).
It will take advantage of hardware features like multiple processor cores and
floating-point vector operations, if available. Even without this additional
acceleration, the employed algorithms are among the most efficient (in terms of
CPU time as well as memory consumption) currently being used in the
astronomical community.
The library is written in strictly standard-conforming C90, ensuring
portability to many different hard- and software platforms, and allowing
straightforward integration with codes written in various programming languages
like C, C++, Fortran, Python etc.
Libpsht is distributed under the terms of the GNU General Public License
(GPL) version 2 and can be downloaded from
http://sourceforge.net/projects/libpsht.Comment: 9 pages, 8 figures, accepted by A&
Recommended from our members
Data standardization
With data rapidly becoming the lifeblood of the global economy, the ability to improve its use significantly affects both social and private welfare. Data standardization is key to facilitating and improving the use of data when data portability and interoperability are needed. Absent data standardization, a “Tower of Babel” of different databases may be created, limiting synergetic knowledge production. Based on interviews with data scientists, this Article identifies three main technological obstacles to data portability and interoperability: metadata uncertainties, data transfer obstacles, and missing data. It then explains how data standardization can remove at least some of these obstacles and lead to smoother data flows and better machine learning. The Article then identifies and analyzes additional effects of data standardization. As shown, data standardization has the potential to support a competitive and distributed data collection ecosystem and lead to easier policing in cases where rights are infringed or unjustified harms are created by data-fed algorithms. At the same time, increasing the scale and scope of data analysis can create negative externalities in the form of better profiling, increased harms to privacy, and cybersecurity harms. Standardization also has implications for investment and innovation, especially if lock-in to an inefficient standard occurs. The Article then explores whether market-led standardization initiatives can be relied upon to increase welfare, and the role governmental-facilitated data standardization should play, if at all
VXA: A Virtual Architecture for Durable Compressed Archives
Data compression algorithms change frequently, and obsolete decoders do not
always run on new hardware and operating systems, threatening the long-term
usability of content archived using those algorithms. Re-encoding content into
new formats is cumbersome, and highly undesirable when lossy compression is
involved. Processor architectures, in contrast, have remained comparatively
stable over recent decades. VXA, an archival storage system designed around
this observation, archives executable decoders along with the encoded content
it stores. VXA decoders run in a specialized virtual machine that implements an
OS-independent execution environment based on the standard x86 architecture.
The VXA virtual machine strictly limits access to host system services, making
decoders safe to run even if an archive contains malicious code. VXA's adoption
of a "native" processor architecture instead of type-safe language technology
allows reuse of existing "hand-optimized" decoders in C and assembly language,
and permits decoders access to performance-enhancing architecture features such
as vector processing instructions. The performance cost of VXA's virtualization
is typically less than 15% compared with the same decoders running natively.
The storage cost of archived decoders, typically 30-130KB each, can be
amortized across many archived files sharing the same compression method.Comment: 14 pages, 7 figures, 2 table
An ontology framework for developing platform-independent knowledge-based engineering systems in the aerospace industry
This paper presents the development of a novel knowledge-based engineering (KBE) framework for implementing platform-independent knowledge-enabled product design systems within the aerospace industry. The aim of the KBE framework is to strengthen the structure, reuse and portability of knowledge consumed within KBE systems in view of supporting the cost-effective and long-term preservation of knowledge within such systems. The proposed KBE framework uses an ontology-based approach for semantic knowledge management and adopts a model-driven architecture style from the software engineering discipline. Its phases are mainly (1) Capture knowledge required for KBE system; (2) Ontology model construct of KBE system; (3) Platform-independent model (PIM) technology selection and implementation and (4) Integration of PIM KBE knowledge with computer-aided design system. A rigorous methodology is employed which is comprised of five qualitative phases namely, requirement analysis for the KBE framework, identifying software and ontological engineering elements, integration of both elements, proof of concept prototype demonstrator and finally experts validation. A case study investigating four primitive three-dimensional geometry shapes is used to quantify the applicability of the KBE framework in the aerospace industry. Additionally, experts within the aerospace and software engineering sector validated the strengths/benefits and limitations of the KBE framework. The major benefits of the developed approach are in the reduction of man-hours required for developing KBE systems within the aerospace industry and the maintainability and abstraction of the knowledge required for developing KBE systems. This approach strengthens knowledge reuse and eliminates platform-specific approaches to developing KBE systems ensuring the preservation of KBE knowledge for the long term
A Brief History of BioPerl
Large-scale open-source projects face a litany of pitfalls and difficulties. Problems of contribution quality, credit for contributions, project coordination, funding, and mission-creep are ever-present. Of these, long-term funding and project coordination can interact to form a particularly difficult problem for open-source projects in an academic environment.
BioPerl was chosen as an example of a successful academic open-source project. Several of the roadblocks and hurdles encountered and overcome in the development of BioPerl are examined through the telling of the history of the project. Along the way, key points of open-source law are explained, such as license choice and copyright.
The BioPerl project current status is then analyzed, and four different strategies typically employed by traditional open-source projects at this stage are analyzed as future directions. Strategies such as soliciting donations, securing grants, providing dual-licenses to enhance commercial interest, and the paid provision of support have all been employed in various traditional open-source projects with success, but each has drawbacks when applied to the academy. Finally, the construction of a successful long-term strategy for BioPerl, and other academic open-source projects, is proposed so that such projects can navigate the difficulties
- …