15,657 research outputs found
Modular Web Queries â From Rules to Stores
Even with all the progress in Semantic technology, accessing Web
data remains a challenging issue with new Web query languages and approaches
appearing regularly. Yet most of these languages, including W3C approaches
such as XQuery and SPARQL, do little to cope with the explosion of the data
size and schemata diversity and richness on the Web. In this paper we propose
a straightforward step toward the improvement of this situation that is simple to
realize and yet effective: Advanced module systems that make partitioning of (a)
the evaluation and (b) the conceptual design of complex Web queries possible.
They provide the query programmer with a powerful, but easy to use high-level
abstraction for packaging, encapsulating, and reusing conceptually related parts
(in our case, rules) of a Web query. The proposed module system combines ease
of use thanks to a simple core concept, the partitioning of rules and their consequences
in flexible âstoresâ, with ease of deployment thanks to a reduction
semantics. We focus on extending the rule-based Semantic Web query language
Xcerpt with such a module system though the same approach can be applied to
other (rule-based) languages as well
Optimizing Frameworks Performance Using C++ Modules Aware ROOT
ROOT is a data analysis framework broadly used in and outside of High Energy
Physics (HEP). Since HEP software frameworks always strive for performance
improvements, ROOT was extended with experimental support of runtime C++
Modules. C++ Modules are designed to improve the performance of C++ code
parsing. C++ Modules offers a promising way to improve ROOT's runtime
performance by saving the C++ header parsing time which happens during ROOT
runtime. This paper presents the results and challenges of integrating C++
Modules into ROOT.Comment: 8 pages, 3 figures, 6 listing, CHEP 2018 - 23rd International
Conference on Computing in High Energy and Nuclear Physic
Modules program structures and the structuring of operating systems
In this paper some views are presented on the way in which complex systems, such as Operating Systems and the programs to be interfaced with them can be constructed, and how such systems may become heavily library oriented. Although such systems have a dynamic nature, all interfacing within and among modules can be checked statically. It will be shown that the concepts presented are equally valid for single user systems, multi-programming systems and even distributed systems. The ideas have been spurred by the implementation of a modular version of Pascal and a supporting Operating System, currently nearing completion at Twente University of Technology, The Netherlands
Constraining application behaviour by generating languages
Writing a platform for reactive applications which enforces operational
constraints is difficult, and has been approached in various ways. In this
experience report, we detail an approach using an embedded DSL which can be
used to specify the structure and permissions of a program in a given
application domain. Once the developer has specified which components an
application will consist of, and which permissions each one needs, the
specification itself evaluates to a new, tailored, language. The final
implementation of the application is then written in this specialised
environment where precisely the API calls associated with the permissions which
have been granted, are made available.
Our prototype platform targets the domain of mobile computing, and is
implemented using Racket. It demonstrates resource access control (e.g.,
camera, address book, etc.) and tries to prevent leaking of private data.
Racket is shown to be an extremely effective platform for designing new
programming languages and their run-time libraries. We demonstrate that this
approach allows reuse of an inter-component communication layer, is convenient
for the application developer because it provides high-level building blocks to
structure the application, and provides increased control to the platform
owner, preventing certain classes of errors by the developer.Comment: 8 pages, 8th European Lisp Symposiu
Knowledge based cloud FE simulation of sheet metal forming processes
The use of Finite Element (FE) simulation software to adequately predict the outcome of sheet metal forming processes is crucial to enhancing the efficiency and lowering the development time of such processes, whilst reducing costs involved in trial-and-error prototyping. Recent focus on the substitution of steel components with aluminum alloy alternatives in the automotive and aerospace sectors has increased the need to simulate the forming behavior of such alloys for ever more complex component geometries. However these alloys, and in particular their high strength variants, exhibit limited formability at room temperature, and high temperature manufacturing technologies have been developed to form them. Consequently, advanced constitutive models are required to reflect the associated temperature and strain rate effects. Simulating such behavior is computationally very expensive using conventional FE simulation techniques. This paper presents a novel Knowledge Based Cloud FE (KBC-FE) simulation technique that combines advanced material and friction models with conventional FE simulations in an efficient manner thus enhancing the capability of commercial simulation software packages. The application of these methods is demonstrated through two example case studies, namely: the prediction of a material's forming limit under hot stamping conditions, and the tool life prediction under multi-cycle loading conditions
Software process modelling as relationships between tasks
Systematic formulation of software process models is currently a challenging problem in software engineering. We present an approach to define models covering the phases of specification, design, implementation and testing of software systems in the component programming framework, taking into account non-functional aspects of software (efficiency, etc.), automatic reusability of implementations in systems and also prototyping techniques involving both specifications and implementations. Our proposal relies on the identification of a catalogue of tasks that appear during these phases which satisfy some relationships concerning their order of execution. A software process model can be defined as the addition of more relationships over these tasks using a simple, modular process language. We have developed also a formal definition of correctness of a software development with respect to a software process model, based on the formulation of models as graphs.Peer ReviewedPostprint (published version
Interoperability and Standards: The Way for Innovative Design in Networked Working Environments
Organised by: Cranfield UniversityIn todayâs networked economy, strategic business partnerships and outsourcing has become the dominant
paradigm where companies focus on core competencies and skills, as creative design, manufacturing, or
selling. However, achieving seamless interoperability is an ongoing challenge these networks are facing,
due to their distributed and heterogeneous nature. Part of the solution relies on adoption of standards for
design and product data representation, but for sectors predominantly characterized by SMEs, such as the
furniture sector, implementations need to be tailored to reduce costs. This paper recommends a set of best
practices for the fast adoption of the ISO funStep standard modules and presents a framework that enables
the usage of visualization data as a way to reduce costs in manufacturing and electronic catalogue design.Mori Seiki â The Machine Tool Compan
Assembling and enriching digital library collections
People who create digital libraries need to gather together the raw material, add metadata as necessary, and design and build new collections. This paper sets out the requirements for these tasks and describes a new tool that supports them interactively, making it easy for users to create their own collections from electronic files of all types. The process involves selecting documents for inclusion, coming up with a suitable metadata set, assigning metadata to each document or group of documents, designing the form of the collection in terms of document formats, searchable indexes, and browsing facilities, building the necessary indexes and data structures, and putting the collection in place for others to use. Moreover, different situations require different workflows, and the system must be flexible enough to cope with these demands. Although the tool is specific to the Greenstone digital library software, the underlying ideas should prove useful in more general contexts
- âŠ