5,120 research outputs found
makeSense: Simplifying the Integration of Wireless Sensor Networks into Business Processes
A wide gap exists between the state of the art in developing Wireless Sensor Network (WSN) software and current practices concerning the design, execution, and maintenance of business processes. WSN software is most often developed based on low-level OS abstractions, whereas business process development leverages high-level languages and tools. This state of affairs places WSNs at the fringe of industry. The makeSense system addresses this problem by simplifying the integration of WSNs into business processes. Developers use BPMN models extended with WSN-specific constructs to specify the application behavior across both traditional business process execution environments and the WSN itself, which is to be equipped with application-specific software. We compile these models into a high-level intermediate language—also directly usable by WSN developers—and then into OS-specific deployment-ready binaries. Key to this process is the notion of meta-abstraction, which we define to capture fundamental patterns of interaction with and within the WSN. The concrete realization of meta-abstractions is application-specific; developers tailor the system configuration by selecting concrete abstractions out of the existing codebase or by providing their own. Our evaluation of makeSense shows that i) users perceive our approach as a significant advance over the state of the art, providing evidence of the increased developer productivity when using makeSense; ii) in large-scale simulations, our prototype exhibits an acceptable system overhead and good scaling properties, demonstrating the general applicability of makeSense; and, iii) our prototype—including the complete tool-chain and underlying system support—sustains a real-world deployment where estimates by domain specialists indicate the potential for drastic reductions in the total cost of ownership compared to wired and conventional WSN-based solutions
An implementation and analysis of the Abstract Syntax Notation One and the basic encoding rules
The details of abstract syntax notation one standard (ASN.1) and the basic encoding rules standard (BER) that collectively solve the problem of data transfer across incompatible host environments are presented, and a compiler that was built to automate their use is described. Experiences with this compiler are also discussed which provide a quantitative analysis of the performance costs associated with the application of these standards. An evaluation is offered as to how well suited ASN.1 and BER are in solving the common data representation problem
ERIC: An Efficient and Practical Software Obfuscation Framework
Modern cloud computing systems distribute software executables over a network
to keep the software sources, which are typically compiled in a
security-critical cluster, secret. We develop ERIC, a new, efficient, and
general software obfuscation framework. ERIC protects software against (i)
static analysis, by making only an encrypted version of software executables
available to the human eye, no matter how the software is distributed, and (ii)
dynamic analysis, by guaranteeing that an encrypted executable can only be
correctly decrypted and executed by a single authenticated device. ERIC
comprises key hardware and software components to provide efficient software
obfuscation support: (i) a hardware decryption engine (HDE) enables efficient
decryption of encrypted hardware in the target device, (ii) the compiler can
seamlessly encrypt software executables given only a unique device identifier.
Both the hardware and software components are ISA-independent, making ERIC
general. The key idea of ERIC is to use physical unclonable functions (PUFs),
unique device identifiers, as secret keys in encrypting software executables.
Malicious parties that cannot access the PUF in the target device cannot
perform static or dynamic analyses on the encrypted binary. We develop ERIC's
prototype on an FPGA to evaluate it end-to-end. Our prototype extends RISC-V
Rocket Chip with the hardware decryption engine (HDE) to minimize the overheads
of software decryption. We augment the custom LLVM-based compiler to enable
partial/full encryption of RISC-V executables. The HDE incurs minor FPGA
resource overheads, it requires 2.63% more LUTs and 3.83% more flip-flops
compared to the Rocket Chip baseline. LLVM-based software encryption increases
compile time by 15.22% and the executable size by 1.59%. ERIC is publicly
available and can be downloaded from https://github.com/kasirgalabs/ERICComment: DSN 2022 - The 52nd Annual IEEE/IFIP International Conference on
Dependable Systems and Network
Life of occam-Pi
This paper considers some questions prompted by a brief review of the history of computing. Why is programming so hard? Why is concurrency considered an “advanced” subject? What’s the matter with Objects? Where did all the Maths go? In searching for answers, the paper looks at some concerns over fundamental ideas within object orientation (as represented by modern programming languages), before focussing on the concurrency model of communicating processes and its particular expression in the occam family of languages. In that focus, it looks at the history of occam, its underlying philosophy (Ockham’s Razor), its semantic foundation on Hoare’s CSP, its principles of process oriented design and its development over almost three decades into occam-? (which blends in the concurrency dynamics of Milner’s ?-calculus). Also presented will be an urgent need for rationalisation – occam-? is an experiment that has demonstrated significant results, but now needs time to be spent on careful review and implementing the conclusions of that review. Finally, the future is considered. In particular, is there a future
Does Code Generation Promote or Prevent Optimizations?
International audienceThis paper addresses the problem of code optimization for Real-Time and Embedded Systems (RTES). Such systems are designed using Model-Based Development (MBD)approach that consists of performing three major steps: building models, generating code from them and compiling the generated code. Actually, during the code generation, an important part of the modeling language semantics which could be useful for optimization is lost, thus, making impossible some optimizations achievement. This paper shows how adding a new level of optimization at the model level results in a more compact code. It also discusses the impact of the code generation on optimization: whether this step promotes or prevents optimizations. We conclude on a proposal of a new MBD approach containing only steps that advance optimization: modeling and compiling steps
On the engineering of crucial software
The various aspects of the conventional software development cycle are examined. This cycle was the basis of the augmented approach contained in the original grant proposal. This cycle was found inadequate for crucial software development, and the justification for this opinion is presented. Several possible enhancements to the conventional software cycle are discussed. Software fault tolerance, a possible enhancement of major importance, is discussed separately. Formal verification using mathematical proof is considered. Automatic programming is a radical alternative to the conventional cycle and is discussed. Recommendations for a comprehensive approach are presented, and various experiments which could be conducted in AIRLAB are described
Portable implementation of computer aided design environment for composite structures
Composite materials are widely used due to their low weight, long durability and the ability to tailor their properties to specific design requirements. Their wide range of applications requires a solid understanding of their behavior under different load conditions. The calculations needed for efficient and accurate design of composites could be exhaustive and time consuming. Therefore an efficient computer program which would facilitate effective design becomes a key factor to supporting commercial use of composite materials.;A portable software tool was developed in Java programming environment for design analysis of composite materials. This software tool is superior to its predecessor, the Computer Aided Design Environment for Composite software (CADEC). The Java software is an exact replica of the CADEC software, which is written in Toolbook. The only major difference is that the new program is rewritten in a portable language. The software imposes no restrictions on the accessibility or the usability of this tool. The software can be accessed via internet and it can be run on any operation system as long as a Java enabled browser is available. The software tool has been evaluated using the example problems taken from the text book Introduction to Composite Materials Design written by Dr. Ever J. Barbero. The results obtained by using the new tool are sufficiently close to the text book results
- …