173,115 research outputs found

    Synthetic biology—putting engineering into biology

    Get PDF
    Synthetic biology is interpreted as the engineering-driven building of increasingly complex biological entities for novel applications. Encouraged by progress in the design of artificial gene networks, de novo DNA synthesis and protein engineering, we review the case for this emerging discipline. Key aspects of an engineering approach are purpose-orientation, deep insight into the underlying scientific principles, a hierarchy of abstraction including suitable interfaces between and within the levels of the hierarchy, standardization and the separation of design and fabrication. Synthetic biology investigates possibilities to implement these requirements into the process of engineering biological systems. This is illustrated on the DNA level by the implementation of engineering-inspired artificial operations such as toggle switching, oscillating or production of spatial patterns. On the protein level, the functionally self-contained domain structure of a number of proteins suggests possibilities for essentially Lego-like recombination which can be exploited for reprogramming DNA binding domain specificities or signaling pathways. Alternatively, computational design emerges to rationally reprogram enzyme function. Finally, the increasing facility of de novo DNA synthesis—synthetic biology’s system fabrication process—supplies the possibility to implement novel designs for ever more complex systems. Some of these elements have merged to realize the first tangible synthetic biology applications in the area of manufacturing of pharmaceutical compounds.

    Putting Instruction Sequences into Effect

    Get PDF
    An attempt is made to define the concept of execution of an instruction sequence. It is found to be a special case of directly putting into effect of an instruction sequence. Directly putting into effect of an instruction sequences comprises interpretation as well as execution. Directly putting into effect is a special case of putting into effect with other special cases classified as indirectly putting into effect

    A model driven approach to analysis and synthesis of sequence diagrams

    Get PDF
    Software design is a vital phase in a software development life cycle as it creates a blueprint for the implementation of the software. It is crucial that software designs are error-free since any unresolved design-errors could lead to costly implementation errors. To minimize these errors, the software community adopted the concept of modelling from various other engineering disciplines. Modelling provides a platform to create and share abstract or conceptual representations of the software system – leading to various modelling languages, among them Unified Modelling Language (UML) and Petri Nets. While Petri Nets strong mathematical capability allows various formal analyses to be performed on the models, UMLs user-friendly nature presented a more appealing platform for system designers. Using Multi Paradigm Modelling, this thesis presents an approach where system designers may have the best of both worlds; SD2PN, a model transformation that maps UML Sequence Diagrams into Petri Nets allows system designers to perform modelling in UML while still using Petri Nets to perform the analysis. Multi Paradigm Modelling also provided a platform for a well-established theory in Petri Nets – synthesis to be adopted into Sequence Diagram as a method of putting-together different Sequence Diagrams based on a set of techniques and algorithms

    Beyond Good and Evil: Formalizing the Security Guarantees of Compartmentalizing Compilation

    Full text link
    Compartmentalization is good security-engineering practice. By breaking a large software system into mutually distrustful components that run with minimal privileges, restricting their interactions to conform to well-defined interfaces, we can limit the damage caused by low-level attacks such as control-flow hijacking. When used to defend against such attacks, compartmentalization is often implemented cooperatively by a compiler and a low-level compartmentalization mechanism. However, the formal guarantees provided by such compartmentalizing compilation have seen surprisingly little investigation. We propose a new security property, secure compartmentalizing compilation (SCC), that formally characterizes the guarantees provided by compartmentalizing compilation and clarifies its attacker model. We reconstruct our property by starting from the well-established notion of fully abstract compilation, then identifying and lifting three important limitations that make standard full abstraction unsuitable for compartmentalization. The connection to full abstraction allows us to prove SCC by adapting established proof techniques; we illustrate this with a compiler from a simple unsafe imperative language with procedures to a compartmentalized abstract machine.Comment: Nit

    Alloy meets the algebra of programming: a case study

    Get PDF
    Relational algebra offers to software engineering the same degree of conciseness and calculational power as linear algebra in other engineering disciplines. Binary relations play the role of matrices with similar emphasis on multiplication and transposition. This matches with Alloy’s lemma “everything is a relation” and with the relational basis of the Algebra of Programming (AoP). Altogether, it provides a simple and coherent approach to checking and calculating programs from abstract models. In this paper, we put Alloy and the Algebra of Programming together in a case study originating from the Verifiable File System mini-challenge put forward by Joshi and Holzmann: verifying the refinement of an abstract file store model into a journaled (FLASH) data model catering to wear leveling and recovery from power loss. Our approach relies on diagrams to graphically express typed assertions. It interweaves model checking (in Alloy) with calculational proofs in a way which offers the best of both worlds. This provides ample evidence of the positive impact in software verification of Alloy’s focus on relations, complemented by induction-free proofs about data structures such as stores and lists.Fundação para a Ciência e a Tecnologia (FCT

    A Study of Text Mining Framework for Automated Classification of Software Requirements in Enterprise Systems

    Get PDF
    abstract: Text Classification is a rapidly evolving area of Data Mining while Requirements Engineering is a less-explored area of Software Engineering which deals the process of defining, documenting and maintaining a software system's requirements. When researchers decided to blend these two streams in, there was research on automating the process of classification of software requirements statements into categories easily comprehensible for developers for faster development and delivery, which till now was mostly done manually by software engineers - indeed a tedious job. However, most of the research was focused on classification of Non-functional requirements pertaining to intangible features such as security, reliability, quality and so on. It is indeed a challenging task to automatically classify functional requirements, those pertaining to how the system will function, especially those belonging to different and large enterprise systems. This requires exploitation of text mining capabilities. This thesis aims to investigate results of text classification applied on functional software requirements by creating a framework in R and making use of algorithms and techniques like k-nearest neighbors, support vector machine, and many others like boosting, bagging, maximum entropy, neural networks and random forests in an ensemble approach. The study was conducted by collecting and visualizing relevant enterprise data manually classified previously and subsequently used for training the model. Key components for training included frequency of terms in the documents and the level of cleanliness of data. The model was applied on test data and validated for analysis, by studying and comparing parameters like precision, recall and accuracy.Dissertation/ThesisMasters Thesis Engineering 201

    Putting formal specifications under the magnifying glass: Model-based testing for validation

    Get PDF
    A software development process is effectively an abstract form of model transformation, starting from an end-user model of requirements, through to a system model for which code can be automatically generated. The success (or failure) of such a transformation depends substantially on obtaining a correct, well-formed initial model that captures user concerns. Model-based testing automates black box testing based on the model of the system under analysis. This paper proposes and evaluates a novel model-based testing technique that aims to reveal specification/requirement-related errors by generating test cases from a test model and exercising them on the design model. The case study outlined in the paper shows that a separate test model not only increases the level of objectivity of the requirements, but also supports the validation of the system under test through test case generation. The results obtained from the case study support the hypothesis that there may be discrepancies between the formal specification of the system modeled at developer end and the problem to be solved, and using solely formal verification methods may not be sufficient to reveal these. The approach presented in this paper aims at providing means to obtain greater confidence in the design model that is used as the basis for code generation

    Towards a new vision of Information System Engineering.

    Get PDF
    Information Technologies bear the potential of new uses. These uses provoke a new organization which induces a new vision of software engineering. Under the influence of globalization, and the impact of Information and Communication Technologies (ICT) that modify radically our relationship with space and time, the hierarchical company locked up on its local borders becomes an Extended Company, without borders, opened and adaptable. In this context, this paper proposes a shift in the way the design of information systems is viewed, so that the digital information system and potential user are in harmony right from the design stage of the system. The goal is to help to design systems that are useful. It will therefore be a matter of distributed intelligence of the situation in terms of interactions and cooperative partners rather than in terms of a more passive user. This means putting at the disposal of the user, seen as a "partner", a system that will help him or her think more efficiently about a situation. The approach adopted is a global philosophy based on business process management within the framework of all the methodological principles. The research described here is therefore a contribution to the software engineering.User; Software Engineering; Information System; Business Process Management; Extended Company; Digital Information System;

    What is Conceptual Engineering and What Should it Be?

    Get PDF
    Conceptual engineering is the design, implementation, and evaluation of concepts. Conceptual engineering includes or should include de novo conceptual engineering (designing a new concept) as well as conceptual re-engineering (fixing an old concept). It should also include heteronymous (different-word) as well as homonymous (same-word) conceptual engineering. I discuss the importance and the difficulty of these sorts of conceptual engineering in philosophy and elsewhere

    Democracy, Ideology and Process Re-Engineering: Realising the Benefits of e-Government in Singapore

    No full text
    The re-engineering of governmental processes is a necessary condition for the realisation of the benefits of e-government. Several obstacles to such re-engineering exist. These include: (1) information processing thrives on transparency and amalgamation of data, whilst governments are constrained by principles of privacy and data separation; (2) top-down re-engineering may be resisted effectively from the bottom up. This paper analyses these obstacles in the way of re-engineering in Singapore – a democratic one-party state where legislative and executive power lies with the People’s Action Party – and considers how that hegemony has aided the development of e-government
    corecore