8,286 research outputs found

    Reduced order modeling of distillation systems

    Get PDF
    The concept of distillation separation feasibility is investigated using reduced-order models. Three different models of nonequilibrium rate-based packed distillation columns are developed, each with progressive levels of complexity. The final model is the most complex, and is based on the Maxwell-Stefan theory of mass transfer. The first and second models are used as building blocks in the approach to the final model, as various simplifying assumptions are systematically relaxed. The models are all developed using orthogonal collocation. The order reduction properties of collocation are well documented. A low order model is desirable as the subsequent generation of data required for assessing the separation feasibility is fast. The first model is the simplest as constant molar overflow is assumed. This assumption is relaxed in the subsequent models. The second and third models differ in their respective mass and energy transfer. The second model uses a constant bulk phase approximation for an overall gas phase transfer coefficient. The third model uses rigorous Maxwell-Stefan mass transfer coefficients, which vary throughout the column. In all models, the bootstrap equation for the energy balance across the two-phase film is used after the appropriate modifications are made based on the system assumptions. Starting point solutions and minimum height and flows analysis are presented for all models. The first model is used to develop an azeotropic methodology for identifying and characterizing pinches. Different numerical techniques are also compared, and the accuracy of orthogonal collocation is verified. Ternary and pseudo McCabe-Thiele diagrams are used to represent the result$ for the multicomponent models 2 and 3. The results for models 2 and 3 are similar. This is expected as they differ only in the mass and heat transfer definitions. An argument is made for a specific definition of an objective function for models 2 and 3, which is subsequently used to generate separation surfaces. This function is defined such that there will always be a solution and for this reason is deemed superior to any alternatives. Feasible regions are identified using a grid projection of the relevant sections of the separation surfaces. The data set contained within the feasible region will be used in an optimizer in future work. In general, this work involves an understanding and application of the collocation mathematics to distillation systems. A further understanding of distillation systems, the associated mathematics and degrees of freedom is essential. A large section of this work is devoted to explaining and manipulating the available degrees of freedom, such that the desired end result of a feasible region for a specific separation can be obtained. Other complicating factors include the use of the collocation boundary conditions, and the relationship between these and the overall degrees of freedom for the system. In the literature, collocation is largely applied to staged columns. The resulting feed stage discontinuities are smoothed out using various interpolation routines. Both of these approaches are incorrect. It is shown that the use of collocation in staged columns is fundamentally flawed due to the underlying theory of staged distillation and the implications of collocation assumptions. Further, the feed discontinuities present in all the results are intrinsic features of the system and should be preserved. It is further concluded that Models 2 and 3 were correct in comparison with each other. Finally it was shown that the separation feasibility was successfully determined using the optimal objective function. This success was based on the accuracy and order reduction achieved through the use of collocation. Further work will involve optimizing the data found in the feasible region using Non-Linear Programming

    A comparison of two SPLE tools : Pure::Variants and Clafer tools

    Get PDF
    In software product line engineering (SPLE), parts of developed software is made variable in order to be able to build a whole range of software products at the same time. This is widely known to have a number of potential benefits such as saving costs when the product line is large enough. However, managing variability in software introduces challenges that are not well addressed by tools used in conventional software engineering, and specialized tools are needed. Research questions: 1) What are the most important requirements for SPLE tools for a small-to-medium sized organisation aiming to experiment with SPLE? 2) How well those requirements are met in two specific SPLE tools, Pure::Variants and Clafer tools? 3) How do the studied tools compare against each other when it comes to their suitability for the chosen context (a digital board game platform)? 4) How common requirements for SPL tools can be generalized to be applicable for both graphical and text-based tools? A list of requirements is first obtained from literature and then used as a basis for an experiment where support for each requirement is tried out with both tools. Then a part of an example product line is developed with both tools and the experiences reported on. Both tools were found to support the list of requirements quite well, although there were some usability problems and not everything could be tested due to technical issues. Based on developing the example, both tools were found to have their own strengths and weaknesses probably partly resulting from one being GUI-based and one textual. ACM Computing Classification System (CCS): (1) CCS → Software and its engineering → Software creation and management → Software development techniques → Reusability → Software product lines (2) CCS → Software and its engineering → Software notations and tools → Software configuration management and version control system

    Advanced manufacturing development of a composite empennage component for L-1011 aircraft

    Get PDF
    This is the final report of technical work conducted during the fourth phase of a multiphase program having the objective of the design, development and flight evaluation of an advanced composite empennage component manufactured in a production environment at a cost competitive with those of its metal counterpart, and at a weight savings of at least 20 percent. The empennage component selected for this program is the vertical fin box of the L-1011 aircraft. The box structure extends from the fuselage production joint to the tip rib and includes front and rear spars. During Phase 4 of the program, production quality tooling was designed and manufactured to produce three sets of covers, ribs, spars, miscellaneous parts, and subassemblies to assemble three complete ACVF units. Recurring and nonrecurring cost data were compiled and documented in the updated producibility/design to cost plan. Nondestruct inspections, quality control tests, and quality acceptance tests were performed in accordance with the quality assurance plan and the structural integrity control plan. Records were maintained to provide traceability of material and parts throughout the manufacturing development phase. It was also determined that additional tooling would not be required to support the current and projected L-1011 production rate

    A Reference Framework for Variability Management of Software Product Lines

    Get PDF
    Variability management (VM) in software product line engineering (SPLE) is introduced as an abstraction that enables the reuse and customization of assets. VM is a complex task involving the identification, representation, and instantiation of variability for specific products, as well as the evolution of variability itself. This work presents a comparison and contrast between existing VM approaches using qualitative meta-synthesis to determine the underlying perspectives, metaphors, and concepts of existing methods. A common frame of reference for the VM was proposed as the result of this analysis. Putting metaphors in the context of the dimensions in which variability occurs and identifying its key concepts provides a better understanding of its management and enables several analyses and evaluation opportunities. Finally, the proposed framework was evaluated using a qualitative study approach. The results of the evaluation phase suggest that the organizations in practice only focus on one dimension. The presented frame of reference will help the organization to cover this gap in practice.Comment: 24 page

    Interaction Design: Foundations, Experiments

    Get PDF
    Interaction Design: Foundations, Experiments is the result of a series of projects, experiments and curricula aimed at investigating the foundations of interaction design in particular and design research in general. The first part of the book - Foundations - deals with foundational theoretical issues in interaction design. An analysis of two categorical mistakes -the empirical and interactive fallacies- forms a background to a discussion of interaction design as act design and of computational technology as material in design. The second part of the book - Experiments - describes a range of design methods, programs and examples that have been used to probe foundational issues through systematic questioning of what is given. Based on experimental design work such as Slow Technology, Abstract Information Displays, Design for Sound Hiders, Zero Expression Fashion, and IT+Textiles, this section also explores how design experiments can play a central role when developing new design theory

    What’s in a Convention? Process and substance in the project of European constitution-building. IHS Political Science Series: 2003, No. 89

    Get PDF
    The paper studies aspects of the process and substance of the deliberations of the Convention on the Future of the Union, against the backdrop of the longer term development of a Constitution for the European Union. It examines some of the issues which have arisen over the course of the longer term debate about European constitutionalism, including the normative basis of a putative Constitution for the EU. In the main part of the paper, the primary objective is to elaborate in more detail the ways in which the Convention’s work was structured by the complex procedural and substantive heritage of the Union’s constitutional acquis. It focuses on the Convention as an addition to an already complex and multi-facetted constitution-building process, and looks at some of the principles which it has proposed to bring into the constitutional architecture, such as the explicit articulation of the supremacy principle. It concludes that at times the fit between the ‘old’ and the ‘new’ in the constitutional process and substance developed by the Convention is far from satisfactory
    • 

    corecore