41,348 research outputs found

    On Modeling and Analyzing Cost Factors in Information Systems Engineering

    Get PDF
    Introducing enterprise information systems (EIS) is usually associated with high costs. It is therefore crucial to understand those factors that determine or influence these costs. Though software cost estimation has received considerable attention during the last decades, it is difficult to apply existing approaches to EIS. This difficulty particularly stems from the inability of these methods to deal with the dynamic interactions of the many technological, organizational and projectdriven cost factors which specifically arise in the context of EIS. Picking up this problem, we introduce the EcoPOST framework to investigate the complex cost structures of EIS engineering projects through qualitative cost evaluation models. This paper extends previously described concepts and introduces design rules and guidelines for cost evaluation models in order to enhance the development of meaningful and useful EcoPOST cost evaluation models. A case study illustrates the benefits of our approach. Most important, our EcoPOST framework is an important tool supporting EIS engineers in gaining a better understanding of the critical factors determining the costs of EIS engineering projects

    Unraveling radial dependency effects in fiber thermal drawing

    Full text link
    Fiber-based devices with advanced functionalities are emerging as promising solutions for various applications in flexible electronics and bioengineering. Multimaterial thermal drawing, in particular, has attracted strong interest for its ability to generate fibers with complex architectures. Thus far, however, the understanding of its fluid dynamics has only been applied to single material preforms for which higher order effects, such as the radial dependency of the axial velocity, could be neglected. With complex multimaterial preforms, such effects must be taken into account, as they can affect the architecture and the functional properties of the resulting fiber device. Here, we propose a versatile model of the thermal drawing of fibers, which takes into account a radially varying axial velocity. Unlike the commonly used cross section averaged approach, our model is capable of predicting radial variations of functional properties caused by the deformation during drawing. This is demonstrated for two effects observed, namely, by unraveling the deformation of initially straight, transversal lines in the preform and the dependence on the draw ratio and radial position of the in-fiber electrical conductivity of polymer nanocomposites, an important class of materials for emerging fiber devices. This work sets a thus far missing theoretical and practical understanding of multimaterial fiber processing to better engineer advanced fibers and textiles for sensing, health care, robotics, or bioengineering applications

    A common trajectory recapitulated by urban economies

    Full text link
    Is there a general economic pathway recapitulated by individual cities over and over? Identifying such evolution structure, if any, would inform models for the assessment, maintenance, and forecasting of urban sustainability and economic success as a quantitative baseline. This premise seems to contradict the existing body of empirical evidences for path-dependent growth shaping the unique history of individual cities. And yet, recent empirical evidences and theoretical models have amounted to the universal patterns, mostly size-dependent, thereby expressing many of urban quantities as a set of simple scaling laws. Here, we provide a mathematical framework to integrate repeated cross-sectional data, each of which freezes in time dimension, into a frame of reference for longitudinal evolution of individual cities in time. Using data of over 100 millions employment in thousand business categories between 1998 and 2013, we decompose each city's evolution into a pre-factor and relative changes to eliminate national and global effects. In this way, we show the longitudinal dynamics of individual cities recapitulate the observed cross-sectional regularity. Larger cities are not only scaled-up versions of their smaller peers but also of their past. In addition, our model shows that both specialization and diversification are attributed to the distribution of industry's scaling exponents, resulting a critical population of 1.2 million at which a city makes an industrial transition into innovative economies

    Analysis of Software Binaries for Reengineering-Driven Product Line Architecture\^aAn Industrial Case Study

    Full text link
    This paper describes a method for the recovering of software architectures from a set of similar (but unrelated) software products in binary form. One intention is to drive refactoring into software product lines and combine architecture recovery with run time binary analysis and existing clustering methods. Using our runtime binary analysis, we create graphs that capture the dependencies between different software parts. These are clustered into smaller component graphs, that group software parts with high interactions into larger entities. The component graphs serve as a basis for further software product line work. In this paper, we concentrate on the analysis part of the method and the graph clustering. We apply the graph clustering method to a real application in the context of automation / robot configuration software tools.Comment: In Proceedings FMSPLE 2015, arXiv:1504.0301

    Co-simulation of Continuous Systems: A Tutorial

    Full text link
    Co-simulation consists of the theory and techniques to enable global simulation of a coupled system via the composition of simulators. Despite the large number of applications and growing interest in the challenges, the field remains fragmented into multiple application domains, with limited sharing of knowledge. This tutorial aims at introducing co-simulation of continuous systems, targeted at researchers new to the field

    Ontology modelling methodology for temporal and interdependent applications

    Get PDF
    The increasing adoption of Semantic Web technology by several classes of applications in recent years, has made ontology engineering a crucial part of application development. Nowadays, the abundant accessibility of interdependent information from multiple resources and representing various fields such as health, transport, and banking etc., further evidence the growing need for utilising ontology for the development of Web applications. While there have been several advances in the adoption of the ontology for application development, less emphasis is being made on the modelling methodologies for representing modern-day application that are characterised by the temporal nature of the data they process, which is captured from multiple sources. Taking into account the benefits of a methodology in the system development, we propose a novel methodology for modelling ontologies representing Context-Aware Temporal and Interdependent Systems (CATIS). CATIS is an ontology development methodology for modelling temporal interdependent applications in order to achieve the desired results when modelling sophisticated applications with temporal and inter dependent attributes to suit today's application requirements

    Reasoning About the Reliability of Multi-version, Diverse Real-Time Systems

    Get PDF
    This paper is concerned with the development of reliable real-time systems for use in high integrity applications. It advocates the use of diverse replicated channels, but does not require the dependencies between the channels to be evaluated. Rather it develops and extends the approach of Little wood and Rush by (for general systems) by investigating a two channel system in which one channel, A, is produced to a high level of reliability (i.e. has a very low failure rate), while the other, B, employs various forms of static analysis to sustain an argument that it is perfect (i.e. it will never miss a deadline). The first channel is fully functional, the second contains a more restricted computational model and contains only the critical computations. Potential dependencies between the channels (and their verification) are evaluated in terms of aleatory and epistemic uncertainty. At the aleatory level the events ''A fails" and ''B is imperfect" are independent. Moreover, unlike the general case, independence at the epistemic level is also proposed for common forms of implementation and analysis for real-time systems and their temporal requirements (deadlines). As a result, a systematic approach is advocated that can be applied in a real engineering context to produce highly reliable real-time systems, and to support numerical claims about the level of reliability achieved

    Simplification of UML/OCL schemas for efficient reasoning

    Get PDF
    Ensuring the correctness of a conceptual schema is an essential task in order to avoid the propagation of errors during software development. The kind of reasoning required to perform such task is known to be exponential for UML class diagrams alone and even harder when considering OCL constraints. Motivated by this issue, we propose an innovative method aimed at removing constraints and other UML elements of the schema to obtain a simplified one that preserve the same reasoning outcomes. In this way, we can reason about the correctness of the initial artifact by reasoning on a simplified version of it. Thus, the efficiency of the reasoning process is significantly improved. In addition, since our method is independent from the reasoning engine used, any reasoning method may benefit from it.Peer ReviewedPostprint (author's final draft

    Development of filtered Euler–Euler two-phase model for circulating fluidised bed: High resolution simulation, formulation and a priori analyses

    Get PDF
    Euler–Euler two-phase model simulations are usually performed with mesh sizes larger than the smallscale structure size of gas–solid flows in industrial fluidised beds because of computational resource limitation. Thus, these simulations do not fully account for the particle segregation effect at the small scale and this causes poor prediction of bed hydrodynamics. An appropriate modelling approach accounting for the influence of unresolved structures needs to be proposed for practical simulations. For this purpose, computational grids are refined to a cell size of a few particle diameters to obtain mesh-independent results requiring up to 17 million cells in a 3D periodic circulating fluidised bed. These mesh-independent results are filtered by volume averaging and used to perform a priori analyses on the filtered phase balance equations. Results show that filtered momentum equations can be used for practical simulations but must take account of a drift velocity due to the sub-grid correlation between the local fluid velocity and the local particle volume fraction, and particle sub-grid stresses due to the filtering of the non-linear convection term. This paper proposes models for sub-grid drift velocity and particle sub-grid stresses and assesses these models by a priori tests
    corecore