9,530 research outputs found

    Two Case Studies of Subsystem Design for General-Purpose CSCW Software Architectures

    Get PDF
    This paper discusses subsystem design guidelines for the software architecture of general-purpose computer supported cooperative work systems, i.e., systems that are designed to be applicable in various application areas requiring explicit collaboration support. In our opinion, guidelines for subsystem level design are rarely given most guidelines currently given apply to the programming language level. We extract guidelines from a case study of the redesign and extension of an advanced commercial workflow management system and place them into the context of existing software engineering research. The guidelines are then validated against the design decisions made in the construction of a widely used web-based groupware system. Our approach is based on the well-known distinction between essential (logical) and physical architectures. We show how essential architecture design can be based on a direct mapping of abstract functional concepts as found in general-purpose systems to modules in the essential architecture. The essential architecture is next mapped to a physical architecture by applying software clustering and replication to achieve the required distribution and performance characteristics

    Encapsulation of Soft Computing Approaches within Itemset Mining a A Survey

    Get PDF
    Data Mining discovers patterns and trends by extracting knowledge from large databases. Soft Computing techniques such as fuzzy logic, neural networks, genetic algorithms, rough sets, etc. aims to reveal the tolerance for imprecision and uncertainty for achieving tractability, robustness and low-cost solutions. Fuzzy Logic and Rough sets are suitable for handling different types of uncertainty. Neural networks provide good learning and generalization. Genetic algorithms provide efficient search algorithms for selecting a model, from mixed media data. Data mining refers to information extraction while soft computing is used for information processing. For effective knowledge discovery from large databases, both Soft Computing and Data Mining can be merged. Association rule mining (ARM) and Itemset mining focus on finding most frequent item sets and corresponding association rules, extracting rare itemsets including temporal and fuzzy concepts in discovered patterns. This survey paper explores the usage of soft computing approaches in itemset utility mining

    From Earth to Orbit: An assessment of transportation options

    Get PDF
    The report assesses the requirements, benefits, technological feasibility, and roles of Earth-to-Orbit transportation systems and options that could be developed in support of future national space programs. Transportation requirements, including those for Mission-to-Planet Earth, Space Station Freedom assembly and operation, human exploration of space, space science missions, and other major civil space missions are examined. These requirements are compared with existing, planned, and potential launch capabilities, including expendable launch vehicles (ELV's), the Space Shuttle, the National Launch System (NLS), and new launch options. In addition, the report examines propulsion systems in the context of various launch vehicles. These include the Advanced Solid Rocket Motor (ASRM), the Redesigned Solid Rocket Motor (RSRM), the Solid Rocket Motor Upgrade (SRMU), the Space Shuttle Main Engine (SSME), the Space Transportation Main Engine (STME), existing expendable launch vehicle engines, and liquid-oxygen/hydrocarbon engines. Consideration is given to systems that have been proposed to accomplish the national interests in relatively cost effective ways, with the recognition that safety and reliability contribute to cost-effectiveness. Related resources, including technology, propulsion test facilities, and manufacturing capabilities are also discussed

    Model-Driven Engineering in the Large: Refactoring Techniques for Models and Model Transformation Systems

    Get PDF
    Model-Driven Engineering (MDE) is a software engineering paradigm that aims to increase the productivity of developers by raising the abstraction level of software development. It envisions the use of models as key artifacts during design, implementation and deployment. From the recent arrival of MDE in large-scale industrial software development – a trend we refer to as MDE in the large –, a set of challenges emerges: First, models are now developed at distributed locations, by teams of teams. In such highly collaborative settings, the presence of large monolithic models gives rise to certain issues, such as their proneness to editing conflicts. Second, in large-scale system development, models are created using various domain-specific modeling languages. Combining these models in a disciplined manner calls for adequate modularization mechanisms. Third, the development of models is handled systematically by expressing the involved operations using model transformation rules. Such rules are often created by cloning, a practice related to performance and maintainability issues. In this thesis, we contribute three refactoring techniques, each aiming to tackle one of these challenges. First, we propose a technique to split a large monolithic model into a set of sub-models. The aim of this technique is to enable a separation of concerns within models, promoting a concern-based collaboration style: Collaborators operate on the submodels relevant for their task at hand. Second, we suggest a technique to encapsulate model components by introducing modular interfaces in a set of related models. The goal of this technique is to establish modularity in these models. Third, we introduce a refactoring to merge a set of model transformation rules exhibiting a high degree of similarity. The aim of this technique is to improve maintainability and performance by eliminating the drawbacks associated with cloning. The refactoring creates variability-based rules, a novel type of rule allowing to capture variability by using annotations. The refactoring techniques contributed in this work help to reduce the manual effort during the refactoring of models and transformation rules to a large extent. As indicated in a series of realistic case studies, the output produced by the techniques is comparable or, in the case of transformation rules, partly even preferable to the result of manual refactoring, yielding a promising outlook on the applicability in real-world settings

    Introducing mobile edge computing capabilities through distributed 5G Cloud Enabled Small Cells

    Get PDF
    Current trends in broadband mobile networks are addressed towards the placement of different capabilities at the edge of the mobile network in a centralised way. On one hand, the split of the eNB between baseband processing units and remote radio headers makes it possible to process some of the protocols in centralised premises, likely with virtualised resources. On the other hand, mobile edge computing makes use of processing and storage capabilities close to the air interface in order to deploy optimised services with minimum delay. The confluence of both trends is a hot topic in the definition of future 5G networks. The full centralisation of both technologies in cloud data centres imposes stringent requirements to the fronthaul connections in terms of throughput and latency. Therefore, all those cells with limited network access would not be able to offer these types of services. This paper proposes a solution for these cases, based on the placement of processing and storage capabilities close to the remote units, which is especially well suited for the deployment of clusters of small cells. The proposed cloud-enabled small cells include a highly efficient microserver with a limited set of virtualised resources offered to the cluster of small cells. As a result, a light data centre is created and commonly used for deploying centralised eNB and mobile edge computing functionalities. The paper covers the proposed architecture, with special focus on the integration of both aspects, and possible scenarios of application.Peer ReviewedPostprint (author's final draft
    • 

    corecore