59,582 research outputs found

    Quality-aware model-driven service engineering

    Get PDF
    Service engineering and service-oriented architecture as an integration and platform technology is a recent approach to software systems integration. Quality aspects ranging from interoperability to maintainability to performance are of central importance for the integration of heterogeneous, distributed service-based systems. Architecture models can substantially influence quality attributes of the implemented software systems. Besides the benefits of explicit architectures on maintainability and reuse, architectural constraints such as styles, reference architectures and architectural patterns can influence observable software properties such as performance. Empirical performance evaluation is a process of measuring and evaluating the performance of implemented software. We present an approach for addressing the quality of services and service-based systems at the model-level in the context of model-driven service engineering. The focus on architecture-level models is a consequence of the black-box character of services

    Model-driven performance evaluation for service engineering

    Get PDF
    Service engineering and service-oriented architecture as an integration and platform technology is a recent approach to software systems integration. Software quality aspects such as performance are of central importance for the integration of heterogeneous, distributed service-based systems. Empirical performance evaluation is a process of measuring and calculating performance metrics of the implemented software. We present an approach for the empirical, model-based performance evaluation of services and service compositions in the context of model-driven service engineering. Temporal databases theory is utilised for the empirical performance evaluation of model-driven developed service systems

    Database Systems - Present and Future

    Get PDF
    The database systems have nowadays an increasingly important role in the knowledge-based society, in which computers have penetrated all fields of activity and the Internet tends to develop worldwide. In the current informatics context, the development of the applications with databases is the work of the specialists. Using databases, reach a database from various applications, and also some of related concepts, have become accessible to all categories of IT users. This paper aims to summarize the curricular area regarding the fundamental database systems issues, which are necessary in order to train specialists in economic informatics higher education. The database systems integrate and interfere with several informatics technologies and therefore are more difficult to understand and use. Thus, students should know already a set of minimum, mandatory concepts and their practical implementation: computer systems, programming techniques, programming languages, data structures. The article also presents the actual trends in the evolution of the database systems, in the context of economic informatics.database systems - DBS, database management systems – DBMS, database – DB, programming languages, data models, database design, relational database, object-oriented systems, distributed systems, advanced database systems

    Criteria for the Diploma qualifications in information technology at levels 1, 2 and 3

    Get PDF

    CMS Data Analysis: Current Status and Future Strategy

    Full text link
    We present the current status of CMS data analysis architecture and describe work on future Grid-based distributed analysis prototypes. CMS has two main software frameworks related to data analysis: COBRA, the main framework, and IGUANA, the interactive visualisation framework. Software using these frameworks is used today in the world-wide production and analysis of CMS data. We describe their overall design and present examples of their current use with emphasis on interactive analysis. CMS is currently developing remote analysis prototypes, including one based on Clarens, a Grid-enabled client-server tool. Use of the prototypes by CMS physicists will guide us in forming a Grid-enriched analysis strategy. The status of this work is presented, as is an outline of how we plan to leverage the power of our existing frameworks in the migration of CMS software to the Grid.Comment: 4 pages, 3 figures, contribution to CHEP`03 conferenc

    Kernel arquitecture for CAD/CAM in shipbuilding enviroments

    Get PDF
    The capabilities of complex software products such as CAD/CAM systems are strongly supported by basic information technologies related with data management, visualization, communication, geometry modeling and others related with the development process. These basic information technologies are involved in a continuous evolution process, but over recent years this evolution has been dramatic. The main reason for this has been that new hardware capabilities (including graphic cards) are available at very low cost, but also a contributing factor has been the evolution of the prices of basic software. To take advantage of these new features, the existing CAD/CAM systems must undergo a complete and drastic redesign. This process is complicated but strategic for the future evolution of a system. There are several examples in the market of how a bad decision has lead to a cul-de-sac (both technically and commercially). This paper describes what the authors consider are the basic architectural components of a kernel for a CAD/CAM system oriented to shipbuilding. The proposed solution is a combination of in-house developed frameworks together with commercial products that are accepted as standard components. The proportion of in-house frameworks within this combination of products is a key factor, especially when considering CAD/CAM systems oriented to shipbuilding. General-purpose CAD/CAM systems are mainly oriented to the mechanical CAD market. For this reason several basic products exist devoted to geometry modelling in this context. But these basic products are not well suited to deal with the very specific geometry modelling requirements of a CAD/CAM system oriented to shipbuilding. The complexity of the ship model, the different model requirements through its short and changing life cycle and the many different disciplines involved in the process are reasons for this inadequacy. Apart from these basic frameworks, specific shipbuilding frameworks are also required. This second layer is built over the basic technology components mentioned above. This paper describes in detail the technological frameworks which have been used to develop the latest FORAN version.Postprint (published version

    ATLAS Data Challenge 1

    Full text link
    In 2002 the ATLAS experiment started a series of Data Challenges (DC) of which the goals are the validation of the Computing Model, of the complete software suite, of the data model, and to ensure the correctness of the technical choices to be made. A major feature of the first Data Challenge (DC1) was the preparation and the deployment of the software required for the production of large event samples for the High Level Trigger (HLT) and physics communities, and the production of those samples as a world-wide distributed activity. The first phase of DC1 was run during summer 2002, and involved 39 institutes in 18 countries. More than 10 million physics events and 30 million single particle events were fully simulated. Over a period of about 40 calendar days 71000 CPU-days were used producing 30 Tbytes of data in about 35000 partitions. In the second phase the next processing step was performed with the participation of 56 institutes in 21 countries (~ 4000 processors used in parallel). The basic elements of the ATLAS Monte Carlo production system are described. We also present how the software suite was validated and the participating sites were certified. These productions were already partly performed by using different flavours of Grid middleware at ~ 20 sites.Comment: 10 pages; 3 figures; CHEP03 Conference, San Diego; Reference MOCT00
    • 

    corecore