2,243 research outputs found

    A repository for integration of software artifacts with dependency resolution and federation support

    Full text link
    While developing new IT products, reusability of existing components is a key aspect that can considerably improve the success rate. This fact has become even more important with the rise of the open source paradigm. However, integrating different products and technologies is not always an easy task. Different communities employ different standards and tools, and most times is not clear which dependencies a particular piece of software has. This is exacerbated by the transitive nature of these dependencies, making component integration a complicated affair. To help reducing this complexity we propose a model-based repository, capable of automatically resolve the required dependencies. This repository needs to be expandable, so new constraints can be analyzed, and also have federation support, for the integration with other sources of artifacts. The solution we propose achieves these working with OSGi components and using OSGi itself

    A Model-based Repository for Open Source Service and Component Integration.

    Get PDF
    Open source is a software development paradigm that has seen a huge rise in recent years. It reduces IT costs and time to market, while increasing security and reliability. However, the difficulty in integrating developments from different communities and stakeholders prevents this model from reaching its full potential. This is mainly due to the challenge of determining and locating the correct dependencies for a given software artifact. To solve this problem we propose the development of an extensible software component repository based upon models. This repository should be capable of solving the dependencies between several components and work with already existing repositories to access the needed artifacts transparently. This repository will also be easily expandable, enabling the creation of modules that support new kinds of dependencies or other existing repository technologies. The proposed solution will work with OSGi components and use OSGi itself

    A MODEL-BASED REPOSITORY FOR OPEN SOURCE SERVICE AND COMPONENT INTEGRATION

    Get PDF
    Abstract: Open source is a software development paradigm that has seen a huge rise in recent years. It reduces IT costs and time to market, while increasing security and reliability. However, the difficulty in integrating developments from different communities and stakeholders prevents this model from reaching its full potential. This is mainly due to the challenge of determining and locating the correct dependencies for a given software artifact. To solve this problem we propose the development of an extensible software component repository based upon models. This repository should be capable of solving the dependencies between several components and work with already existing repositories to access the needed artifacts transparently. This repository will also be easily expandable, enabling the creation of modules that support new kinds of dependencies or other existing repository technologies. The proposed solution will work with OSGi components and use OSGi itself

    Effecting Data Quality Through Data Governance: a Case Study in the Financial Services Industry

    Get PDF
    One of the most significant challenges faced by senior management today is implementing a data governance program to ensure that data is an asset to an organization\u27s mission. New legislation aligned with continual regulatory oversight, increasing data volume growth, and the desire to improve data quality for decision making are driving forces behind data governance initiatives. Data governance involves reshaping existing processes and the way people view data along with the information technology required to create a consistent, secure and defined processes for handling the quality of an organization\u27s data. In examining attempts to move towards making data an asset in organizations, the term data governance helps to conceptualize the break with existing ad hoc, siloed and improper data management practices. This research considers a case study of large financial services company to examine data governance policies and procedures. It seeks to bring some information to bare on the drivers of data governance, the processes to ensure data quality, the technologies and people involved to aid in the processes as well as the use of data governance in decision making. This research also addresses some core questions surrounding data governance, such as the viability of a golden source record, ownership and responsibilities for data, and the optimum placement of a data governance department. The findings will provide a model for financial services companies hoping to take the initial steps towards better data quality and ultimately a data governance capability

    Towards Multilingual Programming Environments

    Get PDF
    Software projects consist of different kinds of artifacts: build files, configuration files, markup files, source code in different software languages, and so on. At the same time, however, most integrated development environments (IDEs) are focused on a single (programming) language. Even if a programming environment supports multiple languages (e.g., Eclipse), IDE features such as cross-referencing, refactoring, or debugging, do not often cross language boundaries. What would it mean for programming environment to be truly multilingual? In this short paper we sketch a vision of a system that integrates IDE support across language boundaries. We propose to build this system on a foundation of unified source code models and metaprogramming. Nevertheless, a number of important and hard research questions still need to be addressed

    Towards multilingual programming environments

    Get PDF
    Software projects consist of different kinds of artifacts: build files, configuration files, markup files, source code in different software languages, and so on. At the same time, however, most integrated development environments (IDEs) are focused on a single (programming) language. Even if a programming environment supports multiple languages (e.g., Eclipse), IDE features such as cross-referencing, refactoring, or debugging, do not often cross language boundaries. What would it mean for programming environment to be truly multilingual? In this short paper we sketch a vision of a system that integrates IDE support across language boundaries. We propose to build this system on a foundation of unified source code models and metaprogramming. Nevertheless, a number of important and hard research questions still need to be addressed

    Gateway Modeling and Simulation Plan

    Get PDF
    This plan institutes direction across the Gateway Program and the Element Projects to ensure that Cross Program M&S are produced in a manner that (1) generate the artifacts required for NASA-STD-7009 compliance, (2) ensures interoperability of M&S exchanged and integrated across the program and, (3) drives integrated development efforts to provide cross-domain integrated simulation of the Gateway elements, space environment, and operational scenarios. This direction is flowed down via contractual enforcement to prime contractors and includes both the GMS requirements specified in this plan and the NASASTD- 7009 derived requirements necessary for compliance. Grounding principles for management of Gateway Models and Simulations (M&S) are derived from the Columbia Accident Investigation Board (CAIB) report and the Diaz team report, A Renewed Commitment to Excellence. As an outcome of these reports, and in response to Action 4 of the Diaz team report, the NASA Standard for Models and Simulations, NASA-STD-7009 was developed. The standard establishes M&S requirements for development and use activities to ensure proper capture and communication of M&S pedigree and credibility information to Gateway program decision makers. Through the course of the Gateway program life cycle M&S will be heavily relied upon to conduct analysis, test products, support operations activities, enable informed decision making and ultimately to certify the Gateway with an acceptable level of risk to crew and mission. To reduce risk associated with M&S influenced decisions, this plan applies the NASA-STD-7009 requirements to produce the artifacts that support credibility assessments and ensure the information is communicated to program management

    Identifying the Presence of Known Vulnerabilities in the Versions of a Software Project

    Get PDF
    As the world continues to embrace a completely digital society in all aspects of life, the ever present threat of a security flaw in a software system looms. Especially with a stream of high profile security flaws and breaches, the public is more aware of the risk now than ever before. However, the realities of any software project is that there are engineering concerns of the utmost importance that all demand simultaneous attention. To balance and manage these challenges, software engineering has developed patterns of industry activities and best practices. Yet even as engineers rely on these practices to stay afloat, managing security can become elusive in a tangled mess of complex relationships between systems. Modern software projects rely upon other software to do its job; only the most niche and specialized software lives in isolation in today\u27s industry. In this work, we present an approach to help alleviate one of the aspects of actively managing security in a software project. The objectives of this approach are 1) to establish the presence of a known vulnerability in a software project version and 2) to develop a set of versions of a software project which identify vulnerability status. We tested the approach on three Apache Software Foundation projects, for a total of eleven vulnerabilities tested. In the analysis of the results, we find that the approach is conservative in marking a particular version not vulnerablenot~vulnerable, but when it does so, it is completely consistent with the evaluation results. This conservative nature is a beneficial characteristic of the approach when considering the context of software security in which it operates

    Automated tools and techniques for distributed Grid Software: Development of the testbed infrastructure

    Get PDF
    Grid technology is becoming more and more important as the new paradigm for sharing computational resources across different organizations in a secure way. The great powerfulness of this solution, requires the definition of a generic stack of services and protocols and this is the scope of the different Grid initiatives. As a result of international collaborations for its development, the Open Grid Forum created the Open Grid Services Architecture (OGSA) which aims to define the common set of services that will enable interoperability across the different implementations. This master thesis has been developed in this framework, as part of the two European-funded projects ETICS and OMII-Europe. The main objective is to contribute to the design and maintenance of large distributed development projects with the automated tool that enables to implement Software Engineering techniques oriented to achieve an acceptable level of quality at the release process. Specifically, this thesis develops the testbed concept as the virtual production-like scenario where to perform compliance tests. As proof of concept, the OGSA Basic Execution Service has been chosen in order to implement and execute conformance tests within the ETICS automated testbed framework
    corecore