4,251 research outputs found

    Identifying Redundancies and Gaps Across Testing Levels During Verification of Automotive Software

    Get PDF
    Testing of automotive systems usually follows the V-Model, a process where sequential testing activities progress from low-level code structures to high-level integrated systems. In theory, the V-Model should reduce redundant testing and prevent gaps in verification. To assess whether such benefits translate in practice, in a case study at Scania CV AB, we have developed a framework to identify redundancies and gaps in test cases across V-model test levels.Our framework identified both redundancies and gaps in Sca-nia’s scripted testing efforts. Deviating cases were also identified where, e.g., requirements were outdated or contained incorrect details. Factors contributing to redundancy include re-verification in a new context, difficulties mapping requirements across levels, and lack of test case documentation. Both redundancies and gaps result from a lack of communication and traceability of test results across test levels. We recommend active collaboration across levels, as well as use of coverage matrices to alleviate these issues. We offer our framework to help refine testing practices and to inspire process improvements

    Research and Education in Computational Science and Engineering

    Get PDF
    Over the past two decades the field of computational science and engineering (CSE) has penetrated both basic and applied research in academia, industry, and laboratories to advance discovery, optimize systems, support decision-makers, and educate the scientific and engineering workforce. Informed by centuries of theory and experiment, CSE performs computational experiments to answer questions that neither theory nor experiment alone is equipped to answer. CSE provides scientists and engineers of all persuasions with algorithmic inventions and software systems that transcend disciplines and scales. Carried on a wave of digital technology, CSE brings the power of parallelism to bear on troves of data. Mathematics-based advanced computing has become a prevalent means of discovery and innovation in essentially all areas of science, engineering, technology, and society; and the CSE community is at the core of this transformation. However, a combination of disruptive developments---including the architectural complexity of extreme-scale computing, the data revolution that engulfs the planet, and the specialization required to follow the applications to new frontiers---is redefining the scope and reach of the CSE endeavor. This report describes the rapid expansion of CSE and the challenges to sustaining its bold advances. The report also presents strategies and directions for CSE research and education for the next decade.Comment: Major revision, to appear in SIAM Revie

    COLLABORATIVE TESTING ACROSS SHARED SOFTWARE COMPONENTS

    Get PDF
    Large component-based systems are often built from many of the same components. As individual component-based software systems are developed, tested and maintained, these shared components are repeatedly manipulated. As a result there are often significant overlaps and synergies across and among the different test efforts of different component-based systems. However, in practice, testers of different systems rarely collaborate, taking a test-all-by-yourself approach. As a result, redundant effort is spent testing common components, and important information that could be used to improve testing quality is lost. The goal of this research is to demonstrate that, if done properly, testers of shared software components can save effort by avoiding redundant work, and can improve the test effectiveness for each component as well as for each component-based software system by using information obtained when testing across multiple components. To achieve this goal I have developed collaborative testing techniques and tools for developers and testers of component-based systems with shared components, applied the techniques to subject systems, and evaluated the cost and effectiveness of applying the techniques. The dissertation research is organized in three parts. First, I investigated current testing practices for component-based software systems to find the testing overlap and synergy we conjectured exists. Second, I designed and implemented infrastructure and related tools to facilitate communication and data sharing between testers. Third, I designed two testing processes to implement different collaborative testing algorithms and applied them to large actively developed software systems. This dissertation has shown the benefits of collaborative testing across component developers who share their components. With collaborative testing, researchers can design algorithms and tools to support collaboration processes, achieve better efficiency in testing configurations, and discover inter-component compatibility faults within a minimal time window after they are introduced

    Rule-Based System Architecting of Earth Observing Systems: Earth Science Decadal Survey

    Get PDF
    This paper presents a methodology to explore the architectural trade space of Earth observing satellite systems, and applies it to the Earth Science Decadal Survey. The architecting problem is formulated as a combinatorial optimization problem with three sets of architectural decisions: instrument selection, assignment of instruments to satellites, and mission scheduling. A computational tool was created to automatically synthesize architectures based on valid combinations of options for these three decisions and evaluate them according to several figures of merit, including satisfaction of program requirements, data continuity, affordability, and proxies for fairness, technical, and programmatic risk. A population-based heuristic search algorithm is used to search the trade space. The novelty of the tool is that it uses a rule-based expert system to model the knowledge-intensive components of the problem, such as scientific requirements, and to capture the nonlinear positive and negative interactions between instruments (synergies and interferences), which drive both requirement satisfaction and cost. The tool is first demonstrated on the past NASA Earth Observing System program and then applied to the Decadal Survey. Results suggest that the Decadal Survey architecture is dominated by other more distributed architectures in which DESDYNI and CLARREO are consistently broken down into individual instruments."La Caixa" FoundationCharles Stark Draper LaboratoryGoddard Space Flight Cente

    SPATIOTEMPORAL IMPACT OF PHAGE EXPOSURE ON BIOFILM SYSTEMS

    Get PDF
    When single-celled prokaryotic organisms, one of the simplest forms of life, develop the ability to exhibit complex emergent properties such as social cooperation, resource capture, and enhanced survivability, the individual limitations of existence can be overcome which would otherwise be unlikely. Emergent properties of biofilms such as matrix production, quorum sensing, and coordinated lifecycle offers structural and functional advantages which makes them highly successful at evading destruction by antimicrobials and immune defenses. With few, if any, novel antibiotics in the clinical pipeline, there is a resurgence of interest in alternatives such as phage therapy, the practice of bacterial viruses known as bacteriophages that infect and lyse bacteria to treat infections. In this thesis, we explore the understudied impact of phage titer on biofilm dynamics and outcomes. We determined that the biofilm developmental stage at the time of phage addition modulates its response. These responses vary as a function of the phage dose and can be broadly organized into four distinct classes. In each of these classes, we observe that high phage doses restrain the biofilm from transitioning into the next stage of their developmental cycle. A paradoxical aspect of this result is that mature biofilms exposed to high phage titers are enhanced by phage treatment. Despite this apparently unwanted outcome, the inhibition of biofilm dispersion in phage-treated samples could potentially minimize the further spread of infections to other locations. These results comprehensively demonstrate predictable biofilm outcomes versus phage dosage and biofilm age, and will provide guidance in advancing phage-based personalized medicine when generalized treatments fail. Collectively, this dissertation derives insights on the advantages and limitations of phages to inhibit, control, and eliminate biofilms.Ph.D

    Advanced Knowledge Technologies at the Midterm: Tools and Methods for the Semantic Web

    Get PDF
    The University of Edinburgh and research sponsors are authorised to reproduce and distribute reprints and on-line copies for their purposes notwithstanding any copyright annotation hereon. The views and conclusions contained herein are the author’s and shouldn’t be interpreted as necessarily representing the official policies or endorsements, either expressed or implied, of other parties.In a celebrated essay on the new electronic media, Marshall McLuhan wrote in 1962:Our private senses are not closed systems but are endlessly translated into each other in that experience which we call consciousness. Our extended senses, tools, technologies, through the ages, have been closed systems incapable of interplay or collective awareness. Now, in the electric age, the very instantaneous nature of co-existence among our technological instruments has created a crisis quite new in human history. Our extended faculties and senses now constitute a single field of experience which demands that they become collectively conscious. Our technologies, like our private senses, now demand an interplay and ratio that makes rational co-existence possible. As long as our technologies were as slow as the wheel or the alphabet or money, the fact that they were separate, closed systems was socially and psychically supportable. This is not true now when sight and sound and movement are simultaneous and global in extent. (McLuhan 1962, p.5, emphasis in original)Over forty years later, the seamless interplay that McLuhan demanded between our technologies is still barely visible. McLuhan’s predictions of the spread, and increased importance, of electronic media have of course been borne out, and the worlds of business, science and knowledge storage and transfer have been revolutionised. Yet the integration of electronic systems as open systems remains in its infancy.Advanced Knowledge Technologies (AKT) aims to address this problem, to create a view of knowledge and its management across its lifecycle, to research and create the services and technologies that such unification will require. Half way through its sixyear span, the results are beginning to come through, and this paper will explore some of the services, technologies and methodologies that have been developed. We hope to give a sense in this paper of the potential for the next three years, to discuss the insights and lessons learnt in the first phase of the project, to articulate the challenges and issues that remain.The WWW provided the original context that made the AKT approach to knowledge management (KM) possible. AKT was initially proposed in 1999, it brought together an interdisciplinary consortium with the technological breadth and complementarity to create the conditions for a unified approach to knowledge across its lifecycle. The combination of this expertise, and the time and space afforded the consortium by the IRC structure, suggested the opportunity for a concerted effort to develop an approach to advanced knowledge technologies, based on the WWW as a basic infrastructure.The technological context of AKT altered for the better in the short period between the development of the proposal and the beginning of the project itself with the development of the semantic web (SW), which foresaw much more intelligent manipulation and querying of knowledge. The opportunities that the SW provided for e.g., more intelligent retrieval, put AKT in the centre of information technology innovation and knowledge management services; the AKT skill set would clearly be central for the exploitation of those opportunities.The SW, as an extension of the WWW, provides an interesting set of constraints to the knowledge management services AKT tries to provide. As a medium for the semantically-informed coordination of information, it has suggested a number of ways in which the objectives of AKT can be achieved, most obviously through the provision of knowledge management services delivered over the web as opposed to the creation and provision of technologies to manage knowledge.AKT is working on the assumption that many web services will be developed and provided for users. The KM problem in the near future will be one of deciding which services are needed and of coordinating them. Many of these services will be largely or entirely legacies of the WWW, and so the capabilities of the services will vary. As well as providing useful KM services in their own right, AKT will be aiming to exploit this opportunity, by reasoning over services, brokering between them, and providing essential meta-services for SW knowledge service management.Ontologies will be a crucial tool for the SW. The AKT consortium brings a lot of expertise on ontologies together, and ontologies were always going to be a key part of the strategy. All kinds of knowledge sharing and transfer activities will be mediated by ontologies, and ontology management will be an important enabling task. Different applications will need to cope with inconsistent ontologies, or with the problems that will follow the automatic creation of ontologies (e.g. merging of pre-existing ontologies to create a third). Ontology mapping, and the elimination of conflicts of reference, will be important tasks. All of these issues are discussed along with our proposed technologies.Similarly, specifications of tasks will be used for the deployment of knowledge services over the SW, but in general it cannot be expected that in the medium term there will be standards for task (or service) specifications. The brokering metaservices that are envisaged will have to deal with this heterogeneity.The emerging picture of the SW is one of great opportunity but it will not be a wellordered, certain or consistent environment. It will comprise many repositories of legacy data, outdated and inconsistent stores, and requirements for common understandings across divergent formalisms. There is clearly a role for standards to play to bring much of this context together; AKT is playing a significant role in these efforts. But standards take time to emerge, they take political power to enforce, and they have been known to stifle innovation (in the short term). AKT is keen to understand the balance between principled inference and statistical processing of web content. Logical inference on the Web is tough. Complex queries using traditional AI inference methods bring most distributed computer systems to their knees. Do we set up semantically well-behaved areas of the Web? Is any part of the Web in which semantic hygiene prevails interesting enough to reason in? These and many other questions need to be addressed if we are to provide effective knowledge technologies for our content on the web

    Research and Education in Computational Science and Engineering

    Get PDF
    This report presents challenges, opportunities, and directions for computational science and engineering (CSE) research and education for the next decade. Over the past two decades the field of CSE has penetrated both basic and applied research in academia, industry, and laboratories to advance discovery, optimize systems, support decision-makers, and educate the scientific and engineering workforce. Informed by centuries of theory and experiment, CSE performs computational experiments to answer questions that neither theory nor experiment alone is equipped to answer. CSE provides scientists and engineers with algorithmic inventions and software systems that transcend disciplines and scales. CSE brings the power of parallelism to bear on troves of data. Mathematics-based advanced computing has become a prevalent means of discovery and innovation in essentially all areas of science, engineering, technology, and society, and the CSE community is at the core of this transformation. However, a combination of disruptive developments---including the architectural complexity of extreme-scale computing, the data revolution and increased attention to data-driven discovery, and the specialization required to follow the applications to new frontiers---is redefining the scope and reach of the CSE endeavor. With these many current and expanding opportunities for the CSE field, there is a growing demand for CSE graduates and a need to expand CSE educational offerings. This need includes CSE programs at both the undergraduate and graduate levels, as well as continuing education and professional development programs, exploiting the synergy between computational science and data science. Yet, as institutions consider new and evolving educational programs, it is essential to consider the broader research challenges and opportunities that provide the context for CSE education and workforce development

    Creating New Ventures: A review and research agenda

    Get PDF
    Creating new ventures is one of the most central topics to entrepreneurship and is a critical step from which many theories of management, organizational behavior, and strategic management build. Therefore, this review and proposed research agenda is not only relevant to entrepreneurship scholars but also other management scholars who wish to challenge some of the implicit assumptions of their current streams of research and extend the boundaries of their current theories to earlier in the organization’s life. Given that the last systematic review of the topic was published 16 years ago, and that the topic has evolved rapidly over this time, an overview and research outlook are long overdue. From our review, we inductively generated ten sub-topics: (1) Lead founder, (2) Founding team, (3) Social relationships, (4) Cognitions, (5) Emergent organizing, (6) New venture strategy, (7) Organizational emergence, (8) New venture legitimacy, (9) Founder exit, and (10) Entrepreneurial environment. These sub-topics are then organized into three major stages of the entrepreneurial process—co-creating, organizing, and performing. Together, the framework provides a cohesive story of the past and a road map for future research on creating new ventures, focusing on the links connecting these sub-topics
    • …
    corecore