957 research outputs found

    Testing real-time multi input-output systems

    Get PDF
    In formal testing, the assumption of input enabling is typically made. This assumption requires all inputs to be enabled anytime. In addition, the useful concept of quiescence is sometimes applied. Briefly, a system is in a quiescent state when it cannot produce outputs. In this paper, we relax the input enabling assumption, and allow some input sets to be enabled while others remain disabled. Moreover, we also relax the general bound M used in timed systems to detect quiescence, and allow different bounds for different sets of outputs. By considering the tioco-M theory, an enriched theory for timed testing with repetitive quiescence, and allowing the partition of input sets and output sets, we introduce the mtioco^M relation. A test derivation procedure which is nondeterministic and parameterized is further developed, and shown to be sound and complete wrt mtioco^

    Testing multi input-output real-time systems (Extended version)

    Get PDF
    In formal testing, the assumption of input enabling is typically made. This assumption requires all inputs to be enabled anytime. In addition, the useful concept of quiescence is sometimes applied. Briefly, a system is in a quiescent state when it cannot produce outputs. In this paper, we relax the input enabling assumption, and allow some input sets to be enabled while others remain disabled. Moreover, we also relax the general bound M used in timed systems to detect quiescence, and allow different bounds for different sets of outputs. By considering the tiocoM theory, an enriched theory for timed testing with repetitive quiescence, and allowing the partition of input sets and output sets, we introduce the mtiocoM relation. A test derivation procedure which is nondeterministic and parameterized is further developed, and shown to be sound and complete wrt mtiocoM

    Test Derivation from Timed Automata

    Get PDF
    A real-time system is a discrete system whose state changes occur in real-numbered time [AH97]. For testing real-time systems, specification languages must be extended with constructs for expressing real-time constraints, the implementation relation must be generalized to consider the temporal dimension, and the data structures and algorithms used to generate tests must be revised to operate on a potentially infinite set of states

    Work-in-progress Assume-guarantee reasoning with ioco

    Get PDF
    This paper presents a combination between the assume-guarantee paradigm and the testing relation ioco. The assume-guarantee paradigm is a ”divide and conquer” technique that decomposes the verification of a system into smaller tasks that involve the verification of its components. The principal aspect of assume-guarantee reasoning is to consider each component separately, while taking into account assumptions about the context of the component. The testing relation ioco is a formal conformance relation for model-based testing that works on labeled transition systems. Our main result shows that, with certain restrictions, assume-guarantee reasoning can be applied in the context of ioco. This enables testing ioco-conformance of a system by testing its components separately

    A Semantic Framework for Test Coverage (Extended Version)

    Get PDF
    Since testing is inherently incomplete, test selection is of vital importance. Coverage measures evaluate the quality of a test suite and help the tester select test cases with maximal impact at minimum cost. Existing coverage criteria for test suites are usually defined in terms of syntactic characteristics of the implementation under test or its specification. Typical black-box coverage metrics are state and transition coverage of the specification. White-box testing often considers statement, condition and path coverage. A disadvantage of this syntactic approach is that different coverage figures are assigned to systems that are behaviorally equivalent, but syntactically different. Moreover, those coverage metrics do not take into account that certain failures are more severe than others, and that more testing effort should be devoted to uncover the most important bugs, while less critical system parts can be tested less thoroughly. This paper introduces a semantic approach to test coverage. Our starting point is a weighted fault model, which assigns a weight to each potential error in an implementation. We define a framework to express coverage measures that express how well a test suite covers such a specification, taking into account the error weight. Since our notions are semantic, they are insensitive to replacing a specification by one with equivalent behaviour.We present several algorithms that, given a certain minimality criterion, compute a minimal test suite with maximal coverage. These algorithms work on a syntactic representation of weighted fault models as fault automata. They are based on existing and novel optimization\ud problems. Finally, we illustrate our approach by analyzing and comparing a number of test suites for a chat protocol

    A Semantic Framework for Test Coverage

    Get PDF
    Since testing is inherently incomplete, test selection is of vital importance. Coverage measures evaluate the quality of a test suite and help the tester select test cases with maximal impact at minimum cost. Existing coverage criteria for test suites are usually defined in terms of syntactic characteristics of the implementation under test or its specification. Typical black-box coverage metrics are state and transition coverage of the specification. White-box testing often considers statement, condition and path coverage. A disadvantage of this syntactic approach is that different coverage figures are assigned to systems that are behaviorally equivalent, but syntactically different. Moreover, those coverage metrics do not take into account that certain failures are more severe than others, and that more testing effort should be devoted to uncover the most important bugs, while less critical system parts can be tested less thoroughly. This paper introduces a semantic approach to test coverage. Our starting point is a weighted fault model, which assigns a weight to each potential error in an implementation. We define a framework to express coverage measures that express how well a test suite covers such a specification, taking into account the error weight. Since our notions are semantic, they are insensitive to replacing a specification by one with equivalent behaviour.We present several algorithms that, given a certain minimality criterion, compute a minimal test suite with maximal coverage. These algorithms work on a syntactic representation of weighted fault models as fault automata. They are based on existing and novel optimization\ud problems. Finally, we illustrate our approach by analyzing and comparing a number of test suites for a chat protocol

    Morphometric Analysis to Characterize the Differentiation of Mesenchymal Stem Cells into Smooth Muscle Cells in Response to Biochemical and Mechanical Stimulation

    Full text link
    The morphology and biochemical phenotype of cells are closely linked. This relationship is important in progenitor cell bioengineering, which generates functional, tissue-specific cells from uncommitted precursors. Advances in biofabrication have demonstrated that cell shape can regulate cell behavior and alter phenotype-specific functions. Establishing accessible and rigorous techniques for quantifying cell shape will therefore facilitate assessment of cellular responses to environmental stimuli, and will enable more comprehensive understanding of developmental, pathological, and regenerative processes. For progenitor cells being induced into specific lineages, this ability becomes a pertinent means for validating their degree of differentiation and may lead to novel strategies for controlling cell phenotype. In our approach, we used the differentiation of adult human mesenchymal stem cells (MSCs) into smooth muscle cells (SMCs) as a model system to investigate the relationship between cell shape and phenotype. These cell types are responsive to mechanical and biochemical stimuli and the shape of SMCs is a recognized marker of differentiated state, providing a system in which morphological and biochemical phenotype are both understood and inducible. By applying exogenous stimuli, we changed cell shape and examined the corresponding cellular phenotype. In the first Aim, we applied stretch to MSCs on 2D collagen sheets to promote differentiation. Using mathematical shape factors, we quantified the morphological changes in response to defined stretch parameters. In the second Aim, we investigated the use of input energy as a means of controlling cell shape and corresponding differentiation. We examined how combinations of stretch parameters that produce equal energy input impacted morphology, and postulated that cell shape is a function of energy input. In the third Aim, we translated our method of quantifying shape factors into 3D culture, and validated the method by investigating the differentiation of MSCs into SMCs by mechanical and growth factor stimulation. We used the shape factors to quantify morphological differences and compared these changes to biochemical markers. Our results demonstrate that mechanical stretch influences multiple aspects of MSC phenotype, including cell morphology. Shape factors described these changes objectively and quantitatively, and enabled the identification of relationships between SMC shape and differentiated state. Similar morphological responses could be induced using different combinations of stretch parameters that resulted in equal energy input. Cell shape followed a linear relationship with energy input despite the variance introduced by using MSCs from different patients. Only one SMC gene marker directly exhibited this relationship; however, partial least squares regression analysis revealed that other genes were also associated with shape factors. Translation of the shape quantification method into 3D systems revealed that while the additional dimensionality hindered comparison of morphology between 2D and 3D samples, these shape factors were still applicable within 3D systems. Differences in cell morphology caused by growth factors and mechanical stretch in 3D constructs were elucidated by shape analysis, and these phenotypic changes were corroborated through biochemical assays. Taken together, these results validate the use of cell shape as means of characterizing phenotype and the process of progenitor cell differentiation. The automated method we developed generates a robust set of morphological parameters that provide a way to characterize the differentiation of MSCs into SMCs. This work has implications in our understanding of the relationship between cell morphology and phenotype, and may lead to new ways to control and improve differentiation efficiency in a variety of cell and tissue systems.PHDBiomedical EngineeringUniversity of Michigan, Horace H. Rackham School of Graduate Studieshttps://deepblue.lib.umich.edu/bitstream/2027.42/145833/1/brandanw_1.pd

    Management of Coastal Navigation Channels Based on Vessel Underkeel Clearance in Transit

    Get PDF
    The United States Army Corps of Engineers (USACE) spends approximately 2billionannuallytoinvestigate,construct,andmaintainprojectsinitsportfolioofcoastalnavigationinfrastructure.Ofthatexpenditure,approximately2 billion annually to investigate, construct, and maintain projects in its portfolio of coastal navigation infrastructure. Of that expenditure, approximately 1 billion is spent annually on maintenance dredging to increase the depth of maintained channels. The USACE prioritizes maintenance funding using a variety of metrics reflecting the amount of cargo moving through maintained projects but does not directly consider the reduction in the likelihood for the bottom of a vessel\u27s hull to make contact with the bottom of the channel that results from maintenance dredging investments. Net underkeel clearance, which remains between the channel bottom and the vessel’s keel after considering several important factors that act to increase the necessary keel depth, is used as an indicator of potential reduction of navigation safety. This dissertation presents a model formulated to estimate net underkeel clearance using archival Automatic Identification System (AIS) data and applies it to the federal navigation project in Charleston, South Carolina. Observations from 2011 including 3,961 vessel transits are used to determine the probability that a vessel will have less than 0 feet of net underkeel clearance as it transits from origin to destination. The probability that a vessel had net underkeel clearance greater than or equal to 0 feet was 0.993. A Monte-Carlo approach is employed to prioritize reach maintenance improvement order. A value heuristic is used to rank 7,500 dredging alternatives. 159 options were identified that meet an arbitrarily selected minimum reliability of 0.985. Cost reductions associated with options that met the minimum reliability requirement ranged from 7.7% to 42.6% on an annualized basis. Fort Sumter Range, Hog Island Reach, and Wando Lower Reach are identified as the most important reaches to maintain. The underkeel clearance reliability model developed in this work provides a more accurate representation of the waterway users’ ability to safely transit dredged channels with respect to available depth that is currently available to USACE waterway managers. The transit reliability metric developed provides an accurate representation of the benefit obtained from channel dredging investments, and directly relates the benefit to dredging cost

    Management of Coastal Navigation Channels Based on Vessel Underkeel Clearance in Transit

    Get PDF
    The United States Army Corps of Engineers (USACE) spends approximately 2billionannuallytoinvestigate,construct,andmaintainprojectsinitsportfolioofcoastalnavigationinfrastructure.Ofthatexpenditure,approximately2 billion annually to investigate, construct, and maintain projects in its portfolio of coastal navigation infrastructure. Of that expenditure, approximately 1 billion is spent annually on maintenance dredging to increase the depth of maintained channels. The USACE prioritizes maintenance funding using a variety of metrics reflecting the amount of cargo moving through maintained projects but does not directly consider the reduction in the likelihood for the bottom of a vessel\u27s hull to make contact with the bottom of the channel that results from maintenance dredging investments. Net underkeel clearance, which remains between the channel bottom and the vessel’s keel after considering several important factors that act to increase the necessary keel depth, is used as an indicator of potential reduction of navigation safety. This dissertation presents a model formulated to estimate net underkeel clearance using archival Automatic Identification System (AIS) data and applies it to the federal navigation project in Charleston, South Carolina. Observations from 2011 including 3,961 vessel transits are used to determine the probability that a vessel will have less than 0 feet of net underkeel clearance as it transits from origin to destination. The probability that a vessel had net underkeel clearance greater than or equal to 0 feet was 0.993. A Monte-Carlo approach is employed to prioritize reach maintenance improvement order. A value heuristic is used to rank 7,500 dredging alternatives. 159 options were identified that meet an arbitrarily selected minimum reliability of 0.985. Cost reductions associated with options that met the minimum reliability requirement ranged from 7.7% to 42.6% on an annualized basis. Fort Sumter Range, Hog Island Reach, and Wando Lower Reach are identified as the most important reaches to maintain. The underkeel clearance reliability model developed in this work provides a more accurate representation of the waterway users’ ability to safely transit dredged channels with respect to available depth that is currently available to USACE waterway managers. The transit reliability metric developed provides an accurate representation of the benefit obtained from channel dredging investments, and directly relates the benefit to dredging cost

    OPSE 310 - 101: Virtual Instrumentation

    Get PDF
    corecore