176 research outputs found

    Towards Integrated Variant Management in Global Software Engineering: An Experience Report

    Get PDF
    In the automotive domain, customer demands and market constraints are progressively realized by electric/ electronic components and corresponding software. Variant traceability in SPL is crucial in the context of different tasks, like change impact analysis, especially in complex global software projects. In addition, traceability concepts must be extended by partly automated variant configuration mechanisms to handle restrictions and dependencies between variants. Such variant configuration mechanism helps to reduce complexity when configuring a valid variant and to establish an explicit documentation of dependencies between components. However, integrated variant management has not been sufficiently addressed so far. Especially, the increasing number of software variants requires an examination of traceable and configurable software variants over the software lifecycle. This paper emphasizes variant traceability achievements in a large global software engineering project, elaborates existing challenges, and evaluates an industrial usage of an integrated variant management based on experiences

    Supporting Multi-Criteria Decision Support Queries over Disparate Data Sources

    Get PDF
    In the era of big data revolution, marked by an exponential growth of information, extracting value from data enables analysts and businesses to address challenging problems such as drug discovery, fraud detection, and earthquake predictions. Multi-Criteria Decision Support (MCDS) queries are at the core of big-data analytics resulting in several classes of MCDS queries such as OLAP, Top-K, Pareto-optimal, and nearest neighbor queries. The intuitive nature of specifying multi-dimensional preferences has made Pareto-optimal queries, also known as skyline queries, popular. Existing skyline algorithms however do not address several crucial issues such as performing skyline evaluation over disparate sources, progressively generating skyline results, or robustly handling workload with multiple skyline over join queries. In this dissertation we thoroughly investigate topics in the area of skyline-aware query evaluation. In this dissertation, we first propose a novel execution framework called SKIN that treats skyline over joins as first class citizens during query processing. This is in contrast to existing techniques that treat skylines as an add-on, loosely integrated with query processing by being placed on top of the query plan. SKIN is effective in exploiting the skyline characteristics of the tuples within individual data sources as well as across disparate sources. This enables SKIN to significantly reduce two primary costs, namely the cost of generating the join results and the cost of skyline comparisons to compute the final results. Second, we address the crucial business need to report results early; as soon as they are being generated so that users can formulate competitive decisions in near real-time. On top of SKIN, we built a progressive query evaluation framework ProgXe to transform the execution of queries involving skyline over joins to become non-blocking, i.e., to be progressively generating results early and often. By exploiting SKIN\u27s principle of processing query at multiple levels of abstraction, ProgXe is able to: (1) extract the output dependencies in the output spaces by analyzing both the input and output space, and (2) exploit this knowledge of abstract-level relationships to guarantee correctness of early output. Third, real-world applications handle query workloads with diverse Quality of Service (QoS) requirements also referred to as contracts. Time sensitive queries, such as fraud detection, require results to progressively output with minimal delay, while ad-hoc and reporting queries can tolerate delay. In this dissertation, by building on the principles of ProgXe we propose the Contract-Aware Query Execution (CAQE) framework to support the open problem of contract driven multi-query processing. CAQE employs an adaptive execution strategy to continuously monitor the run-time satisfaction of queries and aggressively take corrective steps whenever the contracts are not being met. Lastly, to elucidate the portability of the core principle of this dissertation, the reasoning and query processing at different levels of data abstraction, we apply them to solve an orthogonal research question to auto-generate recommendation queries that facilitate users in exploring a complex database system. User queries are often too strict or too broad requiring a frustrating trial-and-error refinement process to meet the desired result cardinality while preserving original query semantics. Based on the principles of SKIN, we propose CAPRI to automatically generate refined queries that: (1) attain the desired cardinality and (2) minimize changes to the original query intentions. In our comprehensive experimental study of each part of this dissertation, we demonstrate the superiority of the proposed strategies over state-of-the-art techniques in both efficiency, as well as resource consumption

    Forecasting dose-time profiles of solar particle events using a dosimetry-based Bayesian forecasting methodology

    Get PDF
    A dosimetery-based Bayesian methodology for forecasting astronaut radiation doses in deep space due to radiologically significant solar particle event proton fluences is developed. Three non-linear sigmoidal growth curves (Gompertz, Weibull, logistic) are used with hierarchical, non-linear, regression models to forecast solar particle event dose-time profiles from doses obtained early in the development of the event. Since there are no detailed measurements of dose versus time for actual events, surrogate dose data are provided by calculational methods. Proton fluence data are used as input to the deterministic, coupled neutron-proton space radiation computer code, BRYNTRN, for transporting protons and their reaction products (protons, neutrons, 2H, 3H, ³He, and He) through aluminum shielding material and water. Calculated doses and dose rates for ten historical solar particle events are used as the input data by grouping similar historical solar particle events, using asymptotic dose and maximum dose rate as the grouping criteria. These historical data are then used to lend strength to predictions of dose and dose rate-time profiles for new solar particle events. Bayesian inference techniques are used to make parameter estimates and predictive forecasts. Due to the difficulty in performing the numerical integrations necessary to calculate posterior parameter distributions and posterior predictive distributions, Markov Chain Monte Carlo (MCMC) methods are used to sample from the posterior distributions. Hierarchical, non-linear regression models provide useful predictions of asymptotic dose and dose-time profiles for the November 8, 2000 and August 12, 1989 solar particle events. Predicted dose rate-time profiles are adequate for the November 8, 2000 solar particle event. Predicitions of dose rate-time profiles for the August 12, 1989 solar particle event suffer due to a more complex dose rate- time profile. Model assessment indicates adequate fits of the data. Model comparison results clearly indicate preference for the Weibull model for both events. Forecasts provide a valuable tool to space operations planners when making recommendations concerning operations in which radiological exposure might jeopardize personal safety or mission completion. This work demonstrates that Bayesian inference methods can be used to make forecasts of dose and dose rate-time profiles early in the evolution of solar particle events. Bayesian inference methods provide a coherent methodology for quantifying uncertainty. Hierarchical models provide a natural framework for the prediction of new solar particle event dose and dose rate-time profiles

    Microsimulation as an Instrument to Evaluate Economic and Social Programmes

    Get PDF
    In recent years microsimulation models (MSMs) have been increasingly applied in quantitative analyses of the individual impacts of economic and social programme policies. The suitability of using microsimulation as an instrument to analyze main and side policy impacts at the individual level will be discussed in this paper by characterizing: the general approach and principles of the two general microsimulation approaches: static and dynamic (cross-section and lifecycle) microsimulation, the structure of MSMs with institutional regulations and behavioural response, panel data and behavioural change, deterministic and stochastic microsimulation, the 4M-strategy to combine microtheory, microdata, microestimation and microsimulation, and pinpointing applications and recent developments. To demonstrate the evaluation of economic and social programmes by microsimulation, two examples concerning a dynamic (cross-section and life-cycle) microsimulation of the German retirement pension reform and a combined static/dynamic microsimulation of the recent German tax reform with its behavioural impacts on formal and informal economic activities of private households are briefly described. Evaluating the evaluation of economic and social programmes with microsimulation models finally is followed by concluding remarks about some future developments.microsimulation, evaluation of economic and social-political programms

    Microsimulation as an Instrument to Evaluate Economic and Social Programmes

    Get PDF
    In recent years microsimulation models (MSMs) have been increasingly applied in quantitative analyses of the individual impacts of economic and social programme policies. The suitability of using microsimulation as an instrument to analyze main and side policy impacts at the individual level will be discussed in this paper by characterizing: the general approach and principles of the two general microsimulation approaches: static and dynamic (cross-section and lifecycle) microsimulation, the structure of MSMs with institutional regulations and behavioural response, panel data and behavioural change, deterministic and stochastic microsimulation, the 4M-strategy to combine microtheory, microdata, microestimation and microsimulation, and pinpointing applications and recent developments. To demonstrate the evaluation of economic and social programmes by microsimulation, two examples concerning a dynamic (cross-section and life-cycle) microsimulation of the German retirement pension reform and a combined static/dynamic microsimulation of the recent German tax reform with its behavioural impacts on formal and informal economic activities of private households are briefly described. Evaluating the evaluation of economic and social programmes with microsimulation models finally is followed by concluding remarks about some future developments.microsimulation, evaluation of economic and social-political programms

    Curracurrong: a stream processing system for distributed environments

    Get PDF
    Advances in technology have given rise to applications that are deployed on wireless sensor networks (WSNs), the cloud, and the Internet of things. There are many emerging applications, some of which include sensor-based monitoring, web traffic processing, and network monitoring. These applications collect large amount of data as an unbounded sequence of events and process them to generate a new sequences of events. Such applications need an adequate programming model that can process large amount of data with minimal latency; for this purpose, stream programming, among other paradigms, is ideal. However, stream programming needs to be adapted to meet the challenges inherent in running it in distributed environments. These challenges include the need for modern domain specific language (DSL), the placement of computations in the network to minimise energy costs, and timeliness in real-time applications. To overcome these challenges we developed a stream programming model that achieves easy-to-use programming interface, energy-efficient actor placement, and timeliness. This thesis presents Curracurrong, a stream data processing system for distributed environments. In Curracurrong, a query is represented as a stream graph of stream operators and communication channels. Curracurrong provides an extensible stream operator library and adapts to a wide range of applications. It uses an energy-efficient placement algorithm that optimises communication and computation. We extend the placement problem to support dynamically changing networks, and develop a dynamic program with polynomially bounded runtime to solve the placement problem. In many stream-based applications, real-time data processing is essential. We propose an approach that measures time delays in stream query processing; this model measures the total computational time from input to output of a query, i.e., end-to-end delay

    Curracurrong: a stream processing system for distributed environments

    Get PDF
    Advances in technology have given rise to applications that are deployed on wireless sensor networks (WSNs), the cloud, and the Internet of things. There are many emerging applications, some of which include sensor-based monitoring, web traffic processing, and network monitoring. These applications collect large amount of data as an unbounded sequence of events and process them to generate a new sequences of events. Such applications need an adequate programming model that can process large amount of data with minimal latency; for this purpose, stream programming, among other paradigms, is ideal. However, stream programming needs to be adapted to meet the challenges inherent in running it in distributed environments. These challenges include the need for modern domain specific language (DSL), the placement of computations in the network to minimise energy costs, and timeliness in real-time applications. To overcome these challenges we developed a stream programming model that achieves easy-to-use programming interface, energy-efficient actor placement, and timeliness. This thesis presents Curracurrong, a stream data processing system for distributed environments. In Curracurrong, a query is represented as a stream graph of stream operators and communication channels. Curracurrong provides an extensible stream operator library and adapts to a wide range of applications. It uses an energy-efficient placement algorithm that optimises communication and computation. We extend the placement problem to support dynamically changing networks, and develop a dynamic program with polynomially bounded runtime to solve the placement problem. In many stream-based applications, real-time data processing is essential. We propose an approach that measures time delays in stream query processing; this model measures the total computational time from input to output of a query, i.e., end-to-end delay

    Model-Based Engineering of Collaborative Embedded Systems

    Get PDF
    This Open Access book presents the results of the "Collaborative Embedded Systems" (CrESt) project, aimed at adapting and complementing the methodology underlying modeling techniques developed to cope with the challenges of the dynamic structures of collaborative embedded systems (CESs) based on the SPES development methodology. In order to manage the high complexity of the individual systems and the dynamically formed interaction structures at runtime, advanced and powerful development methods are required that extend the current state of the art in the development of embedded systems and cyber-physical systems. The methodological contributions of the project support the effective and efficient development of CESs in dynamic and uncertain contexts, with special emphasis on the reliability and variability of individual systems and the creation of networks of such systems at runtime. The project was funded by the German Federal Ministry of Education and Research (BMBF), and the case studies are therefore selected from areas that are highly relevant for Germany’s economy (automotive, industrial production, power generation, and robotics). It also supports the digitalization of complex and transformable industrial plants in the context of the German government's "Industry 4.0" initiative, and the project results provide a solid foundation for implementing the German government's high-tech strategy "Innovations for Germany" in the coming years
    corecore